s3nukem – Delete large Amazon S3 buckets

s3nukem is a slightly improved version of s3nuke, a Ruby script by Steve Eley that relatively quickly deletes an Amazon Web Services (AWS) Simple Storage Service (S3) bucket with many objects (millions) by using multiple threads to retrieve and delete the individual objects.

Improvements include:

  • The key retrieval thread will pause when the queue contains 1000 * thread_count items. The original script’s queue would grow unabated, eating up memory unnecessarily.
  • All output is automatically flushed, which ensures you can keep an eye on progress in real-time.
  • Added the number of seconds elapsed since the start of the script to the output so you can calculate the rate at which items are being deleted.

Background

I, like many others, have needed to delete an S3 bucket with many items, but, as you may know, you first have to delete all the objects in the bucket — not a quick task if the bucket has hundreds of thousands or millions of objects.

The bucket I needed to delete had 99 million objects. Attempts to delete the bucket through S3fox and even through the AWS Management Console would fail!

s3cmd, which deletes with a single thread, was deleting objects at a rate of about 1,800/minute (2.5 million / day). At that rate, the deletion would have taken about 40 days.

s3nuke/s3nukem, which I ran with the default 10 threads, deleted objects at a rate of about 9,000/minute (13 million / day), reducing the job to about 7-1/2 days.

Since my deletion was a bit larger than Steve’s (his was about 260,000), I had to make a couple improvements to s3nuke (listed above) so that it wouldn’t slow down and crash and so that I could keep an eye on its progress. You can find my fork at http://github.com/lathanh/s3nukem

Quick download: http://github.com/lathanh/s3nukem/raw/master/s3nukem

This entry was posted in Web Technology. Bookmark the permalink.

6 Responses to s3nukem – Delete large Amazon S3 buckets

  1. Just wanted to add, that those who wants to use some GUI to delete buckets with millions objects inside, they can use S3 Browser for Windows.

    It can also use multiple threads to quickly delete very large amounts of objects.

  2. Travis says:

    Thanks for this… really saved me. Doing it via CyberDuck was not going to cut it for over 82K items.

  3. Tony says:

    There is also DragonDisk (http://www.dragondisk.com/). It is very fast. It is available for Windows / Mac / Linux.

  4. Shaun says:

    How hard would it be to re-purpose s3nukem to only delete files with a common prefix that were older than a set date? For example, delete all items in ‘/queue’ created before 1/1/2011.

  5. Dean says:

    Good tool – wondering how it works on versioned buckets? (seems prudent to suppress versioning before firing this tool)

  6. josh franta says:

    s3 brower is excellent.

    was able to manage a bucket with 29million objects extremely quickly.

Leave a Reply

Your email address will not be published.