All Your Bucket Are Belong to Us…Protect S3 Buckets from Data Leaks and Hacks

 In Blog, Cloud Technology

Bot of the Week: S3 Bucket Permissions

What does this Bot do: Identify buckets exposing data with permissive access lists

This bot continuously monitors and identifies storage containers such as AWS S3 buckets which have read, write or delete permissions open to the world.

Why do I care?

Amazon Web Services S3 Buckets are storage containers in the cloud that are used to house data, documents and images or they can be used to host static websites. Without specified permissions, anyone can read, modify or delete a bucket. Running buckets with this type of access policy can result in data loss, exposure and potential downtime in the case of static website hosting.

Permissions Matter

With S3, you can put access permission controls on your buckets. This governs who can read, write, and delete that bucket. Let’s say you’re hosting a website from S3. You’ll want to provide visitors full access to read the information on the site. You wouldn’t, however, want them to be able to modify or delete your content. On the other hand, if you have a bucket that stores personal information or sensitive information such as customer records, you may want to pull read permissions from your policy. By setting and automating specific permissions you can prevent the viewing or altering of stored data, protecting the organization and its clients.

Guard your buckets

The best way to guard your buckets is to tailor access list control permissions to only the minimum level of access required. Typically only static websites should be open to the world and all other buckets should be locked down to only authorized stakeholders.

Give it a try!

Over 100 out-of-the-box Bots are available on the DivvyCloud Github repo.  Sign up at BotFactory.io for a free test drive.

Recommended Posts

Leave a Comment