An Open Letter To Cloud Admins: Secure Your S3 Buckets

Creating Cloud-Based Applications That Are Secure AND Usable Is Hard

Another day, another headline about data security “In the Cloud”.  Here’s a sampling of some past headlines of data loss in the Cloud:

I’m sure there are plenty of others.  Here’s the most unfortunate part of this trove of reporting — your existing impression of Cloud security will determine how you interpret these headlines.

If you have no previous Cloud experience (as is the case with most Semiconductor Executives), your takeaway on this is simply “Wow, the Cloud is really insecure.  No way am I putting any of my data out there”.

If you have experience with the Cloud, your takeaway is likely “It doesn’t take much effort at all to secure your S3 bucket.  Leaving a bucket wide open is just foolish”.

The fact of the matter is that in Amazon’s Shared Responsibility Model, it’s up to YOU, the admin, to make use the infrastructure they provide. There are tons of legitimate reasons you want a bucket wide open to read.  Maybe you’re hosting a static website on S3, or you’ve got video assets that you link to from your dynamic web site and want to take advantage of S3’s low cost options to deliver it.  It’s not Amazon’s place to force you to choose a model, it’s up to you to enforce security based on what your requirements are.  In the above cases, the use case demanded locked down buckets.  It’s not a problem with the service or Amazon’s delivery of the service.  These are failures of administrative controls.

Open Buckets Don’t Just Happen

Here’s the default settings that pop up when you create a new bucket:


Amazon now helpfully makes the default bucket policy private.  So let’s be clear here, for any new buckets created, an administrator explicitly has to open up the policy.  Now let’s say that you decide to open up the permissions.  Here’s what AWS has to say to you:

It’s pretty obvious that you’re about to do something that would negatively impact your data security posture, assuming of course that this bucket isn’t meant to be public.

Furthermore, if you go to the S3 bucket overview for the bucket you just created, you’ll see that AWS highlights public access for the bucket:

Fortunately, locking down permissions is straightforward.  Just select the bucket and go to the Permissions tab, and select Public Access:

At this point, AWS tells you 3 times on this page that the bucket is public.  All you have to do is de-select the “List objects” selection and save, and voila, your bucket is no longer public.

Going Further, Encryption and Bucket Policies

If you have sensitive data that you are storing in an S3 bucket, you’ll also want to do things like enabling encryption and establishing a bucket policy.

Unfortunately, if you’ve already created a bucket and have populated it with data, simply turning on encryption will not encrypt the existing data.  You’ll either need to create a new bucket to migrate your data or create a temp bucket if you need to keep the existing bucket name.  Note that bucket names are unique within all regions of AWS.  You can’t have the same bucket name in two regions.

AWS gives you 2 options for encrypting your buckets:  S3 managed keys or KMS managed keys.  Here’s Amazon’s description of S3 managed keys:

Amazon S3 encrypts each object with a unique key. As an additional safeguard, it encrypts the key itself with a master key that it regularly rotates. Amazon S3 server-side encryption uses one of the strongest block ciphers available, 256-bit Advanced Encryption Standard (AES-256), to encrypt your data.

The key takeaway here is that AWS manages the master key.  If you would prefer to manage the master key yourself, you should look at using KMS (Key Management Service):

The highlights of SSE-KMS are:

  • You can choose to create and manage encryption keys yourself, or you can choose to use your default service key uniquely generated on a customer by service by region level.
  • The ETag in the response is not the MD5 of the object data.
  • The data keys used to encrypt your data are also encrypted and stored alongside the data they protect.
  • Auditable master keys can be created, rotated, and disabled from the IAM console.
  • The security controls in AWS KMS can help you meet encryption-related compliance requirements.

KMS lets you manage the master key yourself, which is an option I’m sure most Semiconductor companies will prefer.

AWS has a great section on S3 bucket policies called How Do I Add an S3 Bucket Policy?  I highly suggest reading through that, assuming that you are familiar with Identity and Access Management (IAM) principles already.

With Great Power Comes Great Responsibility

Cloud technologies give the user tremendous power, but just like on-prem infrastructure, it’s up to you how you use it.  This is probably one of the most dangerous aspects of “Shadow IT” in the Semiconductor world.  It is not unheard of for software development groups to set up their own Cloud accounts and start putting their builds up out of frustration with plodding IT processes.  Suddenly, all of the shackles of IT are gone and they can just “get stuff done”.  Unfortunately, there is a great opportunity for them to create highly insecure data access patterns for the sake of expediency.  That’s why it’s critical for IT shops to become more cloud savvy and quickly provide both secure and productive Cloud environments.  There are simple things you can do to protect your data such as restricting bucket access, enabling encryption and establishing bucket policies.  It will take some testing and understanding of Cloud principles such as IAM and resource names, but that investment is well worth keeping out of the headlines — or the Courtroom.


Derek Magill has been in Engineering IT for over 20 years, starting out on the UNIX Help Desk at Texas Instruments. While at TI, he held several technical and leadership roles, mainly focused on the areas of license management and HPC. While at Qualcomm, he led the global EDA License Infrastructure team, the Grid Administration Team, and was the primary Engineering Cloud Architect. He currently is an HPC Solutions Architect at Flux7 Labs, a DevOps Cloud Consultancy. Derek has also served as the Chairman of CELUG since 2015 and is the Executive Director of the Association of High Performance Computing Professionals. He also is a member of the Executive Committee of the 2020 Design Automation Conference.


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.