Let’s Keep the Bad Stuff Out

  • 10 November 2021
  • 1 reply

Userlevel 2
Badge +1

Hi all,

I am reaching out regarding a major issue we've been hearing about recently. Though it's not new and has been around for a couple of years, it is of vital importance and has become a "standard" bad practice. Organizations allow direct traffic to ANY and from AWS S3 buckets without analyzing, sanitizing or blocking it by web filters etc., potentially exposing the organization to all sorts of attacks.

A few words for those of you who don’t know exactly what are AWS S3 buckets. These are AWS Simple Storage Service (S3) and according to Amazon, S3 are commonly used to “store and protect any amount of data for a range of use cases, such as websites, mobile applications, backup and restore, archive, enterprise applications, IoT devices, and big data analytics”. At a high level, S3 consists of “buckets” and “objects”, in which objects are files that are stored in a bucket. Due to the functionality of this service various users at an organization (IT, Dev, Sales, Finance etc.) require it and can set up fairly quickly a single S3 Bucket or a number of S3 buckets for their operations. This set up can be done according to an industry standard, organizational security requirements, best practice or default settings.

Just as organizational users use S3 Buckets, so do threat actors, whether they set them up intentionally as part of their attack or use them to take over unprotected buckets and leverage them to jump-start a campaign. The organization's security team must consider this threat as part of the security framework and work on lowering this potential risk.

This can be achieved by doing the following:

  1. Prevent wild card routing to AWS services. Enforce this through the networking teams or other stakeholders in charge of the web gateway.
  2. Use SSL Inspection as part of the web gateway capabilities.
  3. When working with specific vendors or teams within the organization requiring connection to AWS services, add the specific IP or domain to the rules of the web gateway.

Once this is set, it is important to make sure that no changes have been made at the web gateway over time allowing this once again. This can be easily validated periodically by setting an automated web gateway assessment at the Cymulate platform testing the files and exploits categories on a daily basis.

So let’s mange our ‘doors’ to the organization better, and keep the bad stuff out.

1 reply

Something i encourage many orgs to do is document what the S3 bucket is used for and the type of information that SHOULD be stored in that bucket. This will encourage the users of said bucket to force themselves to create a better process for hygiene and of course for Audit purposes.