S3 Bucket Enumeration and Exploitation
Amazon S3 bucket is a user-friendly object repository, that is used for storing and recovering various data from anywhere on the web. As an Amazon Web Service (AWS), it allows creators to store, transfer, or process large amounts of data. The AWS offers a wide range of storage options: from simple static files to more complex applications like websites, mobile apps, machine learning algorithms, etc.
Among all of them S3 stands for Simple Storage Service, it is object storage that is provided by AWS as a cloud service that will charge for only what you will use. Some of the advantages of Amazon S3 include creating buckets, storing data, downloading data, granting or denying permissions, etc.
In this article, We will see how to keep your personal information private and secure. The Best VPN services offer a variety of options for you.
Access Control Lists (ACLs):
Organizations don’t set ACLs properly and that is what becomes the main reason for the vulnerability of the S3 bucket. S3 access control lists are applied at the bucket level as well as at the object level, but it is best to set the ACLs on an object-by-object basis rather than just at a bucket-level or global level. There is some access control including the following set of approvals:
S3 Bucket Enumeration:
S3 bucket enumeration is a process of querying the S3 buckets and objects in those buckets. This can be done using different AWS API calls such as list bucket, get Bucket Contents or ListObjects. This process aims to determine which S3 objects are present within a given bucket. You can use this information to help you understand your data resource better and make decisions about how best to manage it.
Let’s see how we can find such loopholes:-
Step 1: First step we can do to find out whether a website is using an S3 bucket or not we have to perform a PASSIVE RECON.
- We can use nslookup to identify the region of the server, in case the webserver using the S3 bucket is not kept behind a WAF and if the server is behind a WAF we need to find other ways to determine the target.
Note: Here we used nslookup of the flaws. cloud IP address that shows it’s located on us-west-2.
Step 2: Next thing we can do is perform ACTIVE RECON
- Now you have got the part, the next step is to create general querying and enumeration of bucket names. Selecting the region ahead is not required but it will definitely save time later while querying AWS.
- You should execute an enumeration of subdomains, domains, and top-level domains to make sure that your target has a bucket on S3. For example, if you are looking for S3 buckets belonging to www.geeksforgeeks.com, then you must try bucket names geeksforgeeks.com, and www.geeksforgeeks.com.
- For providing the bucket name, you can directly visit the automatically assigned S3 URL given by Amazon, where the format will be:-
Otherwise use the command line given below:
sudo aws s3 ls s3://$bucketname/ --region $region
S3 Bucket Exploitation:
Now that we have listed it, let’s come to the exploitation part.
There are many more sharp things to get on S3 which can be a tough problem for a company which may include log files, usernames, passwords, database queries, etc.
Step 1: So after running the following command I found a secret file
Step 2: And when we navigated to that file, we are finally able to solve the CTF
Step 3: Now finally we are done with exploiting the Level 1 and so we can move on to the level 2
There are quite a few famous tools that you can use to find the S3 bucket of a website. Some of them are as follows:
- Lazy S3
- S3 Bucket Finder
- BURP Suite (just spider the target web application and you might get amazing results)
- Amazon bucket can also be found on Content-Security-Policy Response headers
- You can also open any of the target website images in a new tab and the URL will confirm you about S3.
- bucket finder