# Pentesting Cloud Methodology {{#include ../banners/hacktricks-training.md}}
## Basiese Metodologie Elke cloud het sy eie eienaardighede maar oor die algemeen is daar 'n paar **gemeenskaplike dinge wat 'n pentester moet nagaan** wanneer 'n cloud-omgewing getoets word: - **Benchmark kontroles** - Dit sal jou help om die **grootte van die omgewing** en die **gebruikte dienste** te verstaan - Dit sal jou ook toelaat om 'n paar **vinnige miskonfigurasies** te vind aangesien jy die meeste van hierdie toetse met **geoutomatiseerde gereedskap** kan uitvoer - **Services Enumeration** - Jy sal waarskynlik nie baie meer miskonfigurasies hier vind as jy die benchmark-toetse korrek uitgevoer het nie, maar jy kan dalk sommige vind wat nie in die benchmark-toets nagegaan is nie. - Dit sal jou toelaat om te weet **wat presies in die cloud-omgewing gebruik word** - Dit sal baie help in die volgende stappe - **Kontroleer blootgestelde assets** - Dit kan gedoen word tydens die vorige afdeling; jy moet **uitvind alles wat potensieel blootgestel is** aan die Internet op een of ander manier en hoe dit benader kan word. - Hier bedoel ek **manueel blootgestelde infrastruktuur** soos instances met webblaaie of ander poorte wat blootgestel is, en ook ander **cloud managed services wat gekonfigureer kan word** om blootgestel te word (soos DBs of buckets) - Dan moet jy nagaan **of daardie hulpbron blootgestel kan word of nie** (konfidensiële inligting? vulnerabilities? miskonfigurasies in die blootgestelde diens?) - **Kontroleer permissies** - Hier moet jy **uitvind al die permissies van elke rol/gebruiker** binne die cloud en hoe dit gebruik word - Te **baie hoogs geprivilegieerde** (beheer alles) rekeninge? Gegenereerde sleutels nie gebruik nie? ... Die meeste van hierdie kontroles behoort reeds in die benchmark-toetse gedoen te wees - As die kliënt OpenID of SAML of ander **federation** gebruik, moet jy hulle dalk vra vir verdere **inligting** oor **hoe elke rol toegewys word** (dit is nie dieselfde dat die admin-rol aan 1 gebruiker of aan 100 toegeken is nie) - Dit is **nie genoeg om te vind** watter gebruikers **admin** permissies "*:*" het nie. Daar is baie **ander permissies** wat, afhangende van die dienste wat gebruik word, baie **sensitief** kan wees. - Boonop is daar **potensiële privesc** maniere om te volg deur permissies te misbruik. Al hierdie dinge moet in ag geneem word en **soveel privesc-paaie as moontlik** moet gerapporteer word. - **Kontroleer Integrasies** - Dit is hoogs waarskynlik dat **integrasies met ander clouds of SaaS** binne die cloud-omgewing gebruik word. - Vir **integrations of the cloud you are auditing** met ander platform, moet jy kennis gee wie toegang het om daardie integrasie te (ab)use en jy moet vra hoe **sensitief** die aksie is wat uitgevoer word.\ Byvoorbeeld, wie kan skryf in 'n AWS bucket waar GCP data vandaan kry (vra hoe sensitief die aksie is in GCP wanneer daardie data verwerk word). - Vir **integrations inside the cloud you are auditing** vanaf eksterne platforms, moet jy vra **wie ekstern toegang het om daardie integrasie te (ab)use** en nagaan hoe daardie data gebruik word.\ Byvoorbeeld, as 'n diens 'n Docker image gebruik wat in GCR gehuisves is, moet jy vra wie toegang het om dit te wysig en watter sensitiewe inligting en toegang daardie image sal kry wanneer dit binne 'n AWS cloud uitgevoer word. ## Multi-Cloud gereedskap Daar is verskeie gereedskap wat gebruik kan word om verskillende cloud-omgewings te toets. Die installasiestappe en skakels sal in hierdie afdeling aangedui word. ### [PurplePanda](https://github.com/carlospolop/purplepanda) A tool to **identify bad configurations and privesc path in clouds and across clouds/SaaS.** {{#tabs }} {{#tab name="Install" }} ```bash # You need to install and run neo4j also git clone https://github.com/carlospolop/PurplePanda cd PurplePanda python3 -m venv . source bin/activate python3 -m pip install -r requirements.txt export PURPLEPANDA_NEO4J_URL="bolt://neo4j@localhost:7687" export PURPLEPANDA_PWD="neo4j_pwd_4_purplepanda" python3 main.py -h # Get help ``` {{#endtab }} {{#tab name="GCP" }} ```bash export GOOGLE_DISCOVERY=$(echo 'google: - file_path: "" - file_path: "" service_account_id: "some-sa-email@sidentifier.iam.gserviceaccount.com"' | base64) python3 main.py -a -p google #Get basic info of the account to check it's correctly configured python3 main.py -e -p google #Enumerate the env ``` {{#endtab }} {{#endtabs }} ### [Prowler](https://github.com/prowler-cloud/prowler) Dit ondersteun **AWS, GCP & Azure**. Kyk hoe om elke provider te konfigureer by [https://docs.prowler.cloud/en/latest/#aws](https://docs.prowler.cloud/en/latest/#aws) ```bash # Install pip install prowler prowler -v # Run prowler # Example prowler aws --profile custom-profile [-M csv json json-asff html] # Get info about checks & services prowler --list-checks prowler --list-services ``` ### [CloudSploit](https://github.com/aquasecurity/cloudsploit) AWS, Azure, Github, Google, Oracle, Alibaba {{#tabs }} {{#tab name="Install" }} ```bash # Install git clone https://github.com/aquasecurity/cloudsploit.git cd cloudsploit npm install ./index.js -h ## Docker instructions in github ``` {{#endtab }} {{#tab name="GCP" }} ```bash ## You need to have creds for a service account and set them in config.js file ./index.js --cloud google --config ``` {{#endtab }} {{#endtabs }} ### [ScoutSuite](https://github.com/nccgroup/ScoutSuite) AWS, Azure, GCP, Alibaba Cloud, Oracle Cloud Infrastructure {{#tabs }} {{#tab name="Install" }} ```bash mkdir scout; cd scout virtualenv -p python3 venv source venv/bin/activate pip install scoutsuite scout --help ## Using Docker: https://github.com/nccgroup/ScoutSuite/wiki/Docker-Image ``` {{#endtab }} {{#tab name="GCP" }} ```bash scout gcp --report-dir /tmp/gcp --user-account --all-projects ## use "--service-account KEY_FILE" instead of "--user-account" to use a service account SCOUT_FOLDER_REPORT="/tmp" for pid in $(gcloud projects list --format="value(projectId)"); do echo "================================================" echo "Checking $pid" mkdir "$SCOUT_FOLDER_REPORT/$pid" scout gcp --report-dir "$SCOUT_FOLDER_REPORT/$pid" --no-browser --user-account --project-id "$pid" done ``` {{#endtab }} {{#endtabs }} ### [Steampipe](https://github.com/turbot) {{#tabs }} {{#tab name="Install" }} Laai Steampipe af en installeer dit ([https://steampipe.io/downloads](https://steampipe.io/downloads)). Of gebruik Brew: ``` brew tap turbot/tap brew install steampipe ``` {{#endtab }} {{#tab name="GCP" }} ```bash # Install gcp plugin steampipe plugin install gcp # Use https://github.com/turbot/steampipe-mod-gcp-compliance.git git clone https://github.com/turbot/steampipe-mod-gcp-compliance.git cd steampipe-mod-gcp-compliance # To run all the checks from the dashboard steampipe dashboard # To run all the checks from rhe cli steampipe check all ```
Kontroleer alle Projekte Om al die projekte te kontroleer, moet jy die `gcp.spc`-lêer genereer wat al die projekte aandui wat getoets moet word. Jy kan net die instruksies van die volgende script volg ```bash FILEPATH="/tmp/gcp.spc" rm -rf "$FILEPATH" 2>/dev/null # Generate a json like object for each project for pid in $(gcloud projects list --format="value(projectId)"); do echo "connection \"gcp_$(echo -n $pid | tr "-" "_" )\" { plugin = \"gcp\" project = \"$pid\" }" >> "$FILEPATH" done # Generate the aggragator to call echo 'connection "gcp_all" { plugin = "gcp" type = "aggregator" connections = ["gcp_*"] }' >> "$FILEPATH" echo "Copy $FILEPATH in ~/.steampipe/config/gcp.spc if it was correctly generated" ```
Om **ander GCP-insigte** te kontroleer (nuttig vir die enumerering van dienste) gebruik: [https://github.com/turbot/steampipe-mod-gcp-insights](https://github.com/turbot/steampipe-mod-gcp-insights) Om Terraform GCP code te kontroleer: [https://github.com/turbot/steampipe-mod-terraform-gcp-compliance](https://github.com/turbot/steampipe-mod-terraform-gcp-compliance) Meer GCP-plugins van Steampipe: [https://github.com/turbot?q=gcp](https://github.com/turbot?q=gcp) {{#endtab }} {{#tab name="AWS" }} ```bash # Install aws plugin steampipe plugin install aws # Modify the spec indicating in "profile" the profile name to use nano ~/.steampipe/config/aws.spc # Get some info on how the AWS account is being used git clone https://github.com/turbot/steampipe-mod-aws-insights.git cd steampipe-mod-aws-insights steampipe dashboard # Get the services exposed to the internet git clone https://github.com/turbot/steampipe-mod-aws-perimeter.git cd steampipe-mod-aws-perimeter steampipe dashboard # Run the benchmarks git clone https://github.com/turbot/steampipe-mod-aws-compliance cd steampipe-mod-aws-compliance steampipe dashboard # To see results in browser steampipe check all --export=/tmp/output4.json ``` To check Terraform AWS code: [https://github.com/turbot/steampipe-mod-terraform-aws-compliance](https://github.com/turbot/steampipe-mod-terraform-aws-compliance) More AWS plugins of Steampipe: [https://github.com/orgs/turbot/repositories?q=aws](https://github.com/orgs/turbot/repositories?q=aws) {{#endtab }} {{#endtabs }} ### [~~cs-suite~~](https://github.com/SecurityFTW/cs-suite) AWS, GCP, Azure, DigitalOcean.\ Dit vereis python2.7 en lyk ononderhou. ### Nessus Nessus het 'n _**Audit Cloud Infrastructure**_ scan wat ondersteun: AWS, Azure, Office 365, Rackspace, Salesforce. Sommige ekstra konfigurasies in **Azure** is nodig om 'n **Client Id** te bekom. ### [**cloudlist**](https://github.com/projectdiscovery/cloudlist) Cloudlist is 'n **multi-cloud tool for getting Assets** (Hostnames, IP Addresses) vanaf Cloud Providers. {{#tabs }} {{#tab name="Cloudlist" }} ```bash cd /tmp wget https://github.com/projectdiscovery/cloudlist/releases/latest/download/cloudlist_1.0.1_macOS_arm64.zip unzip cloudlist_1.0.1_macOS_arm64.zip chmod +x cloudlist sudo mv cloudlist /usr/local/bin ``` {{#endtab }} {{#tab name="Second Tab" }} ```bash ## For GCP it requires service account JSON credentials cloudlist -config ``` {{#endtab }} {{#endtabs }} ### [**cartography**](https://github.com/lyft/cartography) Cartography is 'n Python-hulpmiddel wat infrastruktuur-bates en die verhoudings tussen hulle konsolideer in 'n intuïtiewe grafiese aansig wat deur 'n Neo4j-databasis aangedryf word. {{#tabs }} {{#tab name="Install" }} ```bash # Installation docker image pull ghcr.io/lyft/cartography docker run --platform linux/amd64 ghcr.io/lyft/cartography cartography --help ## Install a Neo4j DB version 3.5.* ``` {{#endtab }} {{#tab name="GCP" }} ```bash docker run --platform linux/amd64 \ --volume "$HOME/.config/gcloud/application_default_credentials.json:/application_default_credentials.json" \ -e GOOGLE_APPLICATION_CREDENTIALS="/application_default_credentials.json" \ -e NEO4j_PASSWORD="s3cr3t" \ ghcr.io/lyft/cartography \ --neo4j-uri bolt://host.docker.internal:7687 \ --neo4j-password-env-var NEO4j_PASSWORD \ --neo4j-user neo4j # It only checks for a few services inside GCP (https://lyft.github.io/cartography/modules/gcp/index.html) ## Cloud Resource Manager ## Compute ## DNS ## Storage ## Google Kubernetes Engine ### If you can run starbase or purplepanda you will get more info ``` {{#endtab }} {{#endtabs }} ### [**starbase**](https://github.com/JupiterOne/starbase) Starbase versamel bates en verhoudings van dienste en stelsels, insluitend cloud infrastruktuur, SaaS-toepassings, sekuriteitskontroles en meer, in 'n intuïtiewe grafiek-uitsig wat deur die Neo4j-databasis ondersteun word. {{#tabs }} {{#tab name="Install" }} ```bash # You are going to need Node version 14, so install nvm following https://tecadmin.net/install-nvm-macos-with-homebrew/ npm install --global yarn nvm install 14 git clone https://github.com/JupiterOne/starbase.git cd starbase nvm use 14 yarn install yarn starbase --help # Configure manually config.yaml depending on the env to analyze yarn starbase setup yarn starbase run # Docker git clone https://github.com/JupiterOne/starbase.git cd starbase cp config.yaml.example config.yaml # Configure manually config.yaml depending on the env to analyze docker build --no-cache -t starbase:latest . docker-compose run starbase setup docker-compose run starbase run ``` {{#endtab }} {{#tab name="GCP" }} ```yaml ## Config for GCP ### Check out: https://github.com/JupiterOne/graph-google-cloud/blob/main/docs/development.md ### It requires service account credentials integrations: - name: graph-google-cloud instanceId: testInstanceId directory: ./.integrations/graph-google-cloud gitRemoteUrl: https://github.com/JupiterOne/graph-google-cloud.git config: SERVICE_ACCOUNT_KEY_FILE: "{Check https://github.com/JupiterOne/graph-google-cloud/blob/main/docs/development.md#service_account_key_file-string}" PROJECT_ID: "" FOLDER_ID: "" ORGANIZATION_ID: "" CONFIGURE_ORGANIZATION_PROJECTS: false storage: engine: neo4j config: username: neo4j password: s3cr3t uri: bolt://localhost:7687 #Consider using host.docker.internal if from docker ``` {{#endtab }} {{#endtabs }} ### [**SkyArk**](https://github.com/cyberark/SkyArk) Ontdek die gebruikers met die hoogste voorregte in die gescande AWS- of Azure-omgewing, insluitend die AWS Shadow Admins. Dit gebruik powershell. ```bash Import-Module .\SkyArk.ps1 -force Start-AzureStealth # in the Cloud Console IEX (New-Object Net.WebClient).DownloadString('https://raw.githubusercontent.com/cyberark/SkyArk/master/AzureStealth/AzureStealth.ps1') Scan-AzureAdmins ``` ### [Cloud Brute](https://github.com/0xsha/CloudBrute) 'n gereedskap om 'n maatskappy (teiken) se infrastruktuur, lêers en apps op die top cloud-aanbieders (Amazon, Google, Microsoft, DigitalOcean, Alibaba, Vultr, Linode) te vind. ### [CloudFox](https://github.com/BishopFox/cloudfox) - CloudFox is 'n gereedskap om exploitable attack paths in cloud infrastructure te vind (tans slegs AWS & Azure ondersteun met GCP binnekort). - Dit is 'n enumeration tool wat bedoel is om manuele pentesting aan te vul. - Dit skep of wysig geen data binne die cloud-omgewing nie. ### More lists of cloud security tools - [https://github.com/RyanJarv/awesome-cloud-sec](https://github.com/RyanJarv/awesome-cloud-sec) ## Google ### GCP {{#ref}} gcp-security/ {{#endref}} ### Workspace {{#ref}} workspace-security/ {{#endref}} ## AWS {{#ref}} aws-security/ {{#endref}} ## Azure {{#ref}} azure-security/ {{#endref}} ### Attack Graph [**Stormspotter** ](https://github.com/Azure/Stormspotter) skep 'n “attack graph” van die resources in 'n Azure subscription. Dit stel red teams en pentesters in staat om die attack surface en pivot opportunities binne 'n tenant te visualiseer, en gee jou defenders 'n reuse hupstoot om vinnig te oriënteer en incident response werk te prioritiseer. ### Office365 Jy het **Global Admin** of ten minste **Global Admin Reader** nodig (maar let wel dat Global Admin Reader ietwat beperk is). Hierdie beperkings verskyn egter in sommige PS modules en kan omseil word deur die funksies via die webtoepassing te gebruik. {{#include ../banners/hacktricks-training.md}}