Cloud Methodology
Last updated
Last updated
https://cloud.hacktricks.xyz/pentesting-cloud/pentesting-cloud-methodology
Check exposed assets
This can be done during the previous section, you need to find out everything that is potentially exposed to the Internet somehow and how can it be accessed.
Here I'm taking manually exposed infrastructure like instances with web pages or other ports being exposed, and also about other cloud managed services that can be configured to be exposed (such as DBs or buckets)
Then you should check if that resource can be exposed or not (confidential information? vulnerabilities? misconfigurations in the exposed service?)
Check permissions
Here you should find out all the permissions of each role/user inside the cloud and how are they used
Too many highly privileged (control everything) accounts? Generated keys not used?... Most of these check should have been done in the benchmark tests already
If the client is using OpenID or SAML or other federation you might need to ask them for further information about how is being each role assigned (it's not the same that the admin role is assigned to 1 user or to 100)
It's not enough to find which users has admin permissions ":". There are a lot of other permissions that depending on the services used can be very sensitive.
Moreover, there are potential privesc ways to follow abusing permissions. All this things should be taken into account and as much privesc paths as possible should be reported.
Check Integrations
It's highly probably that integrations with other clouds or SaaS are being used inside the cloud env.
For integrations of the cloud you are auditing with other platform you should notify who has access to (ab)use that integration and you should ask how sensitive is the action being performed. For example, who can write in an AWS bucket where GCP is getting data from (ask how sensitive is the action in GCP treating that data).
For integrations inside the cloud you are auditing from external platforms, you should ask who has access externally to (ab)use that integration and check how is that data being used. For example, if a service is using a Docker image hosted in GCR, you should ask who has access to modify that and which sensitive info and access will get that image when executed inside an AWS cloud.
GCP
Workspace
Access the portal here: http://portal.azure.com/ To start the tests you should have access with a user with Reader permissions over the subscription and Global Reader role in AzureAD. If even in that case you are not able to access the content of the Storage accounts you can fix it with the role Storage Account Contributor.
It is recommended to install azure-cli in a linux and windows virtual machines (to be able to run powershell and python scripts): https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest Then, run az login
to login. Note the account information and token will be saved inside /.azure (in both Windows and Linux).
Remember that if the Security Centre Standard Pricing Tier is being used and not the free tier, you can generate a CIS compliance scan report from the azure portal. Go to Policy & Compliance-> Regulatory Compliance (or try to access https://portal.azure.com/#blade/Microsoft_Azure_Security/SecurityMenuBlade/22). __If the company is not paying for a Standard account you may need to review the CIS Microsoft Azure Foundations Benchmark by "hand" (you can get some help using the following tools). Download it from here.
Run scanners
Run the scanners to look for vulnerabilities and compare the security measures implemented with CIS.
pip install scout
scout azure --cli --report-dir <output_dir>
#Fix azureaudit.py before launching cs.py
#Adding "j_res = {}" on line 1074
python cs.py -env azure
#Azucar is an Azure security scanner for PowerShell (https://github.com/nccgroup/azucar)
#Run it from its folder
.\Azucar.ps1 -AuthMode Interactive -ForceAuth -ExportTo EXCEL
#Azure-CIS-Scanner,CIS scanner for Azure (https://github.com/kbroughton/azure_cis_scanner)
pip3 install azure-cis-scanner #Install
azscan #Run, login before with az login
Attack Graph
**Stormspotter** creates an “attack graph” of the resources in an Azure subscription. It enables red teams and pentesters to visualize the attack surface and pivot opportunities within a tenant, and supercharges your defenders to quickly orient and prioritize incident response work.
More checks
Check for a high number of Global Admin (between 2-4 are recommended). Access it on: https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview
Global admins should have MFA activated. Go to Users and click on Multi-Factor Authentication button.
Dedicated admin account shouldn't have mailboxes (they can only have mailboxes if they have Office 365).
Local AD shouldn't be sync with Azure AD if not needed(https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/AzureADConnect). And if synced Password Hash Sync should be enabled for reliability. In this case it's disabled:
Global Administrators shouldn't be synced from a local AD. Check if Global Administrators emails uses the domain onmicrosoft.com. If not, check the source of the user, the source should be Azure Active Directory, if it comes from Windows Server AD, then report it.
Standard tier is recommended instead of free tier (see the tier being used in Pricing & Settings or in https://portal.azure.com/#blade/Microsoft_Azure_Security/SecurityMenuBlade/24)
Periodic SQL servers scans:Select the SQL server --> Make sure that 'Advanced data security' is set to 'On' --> Under 'Vulnerability assessment settings', set 'Periodic recurring scans' to 'On', and configure a storage account for storing vulnerability assessment scan results --> Click Save
Lack of App Services restrictions: Look for "App Services" in Azure (https://portal.azure.com/#blade/HubsExtension/BrowseResource/resourceType/Microsoft.Web%2Fsites) and check if anyone is being used. In that case check go through each App checking for "Access Restrictions" and there aren't rules, report it. The access to the app service should be restricted according to the needs.
Office365
You need Global Admin or at least Global Admin Reader (but note that Global Admin Reader is a little bit limited). However, those limitations appear in some PS modules and can be bypassed accessing the features via the web application.