Security log analytics in Google Cloud Stay organized with collections Save and categorize content based on your preferences.
This guide shows security practitioners how to onboard Google Cloudlogs to be used in security analytics. By performing security analytics, youhelp your organization prevent, detect, and respond to threats like malware,phishing, ransomware, and poorly configured assets.
This guide shows you how to do the following:
- Enable the logs to be analyzed.
- Route those logs to a single destination depending on your choice ofsecurity analytics tool, such asLog Analytics,BigQuery,Google Security Operations, ora third-party security information and event management (SIEM) technology.
- Analyze those logs to audit your cloud usage and detect potentialthreats to your data and workloads, using sample queries from theCommunity Security Analytics (CSA) project.
The information in this guide is part of Google CloudAutonomic Security Operations,which includes engineering-led transformation of detection and responsepractices and security analytics to improve your threat detection capabilities.
In this guide, logs provide the data source to be analyzed. However, you canapply the concepts from this guide to analysis of other complementarysecurity-related data from Google Cloud, such assecurity findingsfrom Security Command Center.Provided in Security Command Center Premium is a list of regularly-updated manageddetectors that are designed to identify threats, vulnerabilities, andmisconfigurations within your systems in near real-time. By analyzing thesesignals from Security Command Center and correlating them with logs ingested in yoursecurity analytics tool as described in this guide, you can achieve a broaderperspective of potential security threats.
The following diagram shows how security data sources, security analytics tools,and CSA queries work together.

The diagram starts with the following security data sources: logs fromCloud Logging, asset changes from Cloud Asset Inventory, and security findings fromSecurity Command Center. The diagram then shows these security data sources beingrouted into the security analytics tool of your choice: Log Analytics in Cloud Logging,BigQuery, Google Security Operations, or a third-party SIEM. Finally, thediagram shows using CSA queries with your analytics tool to analyze the collatedsecurity data.
Security log analytics workflow
This section describe the steps to set up security log analytics inGoogle Cloud. The workflow consists of the three steps shown in thefollowing diagram and described in the following paragraphs:

Enable logs: There are many security logs available inGoogle Cloud. Each log has different information that can be useful inanswering specific security questions. Some logs like Admin Activity audit logsare enabled by default; others need to be manually enabled because they incuradditional ingestion costs in Cloud Logging. Therefore, the first step inthe workflow is to prioritize the security logs that are most relevant foryour security analysis needs and to individually enable those specific logs.
To help you evaluate logs in terms of the visibility and threat detectioncoverage they provide, this guide includes alog scoping tool. This tool maps each log to relevantthreat tactics and techniques in theMITRE ATT&CK® Matrix for Enterprise.The tool also mapsEvent Threat Detection rules in Security Command Center to the logs on which they rely. You canuse the log scoping tool to evaluate logs regardless of the analytics toolthat you use.
Route logs: After identifying and enabling the logs to beanalyzed, the next step is to route and aggregate the logs from yourorganization, including any contained folders, projects, and billing accounts.How you route logs depends on the analytics tool that you use.
This guide describes common log routing destinations, and shows youhow to use a Cloud Loggingaggregated sink to route organization-wide logs into a Cloud Logginglog bucket or a BigQuery dataset depending onwhether you choose to use Log Analytics or BigQuery for analytics.
Analyze logs: After you route the logs into an analytics tool, the nextstep is to perform an analysis of these logs to identify anypotential security threats. How you analyze the logs depends on theanalytics tool that you use. If you use Log Analytics or BigQuery,you can analyze the logs by using SQL queries. If you use Google Security Operations,you analyze the logs by usingYARA-L rules.If you are using a third-party SIEM tool, you use the query languagespecified by that tool.
In this guide, you'll find SQL queries that you can use to analyze thelogs in either Log Analytics or BigQuery. The SQL queriesprovided in this guide come from theCommunity Security Analytics (CSA) project. CSA is an open-source set of foundational security analyticsdesigned to provide you with a baseline of pre-built queries and rules thatyou can reuse to start analyzing your Google Cloud logs.
The following sections provide detailed information on how to set up and applyeach step in the security logs analytics workflow.
Enable logs
The process of enabling logs involves the following steps:
- Identify the logs you need by using the log scoping tool in this guide.
- Record the log filter generated by the log scoping tool for use laterwhen configuring the log sink.
- Enable logging for each identified log type or Google Cloud service.Depending on the service, you might have to also enable the correspondingData Access audit logs as detailed later in this section.
Identify logs using the log scoping tool
To help you identify the logs that meet your security and compliance needs, youcan use the log scoping tool shown in this section. This tool provides aninteractive table that lists valuable security-relevant logs acrossGoogle Cloud including Cloud Audit Logs, Access Transparency logs, network logs,and several platform logs. This tool maps each log type to the following areas:
- MITRE ATT&CK threat tactics and techniques that can be monitored with that log.
- CIS Google Cloud Computing Platform compliance violations that can be detected in that log.
- Event Threat Detection rules that rely on that log.
The log scoping tool also generates a log filter which appears immediately afterthe table. As you identify the logs that you need, select those logs inthe tool to automatically update that log filter.
The following short procedures explain how to use the log scoping tool:
- To select or remove a log in the log scoping tool, click the toggle nextto the name of the log.
- To select or remove all the logs, click the toggle next to theLogtype heading.
- To see which MITRE ATT&CK techniques can be monitored by each logtype, clicknext to theMITRE ATT&CK tactics and techniques heading.
Log scoping tool
Record the log filter
The log filter that is automatically generated by the log scoping tool containsall of the logs that you have selected in the tool. You can use the filter as isor you can refine the log filter further depending on your requirements. Forexample, you can include (or exclude) resources only in one or more specificprojects. After you have a log filter that meets your logging requirements, youneed to save the filter for use when routing the logs. For instance, you cansave the filter in a text editor or save it in an environment variable asfollows:
- In the "Auto-generated log filter" section that follows the tool, copy thecode for the log filter.
- Optional: Edit the copied code to refine the filter.
InCloud Shell, create a variable tosave the log filter:
exportLOG_FILTER='LOG_FILTER'Replace
LOG_FILTERwith the code for the log filter.
Enable service-specific platform logs
For each of the platform logs that you select in the log scoping tool, thoselogs must be enabled (typically at the resource level) on a service-by-servicebasis. For example, Cloud DNS logs are enabled at the VPC-network level.Likewise, VPC Flow Logs are enabled at the subnet level for all VMs in thesubnet, and logs from Firewall Rules Logging are enabled at theindividual firewall rule level.
Each platform log has its own instructions on how to enable logging. However,you can use the log scoping tool to quickly open the relevant instructions foreach platform log.
To learn how to enable logging for a specific platform log, do the following:
- In the log scoping tool, locate the platform log that you want to enable.
- In theEnabled by default column, click theEnable link thatcorresponds to that log. The link takes you to detailed instructions on howto enable logging for that service.
Enable the Data Access audit logs
As you can see in the log scoping tool, the Data Access audit logs fromCloud Audit Logs provide broad threat detection coverage. However, theirvolume can be quite large. Enabling these Data Access audit logs might thereforeresult in additional charges related to ingesting, storing, exporting, andprocessing these logs. This section both explains how to enable these logs andpresents some best practices to help you with making the tradeoff between valueand cost.
Note: Data Access audit logs might contain personally identifiable information(PII) like caller identities and IP addresses. You must apply the appropriateaccess control and retention settings available in your analytics tool to secureyour log data, retain that data only as long as needed, and then dispose of thatdata securely.Data Access audit logs—except for BigQuery—are disabled by default. Toconfigure Data Access audit logs for Google Cloud services other thanBigQuery, you must explicitly enable them either byusing the Google Cloud console or byusing the Google Cloud CLI to editIdentity and Access Management (IAM) policy objects. When you enable Data Accessaudit logs, you can also configure which types of operations are recorded.There are three Data Access audit log types:
ADMIN_READ: Records operations that read metadata or configurationinformation.DATA_READ: Records operations that read user-provided data.DATA_WRITE: Records operations that write user-provided data.
Note that you can't configure the recording ofADMIN_WRITE operations, whichare operations that write metadata or configuration information.ADMIN_WRITEoperations are included in Admin Activity audit logs from Cloud Audit Logsand therefore can't be disabled.
Manage the volume of Data Access audit logs
When enabling Data Access audit logs, the goal is to maximize their value interms of security visibility while also limiting their cost and managementoverhead. To help you achieve that goal, we recommend that you do the followingto filter out low-value, high-volume logs:
- Prioritize relevant services such as services that host sensitiveworkloads, keys and data. For specific examples of services that you mightwant to prioritize over others, seeExample Data Access audit log configuration.
Prioritize relevant projects such as projects that host productionworkloads as opposed to projects that host developer and staging environments.To filter out all logs from a particular project, add the following expressionto your log filter for your sink. ReplacePROJECT_ID with the ID ofthe project from which you want to filter out all logs:
Project Log filter expression Exclude all logs from a given project NOTlogName=~"^projects/PROJECT_ID"
Prioritize a subset of data access operations such as
ADMIN_READ,DATA_READ, orDATA_WRITEfor a minimal set of recorded operations. Forexample, some services like Cloud DNS write all three types of operations,but you can enable logging for onlyADMIN_READoperations. After you haveconfigured one of more of these three types of data access operations, youmight want to exclude specific operations that are particularly high volume.You can exclude these high volume operations by modifying the sink's logfilter. For example, you decide to enable full Data Access audit logging,includingDATA_READoperations on some critical storage services. To excludespecific high-traffic data read operations in this situation, you can add thefollowing recommended log filter expressions to your sink's log filter:Service Log filter expression Exclude high volume logs from Cloud Storage NOT(resource.type="gcs_bucket"AND(protoPayload.methodName="storage.buckets.get"ORprotoPayload.methodName="storage.buckets.list"))
Exclude high volume logs from Cloud SQL NOT(resource.type="cloudsql_database"ANDprotoPayload.request.cmd="select")
Prioritize relevant resources such as resources that host your mostsensitive workloads and data. You can classify your resources based on thevalue of the data that they process, and their security risk such as whetherthey are externally accessible or not. Although Data Access audit logs areenabled per service, you can filter out specific resources or resource typesthrough the log filter.
Exclude specific principals from having their data accesses recorded.For example, you can exempt your internal testing accounts from having theiroperations recorded. To learn more, seeSet exemptions in Data Accessaudit logs documentation.
_Default sink that routes logs (including Data Accessaudit logs) to the_Default log bucket. Exclusion filters have the opposite effect of a log filter, which is an inclusionfilter. Thus when configuring these expressions as exclusion filters, you needto remove the precedingNOT Boolean operator from the filter expressions thatare shown in this section.Example Data Access audit log configuration
The following table provides a baseline Data Access audit log configurationthat you can use for Google Cloud projects to limit log volumes whilegaining valuable security visibility:
| Tier | Services | Data Access audit log types | MITRE ATT&CK tactics |
|---|---|---|---|
| Authentication & authorization services | IAM Identity-Aware Proxy (IAP)1 Cloud KMS Secret Manager Resource Manager | ADMIN_READ DATA_READ | Discovery Credential Access Privilege Escalation |
| Storage services | BigQuery (enabled by default) Cloud Storage1, 2 | DATA_READ DATA_WRITE | Collection Exfiltration |
| Infrastructure services | Compute Engine Organization Policy | ADMIN_READ | Discovery |
1 Enabling Data Access audit logs for IAP orCloud Storage can generate large log volumes when there is high trafficto IAP-protected web resources or to Cloud Storageobjects.
2 Enabling Data Access audit logs for Cloud Storage mightbreak the use ofauthenticated browser downloads for non-publicobjects. For more details and suggested workarounds to this issue, see theCloud Storage troubleshootingguide.
In the example configuration, notice how services are grouped in tiers ofsensitivity based on their underlying data, metadata, or configuration. Thesetiers demonstrate the following recommended granularity of Data Access auditlogging:
- Authentication & authorization services: For this tier of services, werecommend auditing all data access operations. This level of auditing helpsyou monitor access to your sensitive keys, secrets, and IAMpolicies. Monitoring this access might help you detect MITRE ATT&CK tacticslikeDiscovery,Credential Access, andPrivilege Escalation.
- Storage services: For this tier of services, we recommend auditing dataaccess operations that involve user-provided data. This level of auditinghelps you monitor access to your valuable and sensitive data. Monitoring thisaccess might help you detect MITRE ATT&CK tactics likeCollection andExfiltration against your data.
- Infrastructure services: For this tier of services, we recommend auditing dataaccess operations that involve metadata or configuration information. Thislevel of auditing helps you monitor for scanning of infrastructureconfiguration. Monitoring this access might help you detect MITRE ATT&CKtactics likeDiscovery against your workloads.
Route logs
After the logs are identified and enabled, the next step is to route the logs toa single destination. The routing destination, path and complexity vary dependingon the analytics tools that you use, as shown in the following diagram.

The diagram shows the following routing options:
If you use Log Analytics, you need anaggregated sink to aggregate the logs from across your Google Cloud organization into a single Cloud Logging bucket.
If you use BigQuery, you need an aggregated sink to aggregate the logsfrom across your Google Cloud organization into a single BigQuery dataset.
If you use Google Security Operations and thispredefined subset of logs meets your security analysis needs, you can automatically aggregate these logsinto your Google Security Operations account using the built-in Google Security Operationsingest. You can also view this predefined set of logs by looking at theExportable directly to Google Security Operations column of the log scoping tool.For more information about exporting these predefined logs, seeIngest Google Cloud logs to Google Security Operations.
If you use BigQuery or a third-party SIEM or want to export anexpanded set of logs into Google Security Operations, the diagram shows that anadditional step is needed between enabling the logs and analyzing them. Thisadditional step consists of configuring an aggregated sink that routes theselected logs appropriately. If you're using BigQuery, this sinkis all that you need to route the logs to BigQuery. If you're using athird-party SIEM, you need to have the sink aggregate the selected logs inPub/Sub or Cloud Storage before the logs can be pulled intoyour analytics tool.
The routing options to Google Security Operations and a third-party SIEM aren't coveredin this guide. However, the following sections provide the detailed stepsto route logs to Log Analytics or BigQuery:
- Set up a single destination
- Create an aggregated log sink.
- Grant access to the sink.
- Configure read access to the destination.
- Verify that the logs are routed to the destination.
Set up a single destination
Log Analytics
Note: You can skip this step if you use a Cloud Logging bucket that alreadyexists in the Google Cloud project where you want to aggregate the logs. Youcan use the_Default bucket, but we recommend that you create a separatebucket for this use case.Open the Google Cloud console in the Google Cloud project that you want toaggregate logs into.
In aCloud Shell terminal, run thefollowing
gcloudcommand to create a log bucket:gcloud logging buckets createBUCKET_NAME \ --location=BUCKET_LOCATION \ --project=PROJECT_IDReplace the following:
PROJECT_ID: the ID of the Google Cloud project wherethe aggregated logs will be stored.BUCKET_NAME: the name of the new Loggingbucket.
Note: After you create your bucket, you can't change your bucket's region.BUCKET_LOCATION: the geographical location of thenew Logging bucket. The supported locations areglobal,us, oreu. To learn more about these storage regions, refer toSupported regions.If you don't specify a location, then theglobalregion is used, whichmeans that the logs could be physically located in any of the regions.
Verify that the bucket was created:
gcloud logging buckets list --project=PROJECT_ID(Optional) Set the retention period of the logs in the bucket. Thefollowing example extends the retention of logs stored in the bucket to365 days:
gcloud logging buckets updateBUCKET_NAME \ --location=BUCKET_LOCATION \ --project=PROJECT_ID \ --retention-days=365Upgrade your new bucket to use Log Analytics byfollowing these steps.
BigQuery
Open the Google Cloud console in the Google Cloud project that you want toaggregate logs into.
In aCloud Shell terminal, run thefollowing
bq mkcommand to create a dataset:bq --location=DATASET_LOCATION mk \ --dataset \ --default_partition_expiration=PARTITION_EXPIRATION \PROJECT_ID:DATASET_IDReplace the following:
PROJECT_ID: the ID of the Google Cloud projectwhere the aggregated logs will be stored.DATASET_ID: the ID of the new BigQuerydataset.
Note: If you chooseEU or anEU-based region for the dataset location, your CoreBigQuery Customer Data resides in the EU. CoreBigQuery Customer Data is defined in theServiceSpecific Terms.DATASET_LOCATION: the geographic location of thedataset. After a dataset is created, the location can't be changed.
Best Practice: Set thedefault partition expiration property ofthe dataset based on your log retention requirements so older logsage out and expire. You can do so during or after you create the dataset.This allows you to retain logs as long as needed, while limiting thetotal size of the log storage and associated cost.If you have more granular retention requirements based on the log type,you can override this property at the table levelafter thelog sink has started routing logs and has created their correspondingpartitioned tables. For example, you might be required to keepCloud Audit Logs data for three years, but VPC Flow Logsand Firewall Rules Logs need only be retained for 90 days. If you donot set a default partition expiration at the dataset level, and you do not set a partition expiration when the table is created, the partitions never expirePARTITION_EXPIRATION: the default lifetime (in seconds) forthe partitions in the partitioned tables that are created by the log sink.You configure the log sink in the next section. The log sink that youconfigure uses partitioned tables that are partitioned by day based onthe log entry's timestamp. Partitions (including associated log entries)are deletedPARTITION_EXPIRATIONseconds after thepartition's date.
Create an aggregated log sink
You route your organization logs into your destination by creating an aggregatedsink at the organization level. To include all the logs you selected in the log scoping tool,you configure the sink with the log filter generated by the log scoping tool.
Note: Routing logs to this new destination doesn't mean that your logsare redirected to it. Instead, your logs are stored twice: once in theirparent Google Cloud project and then again in the new destination. To avoidthis duplicate storage of your logs, add anexclusion filterto the_Default sink of every child Google Cloud project in your organization.To stop logs from being ingested into the_Default sinks offutureGoogle Cloud projects in your organization,disable the_Default sinkin the default settings of your organization.Log Analytics
In aCloud Shell terminal, runthe following
gcloudcommand to create an aggregated sink at the organizationlevel:gcloud logging sinks createSINK_NAME \ logging.googleapis.com/projects/PROJECT_ID/locations/BUCKET_LOCATION/buckets/BUCKET_NAME \ --log-filter="LOG_FILTER" \ --organization=ORGANIZATION_ID \ --include-childrenReplace the following:
SINK_NAME: the name of the sink that routes the logs.PROJECT_ID: the ID of the Google Cloud projectwhere the aggregated logs will be stored.BUCKET_LOCATION: the location of the Loggingbucket that you created for log storage.BUCKET_NAME: the name of the Loggingbucket that you created for log storage.LOG_FILTER: the log filter that you saved from thelog scoping tool.ORGANIZATION_ID: the resource ID for your organization.
The
--include-childrenflag is important so that logs from all theGoogle Cloud projects within your organization are also included. For moreinformation, seeCollate and route organization-level logs to supported destinations.Verify the sink was created:
gcloud logging sinks list --organization=ORGANIZATION_IDGet the name of the service account associated with the sink that you just created:
gcloud logging sinks describeSINK_NAME --organization=ORGANIZATION_IDThe output looks similar to the following:
writerIdentity: serviceAccount:p1234567890-12345@logging-o1234567890.iam.gserviceaccount.com`Copy the entire string for
writerIdentitystarting withserviceAccount:.This identifier is the sink's service account. Until you grant thisservice account write access to the log bucket, log routing from this sinkwill fail. You grant write access to the sink's writer identity in thenext section.
BigQuery
In aCloud Shell terminal, run thefollowing
gcloudcommand to create an aggregated sink at the organization level:gcloud logging sinks createSINK_NAME \ bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID \ --log-filter="LOG_FILTER" \ --organization=ORGANIZATION_ID \ --use-partitioned-tables \ --include-childrenReplace the following:
SINK_NAME: the name of the sink that routes the logs.PROJECT_ID: the ID for the Google Cloud project youwant to aggregate the logs into.DATASET_ID: the ID of the BigQuery datasetyou created.LOG_FILTER: the log filter that you saved from thelog scoping tool.ORGANIZATION_ID: the resource ID for your organization.
The
--include-childrenflag is important so that logs from all theGoogle Cloud projects within your organization are also included. For moreinformation, seeCollate and route organization-level logs to supported destinations.The
--use-partitioned-tablesflag is important so that data is partitionedby day based on the log entry'stimestampfield. This simplifies queryingof the data and helps reduce query costs by reducing the amount of datascanned by queries. Another benefit of partitioned tables is that you canset a default partition expiration at the dataset level to meet your logretention requirements. You have already set a default partition expirationwhen you created the dataset destination in the previous section. You mightalso choose toset a partition expiration at theindividual table level, providing you with fine-grained data retentioncontrols based on log type.Verify the sink was created:
gcloud logging sinks list --organization=ORGANIZATION_IDGet the name of the service account associated with the sink that you just created:
gcloud logging sinks describeSINK_NAME --organization=ORGANIZATION_IDThe output looks similar to the following:
writerIdentity: serviceAccount:p1234567890-12345@logging-o1234567890.iam.gserviceaccount.com`Copy the entire string for
writerIdentitystarting withserviceAccount:.This identifier is the sink's service account. Until you grant thisservice account write access to the BigQuery dataset,log routing from this sink will fail. You grant write access to the sink'swriter identity in the next section.
Grant access to the sink
After creating the log sink, you must grant your sink access to write to itsdestination, be it the Logging bucket or the BigQuerydataset.
Note: To route logs to a resource protected by aservice perimeter, you must also add the service account for that sink to an access level and then assign it to the destination service perimeter. This isn't necessary for non-aggregated sinks. For details, seeVPC Service Controls: Cloud Logging.Log Analytics
To add the permissions to the sink's service account, follow these steps:
In the Google Cloud console, go to the IAM page:
Make sure that you've selected the destination Google Cloud project thatcontains the Logging bucket you created for central log storage.
Clickperson_addGrant access.
In theNew principals field, enter the sink's service account withoutthe
serviceAccount:prefix. Recall that this identity comes fromthewriterIdentityfield you retrieved in the previous section after youcreated the sink.In theSelect a role drop-down menu, selectLogs Bucket Writer.
ClickAdd IAM condition to restrict the service account'saccess to only the log bucket you created.
Enter aTitle andDescription for the condition.
In theCondition type drop-down menu, selectResource >Name.
In theOperator drop-down menu, selectEnds with.
In theValue field, enter the bucket's location and name as follows:
locations/BUCKET_LOCATION/buckets/BUCKET_NAMEClickSave to add the condition.
ClickSave to set the permissions.
BigQuery
To add the permissions to the sink's service account, follow these steps:
In the Google Cloud console, go to BigQuery:
Open the BigQuery dataset that you created for central log storage.
In the Dataset info tab, click theSharingkeyboard_arrow_downdrop-down menu, and then clickPermissions.
In the Dataset Permissions side panel, clickAdd Principal.
In theNew principals field, enter the sink's service account withoutthe
serviceAccount:prefix. Recall that this identity comes fromthewriterIdentityfield you retrieved in the previous section after youcreated the sink.In theRole drop-down menu, selectBigQuery Data Editor.
ClickSave.
After you grant access to the sink, log entries begin to populate the sinkdestination: the Logging bucket or the BigQuery dataset.
Configure read access to the destination
Now that your log sink routes logs from your entire organization into one singledestination, you can search across all of these logs. Use IAMpermissions to manage permissions and grant access as needed.
Log Analytics
To grant access to view and query the logs in your new log bucket, follow these steps.
In the Google Cloud console, go to the IAM page:
Make sure you've selected the Google Cloud project you're using toaggregate the logs.
Clickperson_addAdd.
In theNew principal field, add your email account.
In theSelect a role drop-down menu, selectLogs Views Accessor.
This role provides the newly added principal with read access to all viewsfor any buckets in the Google Cloud project. To limit a user's access,add a condition that lets the user read only from your new bucket only.
ClickAdd condition.
Enter aTitle andDescription for the condition.
In theCondition type drop-down menu, selectResource >Name.
In theOperator drop-down menu, selectEnds with.
In theValue field, enter the bucket's location and name,and the default log view
_AllLogsas follows: Note: Cloud Logging automatically creates thelocations/BUCKET_LOCATION/buckets/BUCKET_NAME/views/_AllLogs_AllLogsview forevery bucket, which shows all the logs in the bucket. For more granularcontrol over which logs can be viewed and queried within that log bucket,you can create and use acustom log view instead of_AllLogs.ClickSave to add the condition.
ClickSave to set the permissions.
BigQuery
To grant access to view and query the logs in your BigQuery dataset,follow the steps in theGranting access to a dataset section of theBigQuery documentation.
Verify that the logs are routed to the destination
Log Analytics
When you route logs to a log bucket upgraded to Log Analytics, you canview and query all log entries through a single log view with a unified schemafor all log types. Follow these steps to verify the logs are correctly routed.
In the Google Cloud console, go to Log Analytics page:
Make sure you've selected the Google Cloud project you're using toaggregate the logs.
Click onLog Views tab.
Expand the log views under the log bucket that you have created (that is
BUCKET_NAME) if it is not expandedalready.Select the default log view
_AllLogs. You can now inspect the entire logschema in the right panel, as shown in the following screenshot:
Next to
_AllLogs, clickQuery . This populates theQuery editorwith a SQL sample query to retrieve recently routed log entries.ClickRun query to view recently routed log entries.
Depending on level of activity in Google Cloud projects in your organization,you might have to wait a few minutes until some logs get generated, and thenrouted to your log bucket.
BigQuery
When you route logs to a BigQuery dataset, Cloud Loggingcreates BigQuery tables to hold the log entries as shownin the following screenshot:

The screenshot shows how Cloud Logging names each BigQuerytable based on the name of the log to which a log entry belongs. For example,thecloudaudit_googleapis_com_data_access table that is selected in thescreenshot contains Data Access audit logs whose log ID iscloudaudit.googleapis.com%2Fdata_access. In addition to being named based onthe corresponding log entry, each table is also partitioned based on thetimestamps for each log entry.
Depending on level of activity in Google Cloud projects in your organization,you might have to wait a few minutes until some logs get generated, and thenrouted to your BigQuery dataset.
Note: Both Admin Activity and Data Access logs are loaded into BigQuerywith theirprotoPayload log entry field renamed toprotoPayload_auditlogin BigQuery. For more information about schema conversionsdone by Cloud Logging before writing to BigQuery,seeFields in exported audit logs.Analyze logs
You can run a broad range of queries against your audit and platform logs. Thefollowing list provides a set of sample security questions that you might want toask of your own logs. For each question in this list, there are two versions ofthe corresponding CSA query: one for use with Log Analytics and one for usewith BigQuery. Use the query version that matches the sink destinationthat you previously set up.
Log Analytics
Before using any of the SQL queries below, replaceMY_PROJECT_IDwith the ID of the Google Cloud project where you created the log bucket (that isPROJECT_ID), andMY_DATASET_ID with the region and name of that log bucket (that isBUCKET_LOCATION.BUCKET_NAME).
BigQuery
Before using any of the SQL queries below, replaceMY_PROJECT_IDwith the ID of the Google Cloud project where you created the BigQuery dataset (that isPROJECT_ID), andMY_DATASET_ID with the name of that dataset, that isDATASET_ID.
- Login and access questions
- Permission changes questions
- Provisioning activity questions
- Workload usage questions
- Data access questions
- Which users most frequently accessed data in the past week?
- Which users accessed the data in the "accounts" table last month?
- What tables are most frequently accessed and by whom?
- What are the top 10 queries against BigQuery in the past week?
- What are the most common actions recorded in the data access log over the past month?
- Network security questions
Login and access questions
These sample queries perform analysis to detect suspicious login attemptsor initial access attempts to your Google Cloud environment.
Note: Login activity is captured in Cloud Identity logs that are included inGoogle Workspace Login Audit.To analyze login activity and use some of the queries in this section, you needto enable Google Workspace data sharing with Google Cloud. To learn more aboutsharing Google Workspace audit logs with Google Cloud, seeView and manage audit logs for Google Workspace.Any suspicious login attempt flagged by Google Workspace?
By searching Cloud Identity logs that are part ofGoogle Workspace Login Audit,the following query detects suspicious login attempts flagged by Google Workspace.Such login attempts might be from the Google Cloud console, Admin console,or the gcloud CLI.
Log Analytics
SELECTtimestamp,proto_payload.audit_log.authentication_info.principal_email,proto_payload.audit_log.request_metadata.caller_ip,proto_payload.audit_log.method_name,parameterFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`,UNNEST(JSON_QUERY_ARRAY(proto_payload.audit_log.metadata.event[0].parameter))ASparameterWHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL60DAY)ANDproto_payload.audit_logISNOTNULLANDproto_payload.audit_log.service_name="login.googleapis.com"ANDproto_payload.audit_log.method_name="google.login.LoginService.loginSuccess"ANDJSON_VALUE(parameter.name)="is_suspicious"ANDJSON_VALUE(parameter.boolValue)="true"BigQuery
SELECTtimestamp,protopayload_auditlog.authenticationInfo.principalEmail,protopayload_auditlog.requestMetadata.callerIp,protopayload_auditlog.methodNameFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_data_access`,UNNEST(JSON_QUERY_ARRAY(protopayload_auditlog.metadataJson,'$.event[0].parameter'))ASparameterWHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL60DAY)ANDprotopayload_auditlog.metadataJsonISNOTNULLANDprotopayload_auditlog.serviceName="login.googleapis.com"ANDprotopayload_auditlog.methodName="google.login.LoginService.loginSuccess"ANDJSON_VALUE(parameter,'$.name')="is_suspicious"ANDJSON_VALUE(parameter,'$.boolValue')="true"Any excessive login failures from any user identity?
By searching Cloud Identity logs that are part ofGoogle Workspace Login Audit,the following query detects users who have had three or more successive loginfailures within the last 24 hours.
Log Analytics
SELECTproto_payload.audit_log.authentication_info.principal_email,MIN(timestamp)ASearliest,MAX(timestamp)ASlatest,count(*)ASattemptsFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL1DAY)ANDproto_payload.audit_log.service_name="login.googleapis.com"ANDproto_payload.audit_log.method_name="google.login.LoginService.loginFailure"GROUPBY1HAVINGattempts>=3BigQuery
SELECTprotopayload_auditlog.authenticationInfo.principalEmail,MIN(timestamp)ASearliest,MAX(timestamp)ASlatest,count(*)ASattemptsFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_data_access`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL1DAY)ANDprotopayload_auditlog.serviceName="login.googleapis.com"ANDprotopayload_auditlog.methodName="google.login.LoginService.loginFailure"GROUPBY1HAVINGattempts>=3Any access attempts violating VPC Service Controls?
By analyzing Policy Denied audit logs from Cloud Audit Logs, the followingquery detects access attempts blocked by VPC Service Controls. Any query resultsmight indicate potential malicious activity like access attempts fromunauthorized networks using stolen credentials.
Log Analytics
SELECTtimestamp,log_name,proto_payload.audit_log.authentication_info.principal_email,proto_payload.audit_log.request_metadata.caller_ip,proto_payload.audit_log.method_name,proto_payload.audit_log.service_name,JSON_VALUE(proto_payload.audit_log.metadata.violationReason)asviolationReason,IF(JSON_VALUE(proto_payload.audit_log.metadata.ingressViolations)ISNULL,'ingress','egress')ASviolationType,COALESCE(JSON_VALUE(proto_payload.audit_log.metadata.ingressViolations[0].targetResource),JSON_VALUE(proto_payload.audit_log.metadata.egressViolations[0].targetResource))AStargetResource,COALESCE(JSON_VALUE(proto_payload.audit_log.metadata.ingressViolations[0].servicePerimeter),JSON_VALUE(proto_payload.audit_log.metadata.egressViolations[0].servicePerimeter))ASservicePerimeterFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)ANDproto_payload.audit_logISNOTNULLANDJSON_VALUE(proto_payload.audit_log.metadata,'$."@type"')='type.googleapis.com/google.cloud.audit.VpcServiceControlAuditMetadata'ORDERBYtimestampDESCLIMIT1000BigQuery
SELECTtimestamp,protopayload_auditlog.authenticationInfo.principalEmail,protopayload_auditlog.requestMetadata.callerIp,protopayload_auditlog.methodName,protopayload_auditlog.serviceName,JSON_VALUE(protopayload_auditlog.metadataJson,'$.violationReason')asviolationReason,IF(JSON_VALUE(protopayload_auditlog.metadataJson,'$.ingressViolations')ISNULL,'ingress','egress')ASviolationType,COALESCE(JSON_VALUE(protopayload_auditlog.metadataJson,'$.ingressViolations[0].targetResource'),JSON_VALUE(protopayload_auditlog.metadataJson,'$.egressViolations[0].targetResource'))AStargetResource,COALESCE(JSON_VALUE(protopayload_auditlog.metadataJson,'$.ingressViolations[0].servicePerimeter'),JSON_VALUE(protopayload_auditlog.metadataJson,'$.egressViolations[0].servicePerimeter'))ASservicePerimeterFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_policy`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL400DAY)ANDJSON_VALUE(protopayload_auditlog.metadataJson,'$."@type"')='type.googleapis.com/google.cloud.audit.VpcServiceControlAuditMetadata'ORDERBYtimestampDESCLIMIT1000Any access attempts violating IAP access controls?
By analyzing external Application Load Balancer logs, the following query detectsaccess attempts blocked by IAP. Any query results mightindicate an initial access attempt or vulnerability exploit attempt.
Log Analytics
SELECTtimestamp,http_request.remote_ip,http_request.request_method,http_request.status,JSON_VALUE(resource.labels.backend_service_name)ASbackend_service_name,http_request.request_urlFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)ANDresource.type="http_load_balancer"ANDJSON_VALUE(json_payload.statusDetails)="handled_by_identity_aware_proxy"ORDERBYtimestampDESCBigQuery
SELECTtimestamp,httpRequest.remoteIp,httpRequest.requestMethod,httpRequest.status,resource.labels.backend_service_name,httpRequest.requestUrl,FROM`[MY_PROJECT_ID].[MY_DATASET_ID].requests`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)ANDresource.type="http_load_balancer"ANDjsonpayload_type_loadbalancerlogentry.statusdetails="handled_by_identity_aware_proxy"ORDERBYtimestampDESCPermission changes questions
These sample queries perform analysis over administrator activity that changespermissions, including changes in IAM policies, groups and groupmemberships, service accounts, and any associated keys. Such permission changesmight provide a high level of access to sensitive data or environments.
Note: Group changes are captured inGoogle Workspace Admin Audit.To analyze group changes activity and use some of the queries in this section, you needto enable Google Workspace data sharing with Google Cloud. To learn more aboutsharing Google Workspace audit logs with Google Cloud, seeView and manage audit logs for Google Workspace.Any user added to highly-privileged groups?
By analyzingGoogle Workspace Admin Audit audit logs, the following query detects users who have been added to any of thehighly-privileged groups listed in the query. You use the regular expression inthe query to define which groups (such asadmin@example.com orprod@example.com)to monitor. Any query results might indicate a malicious or accidental privilegeescalation.
Log Analytics
SELECTtimestamp,proto_payload.audit_log.authentication_info.principal_email,proto_payload.audit_log.method_name,proto_payload.audit_log.resource_name,(SELECTJSON_VALUE(x.value)FROMUNNEST(JSON_QUERY_ARRAY(proto_payload.audit_log.metadata.event[0].parameter))ASxWHEREJSON_VALUE(x.name)="USER_EMAIL")ASuser_email,(SELECTJSON_VALUE(x.value)FROMUNNEST(JSON_QUERY_ARRAY(proto_payload.audit_log.metadata.event[0].parameter))ASxWHEREJSON_VALUE(x.name)="GROUP_EMAIL")ASgroup_email,FROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL120DAY)ANDproto_payload.audit_log.service_name="admin.googleapis.com"ANDproto_payload.audit_log.method_name="google.admin.AdminService.addGroupMember"ANDEXISTS(SELECT*FROMUNNEST(JSON_QUERY_ARRAY(proto_payload.audit_log.metadata.event[0].parameter))ASxWHEREJSON_VALUE(x.name)="GROUP_EMAIL"ANDREGEXP_CONTAINS(JSON_VALUE(x.value),r'(admin|prod).*')-- Update regexp with other sensitive groups if applicable)BigQuery
SELECTtimestamp,protopayload_auditlog.authenticationInfo.principalEmail,protopayload_auditlog.methodName,protopayload_auditlog.resourceName,(SELECTJSON_VALUE(x,'$.value')FROMUNNEST(JSON_QUERY_ARRAY(protopayload_auditlog.metadataJson,'$.event[0].parameter'))ASxWHEREJSON_VALUE(x,'$.name')="USER_EMAIL")ASuserEmail,(SELECTJSON_VALUE(x,'$.value')FROMUNNEST(JSON_QUERY_ARRAY(protopayload_auditlog.metadataJson,'$.event[0].parameter'))ASxWHEREJSON_VALUE(x,'$.name')="GROUP_EMAIL")ASgroupEmail,FROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_activity`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL120DAY)ANDprotopayload_auditlog.serviceName="admin.googleapis.com"ANDprotopayload_auditlog.methodName="google.admin.AdminService.addGroupMember"ANDEXISTS(SELECT*FROMUNNEST(JSON_QUERY_ARRAY(protopayload_auditlog.metadataJson,'$.event[0].parameter'))ASxWHEREJSON_VALUE(x,'$.name')='GROUP_EMAIL'ANDREGEXP_CONTAINS(JSON_VALUE(x,'$.value'),r'(admin|prod).*')-- Update regexp with other sensitive groups if applicable)Any permissions granted over a service account?
By analyzing Admin Activity audit logs from Cloud Audit Logs, the followingquery detects any permissions that have been granted to any principal over aservice account. Examples of permissions that might be granted are the abilityto impersonate that service account or create service account keys. Any queryresults might indicate an instance of privilege escalation or a risk ofcredentials leakage.
Log Analytics
SELECTtimestamp,proto_payload.audit_log.authentication_info.principal_emailasgrantor,JSON_VALUE(bindingDelta.member)asgrantee,JSON_VALUE(bindingDelta.role)asrole,proto_payload.audit_log.resource_name,proto_payload.audit_log.method_nameFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`,UNNEST(JSON_QUERY_ARRAY(proto_payload.audit_log.service_data.policyDelta.bindingDeltas))ASbindingDeltaWHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL400DAY)-- AND log_id = "cloudaudit.googleapis.com/activity"AND((resource.type="service_account"ANDproto_payload.audit_log.method_nameLIKE"google.iam.admin.%.SetIAMPolicy")OR(resource.typeIN("project","folder","organization")ANDproto_payload.audit_log.method_name="SetIamPolicy"ANDJSON_VALUE(bindingDelta.role)LIKE"roles/iam.serviceAccount%"))ANDJSON_VALUE(bindingDelta.action)="ADD"-- Principal (grantee) exclusionsANDJSON_VALUE(bindingDelta.member)NOTLIKE"%@example.com"ORDERBYtimestampDESCBigQuery
SELECTtimestamp,protopayload_auditlog.authenticationInfo.principalEmailasgrantor,bindingDelta.memberasgrantee,bindingDelta.role,protopayload_auditlog.resourceName,protopayload_auditlog.methodName,FROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_activity`,UNNEST(protopayload_auditlog.servicedata_v1_iam.policyDelta.bindingDeltas)ASbindingDeltaWHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL180DAY)AND((resource.type="service_account"ANDprotopayload_auditlog.methodNameLIKE"google.iam.admin.%.SetIAMPolicy")OR(resource.typeIN("project","folder","organization")ANDprotopayload_auditlog.methodName="SetIamPolicy"ANDbindingDelta.roleLIKE"roles/iam.serviceAccount%"))ANDbindingDelta.action='ADD'-- Principal (grantee) exclusionsANDbindingDelta.memberNOTLIKE"%@example.com"ORDERBYtimestampDESCAny service accounts or keys created by non-approved identity?
By analyzing Admin Activity audit logs, the following query detects any serviceaccounts or keys that have been manually created by a user. For example, youmight follow a best practice to only allow service accounts to be created by anapproved service account as part of an automated workflow. Therefore, any serviceaccount creation outside of that workflow is considered non-compliant and possibly malicious.
Log Analytics
SELECTtimestamp,proto_payload.audit_log.authentication_info.principal_email,proto_payload.audit_log.method_name,proto_payload.audit_log.resource_name,JSON_VALUE(proto_payload.audit_log.response.email)asservice_account_emailFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)ANDresource.type="service_account"ANDproto_payload.audit_log.method_nameLIKE"%CreateServiceAccount%"ANDproto_payload.audit_log.authentication_info.principal_emailNOTLIKE"%.gserviceaccount.com"BigQuery
SELECTtimestamp,protopayload_auditlog.authenticationInfo.principalEmail,protopayload_auditlog.methodName,protopayload_auditlog.resourceName,JSON_VALUE(protopayload_auditlog.responseJson,"$.email")asserviceAccountEmailFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_activity`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL180DAY)ANDresource.type="service_account"ANDprotopayload_auditlog.methodNameLIKE"%CreateServiceAccount%"ANDprotopayload_auditlog.authenticationInfo.principalEmailNOTLIKE"%.gserviceaccount.com"Any user added to (or removed from) sensitive IAM policy?
By searching Admin Activity audit logs, the following query detects any user orgroup access change for an IAP-secured resource such as aCompute Engine backend service. The following query searches all IAMpolicy updates for IAP resources involving the IAMroleroles/iap.httpsResourceAccessor. This role provides permissions to accessthe HTTPS resource or the backend service. Any query results might indicateattempts to bypass the defenses of a backend service that might be exposed tothe internet.
Log Analytics
SELECTtimestamp,proto_payload.audit_log.authentication_info.principal_email,resource.type,proto_payload.audit_log.resource_name,JSON_VALUE(binding,'$.role')asrole,JSON_VALUE_ARRAY(binding,'$.members')asmembersFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`,UNNEST(JSON_QUERY_ARRAY(proto_payload.audit_log.response,'$.bindings'))ASbindingWHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)-- AND log_id = "cloudaudit.googleapis.com/activity"ANDproto_payload.audit_log.service_name="iap.googleapis.com"ANDproto_payload.audit_log.method_nameLIKE"%.IdentityAwareProxyAdminService.SetIamPolicy"ANDJSON_VALUE(binding,'$.role')="roles/iap.httpsResourceAccessor"ORDERBYtimestampDESCBigQuery
SELECTtimestamp,protopayload_auditlog.authenticationInfo.principalEmail,resource.type,protopayload_auditlog.resourceName,JSON_VALUE(binding,'$.role')asrole,JSON_VALUE_ARRAY(binding,'$.members')asmembersFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_activity`,UNNEST(JSON_QUERY_ARRAY(protopayload_auditlog.responseJson,'$.bindings'))ASbindingWHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL400DAY)ANDprotopayload_auditlog.serviceName="iap.googleapis.com"ANDprotopayload_auditlog.methodNameLIKE"%.IdentityAwareProxyAdminService.SetIamPolicy"ANDJSON_VALUE(binding,'$.role')="roles/iap.httpsResourceAccessor"ORDERBYtimestampDESCProvisioning activity questions
These sample queries perform analysis to detect suspicious or anomalous adminactivity like provisioning and configuring resources.
Any changes made to logging settings?
By searching Admin Activity audit logs, the following query detects any changemade to logging settings. Monitoring logging settings helps you detectaccidental or malicious disabling of audit logs and similar defense evasiontechniques.
Log Analytics
SELECTreceive_timestamp,timestampASeventTimestamp,proto_payload.audit_log.request_metadata.caller_ip,proto_payload.audit_log.authentication_info.principal_email,proto_payload.audit_log.resource_name,proto_payload.audit_log.method_nameFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREproto_payload.audit_log.service_name="logging.googleapis.com"ANDlog_id="cloudaudit.googleapis.com/activity"BigQuery
SELECTreceiveTimestamp,timestampASeventTimestamp,protopayload_auditlog.requestMetadata.callerIp,protopayload_auditlog.authenticationInfo.principalEmail,protopayload_auditlog.resourceName,protopayload_auditlog.methodNameFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_activity`WHEREprotopayload_auditlog.serviceName="logging.googleapis.com"Any VPC Flow Logs actively disabled?
By searching Admin Activity audit logs, the following query detects any subnetwhose VPC Flow Logs were actively disabled . Monitoring VPC Flow Logssettings helps you detect accidental or malicious disabling of VPC Flow Logsand similar defense evasion techniques.
Log Analytics
SELECTreceive_timestamp,timestampASeventTimestamp,proto_payload.audit_log.request_metadata.caller_ip,proto_payload.audit_log.authentication_info.principal_email,proto_payload.audit_log.resource_name,proto_payload.audit_log.method_nameFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREproto_payload.audit_log.method_name="v1.compute.subnetworks.patch"AND(JSON_VALUE(proto_payload.audit_log.request,"$.logConfig.enable")="false"ORJSON_VALUE(proto_payload.audit_log.request,"$.enableFlowLogs")="false")BigQuery
SELECTreceiveTimestamp,timestampASeventTimestamp,protopayload_auditlog.requestMetadata.callerIp,protopayload_auditlog.authenticationInfo.principalEmail,protopayload_auditlog.resourceName,protopayload_auditlog.methodNameFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_activity`WHEREprotopayload_auditlog.methodName="v1.compute.subnetworks.patch"ANDJSON_VALUE(protopayload_auditlog.requestJson,"$.logConfig.enable")="false"Any unusually high number of firewall rules modified in the past week?
By searching Admin Activity audit logs, the following query detects any unusuallyhigh number of firewall rules changes on any given day in the past week. To determinewhether there is an outlier, the query performs statistical analysis over thedaily counts of firewall rules changes. Averages and standard deviations arecomputed for each day by looking back at the preceding daily counts with alookback window of 90 days. An outlier is considered when the daily count ismore than two standard deviations above the mean. The query, including thestandard deviation factor and the lookback windows, can all be configured to fityour cloud provisioning activity profile and to minimize false positives.
Log Analytics
SELECT*FROM(SELECT*,AVG(counter)OVER(ORDERBYdayROWSBETWEENUNBOUNDEDPRECEDINGAND1PRECEDING)ASavg,STDDEV(counter)OVER(ORDERBYdayROWSBETWEENUNBOUNDEDPRECEDINGAND1PRECEDING)ASstddev,COUNT(*)OVER(RANGEBETWEENUNBOUNDEDPRECEDINGANDUNBOUNDEDFOLLOWING)ASnumSamplesFROM(SELECTEXTRACT(DATEFROMtimestamp)ASday,ARRAY_AGG(DISTINCTproto_payload.audit_log.method_nameIGNORENULLS)ASactions,ARRAY_AGG(DISTINCTproto_payload.audit_log.authentication_info.principal_emailIGNORENULLS)ASactors,COUNT(*)AScounterFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL90DAY)ANDproto_payload.audit_log.method_nameLIKE"v1.compute.firewalls.%"ANDproto_payload.audit_log.method_nameNOTIN("v1.compute.firewalls.list","v1.compute.firewalls.get")GROUPBYday))WHEREcounter >avg+2*stddevANDday>=DATE_SUB(CURRENT_DATE(),INTERVAL7DAY)ORDERBYcounterDESCBigQuery
SELECT*,AVG(counter)OVER(ORDERBYdayROWSBETWEENUNBOUNDEDPRECEDINGAND1PRECEDING)ASavg,STDDEV(counter)OVER(ORDERBYdayROWSBETWEENUNBOUNDEDPRECEDINGAND1PRECEDING)ASstddev,COUNT(*)OVER(RANGEBETWEENUNBOUNDEDPRECEDINGANDUNBOUNDEDFOLLOWING)ASnumSamplesFROM(SELECTEXTRACT(DATEFROMtimestamp)ASday,ARRAY_AGG(DISTINCTprotopayload_auditlog.methodNameIGNORENULLS)ASactions,ARRAY_AGG(DISTINCTprotopayload_auditlog.authenticationInfo.principalEmailIGNORENULLS)ASactors,COUNT(*)AScounterFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_activity`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL90DAY)ANDprotopayload_auditlog.methodNameLIKE"v1.compute.firewalls.%"ANDprotopayload_auditlog.methodNameNOTIN("v1.compute.firewalls.list","v1.compute.firewalls.get")GROUPBYday)WHERETRUEQUALIFYcounter >avg+2*stddevANDday>=DATE_SUB(CURRENT_DATE(),INTERVAL7DAY)ORDERBYcounterDESCAny VMs deleted in the past week?
By searching Admin Activity audit logs, the following query lists anyCompute Engine instances deleted in the past week. This query can help youaudit resource deletions and detect potential malicious activity.
Log Analytics
SELECTtimestamp,JSON_VALUE(resource.labels.instance_id)ASinstance_id,proto_payload.audit_log.authentication_info.principal_email,proto_payload.audit_log.resource_name,proto_payload.audit_log.method_nameFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREresource.type="gce_instance"ANDproto_payload.audit_log.method_name="v1.compute.instances.delete"ANDoperation.firstISTRUEANDtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL7DAY)ORDERBYtimestampdesc,instance_idLIMIT1000BigQuery
SELECTtimestamp,resource.labels.instance_id,protopayload_auditlog.authenticationInfo.principalEmail,protopayload_auditlog.resourceName,protopayload_auditlog.methodNameFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_activity`WHEREresource.type="gce_instance"ANDprotopayload_auditlog.methodName="v1.compute.instances.delete"ANDoperation.firstISTRUEANDtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL7DAY)ORDERBYtimestampdesc,resource.labels.instance_idLIMIT1000Workload usage questions
These sample queries perform analysis to understand who and what is consumingyour cloud workloads and APIs, and help you detect potential malicious behaviorinternally or externally.
Any unusually high API usage by any user identity in the past week?
By analyzing all Cloud Audit Logs, the following query detects unusually high APIusage by any user identity on any given day in the past week. Such unusually highusage might be an indicator of potential API abuse, insider threat, or leakedcredentials. To determine whether there is an outlier, this query performsstatistical analysis over the daily count of actions per principal. Averages andstandard deviations are computed for each day and for each principal by lookingback at the preceding daily counts with a lookback window of 60 days. An outlieris considered when the daily count for a user is more than three standard deviationsabove their mean. The query, including the standard deviation factor and thelookback windows, are all configurable to fit your cloud provisioning activityprofile and to minimize false positives.
Log Analytics
SELECT*FROM(SELECT*,AVG(counter)OVER(PARTITIONBYprincipal_emailORDERBYdayROWSBETWEENUNBOUNDEDPRECEDINGAND1PRECEDING)ASavg,STDDEV(counter)OVER(PARTITIONBYprincipal_emailORDERBYdayROWSBETWEENUNBOUNDEDPRECEDINGAND1PRECEDING)ASstddev,COUNT(*)OVER(PARTITIONBYprincipal_emailRANGEBETWEENUNBOUNDEDPRECEDINGANDUNBOUNDEDFOLLOWING)ASnumSamplesFROM(SELECTproto_payload.audit_log.authentication_info.principal_email,EXTRACT(DATEFROMtimestamp)ASday,ARRAY_AGG(DISTINCTproto_payload.audit_log.method_nameIGNORENULLS)ASactions,COUNT(*)AScounterFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL60DAY)ANDproto_payload.audit_log.authentication_info.principal_emailISNOTNULLANDproto_payload.audit_log.method_nameNOTLIKE"storage.%.get"ANDproto_payload.audit_log.method_nameNOTLIKE"v1.compute.%.list"ANDproto_payload.audit_log.method_nameNOTLIKE"beta.compute.%.list"GROUPBYproto_payload.audit_log.authentication_info.principal_email,day))WHEREcounter >avg+3*stddevANDday>=DATE_SUB(CURRENT_DATE(),INTERVAL7DAY)ORDERBYcounterDESCBigQuery
SELECT*,AVG(counter)OVER(PARTITIONBYprincipalEmailORDERBYdayROWSBETWEENUNBOUNDEDPRECEDINGAND1PRECEDING)ASavg,STDDEV(counter)OVER(PARTITIONBYprincipalEmailORDERBYdayROWSBETWEENUNBOUNDEDPRECEDINGAND1PRECEDING)ASstddev,COUNT(*)OVER(PARTITIONBYprincipalEmailRANGEBETWEENUNBOUNDEDPRECEDINGANDUNBOUNDEDFOLLOWING)ASnumSamplesFROM(SELECTprotopayload_auditlog.authenticationInfo.principalEmail,EXTRACT(DATEFROMtimestamp)ASday,ARRAY_AGG(DISTINCTprotopayload_auditlog.methodNameIGNORENULLS)ASactions,COUNT(*)AScounterFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_*`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL60DAY)ANDprotopayload_auditlog.authenticationInfo.principalEmailISNOTNULLANDprotopayload_auditlog.methodNameNOTLIKE"storage.%.get"ANDprotopayload_auditlog.methodNameNOTLIKE"v1.compute.%.list"ANDprotopayload_auditlog.methodNameNOTLIKE"beta.compute.%.list"GROUPBYprotopayload_auditlog.authenticationInfo.principalEmail,day)WHERETRUEQUALIFYcounter >avg+3*stddevANDday>=DATE_SUB(CURRENT_DATE(),INTERVAL7DAY)ORDERBYcounterDESCWhat is the autoscaling usage per day in the past month?
By analyzing Admin Activity audit logs, the following query reports theautoscaling usage by day for the last month. This query can be usedto identify patterns or anomalies that warrant further security investigation.
Log Analytics
SELECTTIMESTAMP_TRUNC(timestamp,DAY)ASday,proto_payload.audit_log.method_name,COUNT(*)AScounterFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREresource.type="gce_instance_group_manager"ANDlog_id="cloudaudit.googleapis.com/activity"ANDtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)GROUPBY1,2ORDERBY1,2BigQuery
SELECTTIMESTAMP_TRUNC(timestamp,DAY)ASday,protopayload_auditlog.methodNameASmethodName,COUNT(*)AScounterFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_activity`WHEREresource.type="gce_instance_group_manager"ANDtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)GROUPBY1,2ORDERBY1,2Data access questions
These sample queries perform analysis to understand who is accessing ormodifying data in Google Cloud.
Which users most frequently accessed data in the past week?
The following query uses the Data Access audit logs to find the useridentities that most frequently accessed BigQuery tables dataover the past week.
Log Analytics
SELECTproto_payload.audit_log.authentication_info.principal_email,COUNT(*)ASCOUNTERFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHERE(proto_payload.audit_log.method_name="google.cloud.bigquery.v2.JobService.InsertJob"ORproto_payload.audit_log.method_name="google.cloud.bigquery.v2.JobService.Query")ANDtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL7DAY)ANDlog_id="cloudaudit.googleapis.com/data_access"GROUPBY1ORDERBY2desc,1LIMIT100BigQuery
SELECTprotopayload_auditlog.authenticationInfo.principalEmail,COUNT(*)ASCOUNTERFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_data_access`WHERE(protopayload_auditlog.methodName="google.cloud.bigquery.v2.JobService.InsertJob"ORprotopayload_auditlog.methodName="google.cloud.bigquery.v2.JobService.Query")ANDtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL7DAY)GROUPBY1ORDERBY2desc,1LIMIT100Which users accessed the data in the "accounts" table last month?
The following query uses the Data Access audit logs to find the useridentities that most frequently queried a givenaccounts table over the past month.Besides theMY_DATASET_ID andMY_PROJECT_ID placeholders for your BigQueryexport destination, the following query uses theDATASET_IDandPROJECT_ID placeholders. You need to replace to theDATASET_ID andPROJECT_IDplaceholders in order to specify the target table whose access is being analyzed,such as theaccounts table in this example.
Log Analytics
SELECTproto_payload.audit_log.authentication_info.principal_email,COUNT(*)ASCOUNTERFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`,UNNEST(proto_payload.audit_log.authorization_info)authorization_infoWHERE(proto_payload.audit_log.method_name="google.cloud.bigquery.v2.JobService.InsertJob"ORproto_payload.audit_log.method_name="google.cloud.bigquery.v2.JobService.Query")ANDauthorization_info.permission="bigquery.tables.getData"ANDauthorization_info.resource="projects/[PROJECT_ID]/datasets/[DATASET_ID]/tables/accounts"ANDtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)GROUPBY1ORDERBY2desc,1LIMIT100BigQuery
SELECTprotopayload_auditlog.authenticationInfo.principalEmail,COUNT(*)ASCOUNTERFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_data_access`,UNNEST(protopayload_auditlog.authorizationInfo)authorizationInfoWHERE(protopayload_auditlog.methodName="google.cloud.bigquery.v2.JobService.InsertJob"ORprotopayload_auditlog.methodName="google.cloud.bigquery.v2.JobService.Query")ANDauthorizationInfo.permission="bigquery.tables.getData"ANDauthorizationInfo.resource="projects/[PROJECT_ID]/datasets/[DATASET_ID]/tables/accounts"ANDtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)GROUPBY1ORDERBY2desc,1LIMIT100What tables are most frequently accessed and by whom?
The following query uses the Data Access audit logs to find theBigQuery tables with most frequently read and modified data over thepast month. It displays the associated user identity along with breakdown oftotal number of times data was read versus modified.
Log Analytics
SELECTproto_payload.audit_log.resource_name,proto_payload.audit_log.authentication_info.principal_email,COUNTIF(JSON_VALUE(proto_payload.audit_log.metadata,"$.tableDataRead")ISNOTNULL)ASdataReadEvents,COUNTIF(JSON_VALUE(proto_payload.audit_log.metadata,"$.tableDataChange")ISNOTNULL)ASdataChangeEvents,COUNT(*)AStotalEventsFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHERESTARTS_WITH(resource.type,'bigquery')ISTRUEAND(JSON_VALUE(proto_payload.audit_log.metadata,"$.tableDataRead")ISNOTNULLORJSON_VALUE(proto_payload.audit_log.metadata,"$.tableDataChange")ISNOTNULL)ANDtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)GROUPBY1,2ORDERBY5DESC,1,2LIMIT1000BigQuery
SELECTprotopayload_auditlog.resourceName,protopayload_auditlog.authenticationInfo.principalEmail,COUNTIF(JSON_EXTRACT(protopayload_auditlog.metadataJson,"$.tableDataRead")ISNOTNULL)ASdataReadEvents,COUNTIF(JSON_EXTRACT(protopayload_auditlog.metadataJson,"$.tableDataChange")ISNOTNULL)ASdataChangeEvents,COUNT(*)AStotalEventsFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_data_access`WHERESTARTS_WITH(resource.type,'bigquery')ISTRUEAND(JSON_EXTRACT(protopayload_auditlog.metadataJson,"$.tableDataRead")ISNOTNULLORJSON_EXTRACT(protopayload_auditlog.metadataJson,"$.tableDataChange")ISNOTNULL)ANDtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)GROUPBY1,2ORDERBY5DESC,1,2LIMIT1000What are the top 10 queries against BigQuery in the past week?
The following query uses the Data Access audit logs to find themost common queries over the past week. It also lists the corresponding usersand the referenced tables.
Log Analytics
SELECTCOALESCE(JSON_VALUE(proto_payload.audit_log.metadata,"$.jobChange.job.jobConfig.queryConfig.query"),JSON_VALUE(proto_payload.audit_log.metadata,"$.jobInsertion.job.jobConfig.queryConfig.query"))asquery,STRING_AGG(DISTINCTproto_payload.audit_log.authentication_info.principal_email,',')asusers,ANY_VALUE(COALESCE(JSON_EXTRACT_ARRAY(proto_payload.audit_log.metadata,"$.jobChange.job.jobStats.queryStats.referencedTables"),JSON_EXTRACT_ARRAY(proto_payload.audit_log.metadata,"$.jobInsertion.job.jobStats.queryStats.referencedTables")))astables,COUNT(*)AScounterFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHERE(resource.type='bigquery_project'ORresource.type='bigquery_dataset')ANDoperation.lastISTRUEAND(JSON_VALUE(proto_payload.audit_log.metadata,"$.jobChange")ISNOTNULLORJSON_VALUE(proto_payload.audit_log.metadata,"$.jobInsertion")ISNOTNULL)ANDtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL7DAY)GROUPBYqueryORDERBYcounterDESCLIMIT10BigQuery
SELECTCOALESCE(JSON_EXTRACT_SCALAR(protopayload_auditlog.metadataJson,"$.jobChange.job.jobConfig.queryConfig.query"),JSON_EXTRACT_SCALAR(protopayload_auditlog.metadataJson,"$.jobInsertion.job.jobConfig.queryConfig.query"))asquery,STRING_AGG(DISTINCTprotopayload_auditlog.authenticationInfo.principalEmail,',')asusers,ANY_VALUE(COALESCE(JSON_EXTRACT_ARRAY(protopayload_auditlog.metadataJson,"$.jobChange.job.jobStats.queryStats.referencedTables"),JSON_EXTRACT_ARRAY(protopayload_auditlog.metadataJson,"$.jobInsertion.job.jobStats.queryStats.referencedTables")))astables,COUNT(*)AScounterFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_data_access`WHERE(resource.type='bigquery_project'ORresource.type='bigquery_dataset')ANDoperation.lastISTRUEAND(JSON_EXTRACT(protopayload_auditlog.metadataJson,"$.jobChange")ISNOTNULLORJSON_EXTRACT(protopayload_auditlog.metadataJson,"$.jobInsertion")ISNOTNULL)ANDtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL7DAY)GROUPBYqueryORDERBYcounterDESCLIMIT10What are the most common actions recorded in the data access log over the past month?
The following query uses all logs from Cloud Audit Logs to find the 100 mostfrequent actions recorded over the past month.
Log Analytics
SELECTproto_payload.audit_log.method_name,proto_payload.audit_log.service_name,resource.type,COUNT(*)AScounterFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)ANDlog_id="cloudaudit.googleapis.com/data_access"GROUPBYproto_payload.audit_log.method_name,proto_payload.audit_log.service_name,resource.typeORDERBYcounterDESCLIMIT100BigQuery
SELECTprotopayload_auditlog.methodName,protopayload_auditlog.serviceName,resource.type,COUNT(*)AScounterFROM`[MY_PROJECT_ID].[MY_DATASET_ID].cloudaudit_googleapis_com_data_access`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)GROUPBYprotopayload_auditlog.methodName,protopayload_auditlog.serviceName,resource.typeORDERBYcounterDESCLIMIT100Network security questions
These sample queries perform analysis over your network activity in Google Cloud.
Any connections from a new IP address to a specific subnetwork?
The following query detects connections from any new source IP address to agiven subnet by analyzing VPC Flow Logs. In this example, a source IPaddress is considered new if it was seen for the first time in the last 24 hoursover a lookback window of 60 days. You might want to use and tune this query ona subnet that is in-scope for a particular compliance requirement like PCI.
Log Analytics
SELECTJSON_VALUE(json_payload.connection.src_ip)assrc_ip,-- TIMESTAMP supports up to 6 digits of fractional precision, so drop any more digits to avoid parse errorsMIN(TIMESTAMP(REGEXP_REPLACE(JSON_VALUE(json_payload.start_time),r'\.(\d{0,6})\d+(Z)?$','.\\1\\2')))ASfirstInstance,MAX(TIMESTAMP(REGEXP_REPLACE(JSON_VALUE(json_payload.start_time),r'\.(\d{0,6})\d+(Z)?$','.\\1\\2')))ASlastInstance,ARRAY_AGG(DISTINCTJSON_VALUE(resource.labels.subnetwork_name))assubnetNames,ARRAY_AGG(DISTINCTJSON_VALUE(json_payload.dest_instance.vm_name))asvmNames,COUNT(*)numSamplesFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL60DAY)ANDJSON_VALUE(json_payload.reporter)='DEST'ANDJSON_VALUE(resource.labels.subnetwork_name)IN('prod-customer-data')GROUPBYsrc_ipHAVINGfirstInstance>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL1DAY)ORDERBYlastInstanceDESC,numSamplesDESCBigQuery
SELECTjsonPayload.connection.src_ipassrc_ip,-- TIMESTAMP supports up to 6 digits of fractional precision, so drop any more digits to avoid parse errorsMIN(TIMESTAMP(REGEXP_REPLACE(jsonPayload.start_time,r'\.(\d{0,6})\d+(Z)?$','.\\1\\2')))ASfirstInstance,MAX(TIMESTAMP(REGEXP_REPLACE(jsonPayload.start_time,r'\.(\d{0,6})\d+(Z)?$','.\\1\\2')))ASlastInstance,ARRAY_AGG(DISTINCTresource.labels.subnetwork_name)assubnetNames,ARRAY_AGG(DISTINCTjsonPayload.dest_instance.vm_name)asvmNames,COUNT(*)numSamplesFROM`[MY_PROJECT_ID].[MY_DATASET_ID].compute_googleapis_com_vpc_flows`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL60DAY)ANDjsonPayload.reporter='DEST'ANDresource.labels.subnetwork_nameIN('prod-customer-data')GROUPBYsrc_ipHAVINGfirstInstance>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL1DAY)ORDERBYlastInstanceDESC,numSamplesDESCAny connections blocked by Google Cloud Armor?
The following query helps detect potential exploit attempts by analyzingexternal Application Load Balancer logs to find any connection blocked by thesecurity policy configured in Google Cloud Armor. This query assumes that you have aGoogle Cloud Armor security policy configured on your external Application Load Balancer.This query also assumes that you have enabled external Application Load Balancerlogging as described in the instructions that are provided by theEnablelink in thelog scoping tool.
Log Analytics
SELECTtimestamp,http_request.remote_ip,http_request.request_method,http_request.status,JSON_VALUE(json_payload.enforcedSecurityPolicy.name)ASsecurity_policy_name,JSON_VALUE(resource.labels.backend_service_name)ASbackend_service_name,http_request.request_url,FROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)ANDresource.type="http_load_balancer"ANDJSON_VALUE(json_payload.statusDetails)="denied_by_security_policy"ORDERBYtimestampDESCBigQuery
SELECTtimestamp,httpRequest.remoteIp,httpRequest.requestMethod,httpRequest.status,jsonpayload_type_loadbalancerlogentry.enforcedsecuritypolicy.name,resource.labels.backend_service_name,httpRequest.requestUrl,FROM`[MY_PROJECT_ID].[MY_DATASET_ID].requests`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)ANDresource.type="http_load_balancer"ANDjsonpayload_type_loadbalancerlogentry.statusdetails="denied_by_security_policy"ORDERBYtimestampDESCAny high-severity virus or malware detected by Cloud IDS?
The following query shows any high-severity virus or malware detected byCloud IDS by searching Cloud IDS Threat Logs. This queryassumes that you have aCloud IDS endpoint configured.
Log Analytics
SELECTJSON_VALUE(json_payload.alert_time)ASalert_time,JSON_VALUE(json_payload.name)ASname,JSON_VALUE(json_payload.details)ASdetails,JSON_VALUE(json_payload.application)ASapplication,JSON_VALUE(json_payload.uri_or_filename)ASuri_or_filename,JSON_VALUE(json_payload.ip_protocol)ASip_protocol,FROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)ANDresource.type="ids.googleapis.com/Endpoint"ANDJSON_VALUE(json_payload.alert_severity)IN("HIGH","CRITICAL")ANDJSON_VALUE(json_payload.type)="virus"ORDERBYtimestampDESCBigQuery
SELECTjsonPayload.alert_time,jsonPayload.name,jsonPayload.details,jsonPayload.application,jsonPayload.uri_or_filename,jsonPayload.ip_protocolFROM`[MY_PROJECT_ID].[MY_DATASET_ID].ids_googleapis_com_threat`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL30DAY)ANDresource.type="ids.googleapis.com/Endpoint"ANDjsonPayload.alert_severityIN("HIGH","CRITICAL")ANDjsonPayload.type="virus"ORDERBYtimestampDESCWhat are the top Cloud DNS queried domains from your VPC network?
The following query lists the top 10 Cloud DNS queried domains from your VPCnetwork(s) over the last 60 days. This query assumes that you have enabled Cloud DNSlogging for your VPC network(s) as described in the instructionsthat are provided by theEnable link in thelog scoping tool.
Log Analytics
SELECTJSON_VALUE(json_payload.queryName)ASquery_name,COUNT(*)AStotal_queriesFROM`[MY_PROJECT_ID].[MY_LOG_BUCKET_REGION].[MY_LOG_BUCKET_NAME]._AllLogs`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL60DAY)ANDlog_id="dns.googleapis.com/dns_queries"GROUPBYquery_nameORDERBYtotal_queriesDESCLIMIT10BigQuery
SELECTjsonPayload.querynameASquery_name,COUNT(*)AStotal_queriesFROM`[MY_PROJECT_ID].[MY_DATASET_ID].dns_googleapis_com_dns_queries`WHEREtimestamp>=TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL60DAY)GROUPBYquery_nameORDERBYtotal_queriesDESCLIMIT10What's next
Look at how tostream logs from Google Cloud to Splunk.
Explore reference architectures, diagrams, and best practices about Google Cloud.Take a look at ourCloud Architecture Center.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-10-08 UTC.