Introduction to audit logs in BigQuery
Logs are text records that are generated in response to particular events oractions. For instance, BigQuery creates log entries for actions such as creatingor deleting a table, purchasing slots, or running a load job.
Google Cloud also writes logs, including audit logs that provide insight intooperational concerns related to your use of Google Cloud services. For more information about how Google Cloud handles logging, see theCloud Logging documentation andCloud Audit Logs overview.
Audit logs versusINFORMATION_SCHEMA views
Your Google Cloud projects contain audit logs only for the resources that are directly within the Google Cloud project. Other Google Cloud resources, such as folders, organizations, and billing accounts, contain their own audit logs.
Audit logs help you answer the question "Who did what, where, and when?" within your Google Cloud resources. Audit logs are the definitive source of information for system activity by user and access patterns and should be your primary source for audit or security questions.
INFORMATION_SCHEMA views in BigQuery are another source of insights that you can use along with metrics and logs. These views contain metadata about jobs, datasets, tables, and other BigQuery entities. For example, you can get real-time metadata about which BigQuery jobs ran during a specified time. Then, you can group or filter the results by project, user, tables referenced, and other dimensions.
INFORMATION_SCHEMA views provide you information to perform a more detailed analysis about your BigQuery workloads, such as the following:
- What is the average slot utilization for all queries over the past seven days for a given project?
- What streaming errors occurred in the past 30 minutes, grouped by error code?
BigQuery audit logs contain log entries for API calls, but theydon't describe the impact of the API calls. A subset of API calls creates jobs(such as query and load) whose information is captured byINFORMATION_SCHEMAviews. For example, you can find information about the time and slots that areutilized by a specific query inINFORMATION_SCHEMA views but not in the audit logs.
To get insights into the performance of your BigQuery workloads in particular, seejobs metadata,streaming metadata, andreservations metadata.
For more information about the types of audit logs that Google Cloud services write, seeTypes of audit logs.
Audit log format
Google Cloud services write audit logs in a structured JSON format. The base data type for Google Cloud log entries is theLogEntry structure. This structure contains the name of the log, the resource that generated the log entry, the timestamp (UTC), and other basic information.
Logs include details of the logged event in a subfield that's called thepayload field. For audit logs, the payload field is namedprotoPayload. This field's type (protoPayload.@type) is set totype.googleapis.com/google.cloud.audit.AuditLog, which indicates that the field uses theAuditLog log structure.
For operations on datasets, tables, and jobs, BigQuery writes audit logs in two different formats, although both formats share theAuditLog base type.
The older format includes the following fields and values:
- The value for the
resource.typefield isbigquery_resource. - BigQuery writes the details about an operation in the
protoPayload.serviceDatafield. The value of this field uses theAuditDatalog structure.
The newer format includes the following fields and values:
- The value for the
resource.typefield is eitherbigquery_projectorbigquery_dataset. Thebigquery_projectresource has log entries about jobs, while thebigquery_datasetresource has log entries about storage. - BigQuery writes the details about an operation in the
protoPayload.metadatafield. The value of this field uses theBigQueryAuditMetadatastructure.
We recommend consuming logs in the newer format. For more information, seeAudit logs migration guide.
The following is an abbreviated example of a log entry that shows a failed operation:
{"protoPayload":{"@type":"type.googleapis.com/google.cloud.audit.AuditLog","status":{"code":5,"message":"Not found: Dataset myproject:mydataset was not found in location US"},"authenticationInfo":{...},"requestMetadata":{...},"serviceName":"bigquery.googleapis.com","methodName":"google.cloud.bigquery.v2.JobService.InsertJob","metadata":{},"resource":{"type":"bigquery_project","labels":{..},},"severity":"ERROR","logName":"projects/myproject/logs/cloudaudit.googleapis.com%2Fdata_access",...}For operations on BigQuery reservations, theprotoPayload field uses theAuditLog structure, and theprotoPayload.request andprotoPayload.response fields contain more information. You can find the field definitions inBigQuery Reservation API. For more information, seeMonitoring BigQuery reservations.
For a deeper understanding of the audit log format, seeUnderstand audit logs.
Limitations
Log messages have a size limit of 100,000 bytes. For more information, seeTruncated log entry.
Visibility and access control
BigQuery audit logs can include information that users might consider sensitive, such as SQL text, schema definitions, and identifiers for resources such as tables and datasets. For information about managing access to this information, see the Cloud Loggingaccess control documentation.
What's next
- To learn how to use Cloud Logging to audit activities that are related to policy tags, seeAudit policy tags.
- To learn how to use BigQuery to analyze logged activity, seeBigQuery audit logs overview.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-02-18 UTC.