Cloud Logging API overview

The Cloud Logging API lets you programmatically accomplish logging-related tasks,including reading and writing log entries, creating log-based metrics, andmanaging sinks to route logs.

See the following reference documentation for the Logging API:

For details on the limits that apply to your usage of the Logging API,seeLogging API quotas and limits.

Service endpoint

Aservice endpoint is a base URL that specifies thenetwork address of an API service. Logging has both global andregional endpoints. You can use a global or regional serviceendpoint to make requests to Logging:

Enable the Logging API

The Logging API must be enabled before it can be used. Forinstructions, seeEnable the Logging API.

Access the Logging API

You can indirectly invoke the Logging API by using acommand-line interface or a client library written to support ahigh-level programming language. For more information, seethe following reference documentation:

  • To learn how to set up client libraries and authorize theLogging API, with sample code, seeClient libraries.
  • To try the API without writing any code, you can use the APIs Explorer.The APIs Explorer appears on REST API method reference pages in a paneltitledTry this API. For instructions, seeUsing the API Explorer.

Optimize usage of the Logging API

Following are some tips for using the Logging API effectively.

Read and list logs efficiently

To efficiently use yourentries.list quota, try thefollowing:

  • Set a largepageSize: In the request body, you can set thepageSizeparameter up to and including the maximum value of anint32 (2,147,483,647).Setting thepageSize parameter to a higher value lets Loggingreturn more entries per query, reducing the number of queries needed toretrieve the full set of entries that you're targeting.

  • Set a large deadline: When a query nears its deadline, Loggingprematurely terminates and returns the log entries scanned thus far. If youset a large deadline, then Logging can retrieve more entriesper query.

  • Retry quota errors withexponential backoff:If your use case isn't time-sensitive, then you can wait for the quota toreplenish before retryingyour query. ThepageToken parameter is still valid after a delay.

Write logs efficiently

To efficiently use yourentries.write quota, increase yourbatch volume to support a larger number of log entries perrequest, which helps reduce the number of writes made per request.Logging supports requests with up to 10MB of data.

Bulk retrieval of log entries

The method you use to retrieve log entries isentries.list, but this method isn't intended forhigh-volume retrieval of log entries. Using this method in this way mightquickly exhaust your quota for read requests.

If you need contemporary or continuous querying, or bulk retrieval of logentries, thenconfigure sinks to send yourlog entries toPub/Sub. When you create a Pub/Sub sink, you send the logentries that you want to process to a Pub/Sub topic, and thenconsume the log entries from there.

This approach has the following advantages:

  • It doesn't exhaust your read-request quota. For more on quotas, seeLogging usage limits.
  • It captures log entries that might have been written out of order, withoutworkarounds to seek back and re-read recent entries to ensure nothing wasmissed.
  • It automatically buffers the log entries if the consumer becomesunavailable.
  • The log entries don't count towards your free allotment because theyaren't stored in log buckets.

You can create Pub/Sub sinks to route log entries to a variety ofanalytics platforms. For an example, seeScenarios for routing Cloud Logging data: Splunk.

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-15 UTC.