Direct Library Usage

We recommend that you use thegoogle-cloud-logging libraryby integrating it with thePython logging standard library;However, you can also use the library to interact with the Google Cloud Logging APIdirectly.

In addition to writing logs, you can use the library to managelogs,sinks,metrics, and other resources.

Setup

Create a Client

You must set up aClient to use the library:

import google.cloud.logging# if project not given, it will be inferred from the environmentclient = google.cloud.logging.Client(project="my-project")

To use HTTP,disable gRPC when you set up theClient:

http_client = google.cloud.logging.Client(_use_grpc=False)

Create a Logger

Loggers read, write, and delete logs from Google Cloud.

You use yourClient to create aLogger.

client = google.cloud.logging.Client(project="my-project")logger = client.logger(name="log_id")# logger will bind to logName "projects/my_project/logs/log_id"

To add custom labels, do so when you initialize aLogger.When you add custom labels, these labels are added to eachLogEntry written by theLogger:

custom_labels = {"my-key": "my-value"}label_logger = client.logger(log_id, labels=custom_labels)

By default, the library adds aMonitored Resource fieldassociated with the environment the code is run on. For example, code run onApp Engine will have agae_appresource, while code run locally will have aglobal resource field.

To manually set the resource field, do so when you initialize theLogger:

from google.cloud.logging_v2.resource import Resourceresource = Resource(type="global", labels={})global_logger = client.logger(log_id, resource=resource)

Write Log Entries

You write logs by usingLogger.log:

logger.log("A simple entry")  # API call

You can addLogEntry fieldsby passing them as keyword arguments:

logger.log(    "an entry with fields set",    severity="ERROR",    insert_id="0123",    labels={"my-label": "my-value"},)  # API call

Logger.log chooses the appropriateLogEntry typebased on input type. To specify type, you can use the following Logger methods:

Batch Write Logs

By default, each log write takes place in an individual network request, which may be inefficient at scale.

Using theBatch class, logs are batched together, and only sent outwhenbatch.commit is called.

batch = logger.batch()batch.log("first log")batch.log("second log")batch.commit()

To simplify things, you can also useBatch as a context manager:

with logger.batch() as batch:    batch.log("first log")    # do work    batch.log("last log")

In the previous example, the logs are automatically committed when the code exits the “with” block.

Retrieve Log Entries

You retrieve log entries for the default project usinglist_entries()on aClient orLogger object:

for entry in client.list_entries():  # API call(s)    do_something_with(entry)

Entries returned byClient.list_entries()orLogger.list_entries()are instances of one of the following classes:

To filter entries retrieved using theAdvanced Logs Filters syntax

To fetch entries for the default project.

filter_str = "logName:log_name AND textPayload:simple"for entry in client.list_entries(filter_=filter_str):  # API call(s)    do_something_with(entry)

To sort entries in descending timestamp order.

from google.cloud.logging import DESCENDINGfor entry in client.list_entries(order_by=DESCENDING):  # API call(s)    do_something_with(entry)

To retrieve entries for a single logger, sorting in descending timestamp order:

from google.cloud.logging import DESCENDINGfor entry in logger.list_entries(order_by=DESCENDING):  # API call(s)    do_something_with(entry)

For example, to retrieve allGKE Admin Activity audit logsfrom the past 24 hours:

import google.cloud.loggingfrom datetime import datetime, timedelta, timezoneimport os# pull your project id from an environment variableproject_id = os.environ["GOOGLE_CLOUD_PROJECT"]# construct a date object representing yesterdayyesterday = datetime.now(timezone.utc) - timedelta(days=1)# Cloud Logging expects a timestamp in RFC3339 UTC "Zulu" format# https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntrytime_format = "%Y-%m-%dT%H:%M:%S.%f%z"# build a filter that returns GKE Admin Activity audit Logs from# the past 24 hours# https://cloud.google.com/kubernetes-engine/docs/how-to/audit-loggingfilter_str = (    f'logName="projects/{project_id}/logs/cloudaudit.googleapis.com%2Factivity"'    f' AND resource.type="k8s_cluster"'    f' AND timestamp>="{yesterday.strftime(time_format)}"')# query and print all matching logsclient = google.cloud.logging.Client()for entry in client.list_entries(filter_=filter_str):    print(entry)

Delete Log Entries

To delete all logs associated with a logger, use the following call:

logger.delete()  # API call

Manage Log Metrics

Logs-based metrics are counters of entries which match a given filter.They can be used within Cloud Monitoring to create charts and alerts.

To list all logs-based metrics for a project:

for metric in client.list_metrics():  # API call(s)    do_something_with(metric)

To create a logs-based metric:

metric = client.metric(metric_name, filter_=filter, description=description)assert not metric.exists()  # API callmetric.create()  # API callassert metric.exists()  # API call

To refresh local information about a logs-based metric:

existing_metric = client.metric(metric_name)existing_metric.reload()  # API call

To update a logs-based metric:

existing_metric.filter_ = updated_filterexisting_metric.description = updated_descriptionexisting_metric.update()  # API call

To delete a logs-based metric:

    metric.delete()

Log Sinks

Sinks allow exporting of log entries which match a given filter toCloud Storage buckets, BigQuery datasets, or Cloud Pub/Sub topics.

Cloud Storage Sink

Ensure the storage bucket that you want to export logs to hascloud-logs@google.com as an owner. SeeSetting permissions for Cloud Storage.

Ensure thatcloud-logs@google.com is an owner of the bucket:

bucket.acl.reload()  # API calllogs_group = bucket.acl.group("cloud-logs@google.com")logs_group.grant_owner()bucket.acl.add_entity(logs_group)bucket.acl.save()  # API call

To create a Cloud Storage sink:

destination = "storage.googleapis.com/%s" % (bucket.name,)sink = client.sink(sink_name, filter_=filter, destination=destination)assert not sink.exists()  # API callsink.create()  # API callassert sink.exists()  # API call

BigQuery Sink

To export logs to BigQuery, you must log into the Cloud Consoleand addcloud-logs@google.com to a dataset.

See:Setting permissions for BigQuery

from google.cloud.bigquery.dataset import AccessEntryentry_list = dataset.access_entriesentry_list.append(AccessEntry("WRITER", "groupByEmail", "cloud-logs@google.com"))dataset.access_entries = entry_listclient.update_dataset(dataset, ["access_entries"])  # API call

To create a BigQuery sink:

destination = "bigquery.googleapis.com%s" % (dataset.path,)sink = client.sink(sink_name, filter_=filter_str, destination=destination)assert not sink.exists()  # API callsink.create()  # API callassert sink.exists()  # API call

Pub/Sub Sink

To export logs to BigQuery you must log into the Cloud Consoleand addcloud-logs@google.com to a topic.

See:Setting permissions for Pub/Sub

topic_path = client.topic_path(project_id, topic_id)topic = client.create_topic(request={"name": topic_path})policy = client.get_iam_policy(request={"resource": topic_path})  # API callpolicy.bindings.add(role="roles/owner", members=["group:cloud-logs@google.com"])client.set_iam_policy(    request={"resource": topic_path, "policy": policy})  # API call

To create a Cloud Pub/Sub sink:

destination = "pubsub.googleapis.com/%s" % (topic.name,)sink = client.sink(sink_name, filter_=filter_str, destination=destination)assert not sink.exists()  # API callsink.create()  # API callassert sink.exists()  # API call

Manage Sinks

To list all sinks for a project:

for sink in client.list_sinks():  # API call(s)    do_something_with(sink)

To refresh local information about a sink:

existing_sink = client.sink(sink_name)existing_sink.reload()

To update a sink:

existing_sink.filter_ = updated_filterexisting_sink.update()

To delete a sink:

sink.delete()

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-10-30 UTC.