Write and query log entries using a Python script

This quickstart introduces you to some of the capabilities of Cloud Loggingand shows you how to do the following:

  • Write log entries using a Python script.
  • View log entries using a Python script.
  • Delete log entries using a Python script.
  • Route logs to a Cloud Storage bucket.

Logging can route log entries to the following destinations:

  • Cloud Storage buckets
  • BigQuery datasets
  • Pub/Sub
  • Logging buckets
  • Google Cloud projects

Before you begin

You must have a Google Cloud project with billing enabled to complete thisquickstart. If you don't have a Google Cloud project, or if you don't have billingenabled for your Google Cloud project, do the following:
  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.create permission.Learn how to grant roles.
    Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

    Go to project selector

  3. If you're using an existing project for this guide,verify that you have the permissions required to complete this guide. If you created a new project, then you already have the required permissions.

  4. Verify that billing is enabled for your Google Cloud project.

  5. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.create permission.Learn how to grant roles.
    Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

    Go to project selector

  6. If you're using an existing project for this guide,verify that you have the permissions required to complete this guide. If you created a new project, then you already have the required permissions.

  7. Verify that billing is enabled for your Google Cloud project.

This quickstart uses Cloud Logging and Cloud Storage. Use of theseresources can incur a cost to you. When you finish this quickstart, you canavoid continued billing by deleting the resources that you created. SeeClean up on this page for more details.

Required roles

To get the permissions that you need to create, list, and delete log entries and Cloud Storage buckets, ask your administrator to grant you the following IAM roles on your project:

  • Create, list, and delete log entries:Logging Admin (roles/logging.admin)
  • Create, list, and delete Cloud Storage buckets:Storage Admin (roles/storage.admin)

For more information about granting roles, seeManage access to projects, folders, and organizations.

You might also be able to get the required permissions throughcustom roles or otherpredefined roles.

The Logs Writer (roles/logging.logWriter) and Logs Viewer(roles/logging.viewer) roles contain the permissions to create and listlog entries. To delete log entries, grant the Logging Admin(roles/logging.admin) role, which contains the permissions to create, list,and delete log entries. Note that the Logging Admin(roles/logging.admin) role also grants permissions to perform all actionsin Logging.

Getting started

You can use the Cloud Shell environment or a generic Linux environment tocomplete this quickstart. Python is preinstalled in the Cloud Shell.

Cloud Shell

  1. Open the Cloud Shell and verify your Google Cloud projectconfiguration:

    1. From the Google Cloud console, clickActivate Cloud Shell.

      The Cloud Shell opens in a window and displays a welcomemessage.

    2. The welcome message echoes the configured Google Cloud project ID. If this isn'tthe Google Cloud project that you want to use, run the following command afterreplacingPROJECT_ID with your project's ID:

      gcloud config set projectPROJECT_ID

Linux

  1. Ensure that Python is installed and configured.For information about preparing your machinefor Python development, seeSetting up a Python development environment.

  2. Install the Cloud Logging client library:

    pipinstall--upgradegoogle-cloud-logging
  3. Set up the Identity and Access Management permissions for your Google Cloud project. Inthe following steps, you create a service account for yourGoogle Cloud project, and then you generate and download a file toyour Linux workstation.

    1. In the Google Cloud console, go to theService Accounts page:

      Go toService Accounts

      If you use the search bar to find this page, then select the result whose subheading isIAM & Admin.

    2. Select your quickstart Google Cloud project, and then clickCreate Service Account:

      • Enter an account name.
      • Enter an account description.
      • ClickCreate and continue.
    3. Click theSelect a role field and selectLogging Admin.

    4. ClickDone to finish creating the service account.

    5. Create a key file and download it to your workstation:

      • For your service account, clickMore options,and selectManage keys.
      • In theKeys pane, clickAdd key.
      • ClickCreate new key.
      • For theKey type, selectJSON and then clickCreate.After a moment, a window displays a message similarto the following:

        Private key saved save to your computer.

  4. On your Linux workstation, provide your authentication credentials to your application by setting the environment variableGOOGLE_APPLICATION_CREDENTIALS to the path to your key file. For example:

    exportGOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/FILE_NAME.json"

    This environment variable only applies to your current shell session, so if you open a new session, set the variable again.

Clone source

To configure your Cloud Shell for this quickstart, do the following:

  1. Clone the GitHub projectpython-logging:

    gitclonehttps://github.com/googleapis/python-logging

    The directorysamples/snippets contains the two scripts used in this quickstart:

    • snippets.py lets you manage entries in a log.
    • export.py lets you manage log exports.
  2. Change to thesnippets directory:

    cdpython-logging/samples/snippets

Write log entries

Thesnippets.py script uses the Python client libraries to write log entriesto Logging. When thewrite option is specified on the commandline, the script writes the following log entries:

  • An entry with unstructured data and no specified severity level.
  • An entry with unstructured data and a severity level ofERROR.
  • An entry with JSON structured data and no specified severity level.

To write new log entries to the logmy-log, run thesnippets.py script withthewrite option:

pythonsnippets.pymy-logwrite

View log entries

To view the log entries in the Cloud Shell, run thesnippets.pyscript with thelist option:

pythonsnippets.pymy-loglist

The command completes with the result:

Listing entries for logger my-log:* 2025-11-13T16:05:35.548471+00:00: Hello, world!* 2025-11-13T16:05:35.647190+00:00: Goodbye, world!* 2025-11-13T16:05:35.726315+00:00: {u'favorite_color': u'Blue', u'quest': u'Find the Holy Grail', u'name': u'King Arthur'}

If the result doesn't show any entries, then retry the command. It takes a fewmoments for Logging to receive and process log entries.

You can also view the log entries you wrote by using the Logs Explorer.For more details, seeView logs by using the Logs Explorer.

Delete log entries

To delete all of the log entries in the logmy-log, run thesnippets.pyscript with the optiondelete:

pythonsnippets.pymy-logdelete

The command completes with the result:

Deleted all logging entries for my-log.

Route logs

In this section, you do the following:

  • Create a Cloud Storage bucket as the destination for your data.
  • Create a sink that copies new log entries to the destination.
  • Update the permissions of your Cloud Storage bucket.
  • Write log entries to Logging.
  • Optionally, verify the content of your Cloud Storage bucket.

Create destination

The export destination for this quickstart is a Cloud Storage bucket.To create a Cloud Storage bucket, do the following:

  1. In the Google Cloud console, go to theBuckets page:

    Go toBuckets

    If you use the search bar to find this page, then select the result whose subheading isCloud Storage.

  2. ClickCreate bucket.
  3. Enter a name for your bucket. This quickstart uses the namemyloggingproject-1.
  4. ForLocation type, selectRegion, which selects a bucket location with the lowest latency.
  5. ForSet a default class, selectStandard.
  6. ForAccess control, selectFine-grained.
  7. ForProtection tools, selectNone, and then clickCreate.

Create sink

A sink is a rule that determines if Logging routes a newlyarrived log entry to a destination. A sink has three attributes:

  • Name
  • Destination
  • Filter

For more information about sinks, seeAbout log sinks.

If a newly arrived log entry meets the query conditions, then that log entryis routed to the destination.

Theexport.py script uses the Python client libraries to create, list,modify and delete sinks. To create the sinkmysink that exports all logentries with a severity of at leastINFO to the Cloud Storage bucketmyloggingproject-1, run the following command:

pythonexport.pycreatemysinkmyloggingproject-1"severity>=INFO"

The script returns the following:

Created sink mysink

To view your sinks, run theexport.py script with thelist option:

pythonexport.pylist

The script returns the following:

mysink: severity>=INFO -> storage.googleapis.com/myloggingproject-1

Update destination permissions

The permissions of the destination, in this case, your Cloud Storagebucket, aren't modified when you create a sink by using theexport.py script.You must change the permission settings of your Cloud Storage bucket togrant permissions to write to your sink. For information about service accounts,access scopes, and Identity and Access Management roles,seeService accounts.

To update the permissions on your Cloud Storage bucket:

  1. Identify your sink'sWriter Identity:

    1. In the Google Cloud console, go to theLog Router page:

      Go toLog Router

      If you use the search bar to find this page, then select the result whose subheading isLogging.

      You see a summary table of your sinks.

    2. Find your sink in the table, selectMenu, and then selectView sink details.

    3. Copy the writer identity to your clipboard.

  2. In the Google Cloud console, go to theBuckets page:

    Go toBuckets

    If you use the search bar to find this page, then select the result whose subheading isCloud Storage.

  3. To open the detailed view, click the name of your bucket.

  4. SelectPermissions and clickGrant Access.

  5. Paste the writer identity into theNew principals box. Remove theserviceAccount: prefix from the writer identity address.

  6. Set theRole toStorage Admin, then clickSave.

For more information, seeSet destination permissions.

Validate sink

To validate that your sink and destination are properly configured, do thefollowing:

  1. Write new log entries to the logmy-log:

    pythonsnippets.pymy-logwrite
  2. View your Cloud Storage bucket's contents:

    1. In the Google Cloud console, go to theBuckets page:

      Go toBuckets

      If you use the search bar to find this page, then select the result whose subheading isCloud Storage.

    2. To open the detailed view, click the name of your bucket. The detailed view lists the folders that contain data. If there isn't data in your bucket, the following message is displayed:

      There are no live objects in this bucket.

      As described inLate-arriving log entries, it might take 2 or 3 hours before the first entries appear at the destination, or before you are notified of a configuration error.

      After your bucket has received data, the detailed view shows a result similar to:

      Bucket contents detailed view.

    3. The data in each folder is organized in a series of folders labeled with the top-level folder consisting of a log name, and then successively, the year, month, and day. To view the data that was exported by your sink, click the folder namemy-log, and then continue clicking through the year, month, and day subfolders until you reach a file that ends withjson:

      Bucket contents subfolder view.

    4. The JSON file contains the log entries that were exported to your Cloud Storage bucket. Click the name of the JSON file to see its contents. The contents are similar to:

      {"insertId":"yf1cshfoivz48","logName":"projects/loggingproject-222616/logs/my-log","receiveTimestamp":"2018-11-15T23:06:14.738729911Z","resource":{"labels":{"project_id":"loggingproject-222616"},"type":"global"},"severity":"ERROR","textPayload":"Goodbye, world!","timestamp":"2018-11-15T23:06:14.738729911Z"}

      Because the severity level ofERROR is greater than the severity level ofINFO, the log entry containing the string '"Goodbye, world!"' is exported to the sink destination. The other log entries that were written weren't exported to the destination because their severity level was set to the default value, and the default severity level is less thanINFO.

Troubleshooting

There are several reasons why a Cloud Storage bucket might be empty:

  • The bucket hasn't received data. It might take 2 or 3 hours before the first entries appear at the destination, or before you are notified of a configuration error. For more information, seeLate-arriving log entries.

  • There is a configuration error. In this case, you will receive an email message similar to the following subject line:

     [ACTION REQUIRED] Logging export config error in myloggingproject.

    The content of the email body describes the configuration issue. For example, if you don't update your destination permissions, then the email lists the following error code:

     bucket_permission_denied

    To correct this particular condition, seeUpdate destination permissions on this page.

  • No log entries were written after the sink was created. The sink is applied only to newly arriving log entries. To correct this situation, write new log entries:

    pythonsnippets.pymy-logwrite

Clean up

To avoid incurring charges to your Google Cloud account for the resources used on this page, follow these steps.

  1. (Optional) Delete the log entries you created. If you don't delete your logentries, they will expire and be removed.SeeQuotas and limits.

    To delete all log entries in the logmy-log, run the following command:

    pythonsnippets.pymy-logdelete
  2. Delete your Google Cloud project or delete your quickstart resources.

    • To delete your Google Cloud project, from theGoogle Cloud consoleProject Info pane, clickGo to project settings, and then clickShut down.

    • To delete your quickstart resources:

      1. Delete your sink by running the following command:

        pythonexport.pydeletemysink
      2. Delete your Cloud Storage bucket. Go to the Google Cloud console and clickStorage > Buckets. Place a check in the box next to your bucket name and then clickDelete.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-19 UTC.