Use the BigQuery JupyterLab plugin
Preview
This product or feature is subject to the "Pre-GA Offerings Terms" in the General Service Terms section of theService Specific Terms. Pre-GA products and features are available "as is" and might have limited support. For more information, see thelaunch stage descriptions.
To request feedback or support for this feature, send an email tobigquery-ide-plugin@google.com.
This document shows you how to install and use the BigQueryJupyterLab plugin to do the following:
- Explore your BigQuery data.
- Use the BigQuery DataFrames API.
- Deploy a BigQuery DataFrames notebook toCloud Composer.
The BigQuery JupyterLab plugin includes all thefunctionality of theDataproc JupyterLab plugin,such as creating a Dataproc Serverless runtime template,launching and managing notebooks, developing with Apache Spark, deploying your code,and managing your resources.
Install the BigQuery JupyterLab plugin
To install and use the BigQuery JupyterLab plugin, follow thesesteps:
In your local terminal, check to make sure you have Python 3.8 or laterinstalled on your system:
python3--versionIn your local terminal,initialize the gcloud CLI:
gcloudinitInstall Pipenv, a Python virtual environment tool:
pip3installpipenvCreate a new virtual environment:
pipenvshellInstall JupyterLab in the new virtual environment:
pipenvinstalljupyterlabInstall the BigQuery JupyterLab plugin:
pipenvinstallbigquery-jupyter-pluginIf your installed version of JupyterLab is earlierthan 4.0.0, then enable the plugin extension:
jupyterserverextensionenablebigquery_jupyter_pluginLaunch JupyterLab:
jupyterlabJupyterLab opens in your browser.
SSL: CERTIFICATE_VERIFY_FAILED error in yourterminal when you launch JupyterLab, update your Python SSL certificate byexecuting/Applications/Python 3.11/Install Certificates.command.This file is located in the Python home directory.Update your project and region settings
By default, your session runs in the project and region that you set when yourangcloud init. To change the project and region settings for yoursession, do the following:
- In the JupyterLab menu, clickSettings> Google BigQuery Settings.
You must restart the plugin for the changes to take effect.
Explore data
To work with your BigQuery data in JupyterLab, do the following:
- In the JupyterLab sidebar, open theDataset Explorer pane: click the
datasets icon. To expand a project, in theDataset Explorer pane, click the expander arrow next to theproject name.

TheDataset Explorer pane shows all of the datasets in a project thatare located in the BigQuery region that you configured forthe session. You can interact with a project and dataset in various ways:
- To view information about a dataset, click the name of the dataset.
- To display all of the tables in a dataset, click the expander arrow next tothe dataset.
- To view information about a table, click the name of the table.
- To change the project or BigQuery region,update your settings.
Execute notebooks
To query your BigQuery data from JupyterLab, do the following:
- To open the launcher page, clickFile> New Launcher.
- In theBigQuery Notebooks section, click theBigQuery DataFramescard. A new notebook opens that shows you how to get started withBigQuery DataFrames.
BigQuery DataFrames notebooks support Python development in a localPython kernel. BigQuery DataFrames operations are executed remotely onBigQuery, but the rest of code is executed locally on yourmachine. When an operation is executed in BigQuery, a query jobID and link to the job appear below the code cell.
- To view the job in the Google Cloud console, clickOpen Job.
Deploy a BigQuery DataFrames notebook
You can deploy a BigQuery DataFrames notebook to Cloud Composerby using aDataproc Serverless runtime template. You must useruntime version 2.1 or later.
- In your JupyterLab notebook, clickcalendar_monthJob Scheduler.
- ForJob name, enter a unique name for your job.
- ForEnvironment, enter the name of the Cloud Composerenvironment to which you want to deploy the job.
- If your notebook is parameterized, add parameters.
- Enter the name of theServerless runtime template.
- To handle notebook execution failures, enter an integer forRetry countand a value (in minutes) forRetry delay.
Select which execution notifications to send, and then enter the recipients.
Notifications are sent using the Airflow SMTP configuration.
Select a schedule for the notebook.
ClickCreate.
When you successfully schedule your notebook, it appears on the list ofscheduled jobs in your selected Cloud Composer environment.
What's next
- Try theBigQuery DataFrames quickstart.
- Learn more about theBigQuery DataFrames Python API.
- Use the JupyterLab forserverless batch and notebook sessionswith Dataproc.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-02-18 UTC.