- 3.39.0 (latest)
- 3.38.0
- 3.37.0
- 3.36.0
- 3.35.1
- 3.34.0
- 3.33.0
- 3.31.0
- 3.30.0
- 3.29.0
- 3.27.0
- 3.26.0
- 3.25.0
- 3.24.0
- 3.23.1
- 3.22.0
- 3.21.0
- 3.20.1
- 3.19.0
- 3.18.0
- 3.17.2
- 3.16.0
- 3.15.0
- 3.14.1
- 3.13.0
- 3.12.0
- 3.11.4
- 3.4.0
- 3.3.6
- 3.2.0
- 3.1.0
- 3.0.1
- 2.34.4
- 2.33.0
- 2.32.0
- 2.31.0
- 2.30.1
- 2.29.0
- 2.28.1
- 2.27.1
- 2.26.0
- 2.25.2
- 2.24.1
- 2.23.3
- 2.22.1
- 2.21.0
- 2.20.0
- 2.19.0
- 2.18.0
- 2.17.0
- 2.16.1
- 2.15.0
- 2.14.0
- 2.13.1
- 2.12.0
- 2.11.0
- 2.10.0
- 2.9.0
- 2.8.0
- 2.7.0
- 2.6.2
- 2.5.0
- 2.4.0
- 2.3.1
- 2.2.0
- 2.1.0
- 2.0.0
- 1.28.2
- 1.27.2
- 1.26.1
- 1.25.0
- 1.24.0
- 1.23.1
- 1.22.0
- 1.21.0
- 1.20.0
- 1.19.0
- 1.18.0
- 1.17.0
- 1.16.0
3.0.0 Migration Guide
New Required Dependencies
Some of the previously optional dependencies are nowrequired in3.x versions of thelibrary, namelygoogle-cloud-bigquery-storage(minimum version2.0.0) andpyarrow (minimumversion3.0.0).
The behavior of some of the package “extras” has thus also changed:
The
pandasextra now requires thedb-typespackage.The
bqstorageextra has been preserved for comaptibility reasons, but it is now ano-op and should be omitted when installing the BigQuery client library.
Before:
$ pip install google-cloud-bigquery[bqstorage]After:
$ pip install google-cloud-bigquery- The
bignumeric_typeextra has been removed, asBIGNUMERICtype is nowautomatically supported. That extra should thus not be used.
Before:
$ pip install google-cloud-bigquery[bignumeric_type]After:
$ pip install google-cloud-bigqueryType Annotations
The library is now type-annotated and declares itself as such. If you use a statictype checker such asmypy, you might start getting errors in places wheregoogle-cloud-bigquery package is used.
It is recommended to update your code and/or type annotations to fix these errors, butif this is not feasible in the short term, you can temporarily ignore type annotationsingoogle-cloud-bigquery, for example by using a special# type: ignore comment:
from google.cloud importbigquery # type: ignoreBut again, this is only recommended as a possible short-term workaround if immediatelyfixing the type check errors in your project is not feasible.
Re-organized Types
The auto-generated parts of the library has been removed, and proto-based types formerlyfound ingoogle.cloud.bigquery_v2 have been replaced by the new implementation (butsee thesection below).
For example, the standard SQL data types should new be imported from a new location:
Before:
from google.cloud.bigquery_v2 import StandardSqlDataTypefrom google.cloud.bigquery_v2.types import StandardSqlFieldfrom google.cloud.bigquery_v2.types.standard_sql import StandardSqlStructTypeAfter:
from google.cloud.bigquery import StandardSqlDataTypefrom google.cloud.bigquery.standard_sql import StandardSqlFieldfrom google.cloud.bigquery.standard_sql import StandardSqlStructTypeTheTypeKind enum defining all possible SQL types for schema fields has been renamedand is not nested anymore underStandardSqlDataType:
Before:
from google.cloud.bigquery_v2 import StandardSqlDataTypeif field_type == StandardSqlDataType.TypeKind.STRING: ...After:
from google.cloud.bigquery importStandardSqlTypeNamesif field_type ==StandardSqlTypeNames.STRING: ...Issuing queries withClient.create_job preserves destination table
TheClient.create_job method no longer removes the destination table from aquery job’s configuration. Destination table for the query can thus beexplicitly defined by the user.
Changes to data types when reading a pandas DataFrame
The default dtypes returned by theto_dataframe method have changed.
Now, the BigQuery
BOOLEANdata type maps to the pandasbooleandtype.Previously, this mapped to the pandasbooldtype when the column did notcontainNULLvalues and the pandasobjectdtype whenNULLvalues arepresent.Now, the BigQuery
INT64data type maps to the pandasInt64dtype.Previously, this mapped to the pandasint64dtype when the column did notcontainNULLvalues and the pandasfloat64dtype whenNULLvalues arepresent.Now, the BigQuery
DATEdata type maps to the pandasdbdatedtype, whichis provided by thedb-dtypespackage. If any date value is outside of the range ofpandas.Timestamp.min(1677-09-22) andpandas.Timestamp.max(2262-04-11), the data type maps to the pandasobjectdtype. Thedate_as_objectparameter has been removed.Now, the BigQuery
TIMEdata type maps to the pandasdbtimedtype, whichis provided by thedb-dtypespackage.
Changes to data types loading a pandas DataFrame
In the absence of schema information, pandas columns with naivedatetime64[ns] values, i.e. without timezone information, are recognized andloaded using theDATETIME type. On the other hand, for columns withtimezone-awaredatetime64[ns, UTC] values, theTIMESTAMP type is continuedto be used.
Changes toModel,Client.get_model,Client.update_model, andClient.list_models
The types of severalModel properties have been changed.
Model.feature_columnsnow returns a sequence ofgoogle.cloud.bigquery.standard_sql.StandardSqlField.Model.label_columnsnow returns a sequence ofgoogle.cloud.bigquery.standard_sql.StandardSqlField.Model.model_typenow returns a string.Model.training_runsnow returns a sequence of dictionaries, as recieved from theBigQuery REST API.
Legacy Protocol Buffers Types
For compatibility reasons, the legacy proto-based types still exists as static codeand can be imported:
from google.cloud.bigquery_v2 import Model # a sublcass of proto.MessageMind, however, that importing them will issue a warning, because aside frombeing importable, these typesare not maintained anymore. They may differboth from the types ingoogle.cloud.bigquery, and from the types supported onthe backend.
Maintaining compatibility withgoogle-cloud-bigquery version 2.0
If you maintain a library or system that needs to support bothgoogle-cloud-bigquery version 2.x and 3.x, it is recommended that you detectwhen version 2.x is in use and convert properties that use the legacy protocolbuffer types, such asModel.training_runs, into the types used in 3.x.
Call theto_dictmethodon the protocol buffers objects to get a JSON-compatible dictionary.
from google.cloud.bigquery_v2 import Modeltraining_run: Model.TrainingRun = ...training_run_dict = training_run.to_dict()2.0.0 Migration Guide
The 2.0 release of thegoogle-cloud-bigquery client drops support for Pythonversions below 3.6. The client surface itself has not changed, but the 1.x serieswill not be receiving any more feature updates or bug fixes. You are thusencouraged to upgrade to the 2.x series.
If you experience issues or have questions, please file anissue.
Supported Python Versions
WARNING: Breaking change
The 2.0.0 release requires Python 3.6+.
Supported BigQuery Storage Clients
The 2.0.0 release requires BigQuery Storage>= 2.0.0, which dropped supportforv1beta1 andv1beta2 versions of the BigQuery Storage API. If you want touse a BigQuery Storage client, it must be the one supporting thev1 API version.
Changed GAPIC Enums Path
WARNING: Breaking change
Generated GAPIC enum types have been moved undertypes. Import paths need to beadjusted.
Before:
from google.cloud.bigquery_v2.gapic import enumsdistance_type = enums.Model.DistanceType.COSINEAfter:
from google.cloud.bigquery_v2 import typesdistance_type = types.Model.DistanceType.COSINEExcept as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-16 UTC.