3.0.0 Migration Guide

New Required Dependencies

Some of the previously optional dependencies are nowrequired in3.x versions of thelibrary, namelygoogle-cloud-bigquery-storage(minimum version2.0.0) andpyarrow (minimumversion3.0.0).

The behavior of some of the package “extras” has thus also changed:

  • Thepandas extra now requires thedb-typespackage.

  • Thebqstorage extra has been preserved for comaptibility reasons, but it is now ano-op and should be omitted when installing the BigQuery client library.

Before:

$ pip install google-cloud-bigquery[bqstorage]

After:

$ pip install google-cloud-bigquery
  • Thebignumeric_type extra has been removed, asBIGNUMERIC type is nowautomatically supported. That extra should thus not be used.

Before:

$ pip install google-cloud-bigquery[bignumeric_type]

After:

$ pip install google-cloud-bigquery

Type Annotations

The library is now type-annotated and declares itself as such. If you use a statictype checker such asmypy, you might start getting errors in places wheregoogle-cloud-bigquery package is used.

It is recommended to update your code and/or type annotations to fix these errors, butif this is not feasible in the short term, you can temporarily ignore type annotationsingoogle-cloud-bigquery, for example by using a special# type: ignore comment:

from google.cloud importbigquery  # type: ignore

But again, this is only recommended as a possible short-term workaround if immediatelyfixing the type check errors in your project is not feasible.

Re-organized Types

The auto-generated parts of the library has been removed, and proto-based types formerlyfound ingoogle.cloud.bigquery_v2 have been replaced by the new implementation (butsee thesection below).

For example, the standard SQL data types should new be imported from a new location:

Before:

from google.cloud.bigquery_v2 import StandardSqlDataTypefrom google.cloud.bigquery_v2.types import StandardSqlFieldfrom google.cloud.bigquery_v2.types.standard_sql import StandardSqlStructType

After:

from google.cloud.bigquery import StandardSqlDataTypefrom google.cloud.bigquery.standard_sql import StandardSqlFieldfrom google.cloud.bigquery.standard_sql import StandardSqlStructType

TheTypeKind enum defining all possible SQL types for schema fields has been renamedand is not nested anymore underStandardSqlDataType:

Before:

from google.cloud.bigquery_v2 import StandardSqlDataTypeif field_type == StandardSqlDataType.TypeKind.STRING:    ...

After:

from google.cloud.bigquery importStandardSqlTypeNamesif field_type ==StandardSqlTypeNames.STRING:    ...

Issuing queries withClient.create_job preserves destination table

TheClient.create_job method no longer removes the destination table from aquery job’s configuration. Destination table for the query can thus beexplicitly defined by the user.

Changes to data types when reading a pandas DataFrame

The default dtypes returned by theto_dataframe method have changed.

  • Now, the BigQueryBOOLEAN data type maps to the pandasboolean dtype.Previously, this mapped to the pandasbool dtype when the column did notcontainNULL values and the pandasobject dtype whenNULL values arepresent.

  • Now, the BigQueryINT64 data type maps to the pandasInt64 dtype.Previously, this mapped to the pandasint64 dtype when the column did notcontainNULL values and the pandasfloat64 dtype whenNULL values arepresent.

  • Now, the BigQueryDATE data type maps to the pandasdbdate dtype, whichis provided by thedb-dtypespackage. If any date value is outside of the range ofpandas.Timestamp.min(1677-09-22) andpandas.Timestamp.max(2262-04-11), the data type maps to the pandasobject dtype. Thedate_as_object parameter has been removed.

  • Now, the BigQueryTIME data type maps to the pandasdbtime dtype, whichis provided by thedb-dtypespackage.

Changes to data types loading a pandas DataFrame

In the absence of schema information, pandas columns with naivedatetime64[ns] values, i.e. without timezone information, are recognized andloaded using theDATETIME type. On the other hand, for columns withtimezone-awaredatetime64[ns, UTC] values, theTIMESTAMP type is continuedto be used.

Changes toModel,Client.get_model,Client.update_model, andClient.list_models

The types of severalModel properties have been changed.

  • Model.feature_columns now returns a sequence ofgoogle.cloud.bigquery.standard_sql.StandardSqlField.

  • Model.label_columns now returns a sequence ofgoogle.cloud.bigquery.standard_sql.StandardSqlField.

  • Model.model_type now returns a string.

  • Model.training_runs now returns a sequence of dictionaries, as recieved from theBigQuery REST API.

Legacy Protocol Buffers Types

For compatibility reasons, the legacy proto-based types still exists as static codeand can be imported:

from google.cloud.bigquery_v2 import Model  # a sublcass of proto.Message

Mind, however, that importing them will issue a warning, because aside frombeing importable, these typesare not maintained anymore. They may differboth from the types ingoogle.cloud.bigquery, and from the types supported onthe backend.

Maintaining compatibility withgoogle-cloud-bigquery version 2.0

If you maintain a library or system that needs to support bothgoogle-cloud-bigquery version 2.x and 3.x, it is recommended that you detectwhen version 2.x is in use and convert properties that use the legacy protocolbuffer types, such asModel.training_runs, into the types used in 3.x.

Call theto_dictmethodon the protocol buffers objects to get a JSON-compatible dictionary.

from google.cloud.bigquery_v2 import Modeltraining_run: Model.TrainingRun = ...training_run_dict = training_run.to_dict()

2.0.0 Migration Guide

The 2.0 release of thegoogle-cloud-bigquery client drops support for Pythonversions below 3.6. The client surface itself has not changed, but the 1.x serieswill not be receiving any more feature updates or bug fixes. You are thusencouraged to upgrade to the 2.x series.

If you experience issues or have questions, please file anissue.

Supported Python Versions

WARNING: Breaking change

The 2.0.0 release requires Python 3.6+.

Supported BigQuery Storage Clients

The 2.0.0 release requires BigQuery Storage>= 2.0.0, which dropped supportforv1beta1 andv1beta2 versions of the BigQuery Storage API. If you want touse a BigQuery Storage client, it must be the one supporting thev1 API version.

Changed GAPIC Enums Path

WARNING: Breaking change

Generated GAPIC enum types have been moved undertypes. Import paths need to beadjusted.

Before:

from google.cloud.bigquery_v2.gapic import enumsdistance_type = enums.Model.DistanceType.COSINE

After:

from google.cloud.bigquery_v2 import typesdistance_type = types.Model.DistanceType.COSINE

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-16 UTC.