Create and query BigLake Iceberg tables in BigQuery

Cloud Composer 3 | Cloud Composer 2 | Cloud Composer 1

This page explains how to create and modify BigLake Iceberg tables in BigQuery using Airflow operators in yourCloud Composer environment.

About BigLake Iceberg tables in BigQuery

BigLake Iceberg tables in BigQueryprovide the foundation for building open-format lakehouses on Google Cloud.BigLake Iceberg tables in BigQuery offer the samefully managed experience as standard BigQuery tables, but store datain customer-owned storage buckets. BigLake Iceberg tables inBigQuery support the open Iceberg table format for betterinteroperability with open-source and third-party compute engines on a singlecopy of data.

Before you begin

Create a BigLake Iceberg table in BigQuery

To create a BigLake Iceberg table in BigQuery, useBigQueryCreateTableOperator in the same way as for other BigQuerytables. In thebiglakeConfiguration field, provide configuration for thetable.

importdatetimefromairflow.models.dagimportDAGfromairflow.providers.google.cloud.operators.bigqueryimportBigQueryCreateTableOperatorwithDAG("bq_iceberg_dag",start_date=datetime.datetime(2025,1,1),schedule=None,)asdag:create_iceberg_table=BigQueryCreateTableOperator(task_id="create_iceberg_table",project_id="PROJECT_ID",dataset_id="DATASET_ID",table_id="TABLE_NAME",table_resource={"schema":{"fields":[{"name":"order_id","type":"INTEGER","mode":"REQUIRED"},{"name":"customer_id","type":"INTEGER","mode":"REQUIRED"},{"name":"amount","type":"INTEGER","mode":"REQUIRED"},{"name":"created_at","type":"TIMESTAMP","mode":"REQUIRED"},]},"biglakeConfiguration":{"connectionId":"CONNECTION_NAME","storageUri":"STORAGE_URI","fileFormat":"PARQUET","tableFormat":"ICEBERG",}})

Replace the following:

  • PROJECT_ID: theProject ID.
  • DATASET_ID: an existing dataset.
  • TABLE_NAME: the name of the table you're creating.
  • CONNECTION_NAME: the name of theCloud Resource connection in theprojects/PROJECT_ID/locations/REGION/connections/CONNECTION_ID format.
  • STORAGE_URI: a fully qualifiedCloud Storage URI for the table. For example,gs://example-bucket/iceberg-table.

Query a BigLake Iceberg table in BigQuery

After you create a BigLake Iceberg table, you can query it withBigQueryInsertJobOperator as usual. The operator doesn't need additionalconfiguration specifically for BigLake Iceberg tables.

importdatetimefromairflow.models.dagimportDAGfromairflow.providers.google.cloud.operators.bigqueryimportBigQueryInsertJobOperatorwithDAG("bq_iceberg_dag_query",start_date=datetime.datetime(2025,1,1),schedule=None,)asdag:insert_values=BigQueryInsertJobOperator(task_id="iceberg_insert_values",configuration={"query":{"query":f"""          INSERT INTO `TABLE_ID` (order_id, customer_id, amount, created_at)          VALUES            (101, 19, 1, TIMESTAMP '2025-09-15 10:15:00+00'),            (102, 35, 2, TIMESTAMP '2025-09-14 10:15:00+00'),            (103, 36, 3, TIMESTAMP '2025-09-12 10:15:00+00'),            (104, 37, 4, TIMESTAMP '2025-09-11 10:15:00+00')        ""","useLegacySql":False,}})

Replace the following:

  • TABLE_ID with the table ID, in thePROJECT_ID.DATASET_ID.TABLE_NAMEformat.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-15 UTC.