Perform anomaly detection with a multivariate time-seriesforecasting model
This tutorial shows you how to do the following tasks:
- Create an
ARIMA_PLUS_XREGtime series forecasting model. - Detect anomalies in the time series data by running the
ML.DETECT_ANOMALIESfunctionagainst the model.
This tutorial uses the following tables from the publicepa_historical_air_quality dataset, which contains daily PM 2.5, temperature,and wind speed information collected from multiple US cities:
epa_historical_air_quality.pm25_nonfrm_daily_summaryepa_historical_air_quality.wind_daily_summaryepa_historical_air_quality.temperature_daily_summary
Required permissions
To create the dataset, you need the
bigquery.datasets.createIAM permission.To create the model, you need the following permissions:
bigquery.jobs.createbigquery.models.createbigquery.models.getDatabigquery.models.updateData
To run inference, you need the following permissions:
bigquery.models.getDatabigquery.jobs.create
For more information about IAM roles and permissions inBigQuery, seeIntroduction to IAM.
Costs
In this document, you use the following billable components of Google Cloud:
- BigQuery: You incur costs for the data you process in BigQuery.
To generate a cost estimate based on your projected usage, use thepricing calculator.
For more information, seeBigQuery pricing.
Before you begin
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
- Create a project: To create a project, you need the Project Creator role (
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission.Learn how to grant roles.
Verify that billing is enabled for your Google Cloud project.
Enable the BigQuery API.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission.Learn how to grant roles.In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
- Create a project: To create a project, you need the Project Creator role (
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission.Learn how to grant roles.
Verify that billing is enabled for your Google Cloud project.
Enable the BigQuery API.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission.Learn how to grant roles.
Create a dataset
Create a BigQuery dataset to store your ML model.
Console
In the Google Cloud console, go to theBigQuery page.
In theExplorer pane, click your project name.
ClickView actions > Create dataset
On theCreate dataset page, do the following:
ForDataset ID, enter
bqml_tutorial.ForLocation type, selectMulti-region, and then selectUS (multiple regions in United States).
Leave the remaining default settings as they are, and clickCreate dataset.
bq
To create a new dataset, use thebq mk commandwith the--location flag. For a full list of possible parameters, see thebq mk --dataset commandreference.
Create a dataset named
bqml_tutorialwith the data location set toUSand a description ofBigQuery ML tutorial dataset:bq --location=US mk -d \ --description "BigQuery ML tutorial dataset." \ bqml_tutorial
Instead of using the
--datasetflag, the command uses the-dshortcut.If you omit-dand--dataset, the command defaults to creating adataset.Confirm that the dataset was created:
bqls
API
Call thedatasets.insertmethod with a defineddataset resource.
{"datasetReference":{"datasetId":"bqml_tutorial"}}
BigQuery DataFrames
Before trying this sample, follow the BigQuery DataFrames setup instructions in theBigQuery quickstart using BigQuery DataFrames. For more information, see theBigQuery DataFrames reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up ADC for a local development environment.
importgoogle.cloud.bigquerybqclient=google.cloud.bigquery.Client()bqclient.create_dataset("bqml_tutorial",exists_ok=True)Prepare the training data
The PM2.5, temperature, and wind speed data are in separate tables.Create thebqml_tutorial.seattle_air_quality_daily table of training databy combining the data in these public tables.bqml_tutorial.seattle_air_quality_daily contains the following columns:
date: the date of the observationPM2.5: the average PM2.5 value for each daywind_speed: the average wind speed for each daytemperature: the highest temperature for each day
The new table has daily data from August 11, 2009 to January 31, 2022.
Go to theBigQuery page.
In the SQL editor pane, run the following SQL statement:
CREATETABLE`bqml_tutorial.seattle_air_quality_daily`ASWITHpm25_dailyAS(SELECTavg(arithmetic_mean)ASpm25,date_localASdateFROM`bigquery-public-data.epa_historical_air_quality.pm25_nonfrm_daily_summary`WHEREcity_name='Seattle'ANDparameter_name='Acceptable PM2.5 AQI & Speciation Mass'GROUPBYdate_local),wind_speed_dailyAS(SELECTavg(arithmetic_mean)ASwind_speed,date_localASdateFROM`bigquery-public-data.epa_historical_air_quality.wind_daily_summary`WHEREcity_name='Seattle'ANDparameter_name='Wind Speed - Resultant'GROUPBYdate_local),temperature_dailyAS(SELECTavg(first_max_value)AStemperature,date_localASdateFROM`bigquery-public-data.epa_historical_air_quality.temperature_daily_summary`WHEREcity_name='Seattle'ANDparameter_name='Outdoor Temperature'GROUPBYdate_local)SELECTpm25_daily.dateASdate,pm25,wind_speed,temperatureFROMpm25_dailyJOINwind_speed_dailyUSING(date)JOINtemperature_dailyUSING(date)
Create the model
Create a multivariate time series model, using the data frombqml_tutorial.seattle_air_quality_daily as training data.
Go to theBigQuery page.
In the SQL editor pane, run the following SQL statement:
CREATEORREPLACEMODEL`bqml_tutorial.arimax_model`OPTIONS(model_type='ARIMA_PLUS_XREG',auto_arima=TRUE,time_series_data_col='temperature',time_series_timestamp_col='date')ASSELECT*FROM`bqml_tutorial.seattle_air_quality_daily`WHEREdate<"2023-02-01";
The query takes several seconds to complete, after which the model
arimax_modelappears in thebqml_tutorialdataset and can be accessedin theExplorer pane.Because the query uses a
CREATE MODELstatement to create a model, thereare no query results.
Perform anomaly detection on historical data
Run anomaly detection against the historical data that you used to train themodel.
Go to theBigQuery page.
In the SQL editor pane, run the following SQL statement:
SELECT*FROMML.DETECT_ANOMALIES(MODEL`bqml_tutorial.arimax_model`,STRUCT(0.6ASanomaly_prob_threshold))ORDERBYdateASC;
The results look similar to the following:
+-------------------------+-------------+------------+--------------------+--------------------+---------------------+| date | temperature | is_anomaly | lower_bound | upper_bound | anomaly_probability |+--------------------------------------------------------------------------------------------------------------------+| 2009-08-11 00:00:00 UTC | 70.1 | false | 67.647370742988727 | 72.552629257011262 | 0 |+--------------------------------------------------------------------------------------------------------------------+| 2009-08-12 00:00:00 UTC | 73.4 | false | 71.7035428351283 | 76.608801349150838 | 0.20478819992561115 |+--------------------------------------------------------------------------------------------------------------------+| 2009-08-13 00:00:00 UTC | 64.6 | true | 67.740408724826068 | 72.6456672388486 | 0.945588334903206 |+-------------------------+-------------+------------+--------------------+--------------------+---------------------+
Perform anomaly detection on new data
Run anomaly detection on the new data that you generate.
Go to theBigQuery page.
In the SQL editor pane, run the following SQL statement:
SELECT*FROMML.DETECT_ANOMALIES(MODEL`bqml_tutorial.arimax_model`,STRUCT(0.6ASanomaly_prob_threshold),(SELECT*FROMUNNEST([STRUCT<dateTIMESTAMP,pm25FLOAT64,wind_speedFLOAT64,temperatureFLOAT64>('2023-02-01 00:00:00 UTC',8.8166665,1.6525,44.0),('2023-02-02 00:00:00 UTC',11.8354165,1.558333,40.5),('2023-02-03 00:00:00 UTC',10.1395835,1.6895835,46.5),('2023-02-04 00:00:00 UTC',11.439583500000001,2.0854165,45.0),('2023-02-05 00:00:00 UTC',9.7208335,1.7083335,46.0),('2023-02-06 00:00:00 UTC',13.3020835,2.23125,43.5),('2023-02-07 00:00:00 UTC',5.7229165,2.377083,47.5),('2023-02-08 00:00:00 UTC',7.6291665,2.24375,44.5),('2023-02-09 00:00:00 UTC',8.5208335,2.2541665,40.5),('2023-02-10 00:00:00 UTC',9.9086955,7.333335,39.5)])));
The results look similar to the following:
+-------------------------+-------------+------------+--------------------+--------------------+---------------------+------------+------------+| date | temperature | is_anomaly | lower_bound | upper_bound | anomaly_probability | pm25 | wind_speed |+----------------------------------------------------------------------------------------------------------------------------------------------+| 2023-02-01 00:00:00 UTC | 44.0 | true | 36.89918003713138 | 41.8044385511539 | 0.88975675709801583 | 8.8166665 | 1.6525 |+----------------------------------------------------------------------------------------------------------------------------------------------+| 2023-02-02 00:00:00 UTC | 40.5 | false | 34.439946284051572 | 40.672021330796483 | 0.57358239699845348 | 11.8354165 | 1.558333 |+--------------------------------------------------------------------------------------------------------------------+-------------------------+| 2023-02-03 00:00:00 UTC | 46.5 | true | 33.615139992931191 | 40.501364463964549 | 0.97902867696346974 | 10.1395835 | 1.6895835 |+-------------------------+-------------+------------+--------------------+--------------------+---------------------+-------------------------+
Clean up
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-02-19 UTC.