Export and import using SQL dump files Stay organized with collections Save and categorize content based on your preferences.
This page describes exporting and importing data into Cloud SQL instances usingSQL dump files.
Note:If you're migrating an entire database froma supported database server (on-premises, in AWS, or Cloud SQL) to a newCloud SQL instance, you can useDatabase Migration Service instead of exportingand then importing files. If you're exporting to create a newinstance from the exported file, considerrestoring from a backup to a different instance orcloning the instance.
You can cancel the import of data into Cloud SQL instances and the export of data from the instances. This data is contained in SQL dump files. For more information about cancelling an import or export operation, seeCancel the import and export of data.
Before you begin
Important: Before starting a large export, ensure that at least 25 percentof the database size is free (on the instance). Doing so helps preventissues with aggressive autogrowth, which can affect the availabilityof the instance.Exports use database resources, but they do not interfere with normal databaseoperations unless the instance is under-provisioned.
For best practices, seeBest Practices for Importing and Exporting Data.
After completing an import operation,verify theresults.
Export data from Cloud SQL for PostgreSQL
Required roles and permissions for exporting from Cloud SQL for PostgreSQL
To exportdata from Cloud SQL into Cloud Storage, the user initiating the export must have one of the following roles:
- TheCloud SQL Editor role
- Acustom role, including the following permissions:
cloudsql.instances.get
cloudsql.instances.export
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
- The
storage.objectAdmin
Identity and Access Management (IAM) role - A custom role, including the following permissions:
storage.objects.create
storage.objects.list
(for exporting files in parallel only)storage.objects.delete
(for exporting files in parallel only)
For help withIAM roles, seeIdentity and Access Management.
Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, seeAccess change propagation.Export to a SQL dump file from Cloud SQL for PostgreSQL
When you use Cloud SQL to perform an export, whether from theGoogle Cloud console, thegcloud CLI, or the API, you areusing thepg_dump
utility, with the options required to ensure that the resulting export file isvalid for import back into Cloud SQL.If you plan to import your data into Cloud SQL, you must follow theinstructions provided inExporting data from an external database serverso that your SQL dump file is formatted correctly for Cloud SQL.
Note: If your data contains large objects (blobs), the export can consume alarge amount of memory, impacting instance performance. For help,seeKnown Issues.To export data from a database on a Cloud SQL instance to a SQL dump filein a Cloud Storage bucket:
Console
In the Google Cloud console, go to theCloud SQL Instances page.
- To open theOverview page of an instance, click the instance name.
- ClickExport.
- In theFile format section, clickSQL to create a SQL dump file.
- In theData to export section, use the drop-down menu to select the database you want to export from.
- In theDestination section, selectBrowse to search for a Cloud Storage bucket or folder for your export.
- ClickExport to begin the export.
gcloud
- Create a Cloud Storage bucket.
- Find the service account for the Cloud SQL instance you're exporting from. You can do this running the
gcloud sql instances describe
command. Look for theserviceAccountEmailAddress
field in the output.gcloudsqlinstancesdescribeINSTANCE_NAME
- Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.objectAdmin
IAM role to the service account. For help with setting IAM permissions, see Using IAM permissions. - Export the database to your Cloud Storage bucket:
Note: If you want to use serverless exports, then use the
offload
parameter. If you want to include theDROP <object>
SQL statement that's required to drop (clean) database objects before you import them, then use theclean
parameter. If you want to include theIF EXISTS
SQL statement with eachDROP
statement that's produced by theclean
parameter, then use theif-exists
parameter.Otherwise, remove these parameters from the following command.
gcloud sql export sqlINSTANCE_NAME gs://BUCKET_NAME/sqldumpfile.gz \--database=DATABASE_NAME \--offload
The
export sql
command does not contain triggers or stored procedures, but does contain views. To export triggers and/or stored procedures, use thepg_dump tool.For more information about using the
export sql
command, see thesql export sql
command reference page. - If you do not need to retain the IAM role you set previously,revoke it now.
REST v1
- Create a bucket for the export:
gcloudstoragebucketscreategs://BUCKET_NAME--project=PROJECT_NAME--location=LOCATION_NAME
This step is not required, but strongly recommended, so you do not open upaccess to any other data.
- Provide your instance with the
legacyBucketWriter
IAM role foryour bucket. For help with setting IAM permissions, seeUsingIAM permissions. - Export your database:
Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID
- INSTANCE_ID: the instance ID
- BUCKET_NAME: the Cloud Storage bucket name
- PATH_TO_DUMP_FILE: the path to the SQL dump file
- DATABASE_NAME_1: the name of a database inside the Cloud SQL instance
- DATABASE_NAME_2: the name of a database inside the Cloud SQL instance
Note: If you want to use serverless exports, then use the
offload
parameter. If you want to include theDROP <object>
SQL statement that's required to drop (clean) database objects before you import them, then use theclean
parameter. If you want to include theIF EXISTS
SQL statement with eachDROP
statement that's produced by theclean
parameter, then use theifExists
parameter.To use these features, set the values of these parameters to
TRUE
. Otherwise, set their values toFALSE
.Serverless exports costs extra. See thepricing page.
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export
Request JSON body:
{ "exportContext": { "fileType": "SQL", "uri": "gs://BUCKET_NAME/PATH_TO_DUMP_FILE", "databases": ["DATABASE_NAME"], "offload":TRUE |FALSE, "sqlExportOptions": { "clean": [TRUE|FALSE], "ifExists": [TRUE|FALSE] } } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell)
Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by runninggcloud init
orgcloud auth login
, or by usingCloud Shell, which automatically logs you into thegcloud
CLI . You can check the currently active account by runninggcloud auth list
.Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export"PowerShell (Windows)
Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by runninggcloud init
orgcloud auth login
. You can check the currently active account by runninggcloud auth list
.Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
Response
{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID"}
- If you do not need to retain the IAM role you set previously, remove it now.
REST v1beta4
- Create a bucket for the export:
gcloud storage buckets create gs://BUCKET_NAME --project=PROJECT_NAME --location=LOCATION_NAME
This step is not required, but strongly recommended, so you do not open upaccess to any other data.
- Provide your instance with the
storage.objectAdmin
IAM role foryour bucket. For help with setting IAM permissions, seeUsingIAM permissions. - Export your database:
Before using any of the request data, make the following replacements:
- PROJECT_ID: the project ID
- INSTANCE_ID: the instance ID
- BUCKET_NAME: the Cloud Storage bucket name
- PATH_TO_DUMP_FILE: the path to the SQL dump file
- DATABASE_NAME_1: the name of a database inside the Cloud SQL instance
- DATABASE_NAME_2: the name of a database inside the Cloud SQL instance
Note: If you want to use serverless exports, then use the
offload
parameter. If you want to include theDROP <object>
SQL statement that's required to drop (clean) database objects before you import them, then use theclean
parameter. If you want to include theIF EXISTS
SQL statement with eachDROP
statement that's produced by theclean
parameter, then use theifExists
parameter.To use these features, set the values of these parameters to
TRUE
. Otherwise, set their values toFALSE
.Serverless exports costs extra. See thepricing page.
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export
Request JSON body:
{ "exportContext": { "fileType": "SQL", "uri": "gs://BUCKET_NAME/PATH_TO_DUMP_FILE", "databases": ["DATABASE_NAME"], "offload":TRUE |FALSE, "sqlExportOptions": { "clean": [TRUE|FALSE], "ifExists": [TRUE|FALSE] } } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell)
Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by runninggcloud init
orgcloud auth login
, or by usingCloud Shell, which automatically logs you into thegcloud
CLI . You can check the currently active account by runninggcloud auth list
.Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export"PowerShell (Windows)
Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by runninggcloud init
orgcloud auth login
. You can check the currently active account by runninggcloud auth list
.Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
Response
{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID"}
- If you do not need to retain the IAM role you set previously,revoke it now.
Import data to Cloud SQL for PostgreSQL
Required roles and permissions for importing to Cloud SQL for PostgreSQL
To importdata from Cloud Storage into Cloud SQL, the user initiating the import must haveone of the following roles:
- TheCloud SQL Admin role
- Acustom role, including the following permissions:
cloudsql.instances.get
cloudsql.instances.import
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
- The
storage.objectAdmin
IAM role - A custom role, including the following permissions:
storage.objects.get
storage.objects.list
(for importing files in parallel only)
For help withIAM roles, seeIdentity and Access Management.
Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, seeAccess change propagation.Import a SQL dump file to Cloud SQL for PostgreSQL
SQL files are plain text files with a sequence of SQL commands.
Console
In the Google Cloud console, go to theCloud SQL Instances page.
- To open theOverview page of an instance, click the instance name.
- ClickImport.
- In theChoose the file you'd like to import data from section, enter the path to the bucket and SQL dump file to use for the import, or browse to an existing file.
You can import a compressed (
.gz
) or an uncompressed (.sql
) file. - ForFormat, selectSQL.
Select the database you want the data to be imported into.
This causes Cloud SQL to run the
If your SQL dump file includes aUSE DATABASE
statement before the import.USE DATABASE
statement, it overrides the database you set in the Google Cloud console.If you want to specify a user to perform the import, select the user.
If your import file contains statements that must be performed by a specific user, use this field to specify that user.
- ClickImport to start the import.
gcloud
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, seeUploading objects.
- Describe the instance you are importing to:
gcloudsqlinstancesdescribeINSTANCE_NAME
- Copy the
serviceAccountEmailAddress
field. - Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.objectAdmin
IAM role to the service account for the bucket. For help with setting IAM permissions, seeUsing IAM permissions.gcloudstoragebucketsadd-iam-policy-bindinggs://BUCKET_NAME\--member=serviceAccount:SERVICE-ACCOUNT\--role=roles/storage.objectAdmin
- Import the database:
gcloudsqlimportsqlINSTANCE_NAMEgs://BUCKET_NAME/IMPORT_FILE_NAME\--database=DATABASE_NAME
For information about using the
import sql
command, see thesql import sql
command reference page.If the command returns an error like
ERROR_RDBMS
, review the permissions; this error is often due to permissions issues. - If you do not need to retain the IAM permissions you set previously, remove them using
gcloud storage buckets remove-iam-policy-binding
.
REST v1
Create a SQL dump file. The linked instructions set certain flags that make the dump file compatible with Cloud SQL.
- If you are importing data from an on-premises PostgreSQL server:
- Create a SQL dump file using the instructions inExporting data using pg_dump.
- Create a bucket in Cloud Storage using the instructions inCreate buckets.
- Upload the SQL dump file to the Cloud Storage bucket using the procedure inUpload objects.
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, seeUploading objects.
- Provide your instance with the
legacyBucketWriter
andobjectViewer
IAM roles for your bucket. For help with setting IAM permissions, seeUsing IAM permissions. - Import your dump file:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_sql_file: The path to the SQL file
- database_name: The name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "SQL", "uri": "gs://bucket_name/path_to_sql_file", "database": "database_name" }}
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell)
Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by runninggcloud init
orgcloud auth login
, or by usingCloud Shell, which automatically logs you into thegcloud
CLI . You can check the currently active account by runninggcloud auth list
.Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import"PowerShell (Windows)
Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by runninggcloud init
orgcloud auth login
. You can check the currently active account by runninggcloud auth list
.Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
Response
{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id", "targetProject": "project-id"}
To use a different user for the import, specify the
For the complete list of parameters for the request, see theinstances:import page.importContext.importUser
property. - If you do not need to retain the IAM permissions you set previously, remove them now.
REST v1beta4
Create a SQL dump file. The linked instructions set certain flags that make the dump file compatible with Cloud SQL.
- If you are importing data from an on-premises PostgreSQL server:
- Create a SQL dump file using the instructions inExporting data using pg_dump.
- Create a bucket in Cloud Storage using the instructions inCreate buckets.
- Upload the SQL dump file to the Cloud Storage bucket using the procedure inUpload objects.
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, seeUploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For help with setting IAM permissions, seeUsing IAM permissions. - Import your dump file:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_sql_file: The path to the SQL file
- database_name: The name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "SQL", "uri": "gs://bucket_name/path_to_sql_file", "database": "database_name" }}
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell)
Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by runninggcloud init
orgcloud auth login
, or by usingCloud Shell, which automatically logs you into thegcloud
CLI . You can check the currently active account by runninggcloud auth list
.Save the request body in a file named
request.json
, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import"PowerShell (Windows)
Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by runninggcloud init
orgcloud auth login
. You can check the currently active account by runninggcloud auth list
.Save the request body in a file named
request.json
, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
Response
{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/operations/operation-id", "targetProject": "project-id"}
To use a different user for the import, specify the
For the complete list of parameters for the request, see theinstances:import page.importContext.importUser
property. - If you do not need to retain the IAM permissions you set previously, remove them now.
What's next
- Learn how tocheck the status of import and export operations.
- Learn more aboutbest practices for importing and exporting data.
- Learn more aboutCloud Storage.
- Known issues for imports and exports.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-07-14 UTC.