Export and import using SQL dump files

MySQL  |  PostgreSQL  |  SQL Server

This page describes exporting and importing data into Cloud SQL instances usingSQL dump files.

Note:If you're migrating an entire database froma supported database server (on-premises, in AWS, or Cloud SQL) to a newCloud SQL instance, you can useDatabase Migration Service instead of exportingand then importing files. If you're exporting to create a newinstance from the exported file, considerrestoring from a backup to a different instance orcloning the instance.

You can cancel the import of data into Cloud SQL instances and the export of data from the instances. This data is contained in SQL dump files. For more information about cancelling an import or export operation, seeCancel the import and export of data.

Before you begin

Important: Before starting a large export, ensure that at least 25 percentof the database size is free (on the instance). Doing so helps preventissues with aggressive autogrowth, which can affect the availabilityof the instance.

Exports use database resources, but they do not interfere with normal databaseoperations unless the instance is under-provisioned.

For best practices, seeBest Practices for Importing and Exporting Data.

After completing an import operation,verify theresults.

Note: MySQL 8.0 for Cloud SQL usesmysql_native_password as the defaultauthentication plugin for new users instead ofcaching_sha2_password. However,users imported from external instances might be configured to use thecaching_sha2_password plugin for authentication. This might introduce problemswhen connecting. For more information, seeMySQL 8 authentication.

Export data from Cloud SQL for MySQL

Required roles and permissions for exporting from Cloud SQL for MySQL

To exportdata from Cloud SQL into Cloud Storage, the user initiating the export must have one of the following roles:

Additionally, the service account for the Cloud SQL instance must have one of the following roles:

  • Thestorage.objectAdmin Identity and Access Management (IAM) role
  • A custom role, including the following permissions:
    • storage.objects.create
    • storage.objects.list (for exporting files in parallel only)
    • storage.objects.delete (for exporting files in parallel only)

For help withIAM roles, seeIdentity and Access Management.

Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, seeAccess change propagation.

Export to a SQL dump file from Cloud SQL for MySQL

To create a SQL dump file, you export data from Cloud SQL toCloud Storage. Once the file is in Cloud Storage, you can import itinto another Cloud SQL database. You can also download data fromCloud Storage to your local environment if you want to access it locally.

Exporting from Cloud SQL uses themysqldump utility with the--single-transaction and--hex-blob options. With the--single-transaction option,mysqldump starts a transactionbefore running. Rather than lock the entire database, this letsmysqldump read the database in the current state, making for aconsistent data dump.

If your SQL dump file containsDEFINER clauses (views, triggers,stored_procedures, and so on), then depending on the order thesestatements are executed, using this file for import could fail. Learn more aboutDEFINER usageand potential workarounds in Cloud SQL.

Note: If your data contains large objects (blobs), the export can consume alarge amount of memory, impacting instance performance. For help,seeKnown Issues.

To export data from a database on a Cloud SQL instance to a SQL dump filein a Cloud Storage bucket:

Console

  1. In the Google Cloud console, go to theCloud SQL Instances page.

    Go to Cloud SQL Instances

  2. To open theOverview page of an instance, click the instance name.
  3. ClickExport.
  4. In theFile format section, clickSQL to create a SQL dump file.
  5. In theData to export section, clickOne or more databases in this instance to export specific databases.
  6. Use the drop-down menu to select the databases you want to export from.
  7. In theDestination section, selectBrowse to search for a Cloud Storage bucket or folder for your export.
  8. ClickExport to begin the export.

gcloud

  1. Create a Cloud Storage bucket.
  2. Find the service account for the Cloud SQL instance you're exporting from. You can do this running thegcloud sql instances describe command. Look for theserviceAccountEmailAddress field in the output.
    gcloudsqlinstancesdescribeINSTANCE_NAME
  3. Usegcloud storage buckets add-iam-policy-binding to grant thestorage.objectAdminIAM role to the service account. For help with setting IAM permissions, see Using IAM permissions.
  4. Export the database to your Cloud Storage bucket:Note: If you want to use serverless exports, then use theoffload parameter. Otherwise, remove it from the following command.
    gcloud sql export sqlINSTANCE_NAME gs://BUCKET_NAME/sqldumpfile.gz \--database=DATABASE_NAME \--offload

    Theexport sql command does not contain triggers or stored procedures, but does contain views. To export triggers and/or stored procedures, use themysqldump tool.

    For more information about using theexport sql command, see thesql export sql command reference page.

  5. If you do not need to retain the IAM role you set previously,revoke it now.

REST v1

  1. Create a bucket for the export:
    gcloudstoragebucketscreategs://BUCKET_NAME--project=PROJECT_NAME--location=LOCATION_NAME

    This step is not required, but strongly recommended, so you do not open upaccess to any other data.

  2. Provide your instance with thelegacyBucketWriterIAM role foryour bucket. For help with setting IAM permissions, seeUsingIAM permissions.
  3. Export your database:

    Before using any of the request data, make the following replacements:

    • PROJECT_ID: the project ID
    • INSTANCE_ID: the instance ID
    • BUCKET_NAME: the Cloud Storage bucket name
    • PATH_TO_DUMP_FILE: the path to the SQL dump file
    • DATABASE_NAME_1: the name of a database inside the Cloud SQL instance
    • DATABASE_NAME_2: the name of a database inside the Cloud SQL instance
    Note: To use serverless exports, set the value of theoffload parameter toTRUE. Serverless exports costs extra. See thepricing page.
  4. HTTP method and URL:

    POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export

    Request JSON body:

    { "exportContext":   {      "fileType": "SQL",      "uri": "gs://BUCKET_NAME/PATH_TO_DUMP_FILE",      "databases": ["DATABASE_NAME"],      "offload":TRUE |FALSE    }}

    To send your request, expand one of these options:

    curl (Linux, macOS, or Cloud Shell)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login , or by usingCloud Shell, which automatically logs you into thegcloud CLI . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    -d @request.json \
    "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export"

    PowerShell (Windows)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    $cred = gcloud auth print-access-token
    $headers = @{ "Authorization" = "Bearer $cred" }

    Invoke-WebRequest `
    -Method POST `
    -Headers $headers `
    -ContentType: "application/json; charset=utf-8" `
    -InFile request.json `
    -Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export" | Select-Object -Expand Content

    You should receive a JSON response similar to the following:

    Response

    {  "kind": "sql#operation",  "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID",  "status": "PENDING",  "user": "user@example.com",  "insertTime": "2020-01-21T22:43:37.981Z",  "operationType": "UPDATE",  "name": "OPERATION_ID",  "targetId": "INSTANCE_ID",  "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/operations/OPERATION_ID",  "targetProject": "PROJECT_ID"}
  5. If you do not need to retain the IAM role you set previously, remove it now.
For the complete list of parameters for the request, see theinstances:export page.

REST v1beta4

  1. Create a bucket for the export:
    gcloud storage buckets create gs://BUCKET_NAME --project=PROJECT_NAME --location=LOCATION_NAME

    This step is not required, but strongly recommended, so you do not open upaccess to any other data.

  2. Provide your instance with thestorage.objectAdminIAM role foryour bucket. For help with setting IAM permissions, seeUsingIAM permissions.
  3. Export your database:

    Before using any of the request data, make the following replacements:

    • PROJECT_ID: the project ID
    • INSTANCE_ID: the instance ID
    • BUCKET_NAME: the Cloud Storage bucket name
    • PATH_TO_DUMP_FILE: the path to the SQL dump file
    • DATABASE_NAME_1: the name of a database inside the Cloud SQL instance
    • DATABASE_NAME_2: the name of a database inside the Cloud SQL instance
    Note: To use serverless exports, set the value of theoffload parameter toTRUE. Serverless exports costs extra. See thepricing page.
  4. HTTP method and URL:

    POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export

    Request JSON body:

    {  "exportContext":    {       "fileType": "SQL",       "uri": "gs://BUCKET_NAME/PATH_TO_DUMP_FILE",       "databases": ["DATABASE_NAME"],       "offload":TRUE |FALSE     } }

    To send your request, expand one of these options:

    curl (Linux, macOS, or Cloud Shell)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login , or by usingCloud Shell, which automatically logs you into thegcloud CLI . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    -d @request.json \
    "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export"

    PowerShell (Windows)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    $cred = gcloud auth print-access-token
    $headers = @{ "Authorization" = "Bearer $cred" }

    Invoke-WebRequest `
    -Method POST `
    -Headers $headers `
    -ContentType: "application/json; charset=utf-8" `
    -InFile request.json `
    -Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export" | Select-Object -Expand Content

    You should receive a JSON response similar to the following:

    Response

    {  "kind": "sql#operation",  "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID",  "status": "PENDING",  "user": "user@example.com",  "insertTime": "2020-01-21T22:43:37.981Z",  "operationType": "UPDATE",  "name": "OPERATION_ID",  "targetId": "INSTANCE_ID",  "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/operations/OPERATION_ID",  "targetProject": "PROJECT_ID"}
  5. If you do not need to retain the IAM role you set previously,revoke it now.
For the complete list of parameters for the request, see theinstances:export page.

Export from your local MySQL server using mysqldump

If you are exporting data from an on-premises MySQL database for import into aCloud SQL database, you must use themysqldump utility with the following flags:

  • --databasesYou must use the--databases option to specify an explicit list ofdatabases to export, and this list mustnot contain themysql systemdatabase.
  • --hex-blobIf your database contains any binary fields, you must use this flag toensure that your binary fields are imported correctly.
  • --set-gtid-purged=OFFGTID information must not be included in the SQL dump file, andbinary logging must not be disabled by the SQL dump file. (Not required forMySQL 5.5 or external replication.)
  • --single-transactionStarts a transaction before running. Rather than lock the entire database,this lets mysqldump read the database in the current state, making for aconsistent data dump.
Note: If you intend to import the SQL dump file into a Cloud SQLdatabase, do not export customer-created MySQL users. You must add usersdirectly to the Cloud SQL database. SeeCreating a user.

From a command line, runmysqldump:

mysqldump--databasesDATABASE_NAME-hINSTANCE_IP-uUSERNAME-p\--hex-blob--single-transaction--set-gtid-purged=OFF\--default-character-set=utf8mb4 >SQL_FILE.sql
Note: To export stored procedures from a Cloud SQLinstance, include the--routines flag. To export triggers from aCloud SQL instance, make sure that--skip-triggers is notspecified. If binary logging is enabled and you want to export triggers andstored procedures, you must also set thelog_bin_trust_function_creators flag in your MySQL database.

For help withmysqldump, see themysqldump reference.

External replication to Cloud SQL for MySQL

To create a dump file for use in an external server configuration, seeReplicating from an external server.

Import data to Cloud SQL for MySQL

Required roles and permissions for importing to Cloud SQL for MySQL

To importdata from Cloud Storage into Cloud SQL, the user initiating the import must haveone of the following roles:

Additionally, the service account for the Cloud SQL instance must have one of the following roles:

  • Thestorage.objectAdmin IAM role
  • A custom role, including the following permissions:
    • storage.objects.get
    • storage.objects.list (for importing files in parallel only)

For help withIAM roles, seeIdentity and Access Management.

Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, seeAccess change propagation.

Import a SQL dump file to Cloud SQL for MySQL

SQL files are plain text files with a sequence of SQL commands.

Console

  1. In the Google Cloud console, go to theCloud SQL Instances page.

    Go to Cloud SQL Instances

  2. To open theOverview page of an instance, click the instance name.
  3. ClickImport.
  4. In theChoose the file you'd like to import data from section, enter the path to the bucket and SQL dump file to use for the import, browse to an existing file, or upload a local file.

    You can import a compressed (.gz) or an uncompressed (.sql) file.

  5. ForFormat, selectSQL.
  6. Select the database you want the data to be imported into.

    This causes Cloud SQL to run theUSE DATABASE statement before the import.

    If your SQL dump file includes aUSE DATABASE statement, it overrides the database you set in the Google Cloud console.
  7. If you want to specify a user to perform the import, select the user.

    If your import file contains statements that must be performed by a specific user, use this field to specify that user.

  8. ClickImport to start the import.

gcloud

  1. Create a Cloud Storage bucket.
  2. Upload the file to your bucket.

    For help with uploading files to buckets, seeUploading objects.

  3. Describe the instance you are importing to:
    gcloudsqlinstancesdescribeINSTANCE_NAME
  4. Copy theserviceAccountEmailAddress field.
  5. Usegcloud storage buckets add-iam-policy-binding to grant thestorage.objectAdminIAM role to the service account for the bucket.
    gcloudstoragebucketsadd-iam-policy-bindinggs://BUCKET_NAME\--member=serviceAccount:SERVICE-ACCOUNT\--role=roles/storage.objectAdmin
    For help with setting IAM permissions, seeUsing IAM permissions.
  6. Import the database:
    gcloudsqlimportsqlINSTANCE_NAMEgs://BUCKET_NAME/IMPORT_FILE_NAME\--database=DATABASE_NAME

    For information about using theimport sql command, see thesql import sql command reference page.

    If the command returns an error likeERROR_RDBMS, review the permissions; this error is often due to permissions issues.

  7. If you do not need to retain the IAM permissions you set previously, remove them usinggcloud storage buckets remove-iam-policy-binding.

REST v1

  1. Create a SQL dump file. The linked instructions set certain flags that make the dump file compatible with Cloud SQL.

  2. Create a Cloud Storage bucket.
  3. Upload the file to your bucket.

    For help with uploading files to buckets, seeUploading objects.

  4. Provide your instance with thelegacyBucketWriter andobjectViewerIAM roles for your bucket. For help with setting IAM permissions, seeUsing IAM permissions.
  5. Import your dump file:

    Before using any of the request data, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_sql_file: The path to the SQL file
    • database_name: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import

    Request JSON body:

    { "importContext":   {      "fileType": "SQL",      "uri": "gs://bucket_name/path_to_sql_file",      "database": "database_name"    }}

    To send your request, expand one of these options:

    curl (Linux, macOS, or Cloud Shell)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login , or by usingCloud Shell, which automatically logs you into thegcloud CLI . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    -d @request.json \
    "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import"

    PowerShell (Windows)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    $cred = gcloud auth print-access-token
    $headers = @{ "Authorization" = "Bearer $cred" }

    Invoke-WebRequest `
    -Method POST `
    -Headers $headers `
    -ContentType: "application/json; charset=utf-8" `
    -InFile request.json `
    -Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content

    You should receive a JSON response similar to the following:

    Response

    {  "kind": "sql#operation",  "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id",  "status": "PENDING",  "user": "user@example.com",  "insertTime": "2020-01-21T22:43:37.981Z",  "operationType": "UPDATE",  "name": "operation-id",  "targetId": "instance-id",  "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id",  "targetProject": "project-id"}
    For the complete list of parameters for the request, see theinstances:import page.
  6. If you do not need to retain the IAM permissions you set previously, remove them now.

REST v1beta4

  1. Create a SQL dump file. The linked instructions set certain flags that make the dump file compatible with Cloud SQL.

  2. Create a Cloud Storage bucket.
  3. Upload the file to your bucket.

    For help with uploading files to buckets, seeUploading objects.

  4. Provide your instance with thestorage.objectAdminIAM role for your bucket. For help with setting IAM permissions, seeUsing IAM permissions.
  5. Import your dump file:

    Before using any of the request data, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_sql_file: The path to the SQL file
    • database_name: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import

    Request JSON body:

    { "importContext":   {      "fileType": "SQL",      "uri": "gs://bucket_name/path_to_sql_file",      "database": "database_name"    }}

    To send your request, expand one of these options:

    curl (Linux, macOS, or Cloud Shell)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login , or by usingCloud Shell, which automatically logs you into thegcloud CLI . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    -d @request.json \
    "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import"

    PowerShell (Windows)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    $cred = gcloud auth print-access-token
    $headers = @{ "Authorization" = "Bearer $cred" }

    Invoke-WebRequest `
    -Method POST `
    -Headers $headers `
    -ContentType: "application/json; charset=utf-8" `
    -InFile request.json `
    -Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content

    You should receive a JSON response similar to the following:

    Response

    {  "kind": "sql#operation",  "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/target-instance-id",  "status": "PENDING",  "user": "user@example.com",  "insertTime": "2020-01-21T22:43:37.981Z",  "operationType": "UPDATE",  "name": "operation-id",  "targetId": "instance-id",  "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/operations/operation-id",  "targetProject": "project-id"}
    For the complete list of parameters for the request, see theinstances:import page.
  6. If you do not need to retain the IAM permissions you set previously, remove them now.
Note: If you get an error message about needing theSUPER privilege when you attempt to import data, your export file containselements that cannot be imported into Cloud SQL. Make sure you arefollowing the instructions inexporting data for import into Cloud SQL when you create your export file.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-17 UTC.