Export and import files in parallel Stay organized with collections Save and categorize content based on your preferences.
This page describes exporting and importing files into Cloud SQL instances inparallel.
Note:If you're migrating anentire database from a supported database server (on-premises, in AWS, orCloud SQL) to a new Cloud SQL instance, then useDatabase Migration Service instead of exportingand importing files in parallel.If you're exporting because you want to create a new instance from the exportedfile, considerrestoring from a backup to a different instance orcloning the instance.
You can verify that the import or export operation for multiple files in parallel completed successfully bychecking the operation's status. You can also cancel the import of data into Cloud SQL instances and the export of data from the instances. For more information about cancelling an import or export operation, seeCancel the import and export of data.
Before you begin
Before you begin an export or import operation:
- Ensure that your database has adequate free space.
- Follow thebest practices for exporting and importing data.
- After completing an import operation,verify the results.
Export and import operations use database resources, but they don't interfere with typical database operations unless the instance is under-provisioned.
Important: Before starting a large operation, ensure that at least 25 percent of the disk is free on the instance. Doing so helps prevent issues with aggressive autogrowth, which can adversely affect the availability of the instance.Export data from Cloud SQL for MySQL to multiple files in parallel
The following sections contain information about exporting data from Cloud SQL for MySQL to multiple files in parallel.
Required roles and permissions for exporting data from Cloud SQL for MySQL to multiple files in parallel
To exportdata from Cloud SQL into Cloud Storage, the user initiating the export must have one of the following roles:
- TheCloud SQL Editor role
- Acustom role, including the following permissions:
cloudsql.instances.getcloudsql.instances.export
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
- The
storage.objectAdminIdentity and Access Management (IAM) role - A custom role, including the following permissions:
storage.objects.createstorage.objects.list(for exporting files in parallel only)storage.objects.delete(for exporting files in parallel only)
For help withIAM roles, seeIdentity and Access Management.
Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, seeAccess change propagation.Export data to multiple files in parallel
You can export data in parallel from multiple files that reside inCloud SQL to Cloud Storage. Thegcloud sql export function with the-parallel flag uses thedumpInstanceutility to export from multiple files.After the files are in Cloud Storage, you can import theminto another Cloud SQL database. If you want to access the data in the files locally, then download the data from Cloud Storage into your local environment.
If your files containDEFINER clauses (views, triggers,stored_procedures, and so on), then depending on the order thesestatements are run, using these files for import can fail. Learn more aboutDEFINER usageand potential workarounds in Cloud SQL.
gcloud
To export data from Cloud SQL to multiple files in parallel, complete the following steps:
- Create a Cloud Storage bucket.
Note:You don't have to create a folder in the bucket. If the folder doesn't exist, then Cloud SQL creates it for you as a part of the process of exporting multiple files in parallel. However, if the folder exists, then it must be empty or the export operation fails.
- To find the service account for the Cloud SQL instance that you're exporting files from, use the
gcloud sql instances describecommand.gcloudsqlinstancesdescribeINSTANCE_NAME
- To grant the
storage.objectAdminIAM role to the service account, use thegcloud storage buckets add-iam-policy-bindingcommand. For help with setting IAM permissions, seeUse IAM permissions. - To export data from Cloud SQL to multiple files in parallel, use the
gcloud sql export sqlcommand:gcloud sql export sqlINSTANCE_NAME gs://BUCKET_NAME/BUCKET_PATH/FOLDER_NAME \--offload \--parallel \--threads=THREAD_NUMBER \--database=DATABASE_NAME \--table=TABLE_EXPRESSION
Make the following replacements:
- INSTANCE_NAME: the name of the Cloud SQL instance from which you're exporting files in parallel.
- BUCKET_NAME: the name of the Cloud Storage bucket.
- BUCKET_PATH: the path to the bucket where the export files are stored.
- FOLDER_NAME: the folder where the export files are stored.
- THREAD_NUMBER: the number of threads that Cloud SQL uses to export files in parallel. For example, if you want to export three files at a time in parallel, then specify
3as the value for this parameter. - DATABASE_NAME (optional): the name of the databases inside of the Cloud SQL instance from which the export is made. If you don't specify any databases, then Cloud SQL exports all databases for the instance.
- TABLE_EXPRESSION: the tables to export from the specified database.
offloadparameter. If you want to export multiple files in parallel, then use theparallelparameter. Otherwise, remove these parameters from the command.The
export sqlcommand doesn't contain triggers or stored procedures, but does contain views. To export triggers or stored procedures, use a single thread for the export. This thread uses themysqldumptool.After the export completes, you should have files in a folder in the Cloud Storage bucket in the MySQL Shell dump format.
- If you don't need the IAM role that you set inRequired roles and permissions for exporting from Cloud SQL for MySQL, thenrevoke it.
ReplaceINSTANCE_NAME with the name of your Cloud SQL instance.
In the output, look for the value that's associated with theserviceAccountEmailAddress field.
REST v1
To export data from Cloud SQL to multiple files in parallel, complete the following steps:
- Create a Cloud Storage bucket:
Make the following replacements:gcloud storage buckets create gs://BUCKET_NAME --project=PROJECT_NAME --location=LOCATION_NAME
- BUCKET_NAME: the name of the bucket, subject tonaming requirements. For example,
my-bucket. - PROJECT_NAME: the name of the Google Cloud project that contains the Cloud Storage bucket you're creating.
- LOCATION_NAME: the location of the bucket where you want to store the files you're exporting. For example,
us-east1.
Note:You don't have to create a folder in the bucket. If the folder doesn't exist, then Cloud SQL creates it for you as a part of the process of exporting multiple files in parallel. However, if the folder exists, then it must be empty or the export operation fails.
- BUCKET_NAME: the name of the bucket, subject tonaming requirements. For example,
- Provide your instance with the
legacyBucketWriterIAM role foryour bucket. For help with setting IAM permissions, seeUseIAM permissions. Export data from Cloud SQL to multiple files in parallel:
Before using any of the request data, make the following replacements:
- PROJECT_NAME: the name of the Google Cloud project that contains the Cloud Storage bucket you created.
- INSTANCE_NAME: the name of the Cloud SQL instance from which you're exporting files in parallel.
- BUCKET_NAME: the name of the Cloud Storage bucket.
- BUCKET_PATH: the path to the bucket where the export files are stored.
- FOLDER_NAME: the folder where the export files are stored.
- DATABASE_NAME (optional): the name of the databases inside of the Cloud SQL instance from which the export is made. If you don't specify any databases, then Cloud SQL exports all databases for the instance.
- THREAD_NUMBER: the number of threads that Cloud SQL uses to export files in parallel. For example, if you want to export three files at a time in parallel, then specify
3as the value for this parameter.
offloadparameter enables you to use serverless exports for up to 2 threads. Theparallelparameter enables you to export multiple files in parallel. To use these features, set the values of these parameters toTRUE. Otherwise, set their values toFALSE.HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_NAME/instances/INSTANCE_NAME/export
Request JSON body:
{ "exportContext": { "fileType": "SQL", "uri": "gs://BUCKET_NAME/BUCKET_PATH/FOLDER_NAME", "databases": ["DATABASE_NAME"], "offload": [TRUE|FALSE], "sqlExportOptions": { "parallel": [TRUE|FALSE], "threads": [THREAD_NUMBER] } }}To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell)
Note: The following command assumes that you have logged in to thegcloudCLI with your user account by runninggcloud initorgcloud auth login, or by usingCloud Shell, which automatically logs you into thegcloudCLI . You can check the currently active account by runninggcloud auth list.Save the request body in a file named
request.json, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_NAME/instances/INSTANCE_NAME/export"PowerShell (Windows)
Note: The following command assumes that you have logged in to thegcloudCLI with your user account by runninggcloud initorgcloud auth login. You can check the currently active account by runninggcloud auth list.Save the request body in a file named
request.json, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_NAME/instances/INSTANCE_NAME/export" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
Response
{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_NAME/instances/DESTINATION_INSTANCE_NAME", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "OPERATION_ID", "targetId": "INSTANCE_NAME", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_NAME/operations/OPERATION_ID", "targetProject": "PROJECT_NAME"}- If you don't need the IAM role that you set inRequired roles and permissions for exporting from Cloud SQL for MySQL, thenrevoke it.
After the export completes, you should have files in a folder in the Cloud Storage bucket in the MySQL Shell dump format.
REST v1beta4
To export data from Cloud SQL to multiple files in parallel, complete the following steps:
- Create a Cloud Storage bucket:
Make the following replacements:gcloudstoragebucketscreategs://BUCKET_NAME--project=PROJECT_NAME--location=LOCATION_NAME
- BUCKET_NAME: the name of the bucket, subject tonaming requirements. For example,
my-bucket. - PROJECT_NAME: the name of the Google Cloud project that contains the Cloud Storage bucket you're creating.
- LOCATION_NAME: the location of the bucket where you want to store the files you're exporting. For example,
us-east1.
Note:You don't have to create a folder in the bucket. If the folder doesn't exist, then Cloud SQL creates it for you as a part of the process of exporting multiple files in parallel. However, if the folder exists, then it must be empty or the export operation fails.
- BUCKET_NAME: the name of the bucket, subject tonaming requirements. For example,
- Provide your instance with the
storage.objectAdminIAM role foryour bucket. For help with setting IAM permissions, seeUseIAM permissions. Export data from Cloud SQL to multiple files in parallel:
Before using any of the request data, make the following replacements:
- PROJECT_NAME: the name of the Google Cloud project that contains the Cloud Storage bucket you created.
- INSTANCE_NAME: the name of the Cloud SQL instance from which you're exporting files in parallel.
- BUCKET_NAME: the name of the Cloud Storage bucket.
- BUCKET_PATH: the path to the bucket where the export files are stored.
- FOLDER_NAME: the folder where the export files are stored.
- DATABASE_NAME (optional): the name of the databases inside of the Cloud SQL instance from which the export is made. If you don't specify any databases, then Cloud SQL exports all databases for the instance.
- THREAD_NUMBER: the number of threads that Cloud SQL uses to export files in parallel. For example, if you want to export three files at a time in parallel, then specify
3as the value for this parameter.
offloadparameter enables you to use serverless exports for up to 2 threads. Theparallelparameter enables you to export multiple files in parallel. To use these features, set the values of these parameters toTRUE. Otherwise, set their values toFALSE.HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_NAME/instances/INSTANCE_NAME/export
Request JSON body:
{ "exportContext": { "fileType": "SQL", "uri": "gs://BUCKET_NAME/BUCKET_PATH/FOLDER_NAME", "databases": ["DATABASE_NAME"], "offload": [TRUE|FALSE], "sqlExportOptions": { "parallel": [TRUE|FALSE], "threads": [THREAD_NUMBER] } }}To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell)
Note: The following command assumes that you have logged in to thegcloudCLI with your user account by runninggcloud initorgcloud auth login, or by usingCloud Shell, which automatically logs you into thegcloudCLI . You can check the currently active account by runninggcloud auth list.Save the request body in a file named
request.json, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_NAME/instances/INSTANCE_NAME/export"PowerShell (Windows)
Note: The following command assumes that you have logged in to thegcloudCLI with your user account by runninggcloud initorgcloud auth login. You can check the currently active account by runninggcloud auth list.Save the request body in a file named
request.json, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_NAME/instances/INSTANCE_NAME/export" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
Response
{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_NAME/instances/DESTINATION_INSTANCE_NAME", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "OPERATION_ID", "targetId": "INSTANCE_NAME", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_NAME/operations/OPERATION_ID", "targetProject": "PROJECT_NAME"}- If you don't need the IAM role that you set inRequired roles and permissions for exporting from Cloud SQL for MySQL, thenrevoke it.
After the export completes, you should have files in a folder in the Cloud Storage bucket in the MySQL Shell dump format.
Import data from multiple files in parallel to Cloud SQL for MySQL
The following sections contain information about importing data from multiple files in parallel to Cloud SQL for MySQL.
Required roles and permissions for importing data from multiple files in parallel to Cloud SQL for MySQL
To importdata from Cloud Storage into Cloud SQL, the user initiating the import must haveone of the following roles:
- TheCloud SQL Admin role
- Acustom role, including the following permissions:
cloudsql.instances.getcloudsql.instances.import
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
- The
storage.objectAdminIAM role - A custom role, including the following permissions:
storage.objects.getstorage.objects.list(for importing files in parallel only)
For help withIAM roles, seeIdentity and Access Management.
Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, seeAccess change propagation.Import data to Cloud SQL for MySQL
You can import data in parallel from multiple files that reside in Cloud Storage to your database. To do this, use theloadDump utility.
gcloud
To import data from multiple files in parallel into Cloud SQL, complete the following steps:
- Create a Cloud Storage bucket.
Upload the files to your bucket.
Note:Make sure that the files that you're uploading are in the MySQL Shell dump format. For more information, seeExport data from multiple files in parallel.
For help with uploading files to buckets, seeUpload objects from files.
- To find the service account for the Cloud SQL instance that you're importing files to, use the
gcloud sql instances describecommand.gcloudsqlinstancesdescribeINSTANCE_NAME
- To grant the
storage.objectAdminIAM role to the service account, use thegcloud storage buckets add-iam-policy-bindingutility. For help with setting IAM permissions, seeUse IAM permissions. - To import data from multiple files in parallel into Cloud SQL, use the
gcloud sql import sqlcommand:gcloud sql import sqlINSTANCE_NAME gs://BUCKET_NAME/BUCKET_PATH/FOLDER_NAME \--parallel \--threads=THREAD_NUMBER \--database=DATABASE_NAME
Make the following replacements:
- INSTANCE_NAME: the name of the Cloud SQL instance to which you're importing files in parallel.
- BUCKET_NAME: the name of the Cloud Storage bucket.
- BUCKET_PATH: the path to the bucket where the import files are stored.
- FOLDER_NAME: the folder where the import files are stored.
- THREAD_NUMBER: the number of threads that Cloud SQL uses to import files in parallel. For example, if you want to import three files at a time in parallel, then specify
3as the value for this parameter. - DATABASE_NAME (optional): the name of the databases inside of the Cloud SQL instance from which the import is made. If you don't specify any databases, then Cloud SQL imports all databases for the instance.
Note: If you want to import multiple files in parallel, then use the
parallelparameter.Otherwise, remove these parameters from the command.
If the command returns an error like
ERROR_RDBMS, then review the permissions; this error is often due to permissions issues. - If you don't need the IAM permissions that you set inRequired roles and permissions for importing to Cloud SQL for MySQL, then use
gcloud storage buckets remove-iam-policy-bindingto remove them.
ReplaceINSTANCE_NAME with the name of your Cloud SQL instance.
In the output, look for the value that's associated with theserviceAccountEmailAddress field.
REST v1
To import data from multiple files in parallel into Cloud SQL, complete the following steps:
- Create a Cloud Storage bucket:
Make the following replacements:gcloud storage buckets create gs://BUCKET_NAME --project=PROJECT_NAME --location=LOCATION_NAME
- BUCKET_NAME: the name of the bucket, subject tonaming requirements. For example,
my-bucket. - PROJECT_NAME: the name of the Google Cloud project that contains the Cloud Storage bucket you're creating.
- LOCATION_NAME: the location of the bucket where you want to store the files you're importing. For example,
us-east1.
- BUCKET_NAME: the name of the bucket, subject tonaming requirements. For example,
Upload the files to your bucket.
Note:Make sure that the files that you're uploading are in the MySQL Shell dump format. For more information, seeExport data from multiple files in parallel.
For help with uploading files to buckets, seeUpload objects from files.
- Provide your instance with the
storage.objectAdminIAM role for your bucket. For help with setting IAM permissions, seeUse IAM permissions. Import data from multiple files in parallel into Cloud SQL:
Before using any of the request data, make the following replacements:
- PROJECT_NAME: the name of the Google Cloud project that contains the Cloud Storage bucket you created.
- INSTANCE_NAME: the name of the Cloud SQL instance to which you're importing files in parallel.
- BUCKET_NAME: the name of the Cloud Storage bucket.
- BUCKET_PATH: the path to the bucket where the import files are stored.
- FOLDER_NAME: the folder where the import files are stored.
- DATABASE_NAME (optional): the name of the databases inside of the Cloud SQL instance from which the import is made. If you don't specify any databases, then Cloud SQL imports all databases for the instance.
- THREAD_NUMBER: the number of threads that Cloud SQL uses to import files in parallel. For example, if you want to import three files at a time in parallel, then specify
3as the value for this parameter.
Note: The
offloadparameter enables you to use serverless imports for up to 2 threads. Theparallelparameter enables you to import multiple files in parallel.To use these features, set the values of these parameters toTRUE. Otherwise, set their values toFALSE.
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_NAME/instances/INSTANCE_NAME/import
Request JSON body:
{ "importContext": { "fileType": "SQL", "uri": "gs://BUCKET_NAME/BUCKET_PATH/FOLDER_NAME", "databases": ["DATABASE_NAME"], "offload": [TRUE|FALSE], "sqlImportOptions": { "parallel": [TRUE|FALSE], "threads": [THREAD_NUMBER] } }}To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell)
Note: The following command assumes that you have logged in to thegcloudCLI with your user account by runninggcloud initorgcloud auth login, or by usingCloud Shell, which automatically logs you into thegcloudCLI . You can check the currently active account by runninggcloud auth list.Save the request body in a file named
request.json, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_NAME/instances/INSTANCE_NAME/import"PowerShell (Windows)
Note: The following command assumes that you have logged in to thegcloudCLI with your user account by runninggcloud initorgcloud auth login. You can check the currently active account by runninggcloud auth list.Save the request body in a file named
request.json, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_NAME/instances/INSTANCE_NAME/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
For the complete list of parameters for the request, see theCloud SQL Admin API page.Response
{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_NAME/instances/DESTINATION_INSTANCE_NAME", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "OPERATION_ID", "targetId": "INSTANCE_NAME", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_NAME/operations/OPERATION_ID", "targetProject": "PROJECT_NAME"}- If you don't need the IAM permissions that you set inRequired roles and permissions for importing to Cloud SQL for MySQL, then use
gcloud storage buckets remove-iam-policy-bindingto remove them.
REST v1beta4
To import data from multiple files in parallel into Cloud SQL, complete the following steps:
- Create a Cloud Storage bucket:
Make the following replacements:gcloud storage buckets create gs://BUCKET_NAME --project=PROJECT_NAME --location=LOCATION_NAME
- BUCKET_NAME: the name of the bucket, subject tonaming requirements. For example,
my-bucket. - PROJECT_NAME: the name of the Google Cloud project that contains the Cloud Storage bucket you're creating.
- LOCATION_NAME: the location of the bucket where you want to store the files you're importing. For example,
us-east1.
- BUCKET_NAME: the name of the bucket, subject tonaming requirements. For example,
Upload the files to your bucket.
Note:Make sure that the files that you're uploading are in the MySQL Shell dump format. For more information, seeExport data from multiple files in parallel.
For help with uploading files to buckets, seeUpload objects from files.
- Provide your instance with the
storage.objectAdminIAM role for your bucket. For help with setting IAM permissions, seeUse IAM permissions. Import data from multiple files in parallel into Cloud SQL:
Before using any of the request data, make the following replacements:
- PROJECT_NAME: the name of the Google Cloud project that contains the Cloud Storage bucket you created.
- INSTANCE_NAME: the name of the Cloud SQL instance from which you're importing files in parallel.
- BUCKET_NAME: the name of the Cloud Storage bucket.
- BUCKET_PATH: the path to the bucket where the import files are stored.
- FOLDER_NAME: the folder where the import files are stored.
- DATABASE_NAME (optional): the name of the databases inside of the Cloud SQL instance from which the import is made. If you don't specify any databases, then Cloud SQL imports all databases for the instance.
- THREAD_NUMBER: the number of threads that Cloud SQL uses to import files in parallel. For example, if you want to import three files at a time in parallel, then specify
3as the value for this parameter.
Note: The
offloadparameter enables you to use serverless imports for up to 2 threads. Theparallelparameter enables you to import multiple files in parallel.To use these features, set the values of these parameters toTRUE. Otherwise, set their values toFALSE.
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_NAME/instances/INSTANCE_NAME/import
Request JSON body:
{ "importContext": { "fileType": "SQL", "uri": "gs://BUCKET_NAME/BUCKET_PATH/FOLDER_NAME", "databases": ["DATABASE_NAME"], "offload": [TRUE|FALSE], "sqlImportOptions": { "parallel": [TRUE|FALSE], "threads": [THREAD_NUMBER] } } }To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell)
Note: The following command assumes that you have logged in to thegcloudCLI with your user account by runninggcloud initorgcloud auth login, or by usingCloud Shell, which automatically logs you into thegcloudCLI . You can check the currently active account by runninggcloud auth list.Save the request body in a file named
request.json, and execute the following command:curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_NAME/instances/INSTANCE_NAME/import"PowerShell (Windows)
Note: The following command assumes that you have logged in to thegcloudCLI with your user account by runninggcloud initorgcloud auth login. You can check the currently active account by runninggcloud auth list.Save the request body in a file named
request.json, and execute the following command:$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_NAME/instances/INSTANCE_NAME/import" | Select-Object -Expand ContentYou should receive a JSON response similar to the following:
For the complete list of parameters for the request, see theCloud SQL Admin API page.Response
{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_NAME/instances/DESTINATION_INSTANCE_NAME", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "OPERATION_ID", "targetId": "INSTANCE_NAME", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_NAME/operations/OPERATION_ID", "targetProject": "PROJECT_NAME"}- If you don't need the IAM permissions that you set inRequired roles and permissions for importing to Cloud SQL for MySQL, then use
gcloud storage buckets remove-iam-policy-bindingto remove them.
Limitations
If you specify too many threads when you import or export data from multiplefiles in parallel, then you might use more memory than your Cloud SQLinstance has. If this occurs, then an internal error message appears. Checkthe memory usage of your instance and increase the instance's size, as needed.For more information, seeAbout instance settings.
When performing an export, commas in database names or table names in the
databasesortablesfields aren't supported.Make sure that you have enough disk space for the initial dump file download.Otherwise, a
no space left on diskerror appears.If your instance has only one virtual CPU (vCPU), then you can't import orexport multiple files in parallel. The number of vCPUs for your instance can'tbe smaller than the number of threads that you're using for the import orexport operation, and the number of threads must be at least two.
Multi-threaded (parallel) imports and exports aren't compatible withsingle-threaded imports and exports. For example, dump files generated by asingle-threaded export can only be imported by single-threaded imports.Similarly, dump files generated by parallel exports can only be imported byparallel imports.
If you write data definition language (DDL) statements such as
CREATE,DROP, orALTERduring an export operation, then the operation might failor the exported data might be inconsistent with thepoint-in-time recoverysnapshot.If an import operation fails, then you might have partially imported dataremaining. MySQL commits DDL statements automatically. If this occurs,then before you import the data again, clean up the DDL statements and the data.
Similar to a single-database parallel import operation, before running aparallel import operation for an entire instance, make sure all databases havefinished creation before you start.
What's next
- Learn how tocheck the status of import and export operations.
- Learn how tocancel the import and export of data.
- Learn aboutbest practices for importing and exporting data.
- Learn aboutknown issues for imports and exports.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-15 UTC.