Export and import using CSV files

MySQL  |  PostgreSQL  |  SQL Server

This page describes exporting and importing data into Cloud SQL instances usingCSV files.

Note: If you're migrating an entire database froma supported database server (on-premises, in AWS, or Cloud SQL) to a newCloud SQL instance, you can use theDatabase Migration Service instead of exportingand then importing files.If you're exporting because you want to create a new instance from the exportedfile, considerrestoring from a backup to a different instance orcloning the instance.

You can cancel the import of data into Cloud SQL instances and the export of data from the instances. This data is contained in CSV files. For more information about cancelling an import or export operation, seeCancel the import and export of data.

Before you begin

Before you begin an export or import operation:

Export data from Cloud SQL for PostgreSQL

Required roles and permissions for exporting from Cloud SQL for PostgreSQL

To exportdata from Cloud SQL into Cloud Storage, the user initiating the export must have one of the following roles:

Additionally, the service account for the Cloud SQL instance must have one of the following roles:

  • Thestorage.objectAdmin Identity and Access Management (IAM) role
  • A custom role, including the following permissions:
    • storage.objects.create
    • storage.objects.list (for exporting files in parallel only)
    • storage.objects.delete (for exporting files in parallel only)

For help withIAM roles, seeIdentity and Access Management.

Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, seeAccess change propagation.

Export data to a CSV file from Cloud SQL for PostgreSQL

You can export your data in CSV format, which is usable by other tools andenvironments. Exports happen at the database level. During a CSV export, you canspecify the schemas to export. All schemas at the database level areeligible for export.

Note: Cloud SQL uses double quotes (hex value"22") as the default escape character. This can be a problem for databases wherevalues forNULL are entered as string literals. When importing afile that was exported using the default escape character, the file doesn'ttreat the value asNULL but as"NULL". We recommendthat you use--escape="5C" to override the default when you exportthe file.Note: You cannot export to a CSV file from a read replica instance. The export operation creates an export user and grants that user select permissions on the database that the user wants to export. Because read replica instances run in read-only mode, these operations fail.

To export data from a database on a Cloud SQL instance to a CSV filein a Cloud Storage bucket:

Console

  1. In the Google Cloud console, go to theCloud SQL Instances page.

    Go to Cloud SQL Instances

  2. To open theOverview page of an instance, click the instance name.
  3. ClickExport.
  4. SelectOffload export to allow other operations to occur while the export is ongoing.
  5. In theCloud Storage export location section add the name of the bucket, folder, and file that you want to export, or clickBrowse to find or create a bucket, folder, or file.

    If you clickBrowse:

    1. In theLocation section, select a Cloud Storage bucket or folder for your export.Note:If you need to create a new storage bucket, click the bucket with the+ sign in theLocation section. This opensCreate a bucket, where you can add a name and other configurations for the bucket. Select astandard storage class for your bucket.
    2. In theName box, add a name for theCSV file, or select an existing file from the list in theLocation section.

      You can use a file extension of.gz (the complete extension would be.csv.gz) to compress your export file.

    3. ClickSelect.
  6. In theFormat section, clickCSV.
  7. In theDatabase for export section, select the name of the database from the drop-down menu.
  8. ForSQL query, enter a SQL query to specify the table to export data from.

    For example, to export the entire contents of theentries table in theguestbook database, you enter

    SELECT*FROMguestbook.entries;
    Your query must specify a table in the specified database. You can't export an entire database in CSV format.

    Note:While in-transit, the query might be processed in intermediate locations other than the location of the target instance

  9. ClickExport to start the export.
  10. TheExport database? box opens with a message that the export process can take an hour or more for large databases. During the export, the only operation you can perform on the instance is viewing information. After the export starts, you cancancel the operation. If this is a good time to start an export, clickExport. Otherwise, clickCancel.

gcloud

  1. Create a Cloud Storage bucket.
  2. Upload the file to your bucket.

    For help with uploading files to buckets, seeUploading objects.

  3. Find the service account for the Cloud SQL instance you're exporting from. You can do this running thegcloud sql instances describe command. Look for theserviceAccountEmailAddress field in the output.
    gcloudsqlinstancesdescribeINSTANCE_NAME
  4. Use gcloud storage buckets add-iam-policy-binding to grant thestorage.objectAdminIAM role to the Cloud SQL instance service account. For help with setting IAM permissions, seeUsing IAM permissions.
  5. Export the database:Note:Use the--offload flag if you want to use serverless export. Otherwise, remove it from the following command.
    gcloud sql export csvINSTANCE_NAME gs://BUCKET_NAME/FILE_NAME \--database=DATABASE_NAME \--offload \--query=SELECT_QUERY

    For information about using theexport csv command, see thesql export csv command reference page.

    Note:While in-transit, the SELECT_QUERY might beprocessed in intermediate locations other than the location of the targetinstance.

  6. If you do not need to retain the IAM role you set previously,revoke it now.

REST v1

  1. Create a bucket for the export:
    gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME

    This step is not required, but strongly recommended, so you do not open upaccess to any other data.

  2. Provide your instance with thelegacyBucketWriterIAM role foryour bucket. For help with setting IAM permissions, seeUsingIAM permissions.
  3. Export your database:

    Before using any of the request data, make the following replacements:

    HTTP method and URL:

    POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export

    Request JSON body:

    { "exportContext":   {      "fileType": "CSV",      "uri": "gs://bucket_name/path_to_csv_file",      "databases": "database_name",      "offload":true |false      "csvExportOptions":       {           "selectQuery":"select_query",           "escapeCharacter":"escape_character",           "quoteCharacter":"quote_character",           "fieldsTerminatedBy":"fields_terminated_by",           "linesTerminatedBy":"lines_terminated_by"       }   }}

    To send your request, expand one of these options:

    curl (Linux, macOS, or Cloud Shell)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login , or by usingCloud Shell, which automatically logs you into thegcloud CLI . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    -d @request.json \
    "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export"

    PowerShell (Windows)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    $cred = gcloud auth print-access-token
    $headers = @{ "Authorization" = "Bearer $cred" }

    Invoke-WebRequest `
    -Method POST `
    -Headers $headers `
    -ContentType: "application/json; charset=utf-8" `
    -InFile request.json `
    -Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export" | Select-Object -Expand Content

    You should receive a JSON response similar to the following:

    Response

    {  "kind": "sql#operation",  "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id",  "status": "PENDING",  "user": "user@example.com",  "insertTime": "2020-01-21T22:43:37.981Z",  "operationType": "UPDATE",  "name": "operation-id",  "targetId": "instance-id",  "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id",  "targetProject": "project-id"}

    You must specify exactly one database with thedatabases property, and if the select query specifies a database, it must be the same.

  4. If you do not need to retain the IAM permissions you set previously, remove them now.
For the complete list of parameters for the request, see theinstances:export page.

REST v1beta4

  1. Create a bucket for the export:
    gcloudstoragebucketscreategs://BUCKET_NAME--location=LOCATION_NAME--project=PROJECT_NAME

    This step is not required, but strongly recommended, so you do not open upaccess to any other data.

  2. Provide your instance with thestorage.objectAdminIAM role foryour bucket. For help with setting IAM permissions, seeUsingIAM permissions.
  3. Export your database:

    Before using any of the request data, make the following replacements:

    HTTP method and URL:

    POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export

    Request JSON body:

    { "exportContext":   {      "fileType": "CSV",      "uri": "gs://bucket_name/path_to_csv_file",      "databases": "database_name",      "offload":true |false      "csvExportOptions":       {           "selectQuery": "select_query",           "escapeCharacter":  "escape_character",           "quoteCharacter": "quote_character",           "fieldsTerminatedBy": "fields_terminated_by",           "linesTerminatedBy": "lines_terminated_by"       }   }}

    To send your request, expand one of these options:

    curl (Linux, macOS, or Cloud Shell)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login , or by usingCloud Shell, which automatically logs you into thegcloud CLI . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    -d @request.json \
    "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export"

    PowerShell (Windows)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    $cred = gcloud auth print-access-token
    $headers = @{ "Authorization" = "Bearer $cred" }

    Invoke-WebRequest `
    -Method POST `
    -Headers $headers `
    -ContentType: "application/json; charset=utf-8" `
    -InFile request.json `
    -Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export" | Select-Object -Expand Content

    You should receive a JSON response similar to the following:

    Response

    {  "kind": "sql#operation",  "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/target-instance-id",  "status": "PENDING",  "user": "user@example.com",  "insertTime": "2020-01-21T22:43:37.981Z",  "operationType": "UPDATE",  "name": "operation-id",  "targetId": "instance-id",  "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/operations/operation-id",  "targetProject": "project-id"}

    You must specify exactly one database with thedatabases property, and if the select query specifies a database, it must be the same.

  4. If you do not need to retain the IAM role you set previously,revoke it now.
For the complete list of parameters for the request, see theinstances:export page.

Customize the format of a CSV export file

You can usegcloud or the REST API to customize your CSV file format. When youperform an export, you can specify the following formatting options:

CSV optionDefault valuegcloud flagREST API propertyDescription
Escape

"5C"

ASCII hex code for file separator.

--escapeescapeCharacter

Character that appears before a data character that needs to be escaped.

Available only for MySQL and PostgreSQL.

Quote

"22"

ASCII hex code for double quotes.

--quotequoteCharacter

Character that encloses values from columns that have a string data type.

Available only for MySQL and PostgreSQL.

Field delimiter

"2C"

ASCII hex code for comma.

--fields-terminated-byfieldsTerminatedBy

Character that splits column values.

Available only for MySQL and PostgreSQL.

Newline character

"0A"

ASCII hex code for newline.

--lines-terminated-bylinesTerminatedBy

Character that splits line records.

Available only for MySQL.

For example, agcloud command using all of these arguments could belike the following:

gcloudsqlexportcsvINSTANCE_NAMEgs://BUCKET_NAME/FILE_NAME\--database=DATABASE_NAME\--offload\--query=SELECT_QUERY\--quote="22"\--escape="5C"\--fields-terminated-by="2C"\--lines-terminated-by="0A"

The equivalent REST API request body would look like this:

{"exportContext":{"fileType":"CSV","uri":"gs://bucket_name/path_to_csv_file","databases":"DATABASE_NAME","offload":true,"csvExportOptions":{"selectQuery":"SELECT_QUERY","escapeCharacter":"5C","quoteCharacter":"22","fieldsTerminatedBy":"2C","linesTerminatedBy":"0A"}}}

CSV export creates standard CSV output by default. If you need even more optionsthan Cloud SQL provides, you can use the following statement in a psqlclient:

\copy[table_name]TO'[csv_file_name].csv'WITH(FORMATcsv,ESCAPE'[escape_character]',QUOTE'[quote_character]',DELIMITER'[delimiter_character]',ENCODING'UTF8',NULL'[null_marker_string]');

Import data to Cloud SQL for PostgreSQL

Required roles and permissions for importing to Cloud SQL for PostgreSQL

To importdata from Cloud Storage into Cloud SQL, the user initiating the import must haveone of the following roles:

Additionally, the service account for the Cloud SQL instance must have one of the following roles:

  • Thestorage.objectAdmin IAM role
  • A custom role, including the following permissions:
    • storage.objects.get
    • storage.objects.list (for importing files in parallel only)

For help withIAM roles, seeIdentity and Access Management.

Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, seeAccess change propagation.

Import data from a CSV file to Cloud SQL for PostgreSQL

Note: The default behavior for Cloud SQL is to import columns in your CSV file in the same order as the table schema. If the order is different in your CSV file or if some columns are skipped, use importContext.csvImportOptions.columns[] to import data from the CSV file.Note: While you can import a CSV file that was created using one database engineinto an instance created using another database engine, it's not recommended. Ifyou choose to do so, make sure the CSV file format is compatible with thedatabase to which you want to import.

CSV file format requirements

CSV files must have one line for each row of data and use comma-separatedfields.

To import data to a Cloud SQL instance using a CSV file:

Console

  1. In the Google Cloud console, go to theCloud SQL Instances page.

    Go to Cloud SQL Instances

  2. To open theOverview page of an instance, click the instance name.
  3. ClickImport.
  4. In theChoose the file you'd like to import data from section, enter the path to the bucket and CSV file to use for the import. Or to browse to the file:
    1. ClickBrowse.
    2. In theLocation section, double-click the name of the bucket in the list.
    3. Select the file in the list.
    4. ClickSelect.

    You can import a compressed (.gz) or an uncompressed (.csv) file.

  5. In theFormat section, selectCSV.
  6. Specify theDatabase andTable in your Cloud SQL instance where you want to import the CSV file.
  7. You can optionally specify a user for the import operation.
  8. Click theImport to start the import.

gcloud

  1. Create a Cloud Storage bucket.
  2. Upload the file to your bucket.

    For help with uploading files to buckets, seeUploading objects.

  3. Upload data from the CSV file to the bucket.
  4. Identify the service account for the Cloud SQL instance you're exporting from. You can do this running thegcloud sql instances describe command with the instance name. Look for theserviceAccountEmailAddress field in the output.
    gcloudsqlinstancesdescribeINSTANCE_NAME
  5. Copy the serviceAccountEmailAddress field.
  6. Use gcloud storage buckets add-iam-policy-binding to grant thestorage.objectAdminIAM role to the Cloud SQL instance service account for the bucket. For help with setting IAM permissions, seeUsing IAM permissions.
  7. Import the file:
    gcloudsqlimportcsvINSTANCE_NAMEgs://BUCKET_NAME/FILE_NAME\--database=DATABASE_NAME\--table=TABLE_NAME

    For information about using theimport csv command, see thesql import csv command reference page.

  8. If you do not need to retain the IAM permissions you set previously, remove them usinggcloud storage buckets remove-iam-policy-binding.

REST v1

  1. Create a Cloud Storage bucket.
  2. Upload the file to your bucket.

    For help with uploading files to buckets, seeUploading objects.

  3. Provide your instance with thelegacyBucketWriter andobjectViewerIAM roles for your bucket. For help with setting IAM permissions, seeUsing IAM permissions.
  4. Import the file:

    Before using any of the request data, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_csv_file: The path to the CSV file
    • database_name: The name of a database inside the Cloud SQL instance
    • table_name: The name of the database table
    • escape_character: The character that should appear before a data character that needs to be escaped. The value of this argument has to be a character in Hex ASCII Code. For example, "22" represents double quotes. (optional)
    • quote_character: The character that encloses values from columns that have a string data type. The value of this argument has to be a character in Hex ASCII Code. For example, "22" represents double quotes. (optional)
    • fields_terminated_by: The character that split column values. The value of this argument has to be a character in Hex ASCII Code. For example, "2C" represents a comma. (optional)
    • lines_terminated_by: The character that split line records. The value of this argument has to be a character in Hex ASCII Code. For example, "0A" represents a new line. (optional)

    HTTP method and URL:

    POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import

    Request JSON body:

    { "importContext":   {      "fileType": "CSV",      "uri": "gs://bucket_name/path_to_csv_file",      "database": "database_name",      "csvImportOptions":       {         "table": "table_name",         "escapeCharacter": "escape_character",         "quoteCharacter": "quote_character",         "fieldsTerminatedBy": "fields_terminated_by",         "linesTerminatedBy": "lines_terminated_by"       }   }}

    To send your request, expand one of these options:

    curl (Linux, macOS, or Cloud Shell)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login , or by usingCloud Shell, which automatically logs you into thegcloud CLI . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    -d @request.json \
    "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import"

    PowerShell (Windows)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    $cred = gcloud auth print-access-token
    $headers = @{ "Authorization" = "Bearer $cred" }

    Invoke-WebRequest `
    -Method POST `
    -Headers $headers `
    -ContentType: "application/json; charset=utf-8" `
    -InFile request.json `
    -Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content

    You should receive a JSON response similar to the following:

    Response

    {  "kind": "sql#operation",  "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id",  "status": "PENDING",  "user": "user@example.com",  "insertTime": "2020-01-21T22:43:37.981Z",  "operationType": "UPDATE",  "name": "operation-id",  "targetId": "instance-id",  "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id",  "targetProject": "project-id"}
    For the complete list of parameters for the request, see theinstances:import page.
  5. If you do not need to retain the IAM permissions youset previously, remove the permissions.

REST v1beta4

  1. Create a Cloud Storage bucket.
  2. Upload the file to your bucket.

    For help with uploading files to buckets, seeUploading objects.

  3. Provide your instance with thestorage.objectAdminIAM role for your bucket. For help with setting IAM permissions, seeUsing IAM permissions.
  4. Import the file:

    Before using any of the request data, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_csv_file: The path to the CSV file
    • database_name: The name of a database inside the Cloud SQL instance
    • table_name: The name of the database table
    • escape_character: The character that should appear before a data character that needs to be escaped. The value of this argument must be inASCII hex format. For example,"22" represents double quotes. (optional)
    • quote_character: The character that encloses values from columns that have a string data type. The value of this argument must be inASCII hex format. For example, "22" represents double quotes. (optional)
    • fields_terminated_by: The character that split column values. The value of this argument must be inASCII hex format. For example, "2C" represents a comma. (optional)
    • lines_terminated_by: The character that split line records. The value of this argument must be inASCII hex format. For example, "0A" represents a new line. (optional)

    HTTP method and URL:

    POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import

    Request JSON body:

    { "importContext":   {      "fileType": "CSV",      "uri": "gs://bucket_name/path_to_csv_file",      "database": "database_name",      "csvImportOptions":       {         "table": "table_name",         "escapeCharacter": "escape_character",         "quoteCharacter": "quote_character",         "fieldsTerminatedBy": "fields_terminated_by",         "linesTerminatedBy": "lines_terminated_by"       }   }}

    To send your request, expand one of these options:

    curl (Linux, macOS, or Cloud Shell)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login , or by usingCloud Shell, which automatically logs you into thegcloud CLI . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    -d @request.json \
    "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import"

    PowerShell (Windows)

    Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login . You can check the currently active account by runninggcloud auth list.

    Save the request body in a file namedrequest.json, and execute the following command:

    $cred = gcloud auth print-access-token
    $headers = @{ "Authorization" = "Bearer $cred" }

    Invoke-WebRequest `
    -Method POST `
    -Headers $headers `
    -ContentType: "application/json; charset=utf-8" `
    -InFile request.json `
    -Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content

    You should receive a JSON response similar to the following:

    Response

    {  "kind": "sql#operation",  "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/target-instance-id",  "status": "PENDING",  "user": "user@example.com",  "insertTime": "2020-01-21T22:43:37.981Z",  "operationType": "UPDATE",  "name": "operation-id",  "targetId": "instance-id",  "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/operations/operation-id",  "targetProject": "project-id"}
    For the complete list of parameters for the request, see theinstances:import page.
  5. If you do not need to retain the IAM permissions youset previously, remove the permissions.

Customize the format of a CSV file for Cloud SQL for PostgreSQL

You can usegcloud or the REST API to customize your CSV file format.

A samplegcloud command follows:

gcloudsqlimportcsvINSTANCE_NAMEgs://BUCKET_NAME/FILE_NAME\--database=DATABASE_NAME\--table=TABLE_NAME\--quote="22"\--escape="5C"\--fields-terminated-by="2C"\--lines-terminated-by="0A"

The equivalent REST API request body would look like this:

{"importContext":{"fileType":"CSV","uri":"gs://bucket_name/path_to_csv_file","database":"DATABASE_NAME","csvImportOptions":{"table":"TABLE_NAME","escapeCharacter":"5C","quoteCharacter":"22","fieldsTerminatedBy":"2C","linesTerminatedBy":"0A"}}}
Note:If you use custom format options in your importcommands, make sure the exported file was created with the same options.

If you get an error such asERROR_RDBMS, ensure the tableexists. If the table exists, confirm that you have the correctpermissions on the bucket. For help configuring access control inCloud Storage, seeCreate and Manage Access Control Lists.

To see how theunderlying REST API requestis constructed for this task, see theAPIs Explorer on the instances:import page.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-07-14 UTC.