Create an instance in a private network and then import a database

MySQL  |  PostgreSQL  |  SQL Server

Migrating a workload from another platform to Cloud SQL for PostgreSQL ofteninvolves using the Google Cloud console to import data from a SQL dump file that youexport from your previous environment.

This tutorial shows you how to create the Google Cloud resourcesthat you need and then import a SQL database to a Cloud SQL for PostgreSQLinstance. The tutorial demonstrates best practices when migrating toCloud SQL for PostgreSQL, including the use of a Virtual Private Cloud (VPC)network with private services access and enabling private IP for yourCloud SQL instance.

As you work through the steps, retain the default values for settings unlessotherwise specified.

Objectives

  1. Download a sample SQL dump file.
  2. Create a new Virtual Private Cloud network with private services access.
  3. Create a Cloud Storage bucket and upload a SQL dump file to it.
  4. Create a Cloud SQL for PostgreSQL instance configured for private IP.
  5. Create a user.
  6. Create a destination database.
  7. Import from the dump file to a new database.
  8. Verify that the database was successfully imported by viewing the structureand running a query.

Costs

In this document, you use the following billable components of Google Cloud:

To generate a cost estimate based on your projected usage, use thepricing calculator.

New Google Cloud users might be eligible for afree trial.

When you finish the tasks that are described in this document, you can avoid continued billing by deleting the resources that you created. For more information, seeClean up.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.create permission.Learn how to grant roles.
    Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

    Go to project selector

  3. If you're using an existing project for this guide,verify that you have the permissions required to complete this guide. If you created a new project, then you already have the required permissions.

  4. Verify that billing is enabled for your Google Cloud project.

  5. Enable the Cloud SQL, Cloud SQL Admin, Compute Engine, CloudStorage APIs.

    Roles required to enable APIs

    To enable APIs, you need the Service Usage Admin IAM role (roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enable permission.Learn how to grant roles.

    Enable the APIs

  6. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.create permission.Learn how to grant roles.
    Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

    Go to project selector

  7. If you're using an existing project for this guide,verify that you have the permissions required to complete this guide. If you created a new project, then you already have the required permissions.

  8. Verify that billing is enabled for your Google Cloud project.

  9. Enable the Cloud SQL, Cloud SQL Admin, Compute Engine, CloudStorage APIs.

    Roles required to enable APIs

    To enable APIs, you need the Service Usage Admin IAM role (roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enable permission.Learn how to grant roles.

    Enable the APIs

Required roles

To get the permissions that you need to complete this tutorial, ask your administrator to grant you the following IAM roles on your project:

For more information about granting roles, seeManage access to projects, folders, and organizations.

You might also be able to get the required permissions throughcustom roles or otherpredefined roles.

Obtain a sample database dump file

For this tutorial, you'll use a small sample database that contains countrycodes and world capitals.

Download the SQL file for thecountries database to your localenvironment:countries-postgres.sql.

Create a network with private services access

Before you create the rest of your resources, create a VPC network to run yourservices on.Private services access lets you restrict access to yourCloud SQL database by establishing a private connection between yourexternal network and Cloud SQL for PostgreSQL using internal IPv4 addresses.

  1. In the Google Cloud console, go to theVPC networks page.

    Go to the VPC networks page

  2. ClickCreate VPC network.

  3. In the name field, entertutorial-network.

  4. ForSubnet creation mode, selectCustom.

  5. In the name field, entertutorial-subnet.

  6. Select a region near you.

  7. ForIPv4 range, enter10.0.0.0/24.

  8. ForPrivate Google Access, selectOn.

  9. ClickDone.

  10. At the bottom of the page, clickCreate.

After the VPC network creation process finishes, you can configure privateservices access for the network.

  1. On the VPC networks screen, clicktutorial-network.
  2. In the menu bar fortutorial-network, clickPrivate services access.
  3. ClickAllocated IP ranges for services.
  4. ClickAllocate IP range.
  5. For Name, entertutorial-range.
  6. ForIP address range, selectCustom.
  7. In theRange field, enter192.168.0.0/20.
  8. ClickAllocate.
  9. In the submenu, clickPrivate connections to services.
  10. ClickCreate connection.
  11. In theAssigned allocation drop-down, selecttutorial-range.
  12. ClickConnect. In a minute or two, the Google Cloud console displays amessage letting you know that you have successfully created a privateconnection.

Create a Cloud Storage bucket

Next, create a Cloud Storage bucket to store the SQL dump file that youdownloaded earlier. The Cloud SQL import tool expects the dump fileto be in a bucket. A Cloud Storage bucket must have a globally uniquename.

  1. In the Google Cloud console, go to the Cloud StorageBuckets page.

    Go to Buckets

  2. ClickCreate.
  3. For the bucket name, create a globally unique name consisting of lowercaseletters, numbers, and hyphens. You might want to use a random stringgenerator, such asrandom.org/strings to generate the name. Note the name that you choose.
  4. ClickContinue.
  5. UnderLocation type, selectRegion. This is the lowest-cost option.
  6. Choose the same region you chose for your subnet. Locating yourGoogle Cloud resources in the same region reduces latency, improvesspeed, lowers data transfer costs, and simplifies networking.
  7. ClickContinue.
  8. ClickCreate.
  9. If prompted with a dialog, leaveEnforce public access prevention on thisbucket selected, and clickConfirm.

TheBucket details page for the new bucket opens with theObjects paneselected.

Upload your dump file to the bucket

Upload to your bucket the sample SQL dump file that you downloaded earlier.

  1. On theObjects tab, clickUpload and then clickUpload files.
  2. Navigate to and select thecountries-postgres.sql file.
  3. ClickOpen. Cloud Storage uploads the dump file to the bucket.

Create a Cloud SQL instance

Create a Cloud SQL instance in the Google Cloud console using thefollowing settings. For all other settings, keep the default. Although you don'tneed to enable private IP for the import process, using private IP for aproduction workload is a best practice.

  1. Go to the Cloud SQL Instances page in the Google Cloud console.
    Go to the Cloud SQL Instances page
  2. ClickCreate Instance.
  3. ClickChoose PostgreSQL.
  4. ForChoose a Cloud SQL edition, chooseEnterprise.
  5. ForEdition preset, selectSandbox.
  6. ForInstance ID, entertutorial-instance.
  7. Choose and enter a password for the default user account and save it forfuture use.
  8. Choose the same region that you chose for your subnet and bucket.
  9. ForZonal availability, selectSingle zone.
  10. ExpandShow configuration options.
  11. ExpandConnections.
  12. ClearPublic IP.
  13. SelectPrivate IP.
  14. From theNetwork drop-down, selecttutorial-network. This places thenew Cloud SQL instance in the private network that you createdearlier.
  15. ClickCreate instance, and then wait until the instance initializes andstarts. The initialization process can take more than five minutes.
Tip: You can monitor the status of Google Cloud console operations, includingyour instance-creation operation, in thelong-running operation drawer at thebottom of the Google Cloud console.

Add a user

Before you can read or write to a database, you must create a database userthat is different from the root user.

  1. In the SQL navigation menu, clickUsers.
  2. ClickAdd user account.
  3. In the pane that opens, selectBuilt-in authentication.
  4. In theUser name field, entertutorial-user.
  5. Enter a password for the new user. Save this password for future use.
  6. ClickAdd.

Create a destination database

The import workflow requires you to select a destination database to import to,so you need to create an empty database.

Note: The sample SQL file that you're using for this tutorial creates a newdatabase and overrides the destination database that you choose in the console,anyway, but not all dump files work this way.
  1. In the SQL navigation menu, clickDatabases.
  2. ClickCreate database.
  3. ForDatabase Name typecountries.
  4. ClickCreate.

Import from the dump file

Now you're ready to import thecountries database using thecountries-postgres.sql dump file that you uploaded to your Cloud Storagebucket.

  1. In the SQL navigation menu, clickOverview.
  2. On the overview page, clickImport.
  3. UnderFile format, selectSQL.
  4. UnderSelect source file, chooseSelect file from Google CloudStorage.
  5. ClickBrowse.
  6. Expand the storage bucket that you created earlier.
  7. Clickcountries-postgres.sql.
  8. ClickSelect.
  9. In theDestination section, click theDatabase drop-down and thenselectcountries.
  10. ClickImport.

When the import process is complete and thecountries database has beenimported to Cloud SQL for PostgreSQL, a success message is displayed.

Validate the imported data in Cloud SQL for PostgreSQL

After the import operation is complete, you can verify that the database wasimported by connecting to it using Cloud SQL Studio, inspecting thelist of tables, and running a test query against the data.

Authenticate to Cloud SQL Studio

With the user account that you created earlier, connect to the new databaseusing Cloud SQL Studio.

  1. In the SQL navigation menu, clickCloud SQL Studio. A login dialog isdisplayed.
  2. In theDatabase drop-down, choosecountries.
  3. SelectBuilt-in database authentication.
  4. In theUser drop-down, selecttutorial-user.
  5. In thePassword field, enter the password that you chose for the user intheAdd a user section.
  6. ClickAuthenticate. Cloud SQL Studio opens.

View and query the tables

  1. In theExplorer pane, examine thecountries database and confirm thatthe database has two tables:capitals andcountry_codes.
  2. ClickUntitled Query to open the query editor.
  3. Paste the following code into the query editor:

    SELECT"capitals"."country_capital","country_codes"."country_name"FROM"capitals"JOIN"country_codes"ON"capitals"."alpha_2_code"="country_codes"."alpha_2_code"ORDERBY"capitals"."country_capital";
  4. ClickRun.

The results pane displays an alphabetical list of world capitals and their countries.

Clean up

To avoid incurring charges to your Google Cloud account for the resourcesused in this tutorial, do one of the following:

  • Delete the project that contains the resources
  • Keep the project and delete the individual resources.

Delete the project

    Caution: Deleting a project has the following effects:
    • Everything in the project is deleted. If you used an existing project for the tasks in this document, when you delete it, you also delete any other work you've done in the project.
    • Custom project IDs are lost. When you created this project, you might have created a custom project ID that you want to use in the future. To preserve the URLs that use the project ID, such as anappspot.com URL, delete selected resources inside the project instead of deleting the whole project.

    If you plan to explore multiple architectures, tutorials, or quickstarts, reusing projects can help you avoid exceeding project quota limits.

  1. In the Google Cloud console, go to theManage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then clickDelete.
  3. In the dialog, type the project ID, and then clickShut down to delete the project.

Delete individual resources

If you want to keep the project but avoid incurring charges, delete theCloud SQL instance, Cloud Storage bucket, and VPC network thatyou created during the tutorial.

Delete the Cloud SQL instance

First, disable deletion protection and then delete the tutorialCloud SQL instance:

  1. In the SQL navigation menu, clickOverview.
  2. ClickEdit.
  3. Expand theData Protection section.
  4. InInstance deletion protection, deselect all options.
  5. ClickSave. When the operation is complete,Delete is selectable.
  6. ClickDelete. A dialog appears.
  7. In theInstance ID field, entertutorial-instance.
  8. ClickDelete.

Delete the Cloud Storage bucket

Next, delete the storage bucket and its contents.

  1. In the main Google Cloud console navigation menu, go toCloud Storage>Buckets.
  2. Select the box next to the name of the bucket you created earlier.
  3. ClickDelete.
  4. In the dialog, confirm deletion by typingDELETE, and then clickDelete.

Delete the VPC network

Now that you've deleted the resources in your VPC network, you are ready todelete the network. Before you can delete the VPC network, you must delete thepeering connection that was created automatically when you created the VPCnetwork.

  1. In the main navigation menu, go toVPC networks >VPC networkpeering.
  2. Select the box next to the peering connection with the VPC networktutorial-network.
  3. ClickDelete.
  4. In the dialog, confirm by clickingDelete.

Now that the peering connection has been removed, you can delete the VPCnetwork.

  1. In theVPC Network menu, clickVPC networks.
  2. Clicktutorial-network to open the details page.
  3. ClickDelete VPC network.
  4. In the dialog, confirm deletion by typingtutorial-network and thenclickingDelete.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-17 UTC.