Create an instance in a private network and then import a database Stay organized with collections Save and categorize content based on your preferences.
Migrating a workload from another platform to Cloud SQL for SQL Server ofteninvolves using the Google Cloud console to import data from a SQL dump file that youexport from your previous environment.
This tutorial shows you how to create the Google Cloud resourcesthat you need and then import a SQL database to a Cloud SQL for SQL Serverinstance. The tutorial demonstrates best practices when migrating toCloud SQL for SQL Server, including the use of a Virtual Private Cloud (VPC)network with private services access and enabling private IP for yourCloud SQL instance.
As you work through the steps, retain the default values for settings unlessotherwise specified.
Objectives
- Download a sample SQL dump file.
- Create a new Virtual Private Cloud network with private services access.
- Create a Cloud Storage bucket and upload a SQL dump file to it.
- Create a Cloud SQL for SQL Server instance configured for private IP.
- Create a destination database.
- Import from the dump file to a new database.
- Verify that the database was successfully imported by viewing the structureand running a query.
Costs
In this document, you use the following billable components of Google Cloud:
To generate a cost estimate based on your projected usage, use thepricing calculator.
When you finish the tasks that are described in this document, you can avoid continued billing by deleting the resources that you created. For more information, seeClean up.
Before you begin
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
- Create a project: To create a project, you need the Project Creator role (
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission.Learn how to grant roles.
If you're using an existing project for this guide,verify that you have the permissions required to complete this guide. If you created a new project, then you already have the required permissions.
Verify that billing is enabled for your Google Cloud project.
Enable the Cloud SQL, Cloud SQL Admin, Compute Engine, CloudStorage APIs.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission.Learn how to grant roles.In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
- Create a project: To create a project, you need the Project Creator role (
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission.Learn how to grant roles.
If you're using an existing project for this guide,verify that you have the permissions required to complete this guide. If you created a new project, then you already have the required permissions.
Verify that billing is enabled for your Google Cloud project.
Enable the Cloud SQL, Cloud SQL Admin, Compute Engine, CloudStorage APIs.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission.Learn how to grant roles.
Required roles
To get the permissions that you need to complete this tutorial, ask your administrator to grant you the following IAM roles on your project:
- Cloud SQL Admin (
roles/cloudsql.admin) - Storage Admin (
roles/storage.admin) - Compute Network Admin (
roles/compute.networkAdmin)
For more information about granting roles, seeManage access to projects, folders, and organizations.
You might also be able to get the required permissions throughcustom roles or otherpredefined roles.
Obtain a sample database dump file
For this tutorial, you'll use a small sample database that contains countrycodes and world capitals.
Download the SQL file for thecountries database to your localenvironment:countries-sqlserver.sql.
Create a network with private services access
Before you create the rest of your resources, create a VPC network to run yourservices on.Private services access lets you restrict access to yourCloud SQL database by establishing a private connection between yourexternal network and Cloud SQL for SQL Server using internal IPv4 addresses.
In the Google Cloud console, go to theVPC networks page.
ClickCreate VPC network.
In the name field, enter
tutorial-network.ForSubnet creation mode, selectCustom.
In the name field, enter
tutorial-subnet.Select a region near you.
ForIPv4 range, enter
10.0.0.0/24.ForPrivate Google Access, selectOn.
ClickDone.
At the bottom of the page, clickCreate.
After the VPC network creation process finishes, you can configure privateservices access for the network.
- On the VPC networks screen, clicktutorial-network.
- In the menu bar fortutorial-network, clickPrivate services access.
- ClickAllocated IP ranges for services.
- ClickAllocate IP range.
- For Name, enter
tutorial-range. - ForIP address range, selectCustom.
- In theRange field, enter
192.168.0.0/20. - ClickAllocate.
- In the submenu, clickPrivate connections to services.
- ClickCreate connection.
- In theAssigned allocation drop-down, selecttutorial-range.
- ClickConnect. In a minute or two, the Google Cloud console displays amessage letting you know that you have successfully created a privateconnection.
Create a Cloud Storage bucket
Next, create a Cloud Storage bucket to store the SQL dump file that youdownloaded earlier. The Cloud SQL import tool expects the dump fileto be in a bucket. A Cloud Storage bucket must have a globally uniquename.
- In the Google Cloud console, go to the Cloud StorageBuckets page.
- ClickCreate.
- For the bucket name, create a globally unique name consisting of lowercaseletters, numbers, and hyphens. You might want to use a random stringgenerator, such asrandom.org/strings to generate the name. Note the name that you choose.
- ClickContinue.
- UnderLocation type, selectRegion. This is the lowest-cost option.
- Choose the same region you chose for your subnet. Locating yourGoogle Cloud resources in the same region reduces latency, improvesspeed, lowers data transfer costs, and simplifies networking.
- ClickContinue.
- ClickCreate.
- If prompted with a dialog, leaveEnforce public access prevention on thisbucket selected, and clickConfirm.
TheBucket details page for the new bucket opens with theObjects paneselected.
Upload your dump file to the bucket
Upload to your bucket the sample SQL dump file that you downloaded earlier.
- On theObjects tab, clickUpload and then clickUpload files.
- Navigate to and select the
countries-sqlserver.sqlfile. - ClickOpen. Cloud Storage uploads the dump file to the bucket.
Create a Cloud SQL instance
Create a Cloud SQL instance in the Google Cloud console using thefollowing settings. For all other settings, keep the default. Although you don'tneed to enable private IP for the import process, using private IP for aproduction workload is a best practice.
- Go to the Cloud SQL Instances page in the Google Cloud console.
Go to the Cloud SQL Instances page - ClickCreate Instance.
- ClickChoose SQL Server.
- ForChoose a Cloud SQL edition, chooseEnterprise.
- ForEdition preset, selectSandbox.
- ForInstance ID, enter
tutorial-instance. - Choose and enter a password for the default user account and save it forfuture use.
- Choose the same region that you chose for your subnet and bucket.
- ForZonal availability, selectSingle zone.
- ExpandShow configuration options.
- ExpandConnections.
- ClearPublic IP.
- SelectPrivate IP.
- From theNetwork drop-down, selecttutorial-network. This places thenew Cloud SQL instance in the private network that you createdearlier.
- ClickCreate instance, and then wait until the instance initializes andstarts. The initialization process can take more than five minutes.
Create a destination database
The import workflow requires you to select a destination database to import to,so you need to create an empty database.
Note: The sample SQL file that you're using for this tutorial creates a newdatabase and overrides the destination database that you choose in the console,anyway, but not all dump files work this way.- In the SQL navigation menu, clickDatabases.
- ClickCreate database.
- ForDatabase Name type
countries. - ClickCreate.
Import from the dump file
Now you're ready to import thecountries database using thecountries-sqlserver.sql dump file that you uploaded to your Cloud Storagebucket.
- In the SQL navigation menu, clickOverview.
- On the overview page, clickImport.
- UnderFile format, selectSQL.
- UnderSelect source file, chooseSelect file from Google CloudStorage.
- ClickBrowse.
- Expand the storage bucket that you created earlier.
- Clickcountries-sqlserver.sql.
- ClickSelect.
- In theDestination section, click theDatabase drop-down and thenselectcountries.
- ClickImport.
When the import process is complete and thecountries database has beenimported to Cloud SQL for SQL Server, a success message is displayed.
Validate the imported data in Cloud SQL for SQL Server
After the import operation is complete, you can verify that the database wasimported by connecting to it using Cloud SQL Studio, inspecting thelist of tables, and running a test query against the data.
Authenticate to Cloud SQL Studio
With the user account that you created earlier, connect to the new databaseusing Cloud SQL Studio.
- In the SQL navigation menu, clickCloud SQL Studio. A login dialog isdisplayed.
- In theDatabase drop-down, choosecountries.
- SelectBuilt-in database authentication.
- In theUser drop-down, selectsqlserver.
- In thePassword field, enter the password you specified when you createdthe instance.
- ClickAuthenticate. Cloud SQL Studio opens.
View and query the tables
- In theExplorer pane, examine the
countriesdatabase and confirm thatthe database has two tables:capitalsandcountry_codes. - ClickUntitled Query to open the query editor.
Paste the following code into the query editor:
SELECT[capitals].[country_capital],[country_codes].[country_name]FROM[capitals]JOIN[country_codes]ON[capitals].[alpha_2_code]=[country_codes].[alpha_2_code]ORDERBY[capitals].[country_capital];ClickRun.
The results pane displays an alphabetical list of world capitals and their countries.
Clean up
To avoid incurring charges to your Google Cloud account for the resourcesused in this tutorial, do one of the following:
- Delete the project that contains the resources
- Keep the project and delete the individual resources.
Delete the project
Delete individual resources
If you want to keep the project but avoid incurring charges, delete theCloud SQL instance, Cloud Storage bucket, and VPC network thatyou created during the tutorial.
Delete the Cloud SQL instance
First, disable deletion protection and then delete the tutorialCloud SQL instance:
- In the SQL navigation menu, clickOverview.
- ClickEdit.
- Expand theData Protection section.
- InInstance deletion protection, deselect all options.
- ClickSave. When the operation is complete,Delete is selectable.
- ClickDelete. A dialog appears.
- In theInstance ID field, enter
tutorial-instance. - ClickDelete.
Delete the Cloud Storage bucket
Next, delete the storage bucket and its contents.
- In the main Google Cloud console navigation menu, go toCloud Storage>Buckets.
- Select the box next to the name of the bucket you created earlier.
- ClickDelete.
- In the dialog, confirm deletion by typing
DELETE, and then clickDelete.
Delete the VPC network
Now that you've deleted the resources in your VPC network, you are ready todelete the network. Before you can delete the VPC network, you must delete thepeering connection that was created automatically when you created the VPCnetwork.
- In the main navigation menu, go toVPC networks >VPC networkpeering.
- Select the box next to the peering connection with the VPC networktutorial-network.
- ClickDelete.
- In the dialog, confirm by clickingDelete.
Now that the peering connection has been removed, you can delete the VPCnetwork.
- In theVPC Network menu, clickVPC networks.
- Clicktutorial-network to open the details page.
- ClickDelete VPC network.
- In the dialog, confirm deletion by typing
tutorial-networkand thenclickingDelete.
What's next
- Best practices for importing and exporting data.
- Export and import using SQL dump files.
- VPC networks
- Explore reference architectures, diagrams, and best practices about Google Cloud.Take a look at ourCloud Architecture Center.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-17 UTC.