Getting started with Spanner in Node.js

Objectives

This tutorial walks you through the following steps using the Spannerclient library for Node.js:

  • Create a Spanner instance and database.
  • Write, read, and execute SQL queries on data in the database.
  • Update the database schema.
  • Update data using a read-write transaction.
  • Add a secondary index to the database.
  • Use the index to read and execute SQL queries on data.
  • Retrieve data using a read-only transaction.

Costs

This tutorial uses Spanner, which is a billable component of theGoogle Cloud. For information on the cost of using Spanner, seePricing.

Before you begin

Complete the steps described inSet up, which cover creating andsetting a default Google Cloud project, enabling billing, enabling theCloud Spanner API, and setting up OAuth 2.0 to get authentication credentials to usethe Cloud Spanner API.

In particular, make sure that you rungcloud authapplication-default loginto set up your local development environment with authenticationcredentials.

Note: If you don't plan to keep the resources that you create in this tutorial,consider creating a new Google Cloud project instead of selecting an existingproject. After you finish the tutorial, you can delete the project, removing allresources associated with the project.

Prepare your local Node.js environment

  1. Follow the steps toSet Up a Node.js Development Environment

  2. Clone the sample app repository to your local machine:

    gitclonehttps://github.com/googleapis/nodejs-spanner

    Alternatively, you candownload the sample as a zip file and extract it.

  3. Change to the directory that contains the Spanner sample code:

    cdsamples/
  4. Install dependencies usingnpm:

    npminstall

Create an instance

When you first use Spanner, you must create an instance, which is anallocation of resources that are used by Spanner databases. When youcreate an instance, you choose aninstance configuration, which determineswhere your data is stored, and also the number of nodes to use, which determinesthe amount of serving and storage resources in your instance.

SeeCreate an instanceto learn how to create a Spanner instance using any of thefollowing methods. You can name your instancetest-instance to use it withother topics in this document that reference an instance namedtest-instance.

  • The Google Cloud CLI
  • The Google Cloud console
  • A client library (C++, C#, Go, Java, Node.js, PHP, Python, or Ruby)

Look through sample files

The samples repository contains a sample that shows how to use Spannerwith Node.js.

Take a look through thesamples/schema.js file, which shows how tocreate a database and modify a database schema. The data uses the example schemashown in theSchema and data model page.

Create a database

GoogleSQL

nodeschema.jscreateDatabasetest-instanceexample-dbMY_PROJECT_ID

PostgreSQL

nodeschema.jscreatePgDatabasetest-instanceexample-dbMY_PROJECT_ID

You should see:

Createddatabaseexample-dboninstancetest-instance.
The following code creates a database and two tables in the database.Note: The subsequent code samples use these two tables. If you don't executethis code, then create the tables by using the Google Cloud console or thegcloud CLI. For more information, see theexample schema.

GoogleSQL

/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');// creates a clientconstspanner=newSpanner({projectId:projectID,});constdatabaseAdminClient=spanner.getDatabaseAdminClient();constcreateSingersTableStatement=`CREATE TABLE Singers (  SingerId    INT64 NOT NULL,  FirstName   STRING(1024),  LastName    STRING(1024),  SingerInfo  BYTES(MAX),  FullName    STRING(2048) AS (ARRAY_TO_STRING([FirstName, LastName], " ")) STORED,) PRIMARY KEY (SingerId)`;constcreateAlbumsTableStatement=`CREATE TABLE Albums (  SingerId    INT64 NOT NULL,  AlbumId     INT64 NOT NULL,  AlbumTitle  STRING(MAX)) PRIMARY KEY (SingerId, AlbumId),INTERLEAVE IN PARENT Singers ON DELETE CASCADE`;// Creates a new databasetry{const[operation]=awaitdatabaseAdminClient.createDatabase({createStatement:'CREATE DATABASE `'+databaseID+'`',extraStatements:[createSingersTableStatement,createAlbumsTableStatement,],parent:databaseAdminClient.instancePath(projectID,instanceID),});console.log(`Waiting for creation of${databaseID} to complete...`);awaitoperation.promise();console.log(`Created database${databaseID} on instance${instanceID}.`);}catch(err){console.error('ERROR:',err);}

PostgreSQL

/** * TODO(developer): Uncomment these variables before running the sample. */// const instanceId = 'my-instance';// const databaseId = 'my-database';// const projectId = 'my-project-id';// Imports the Google Cloud client libraryconst{Spanner,protos}=require('@google-cloud/spanner');// creates a clientconstspanner=newSpanner({projectId:projectId,});constdatabaseAdminClient=spanner.getDatabaseAdminClient();asyncfunctioncreatePgDatabase(){// Creates a PostgreSQL database. PostgreSQL create requests may not contain any additional// DDL statements. We need to execute these separately after the database has been created.const[operationCreate]=awaitdatabaseAdminClient.createDatabase({createStatement:'CREATE DATABASE "'+databaseId+'"',parent:databaseAdminClient.instancePath(projectId,instanceId),databaseDialect:protos.google.spanner.admin.database.v1.DatabaseDialect.POSTGRESQL,});console.log(`Waiting for operation on${databaseId} to complete...`);awaitoperationCreate.promise();const[metadata]=awaitdatabaseAdminClient.getDatabase({name:databaseAdminClient.databasePath(projectId,instanceId,databaseId),});console.log(`Created database${databaseId} on instance${instanceId} with dialect${metadata.databaseDialect}.`,);// Create a couple of tables using a separate request. We must use PostgreSQL style DDL as the// database has been created with the PostgreSQL dialect.conststatements=[`CREATE TABLE Singers      (SingerId   bigint NOT NULL,      FirstName   varchar(1024),      LastName    varchar(1024),      SingerInfo  bytea,      FullName    character varying(2048) GENERATED ALWAYS AS (FirstName || ' ' || LastName) STORED,      PRIMARY KEY (SingerId)      );      CREATE TABLE Albums      (AlbumId    bigint NOT NULL,      SingerId    bigint NOT NULL REFERENCES Singers (SingerId),      AlbumTitle  text,      PRIMARY KEY (AlbumId)      );`,];const[operationUpdateDDL]=awaitdatabaseAdminClient.updateDatabaseDdl({database:databaseAdminClient.databasePath(projectId,instanceId,databaseId,),statements:[statements],});awaitoperationUpdateDDL.promise();console.log('Updated schema');}createPgDatabase();

The next step is to write data to your database.

Create a database client

Before you can do reads or writes, you must create aDatabase:

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');// Creates a clientconstspanner=newSpanner({projectId});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);// The query to executeconstquery={sql:'SELECT 1',};// Execute a simple SQL statementconst[rows]=awaitdatabase.run(query);console.log(`Query:${rows.length} found.`);rows.forEach(row=>console.log(row));

You can think of aDatabase as a database connection: all of your interactionswith Spanner must go through aDatabase. Typically you create aDatabasewhen your application starts up, then you re-use thatDatabase to read, write,and execute transactions. Each client uses resources in Spanner.

If you create multiple clients in the same app, you should callDatabase.close()to clean up the client's resources, including network connections, as soon as itis no longer needed.

Read more in theDatabasereference.

Write data with DML

You can insert data using Data Manipulation Language (DML) in a read-writetransaction.

You use therunUpdate() method to execute a DML statement.

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);database.runTransaction(async(err,transaction)=>{if(err){console.error(err);return;}try{const[rowCount]=awaittransaction.runUpdate({sql:`INSERT Singers (SingerId, FirstName, LastName) VALUES      (12, 'Melissa', 'Garcia'),      (13, 'Russell', 'Morales'),      (14, 'Jacqueline', 'Long'),      (15, 'Dylan', 'Shaw')`,});console.log(`${rowCount} records inserted.`);awaittransaction.commit();}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}});

Run the sample using thewriteUsingDml argument.

nodedml.jswriteUsingDmltest-instanceexample-dbMY_PROJECT_ID

You should see:

4recordsinserted.
Note: There are limits to commit size. SeeCRUD limitfor more information.

Write data with mutations

You can also insert data usingmutations.

You write data using aTable object. TheTable.insert()method adds new rows to the table. All inserts in a single batch are appliedatomically.

This code shows how to write the data using mutations:

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);// Instantiate Spanner table objectsconstsingersTable=database.table('Singers');constalbumsTable=database.table('Albums');// Inserts rows into the Singers table// Note: Cloud Spanner interprets Node.js numbers as FLOAT64s, so// they must be converted to strings before being inserted as INT64stry{awaitsingersTable.insert([{SingerId:'1',FirstName:'Marc',LastName:'Richards'},{SingerId:'2',FirstName:'Catalina',LastName:'Smith'},{SingerId:'3',FirstName:'Alice',LastName:'Trentor'},{SingerId:'4',FirstName:'Lea',LastName:'Martin'},{SingerId:'5',FirstName:'David',LastName:'Lomond'},]);awaitalbumsTable.insert([{SingerId:'1',AlbumId:'1',AlbumTitle:'Total Junk'},{SingerId:'1',AlbumId:'2',AlbumTitle:'Go, Go, Go'},{SingerId:'2',AlbumId:'1',AlbumTitle:'Green'},{SingerId:'2',AlbumId:'2',AlbumTitle:'Forever Hold your Peace'},{SingerId:'2',AlbumId:'3',AlbumTitle:'Terrified'},]);console.log('Inserted data.');}catch(err){console.error('ERROR:',err);}finally{awaitdatabase.close();}

Run the sample using theinsert argument.

nodecrud.jsinserttest-instanceexample-dbMY_PROJECT_ID

You should see:

Inserteddata.
Note: There are limits to commit size. SeeCRUD limitfor more information.

Query data using SQL

Spanner supports a SQL interface for reading data, which you canaccess on the command line using the Google Cloud CLI orprogrammatically usingthe Spanner client library for Node.js.

On the command line

Execute the following SQL statement to read the values of all columns from theAlbums table:

gcloudspannerdatabasesexecute-sqlexample-db--instance=test-instance\--sql='SELECT SingerId, AlbumId, AlbumTitle FROM Albums'
Note: For the GoogleSQL reference, seeQuery syntax in GoogleSQLand for PostgreSQL reference, seePostgreSQL lexical structure and syntax.

The result shows:

SingerIdAlbumIdAlbumTitle11TotalJunk12Go,Go,Go21Green22ForeverHoldYourPeace23Terrified

Use the Spanner client library for Node.js

In addition to executing a SQL statement on the command line, you can issue thesame SQL statement programmatically using the Spanner client library forNode.js.

UseDatabase.run()to run the SQL query.

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);constquery={sql:'SELECT SingerId, AlbumId, AlbumTitle FROM Albums',};// Queries rows from the Albums tabletry{const[rows]=awaitdatabase.run(query);rows.forEach(row=>{constjson=row.toJSON();console.log(`SingerId:${json.SingerId}, AlbumId:${json.AlbumId}, AlbumTitle:${json.AlbumTitle}`,);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.awaitdatabase.close();}

Here's how to issue the query and access the data:

nodecrud.jsquerytest-instanceexample-dbMY_PROJECT_ID

You should see the following result:

SingerId:1,AlbumId:1,AlbumTitle:TotalJunkSingerId:1,AlbumId:2,AlbumTitle:Go,Go,GoSingerId:2,AlbumId:1,AlbumTitle:GreenSingerId:2,AlbumId:2,AlbumTitle:ForeverHoldyourPeaceSingerId:2,AlbumId:3,AlbumTitle:Terrified

Query using a SQL parameter

If your application has a frequently executed query, you can improve itsperformance by parameterizing it. The resulting parametric query can be cachedand reused, which reduces compilation costs. For more information, seeUse query parameters to speed up frequently executed queries.

Here is an example of using a parameter in theWHERE clause to query recordscontaining a specific value forLastName.

GoogleSQL

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);constquery={sql:`SELECT SingerId, FirstName, LastName        FROM Singers WHERE LastName = @lastName`,params:{lastName:'Garcia',},};// Queries rows from the Albums tabletry{const[rows]=awaitdatabase.run(query);rows.forEach(row=>{constjson=row.toJSON();console.log(`SingerId:${json.SingerId}, FirstName:${json.FirstName}, LastName:${json.LastName}`,);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}

PostgreSQL

/** * TODO(developer): Uncomment these variables before running the sample. */// const instanceId = 'my-instance';// const databaseId = 'my-database';// const projectId = 'my-project-id';// Imports the Google Cloud Spanner client libraryconst{Spanner}=require('@google-cloud/spanner');// Instantiates a clientconstspanner=newSpanner({projectId:projectId,});asyncfunctionqueryWithPgParameter(){// Gets a reference to a Cloud Spanner instance and database.constinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);constfieldType={type:'string',};constquery={sql:`SELECT singerid, firstname, lastname          FROM singers          WHERE firstname LIKE $1`,params:{p1:'A%',},types:{p1:fieldType,},};// Queries rows from the Singers table.try{const[rows]=awaitdatabase.run(query);rows.forEach(row=>{constjson=row.toJSON();console.log(`SingerId:${json.singerid}, FirstName:${json.firstname}, LastName:${json.lastname}`,);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}}queryWithPgParameter();

Here's how to issue the query and access the data:

nodedml.jsqueryWithParametertest-instanceexample-dbMY_PROJECT_ID

You should see the following result:

SingerId:12,FirstName:Melissa,LastName:Garcia

Read data using the read API

In addition to Spanner's SQL interface, Spanner also supports aread interface.

UseTable.read()to read rows from the database. Use aKeySet object to define a collection ofkeys and key ranges to read.

Here's how to read the data:

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);// Reads rows from the Albums tableconstalbumsTable=database.table('Albums');constquery={columns:['SingerId','AlbumId','AlbumTitle'],keySet:{all:true,},};try{const[rows]=awaitalbumsTable.read(query);rows.forEach(row=>{constjson=row.toJSON();console.log(`SingerId:${json.SingerId}, AlbumId:${json.AlbumId}, AlbumTitle:${json.AlbumTitle}`,);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.awaitdatabase.close();}

Run the sample using theread argument.

nodecrud.jsreadtest-instanceexample-dbMY_PROJECT_ID

You should see output similar to:

SingerId:1,AlbumId:1,AlbumTitle:TotalJunkSingerId:1,AlbumId:2,AlbumTitle:Go,Go,GoSingerId:2,AlbumId:1,AlbumTitle:GreenSingerId:2,AlbumId:2,AlbumTitle:ForeverHoldyourPeaceSingerId:2,AlbumId:3,AlbumTitle:Terrified

Update the database schema

Assume you need to add a new column calledMarketingBudget to theAlbumstable. Adding a new column to an existing table requires an update to yourdatabase schema. Spanner supports schema updates to a database while thedatabase continues to serve traffic. Schema updates don't require taking thedatabase offline and they don't lock entire tables or columns; you can continuewriting data to the database during the schema update. Read more about supportedschema updates and schema change performance inMake schema updates.

Add a column

You can add a column on the command line using the Google Cloud CLI orprogrammatically usingthe Spanner client library for Node.js.

On the command line

Use the followingALTER TABLE command toadd the new column to the table:

GoogleSQL

gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\--ddl='ALTER TABLE Albums ADD COLUMN MarketingBudget INT64'

PostgreSQL

gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\--ddl='ALTER TABLE Albums ADD COLUMN MarketingBudget BIGINT'

You should see:

Schemaupdating...done.

Use the Spanner client library for Node.js

UseDatabase.updateSchemato modify the schema:

GoogleSQL

/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');// creates a clientconstspanner=newSpanner({projectId:projectId,});constdatabaseAdminClient=spanner.getDatabaseAdminClient();// Creates a new index in the databasetry{const[operation]=awaitdatabaseAdminClient.updateDatabaseDdl({database:databaseAdminClient.databasePath(projectId,instanceId,databaseId,),statements:['ALTER TABLE Albums ADD COLUMN MarketingBudget INT64'],});console.log('Waiting for operation to complete...');awaitoperation.promise();console.log('Added the MarketingBudget column.');}catch(err){console.error('ERROR:',err);}finally{// Close the spanner client when finished.// The databaseAdminClient does not require explicit closure. The closure of the Spanner client will automatically close the databaseAdminClient.spanner.close();}

PostgreSQL

/** * TODO(developer): Uncomment these variables before running the sample. */// const instanceId = 'my-instance';// const databaseId = 'my-database';// const projectId = 'my-project-id';// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');// creates a clientconstspanner=newSpanner({projectId:projectId,});constdatabaseAdminClient=spanner.getDatabaseAdminClient();asyncfunctionpgAddColumn(){constrequest=['ALTER TABLE Albums ADD COLUMN MarketingBudget BIGINT'];// Alter existing table to add a column.const[operation]=awaitdatabaseAdminClient.updateDatabaseDdl({database:databaseAdminClient.databasePath(projectId,instanceId,databaseId,),statements:request,});console.log(`Waiting for operation on${databaseId} to complete...`);awaitoperation.promise();console.log(`Added MarketingBudget column to Albums table in database${databaseId}.`,);}pgAddColumn();

Run the sample using theaddColumn argument.

nodeschema.jsaddColumntest-instanceexample-dbMY_PROJECT_ID

You should see:

AddedtheMarketingBudgetcolumn.

Write data to the new column

The following code writes data to the new column. It setsMarketingBudget to100000 for the row keyed byAlbums(1, 1) and to500000 for the row keyedbyAlbums(2, 2).

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);// Update a row in the Albums table// Note: Cloud Spanner interprets Node.js numbers as FLOAT64s, so they// must be converted to strings before being inserted as INT64sconstalbumsTable=database.table('Albums');try{awaitalbumsTable.update([{SingerId:'1',AlbumId:'1',MarketingBudget:'100000'},{SingerId:'2',AlbumId:'2',MarketingBudget:'500000'},]);console.log('Updated data.');}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}

Run the sample using theupdate argument.

nodecrud.jsupdatetest-instanceexample-dbMY_PROJECT_ID

You should see:

Updateddata.

You can also execute a SQL query or a read call to fetch the values that youjust wrote.

Here's the code to execute the query:

// This sample uses the `MarketingBudget` column. You can add the column// by running the `add_column` sample or by running this DDL statement against// your database://    ALTER TABLE Albums ADD COLUMN MarketingBudget INT64// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);constquery={sql:'SELECT SingerId, AlbumId, MarketingBudget FROM Albums',};// Queries rows from the Albums tabletry{const[rows]=awaitdatabase.run(query);rows.forEach(asyncrow=>{constjson=row.toJSON();console.log(`SingerId:${json.SingerId}, AlbumId:${json.AlbumId}, MarketingBudget:${json.MarketingBudget?json.MarketingBudget:null}`,);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}

To execute this query, run the sample using thequeryNewColumn argument.

nodeschema.jsqueryNewColumntest-instanceexample-dbMY_PROJECT_ID

You should see:

SingerId:1,AlbumId:1,MarketingBudget:100000SingerId:1,AlbumId:2,MarketingBudget:nullSingerId:2,AlbumId:1,MarketingBudget:nullSingerId:2,AlbumId:2,MarketingBudget:500000SingerId:2,AlbumId:3,MarketingBudget:null

Update data

You can update data using DML in a read-write transaction.

You use therunUpdate() method to execute a DML statement.

GoogleSQL

// This sample transfers 200,000 from the MarketingBudget field// of the second Album to the first Album, as long as the second// Album has enough money in its budget. Make sure to run the// addColumn and updateData samples first (in that order).// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);consttransferAmount=200000;database.runTransaction((err,transaction)=>{if(err){console.error(err);return;}letfirstBudget,secondBudget;constqueryOne=`SELECT MarketingBudget FROM Albums    WHERE SingerId = 2 AND AlbumId = 2`;constqueryTwo=`SELECT MarketingBudget FROM Albums  WHERE SingerId = 1 AND AlbumId = 1`;Promise.all([// Reads the second album's budgettransaction.run(queryOne).then(results=>{// Gets second album's budgetconstrows=results[0].map(row=>row.toJSON());secondBudget=rows[0].MarketingBudget;console.log(`The second album's marketing budget:${secondBudget}`);// Makes sure the second album's budget is large enoughif(secondBudget <transferAmount){thrownewError(`The second album's budget (${secondBudget}) is less than the transfer amount (${transferAmount}).`,);}}),// Reads the first album's budgettransaction.run(queryTwo).then(results=>{// Gets first album's budgetconstrows=results[0].map(row=>row.toJSON());firstBudget=rows[0].MarketingBudget;console.log(`The first album's marketing budget:${firstBudget}`);}),]).then(()=>{// Transfers the budgets between the albumsconsole.log(firstBudget,secondBudget);firstBudget+=transferAmount;secondBudget-=transferAmount;console.log(firstBudget,secondBudget);// Updates the database// Note: Cloud Spanner interprets Node.js numbers as FLOAT64s, so they// must be converted (back) to strings before being inserted as INT64s.returntransaction.runUpdate({sql:`UPDATE Albums SET MarketingBudget = @Budget              WHERE SingerId = 1 and AlbumId = 1`,params:{Budget:firstBudget,},}).then(()=>transaction.runUpdate({sql:`UPDATE Albums SET MarketingBudget = @Budget                  WHERE SingerId = 2 and AlbumId = 2`,params:{Budget:secondBudget,},}),);}).then(()=>{// Commits the transaction and send the changes to the databasereturntransaction.commit();}).then(()=>{console.log(`Successfully executed read-write transaction using DML to transfer${transferAmount} from Album 2 to Album 1.`,);}).then(()=>{// Closes the database when finisheddatabase.close();});});

PostgreSQL

/** * TODO(developer): Uncomment these variables before running the sample. */// const instanceId = 'my-instance';// const databaseId = 'my-database';// const projectId = 'my-project-id';// Imports the Google Cloud Spanner client libraryconst{Spanner}=require('@google-cloud/spanner');// Instantiates a clientconstspanner=newSpanner({projectId:projectId,});functionupdateUsingDml(){// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);database.runTransaction(async(err,transaction)=>{if(err){console.error(err);return;}try{const[rowCount]=awaittransaction.runUpdate({sql:'UPDATE singers SET FirstName = $1 WHERE singerid = 1',params:{p1:'Virginia',},});console.log(`Successfully updated${rowCount} record in the Singers table.`,);awaittransaction.commit();}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.awaitdatabase.close();}});}updateUsingDml();

Run the sample using thewriteWithTransactionUsingDml argument.

nodedml.jswriteWithTransactionUsingDmltest-instanceexample-dbMY_PROJECT_ID

You should see:

Successfullyexecutedread-writetransactionusingDMLtotransfer$200000fromAlbum2toAlbum1.
Note: You can alsoupdate data using mutations.

Use a secondary index

Suppose you wanted to fetch all rows ofAlbums that haveAlbumTitle valuesin a certain range. You could read all values from theAlbumTitle column usinga SQL statement or a read call, and then discard the rows that don't meet thecriteria, but doing this full table scan is expensive, especially for tableswith a lot of rows. Instead you can speed up the retrieval of rows whensearching by non-primary key columns by creating asecondary index on the table.

Adding a secondary index to an existing table requires a schema update. Likeother schema updates, Spanner supports adding an index while thedatabase continues to serve traffic. Spanner automatically backfills theindex with your existing data. Backfills might take a few minutes to complete,but you don't need to take the database offline or avoid writing to the indexedtable during this process. For more details, seeAdd a secondary index.

After you add a secondary index, Spanner automatically uses it forSQL queries that are likely to run faster with the index. If you use the readinterface, you must specify the index that you want to use.

Add a secondary index

You can add an index on the command line using the gcloud CLI orprogrammatically using the Spanner client library for Node.js.

On the command line

Use the followingCREATE INDEXcommand to add an index to the database:

gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\--ddl='CREATE INDEX AlbumsByAlbumTitle ON Albums(AlbumTitle)'

You should see:

Schemaupdating...done.

Using the Spanner client library for Node.js

UseDatabase.updateSchema()to add an index:

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});constdatabaseAdminClient=spanner.getDatabaseAdminClient();constrequest=['CREATE INDEX AlbumsByAlbumTitle ON Albums(AlbumTitle)'];// Creates a new index in the databasetry{const[operation]=awaitdatabaseAdminClient.updateDatabaseDdl({database:databaseAdminClient.databasePath(projectId,instanceId,databaseId,),statements:request,});console.log('Waiting for operation to complete...');awaitoperation.promise();console.log('Added the AlbumsByAlbumTitle index.');}catch(err){console.error('ERROR:',err);}finally{// Close the spanner client when finished.// The databaseAdminClient does not require explicit closure. The closure of the Spanner client will automatically close the databaseAdminClient.spanner.close();}

Run the sample using thecreateIndex argument.

nodeindexing.jscreateIndextest-instanceexample-dbMY_PROJECT_ID

Adding an index can take a few minutes. After the index is added, you shouldsee:

AddedtheAlbumsByAlbumTitleindex.

Read using the index

For SQL queries, Spanner automatically uses an appropriate index. In theread interface, you must specify the index in your request.

To use the index in the read interface, use theTable.read() method.

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);constalbumsTable=database.table('Albums');constquery={columns:['AlbumId','AlbumTitle'],keySet:{all:true,},index:'AlbumsByAlbumTitle',};// Reads the Albums table using an indextry{const[rows]=awaitalbumsTable.read(query);rows.forEach(row=>{constjson=row.toJSON();console.log(`AlbumId:${json.AlbumId}, AlbumTitle:${json.AlbumTitle}`);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}

Run the sample using thereadIndex argument.

nodeindexing.jsreadIndextest-instanceexample-dbMY_PROJECT_ID

You should see:

AlbumId:2,AlbumTitle:ForeverHoldyourPeaceAlbumId:2,AlbumTitle:Go,Go,GoAlbumId:1,AlbumTitle:GreenAlbumId:3,AlbumTitle:TerrifiedAlbumId:1,AlbumTitle:TotalJunk

Add an index for index-only reads

You might have noticed that the previous read example doesn't include readingtheMarketingBudget column. This is because Spanner's read interfacedoesn't support the ability to join an index with a data table to look up valuesthat are not stored in the index.

Create an alternate definition ofAlbumsByAlbumTitle that stores a copy ofMarketingBudget in the index.

On the command line

GoogleSQL

gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\--ddl='CREATEINDEXAlbumsByAlbumTitle2ONAlbums(AlbumTitle)STORING(MarketingBudget)

PostgreSQL

gcloudspannerdatabasesddlupdateexample-db--instance=test-instance\--ddl='CREATEINDEXAlbumsByAlbumTitle2ONAlbums(AlbumTitle)INCLUDE(MarketingBudget)

Adding an index can take a few minutes. After the index is added, you shouldsee:

Schemaupdating...done.

Using the Spanner client library for Node.js

UseDatabase.updateSchema()to add an index with aSTORING clause:

// "Storing" indexes store copies of the columns they index// This speeds up queries, but takes more space compared to normal indexes// See the link below for more information:// https://cloud.google.com/spanner/docs/secondary-indexes#storing_clause// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});constdatabaseAdminClient=spanner.getDatabaseAdminClient();constrequest=['CREATE INDEX AlbumsByAlbumTitle2 ON Albums(AlbumTitle) STORING (MarketingBudget)',];// Creates a new index in the databasetry{const[operation]=awaitdatabaseAdminClient.updateDatabaseDdl({database:databaseAdminClient.databasePath(projectId,instanceId,databaseId,),statements:request,});console.log('Waiting for operation to complete...');awaitoperation.promise();console.log('Added the AlbumsByAlbumTitle2 index.');}catch(err){console.error('ERROR:',err);}finally{// Close the spanner client when finished.// The databaseAdminClient does not require explicit closure. The closure of the Spanner client will automatically close the databaseAdminClient.spanner.close();}

Run the sample using thecreateStoringIndex argument.

nodeindexing.jscreateStoringIndextest-instanceexample-dbMY_PROJECT_ID

You should see:

AddedtheAlbumsByAlbumTitle2index.

Now you can execute a read that fetches allAlbumId,AlbumTitle, andMarketingBudget columns from theAlbumsByAlbumTitle2 index:

// "Storing" indexes store copies of the columns they index// This speeds up queries, but takes more space compared to normal indexes// See the link below for more information:// https://cloud.google.com/spanner/docs/secondary-indexes#storing_clause// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);constalbumsTable=database.table('Albums');constquery={columns:['AlbumId','AlbumTitle','MarketingBudget'],keySet:{all:true,},index:'AlbumsByAlbumTitle2',};// Reads the Albums table using a storing indextry{const[rows]=awaitalbumsTable.read(query);rows.forEach(row=>{constjson=row.toJSON();letrowString=`AlbumId:${json.AlbumId}`;rowString+=`, AlbumTitle:${json.AlbumTitle}`;if(json.MarketingBudget){rowString+=`, MarketingBudget:${json.MarketingBudget}`;}console.log(rowString);});}catch(err){console.error('ERROR:',err);}finally{// Close the database when finished.database.close();}

Run the sample using thereadStoringIndex argument.

nodeindexing.jsreadStoringIndextest-instanceexample-dbMY_PROJECT_ID

You should see output similar to:

AlbumId:2,AlbumTitle:ForeverHoldyourPeace,MarketingBudget:300000AlbumId:2,AlbumTitle:Go,Go,Go,MarketingBudget:nullAlbumId:1,AlbumTitle:Green,MarketingBudget:nullAlbumId:3,AlbumTitle:Terrified,MarketingBudget:nullAlbumId:1,AlbumTitle:TotalJunk,MarketingBudget:300000

Retrieve data using read-only transactions

Suppose you want to execute more than one read at the same timestamp.Read-onlytransactions observe a consistentprefix of the transaction commit history, so your application always getsconsistent data.UseDatabase.runTransaction()for executing read-only transactions.

The following shows how to run a query and perform a read in the same read-onlytransaction:

// Imports the Google Cloud client libraryconst{Spanner}=require('@google-cloud/spanner');/** * TODO(developer): Uncomment the following lines before running the sample. */// const projectId = 'my-project-id';// const instanceId = 'my-instance';// const databaseId = 'my-database';// Creates a clientconstspanner=newSpanner({projectId:projectId,});// Gets a reference to a Cloud Spanner instance and databaseconstinstance=spanner.instance(instanceId);constdatabase=instance.database(databaseId);// Gets a transaction object that captures the database state// at a specific point in timedatabase.getSnapshot(async(err,transaction)=>{if(err){console.error(err);return;}constqueryOne='SELECT SingerId, AlbumId, AlbumTitle FROM Albums';try{// Read #1, using SQLconst[qOneRows]=awaittransaction.run(queryOne);qOneRows.forEach(row=>{constjson=row.toJSON();console.log(`SingerId:${json.SingerId}, AlbumId:${json.AlbumId}, AlbumTitle:${json.AlbumTitle}`,);});constqueryTwo={columns:['SingerId','AlbumId','AlbumTitle'],};// Read #2, using the `read` method. Even if changes occur// in-between the reads, the transaction ensures that both// return the same data.const[qTwoRows]=awaittransaction.read('Albums',queryTwo);qTwoRows.forEach(row=>{constjson=row.toJSON();console.log(`SingerId:${json.SingerId}, AlbumId:${json.AlbumId}, AlbumTitle:${json.AlbumTitle}`,);});console.log('Successfully executed read-only transaction.');}catch(err){console.error('ERROR:',err);}finally{transaction.end();// Close the database when finished.awaitdatabase.close();}});

Run the sample using thereadOnly argument.

nodetransaction.jsreadOnlytest-instanceexample-dbMY_PROJECT_ID

You should see output similar to:

SingerId:2,AlbumId:2,AlbumTitle:ForeverHoldyourPeaceSingerId:1,AlbumId:2,AlbumTitle:Go,Go,GoSingerId:2,AlbumId:1,AlbumTitle:GreenSingerId:2,AlbumId:3,AlbumTitle:TerrifiedSingerId:1,AlbumId:1,AlbumTitle:TotalJunkSingerId:1,AlbumId:2,AlbumTitle:Go,Go,GoSingerId:1,AlbumId:1,AlbumTitle:TotalJunkSingerId:2,AlbumId:1,AlbumTitle:GreenSingerId:2,AlbumId:2,AlbumTitle:ForeverHoldyourPeaceSingerId:2,AlbumId:3,AlbumTitle:TerrifiedSuccessfullyexecutedread-onlytransaction.

Cleanup

To avoid incurring additional charges to your Cloud Billing account for theresources used in this tutorial, drop the database and delete the instance thatyou created.

Delete the database

If you delete an instance, all databases within it are automatically deleted.This step shows how to delete a database without deleting an instance (you wouldstill incur charges for the instance).

On the command line

gcloudspannerdatabasesdeleteexample-db--instance=test-instance

Using the Google Cloud console

  1. Go to theSpanner Instances page in the Google Cloud console.

    Go to the Instances page

  2. Click the instance.

  3. Click the database that you want to delete.

  4. In theDatabase details page, clickDelete.

  5. Confirm that you want to delete the database and clickDelete.

Delete the instance

Deleting an instance automatically drops all databases created in that instance.

On the command line

gcloudspannerinstancesdeletetest-instance

Using the Google Cloud console

  1. Go to theSpanner Instances page in the Google Cloud console.

    Go to the Instances page

  2. Click your instance.

  3. ClickDelete.

  4. Confirm that you want to delete the instance and clickDelete.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-19 UTC.