- Notifications
You must be signed in to change notification settings - Fork21
BigQuery Storage Node.js client
License
Apache-2.0, Apache-2.0 licenses found
Licenses found
googleapis/nodejs-bigquery-storage
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Node.js idiomatic client forBigQuery Storage.
The BigQuery Storage product is divided into two major APIs: Write and Read API.BigQuery Storage API does not provide functionality related to managing BigQueryresources such as datasets, jobs, or tables.
The BigQuery Storage Write API is a unified data-ingestion API for BigQuery.It combines streaming ingestion and batch loading into a single high-performance API.You can use the Storage Write API to stream records into BigQuery in real time orto batch process an arbitrarily large number of records and commit them in a singleatomic operation.
Read more in ourintroduction guide.
Using a system provided default stream, this code sample demonstrates using theschema of a destination stream/table to construct a writer, and send severalbatches of row data to the table.
const{adapt, managedwriter}=require('@google-cloud/bigquery-storage');const{WriterClient, JSONWriter}=managedwriter;asyncfunctionappendJSONRowsDefaultStream(){constprojectId='my_project';constdatasetId='my_dataset';consttableId='my_table';constdestinationTable=`projects/${projectId}/datasets/${datasetId}/tables/${tableId}`;constwriteClient=newWriterClient({projectId});try{constwriteStream=awaitwriteClient.getWriteStream({streamId:`${destinationTable}/streams/_default`,view:'FULL'});constprotoDescriptor=adapt.convertStorageSchemaToProto2Descriptor(writeStream.tableSchema,'root');constconnection=awaitwriteClient.createStreamConnection({streamId:managedwriter.DefaultStream, destinationTable,});conststreamId=connection.getStreamId();constwriter=newJSONWriter({ streamId, connection, protoDescriptor,});letrows=[];constpendingWrites=[];// Row 1letrow={row_num:1,customer_name:'Octavia',};rows.push(row);// Row 2row={row_num:2,customer_name:'Turing',};rows.push(row);// Send batch.letpw=writer.appendRows(rows);pendingWrites.push(pw);rows=[];// Row 3row={row_num:3,customer_name:'Bell',};rows.push(row);// Send batch.pw=writer.appendRows(rows);pendingWrites.push(pw);constresults=awaitPromise.all(pendingWrites.map(pw=>pw.getResult()));console.log('Write results:',results);}catch(err){console.log(err);}finally{writeClient.close();}}
The BigQuery Storage Read API provides fast access to BigQuery-managed storage byusing an gRPC based protocol. When you use the Storage Read API, structured data issent over the wire in a binary serialization format. This allows for additionalparallelism among multiple consumers for a set of results.
Read more how touse the BigQuery Storage Read API.
See sample code on theQuickstart section.
A comprehensive list of changes in each version may be found inthe CHANGELOG.
- Google BigQuery Storage Node.js Client API Reference
- Google BigQuery Storage Documentation
- github.com/googleapis/nodejs-bigquery-storage
Read more about the client libraries for Cloud APIs, including the olderGoogle APIs Client Libraries, inClient Libraries Explained.
Table of contents:
- Select or create a Cloud Platform project.
- Enable billing for your project.
- Enable the Google BigQuery Storage API.
- Set up authentication so you can access theAPI from your local workstation.
npm install @google-cloud/bigquery-storage
// The read stream contains blocks of Avro-encoded bytes. We use the// 'avsc' library to decode these blocks. Install avsc with the following// command: npm install avscconstavro=require('avsc');// See reference documentation at// https://cloud.google.com/bigquery/docs/reference/storageconst{BigQueryReadClient}=require('@google-cloud/bigquery-storage');constclient=newBigQueryReadClient();asyncfunctionbigqueryStorageQuickstart(){// Get current project ID. The read session is created in this project.// This project can be different from that which contains the table.constmyProjectId=awaitclient.getProjectId();// This example reads baby name data from the public datasets.constprojectId='bigquery-public-data';constdatasetId='usa_names';consttableId='usa_1910_current';consttableReference=`projects/${projectId}/datasets/${datasetId}/tables/${tableId}`;constparent=`projects/${myProjectId}`;/* We limit the output columns to a subset of those allowed in the table, * and set a simple filter to only report names from the state of * Washington (WA). */constreadOptions={selectedFields:['name','number','state'],rowRestriction:'state = "WA"',};lettableModifiers=null;constsnapshotSeconds=0;// Set a snapshot time if it's been specified.if(snapshotSeconds>0){tableModifiers={snapshotTime:{seconds:snapshotSeconds}};}// API request.constrequest={ parent,readSession:{table:tableReference,// This API can also deliver data serialized in Apache Arrow format.// This example leverages Apache Avro.dataFormat:'AVRO', readOptions, tableModifiers,},};const[session]=awaitclient.createReadSession(request);constschema=JSON.parse(session.avroSchema.schema);constavroType=avro.Type.forSchema(schema);/* The offset requested must be less than the last * row read from ReadRows. Requesting a larger offset is * undefined. */letoffset=0;constreadRowsRequest={// Required stream name and optional offset. Offset requested must be less than// the last row read from readRows(). Requesting a larger offset is undefined.readStream:session.streams[0].name, offset,};constnames=newSet();conststates=[];/* We'll use only a single stream for reading data from the table. Because * of dynamic sharding, this will yield all the rows in the table. However, * if you wanted to fan out multiple readers you could do so by having a * reader process each individual stream. */client.readRows(readRowsRequest).on('error',console.error).on('data',data=>{offset=data.avroRows.serializedBinaryRows.offset;try{// Decode all rows in bufferletpos;do{constdecodedData=avroType.decode(data.avroRows.serializedBinaryRows,pos,);if(decodedData.value){names.add(decodedData.value.name);if(!states.includes(decodedData.value.state)){states.push(decodedData.value.state);}}pos=decodedData.offset;}while(pos>0);}catch(error){console.log(error);}}).on('end',()=>{console.log(`Got${names.size} unique names in states:${states}`);console.log(`Last offset:${offset}`);});}
Samples are in thesamples/ directory. Each sample'sREADME.md has instructions for running its sample.
| Sample | Source Code | Try it |
|---|---|---|
| Append_rows_buffered | source code | ![]() |
| Append_rows_json_writer_committed | source code | ![]() |
| Append_rows_json_writer_default | source code | ![]() |
| Append_rows_pending | source code | ![]() |
| Append_rows_proto2 | source code | ![]() |
| Append_rows_table_to_proto2 | source code | ![]() |
| Customer_record_pb | source code | ![]() |
| BigQuery Storage Quickstart | source code | ![]() |
| Sample_data_pb | source code | ![]() |
TheGoogle BigQuery Storage Node.js Client API Reference documentationalso contains samples.
Our client libraries follow theNode.js release schedule.Libraries are compatible with all currentactive andmaintenance versions ofNode.js.If you are using an end-of-life version of Node.js, we recommend that you updateas soon as possible to an actively supported LTS version.
Google's client libraries support legacy versions of Node.js runtimes on abest-efforts basis with the following warnings:
- Legacy versions are not tested in continuous integration.
- Some security patches and features cannot be backported.
- Dependencies cannot be kept up-to-date.
Client libraries targeting some end-of-life versions of Node.js are available, andcan be installed through npmdist-tags.The dist-tags follow the naming conventionlegacy-(version).For example,npm install @google-cloud/bigquery-storage@legacy-8 installs client librariesfor versions compatible with Node.js 8.
This library followsSemantic Versioning.
This library is considered to bestable. The code surface will not change in backwards-incompatible waysunless absolutely necessary (e.g. because of critical security issues) or withan extensive deprecation period. Issues and requests againststable librariesare addressed with the highest priority.
More Information:Google Cloud Platform Launch Stages
Contributions welcome! See theContributing Guide.
Please note that thisREADME.md, thesamples/README.md,and a variety of configuration files in this repository (including.nycrc andtsconfig.json)are generated from a central template. To edit one of these files, make an editto its templates indirectory.
Apache Version 2.0
SeeLICENSE
About
BigQuery Storage Node.js client
Resources
License
Apache-2.0, Apache-2.0 licenses found
Licenses found
Code of conduct
Contributing
Security policy
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Uh oh!
There was an error while loading.Please reload this page.
