Preview table data Stay organized with collections Save and categorize content based on your preferences.
Retrieve selected row data from a table.
Explore further
For detailed documentation that includes this code sample, see the following:
Code sample
C#
Before trying this sample, follow theC# setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryC# API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
usingGoogle.Api.Gax;usingGoogle.Apis.Bigquery.v2.Data;usingGoogle.Cloud.BigQuery.V2;usingSystem;usingSystem.Linq;publicclassBigQueryBrowseTable{publicvoidBrowseTable(stringprojectId="your-project-id"){BigQueryClientclient=BigQueryClient.Create(projectId);TableReferencetableReference=newTableReference(){TableId="shakespeare",DatasetId="samples",ProjectId="bigquery-public-data"};// Load all rows from a tablePagedEnumerable<TableDataList,BigQueryRow>result=client.ListRows(tableReference:tableReference,schema:null);// Print the first 10 rowsforeach(BigQueryRowrowinresult.Take(10)){Console.WriteLine($"{row["corpus"]}: {row["word_count"]}");}}}Go
Before trying this sample, follow theGo setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryGo API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
import("context""fmt""io""cloud.google.com/go/bigquery""google.golang.org/api/iterator")// browseTable demonstrates reading data from a BigQuery table directly without the use of a query.// For large tables, we also recommend the BigQuery Storage API.funcbrowseTable(wio.Writer,projectID,datasetID,tableIDstring)error{// projectID := "my-project-id"// datasetID := "mydataset"// tableID := "mytable"ctx:=context.Background()client,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %w",err)}deferclient.Close()table:=client.Dataset(datasetID).Table(tableID)it:=table.Read(ctx)for{varrow[]bigquery.Valueerr:=it.Next(&row)iferr==iterator.Done{break}iferr!=nil{returnerr}fmt.Fprintln(w,row)}returnnil}Java
Before trying this sample, follow theJava setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryJava API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQuery.TableDataListOption;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.TableId;importcom.google.cloud.bigquery.TableResult;// Sample to directly browse a table with optional pagingpublicclassBrowseTable{publicstaticvoidmain(String[]args){// TODO(developer): Replace these variables before running the sample.Stringtable="MY_TABLE_NAME";Stringdataset="MY_DATASET_NAME";browseTable(dataset,table);}publicstaticvoidbrowseTable(Stringdataset,Stringtable){try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();// Identify the table itselfTableIdtableId=TableId.of(dataset,table);// Page over 100 records. If you don't need pagination, remove the pageSize parameter.TableResultresult=bigquery.listTableData(tableId,TableDataListOption.pageSize(100));// Print the recordsresult.iterateAll().forEach(row->{row.forEach(fieldValue->System.out.print(fieldValue.toString()+", "));System.out.println();});System.out.println("Query ran successfully");}catch(BigQueryExceptione){System.out.println("Query failed to run \n"+e.toString());}}}Node.js
Before trying this sample, follow theNode.js setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryNode.js API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
// Import the Google Cloud client library using default credentialsconst{BigQuery}=require('@google-cloud/bigquery');constbigquery=newBigQuery();asyncfunctionbrowseTable(){// Retrieve a table's rows using manual pagination./** * TODO(developer): Uncomment the following lines before running the sample. */// const datasetId = 'my_dataset'; // Existing dataset// const tableId = 'my_table'; // Table to createconstquery=`SELECT name, SUM(number) as total_people FROM \`bigquery-public-data.usa_names.usa_1910_2013\` GROUP BY name ORDER BY total_people DESC LIMIT 100`;// Create table reference.constdataset=bigquery.dataset(datasetId);constdestinationTable=dataset.table(tableId);// For all options, see https://cloud.google.com/bigquery/docs/reference/rest/v2/Job#jobconfigurationqueryconstqueryOptions={query:query,destination:destinationTable,};// Run the query as a jobconst[job]=awaitbigquery.createQueryJob(queryOptions);// For all options, see https://cloud.google.com/bigquery/docs/reference/v2/jobs/getQueryResultsconstqueryResultsOptions={// Retrieve zero resulting rows.maxResults:0,};// Wait for the job to finish.awaitjob.getQueryResults(queryResultsOptions);functionmanualPaginationCallback(err,rows,nextQuery){rows.forEach(row=>{console.log(`name:${row.name},${row.total_people} total people`);});if(nextQuery){// More results exist.destinationTable.getRows(nextQuery,manualPaginationCallback);}}// For all options, see https://cloud.google.com/bigquery/docs/reference/v2/tabledata/listconstgetRowsOptions={autoPaginate:false,maxResults:20,};// Retrieve all rows.destinationTable.getRows(getRowsOptions,manualPaginationCallback);}browseTable();PHP
Before trying this sample, follow thePHP setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryPHP API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
use Google\Cloud\BigQuery\BigQueryClient;/** * Browses the given table for data * * @param string $projectId The project Id of your Google Cloud Project. * @param string $datasetId The BigQuery dataset ID. * @param string $tableId The BigQuery table ID. * @param int $startIndex Zero-based index of the starting row. */function browse_table( string $projectId, string $datasetId, string $tableId, int $startIndex = 0): void { // Query options $maxResults = 10; $options = [ 'maxResults' => $maxResults, 'startIndex' => $startIndex ]; $bigQuery = new BigQueryClient([ 'projectId' => $projectId, ]); $dataset = $bigQuery->dataset($datasetId); $table = $dataset->table($tableId); $numRows = 0; foreach ($table->rows($options) as $row) { print('---'); foreach ($row as $column => $value) { printf('%s: %s' . PHP_EOL, $column, $value); } $numRows++; }}Python
Before trying this sample, follow thePython setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryPython API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
fromgoogle.cloudimportbigquery# Construct a BigQuery client object.client=bigquery.Client()# TODO(developer): Set table_id to the ID of the table to browse data rows.# table_id = "your-project.your_dataset.your_table_name"# Download all rows from a table.rows_iter=client.list_rows(table_id)# Make an API request.# Iterate over rows to make the API requests to fetch row data.rows=list(rows_iter)print("Downloaded{} rows from table{}".format(len(rows),table_id))# Download at most 10 rows.rows_iter=client.list_rows(table_id,max_results=10)rows=list(rows_iter)print("Downloaded{} rows from table{}".format(len(rows),table_id))# Specify selected fields to limit the results to certain columns.table=client.get_table(table_id)# Make an API request.fields=table.schema[:2]# First two columns.rows_iter=client.list_rows(table_id,selected_fields=fields,max_results=10)print("Selected{} columns from table{}.".format(len(rows_iter.schema),table_id))rows=list(rows_iter)print("Downloaded{} rows from table{}".format(len(rows),table_id))# Print row data in tabular format.rows_iter=client.list_rows(table_id,max_results=10)format_string="{!s:<16} "*len(rows_iter.schema)field_names=[field.nameforfieldinrows_iter.schema]print(format_string.format(*field_names))# Prints column headers.forrowinrows_iter:print(format_string.format(*row))# Prints row data.Ruby
Before trying this sample, follow theRuby setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryRuby API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
require"google/cloud/bigquery"defbrowse_tablebigquery=Google::Cloud::Bigquery.newproject_id:"bigquery-public-data"dataset=bigquery.dataset"samples"table=dataset.table"shakespeare"# Load all rows from a tablerows=table.data# Load the first 10 rowsrows=table.datamax:10# Print row datarows.each{|row|putsrow}endWhat's next
To search and filter code samples for other Google Cloud products, see theGoogle Cloud sample browser.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.