Import and export data Stay organized with collections Save and categorize content based on your preferences.
You can use the managed export and import service torecover from accidental deletion of data and to export data for offlineprocessing. You can export all documents or just specific collections. Likewise,you can import all data from an export or only specific collections. Dataexported from oneCloud Firestore database can be imported into anotherCloud Firestore database. You can alsoloadCloud Firestore exports intoBigQuery.
This page describes how to export and importCloud Firestore documents usingthe managed export and import service andCloud Storage. TheCloud Firestore managed export and import service is available throughthegcloudcommand-line tool and theCloud FirestoreAPI (REST,RPC).
Before you begin
Before you can use the managed export and import service, you must complete thefollowing tasks:
- Enable billing for yourGoogle Cloud project. OnlyGoogle Cloud projects with billing enabled can use the export and import functionality.
- Create aCloud Storage bucket for your project in a location nearyourCloud Firestore database location. You cannot use a Requester Pays bucket for export and import operations.
Make sure your account has the necessary permissions forCloud Firestore andCloud Storage.If you are the project owner, your account has the required permissions. Otherwise, the following roles grant the necessary permissions for export and import operations and for access toCloud Storage:
- Cloud Firestore roles:
Owner,Cloud Datastore Owner, orCloud Datastore Import Export AdminNote: TheseDatastore roles also grant permissions inCloud Firestore. Cloud Storage roles:
OwnerorStorage Admin
- Cloud Firestore roles:
Service agent permissions
Export and import operations use aCloud Firestore service agent toauthorizeCloud Storage operations. TheCloud Firestore service agentuses the following naming convention:
- Cloud Firestore service agent
service-PROJECT_NUMBER@gcp-sa-firestore.iam.gserviceaccount.com
To learn more about service agents, seeService agents.
Note:Cloud Firestore previously used theApp Engine default serviceaccount instead of theCloud Firestore service agent. If your databasestill uses theApp Engine service account to import or export data, werecommend that youmigrate to the service specificCloud Firestore service agent.You canview which account your import and export operations usein the Google Cloud console.If you use VPC Service Controls, you must use the service-specificCloud Firestore service agent to fully protect import and exportoperations. VPC Service Controls are not compatible with theApp Engine serviceaccount.
TheCloud Firestore service agent requires access to theCloud Storage bucket used in an export or import operation.If yourCloud Storage bucket is in the same project as yourCloud Firestoredatabase, then theCloud Firestore service agent can access the bucket by default.
If theCloud Storage bucket is in another project, then youmust give theCloud Firestore service agent access to theCloud Storagebucket.
Assign roles to the service agent
You can use thegsutil command-line tool toassign one of the roles below. For example, to assign the Storage Admin roleto theCloud Firestore service agent, run the following:
gsutiliamchserviceAccount:service-PROJECT_NUMBER@gcp-sa-firestore.iam.gserviceaccount.com:roles/storage.admin\gs://[BUCKET_NAME]
ReplacePROJECT_NUMBER with your project number, whichis used to name yourCloud Firestore service agent. To view theservice agent name, seeView service agent name.
Alternatively, you canassign this role using the Google Cloud console.
View service agent name
You can view the account that your import and export operations use to authorizerequests from theImport/Export page in the Google Cloud console. You can alsoview whether your database uses theCloud Firestoreservice agent or the legacyApp Engine service account.
- View the authorization account next to theImport/Export jobs run as label.
The service agent needs theStorage Admin role for theCloud Storagebucket to be used for the export or import operation.
Set upgcloud for your project
You can initiate import and export operations through the Google Cloud console orthegcloud command-line tool. To usegcloud, set up the command-line tooland connect to your project in one of the following ways:
Access
gcloudfrom theGoogle Cloud console usingCloud Shell.Make sure
gcloudis configured for the correct project:gcloudconfigsetproject[PROJECT_ID]
Import data
Once you have export files inCloud Storage, you can import documents in thosefiles back into your project or to another project. Note the following pointsabout import operations:
When you import data, the required indexes are updated using your database'scurrent index definitions. An export does not contain index definitions.
Imports don't assign new document IDs. Imports use the IDs captured atthe time of the export. As a document is being imported, its ID is reservedto prevent ID collisions. If a document with the same ID already exists, theimport overwrites the existing document.
If a document in your database is not affected by an import, it will remainin your database after the import.
The
.overall_export_metadatafilename must match the name of its parentfolder:gs://BUCKET_NAME/OPTIONAL_NAMESPACE_PATH/PARENT_FOLDER_NAME/PARENT_FOLDER_NAME.overall_export_metadataIf you move or copy the output files of an export, keep thePARENT_FOLDER_NAME and
.overall_export_metadatafilename thesame.An import to aCloud Firestore database from an export withsub-collections fails since sub-collections are not supported inCloud Firestore.
An import to a Cloud Firestore Standard edition database from an export withBSON types fails since BSON types are not supported in Cloud Firestore Standard edition.
An import to aCloud Firestore database cannot import data from non-default namespaces (Datastore API).
An import to aCloud Firestore database from data files that contain non-defaultnamespaces is permitted only if the export operation included a
--namespace-idsfilter with the default namespace. Only data from the default namespace is imported.
Import all documents from an export
Google Cloud Console
In the Google Cloud console, go to theDatabases page.
Select a database from the list of databases.
In the navigation menu, clickImport/Export.
ClickImport.
In theFilename field, enter the filename of an
.overall_export_metadatafile from a completed export operation. You canuse theBrowse button to help you select the file.ClickImport.
The console returns to theImport/Export page. If the operation successfully starts,the page adds an entry to the recent imports and exports page. On failure,the page displays an error message.
gcloud
Use thefirestore import command to import documents from aprevious export operation.
gcloud firestore import gs://[BUCKET_NAME]/[EXPORT_PREFIX]/ --database=[DATABASE]
Replace the following:
BUCKET_NAME/EXPORT_PREFIX: location of yourexport files.DATABASE: name of the database.
For example:
gcloud firestore import gs://my-bucket/2017-05-25T23:54:39_76544/ --database='cymbal'
You can confirm the location of your export files in theCloud Storage browser in the Google Cloud console:
Once you start an import operation, closing the terminal does not cancel theoperation, seecancel an operation.
Import specific collections
Note: To import specific collections, you must use the output of an export operation where youexported specific collections.Google Cloud Console
You cannot select specific collections in the console. Usegcloud instead.
gcloud
To import specific collections from a set of export files, use the--collection-ids flag. The operation importsonly the collections with the given collection IDs.Specify the database name using the--database flag.
Only an export of specific collections supports an import of specificcollections. You cannot import specific collections from an export of alldocuments.
gcloud firestore import gs://[BUCKET_NAME]/[EXPORT_PREFIX]/ \ --collection-ids=[COLLECTION_ID_1],[COLLECTION_ID_2] \ --database=[DATABASE]
Import from an export with PITR data
Use the same steps as inImport all documentsorImport specific collections to importPITR data. If any document already exists in your database, it will beoverwritten.
Export data
An export operation copies documents in your database to a set of files in aCloud Storage bucket. Note that an export is not an exact database snapshot takenat the export start time. An export may include changes made while the operationwas running.
Note: You mustexport specific collections if you plan to:Export all documents
Google Cloud Console
In the Google Cloud console, go to theDatabases page.
Select the required database from the list of databases.
In the navigation menu, clickImport/Export.
ClickExport.
Click theExport entire database option.
SelectExport current state of database to export current data.
In theDestination section, enter the name of aCloud Storage bucketor use theBrowse button to select a bucket.
ClickExport.
The console returns to theImport/Export page. If the operation successfully starts,the page adds an entry to the recent imports and exports page. On failure,the page displays an error message.
gcloud
Use thefirestore export command to export allthe documents in your database, replacing[BUCKET_NAME] with the name of yourCloud Storage bucket. Add the--async flag to prevent thegcloud tool fromwaiting for the operation to complete.
gcloud firestore export gs://[BUCKET_NAME] \ --database=[DATABASE]
Replace the following:
BUCKET_NAME: organize your exports by adding a file prefix afterthe bucket name, for example,BUCKET_NAME/my-exports-folder/export-name. Ifyou don't provide a file prefix, the managed export servicecreates one based on the current timestamp.DATABASE: name of the database from which you want to exportthe documents.
Once you start an export operation, closing the terminal does not cancel theoperation, seecancel an operation.
Export specific collections
Google Cloud Console
In the Google Cloud console, go to theDatabases page.
Select the required database from the list of databases.
In the navigation menu, clickImport/Export.
ClickExport.
Click theExport one or more collection groups option. Use the drop-downmenu to select one or more collections.
SelectExport current state of database to export current data.
In theDestination section, enter the name of aCloud Storage bucketor use theBrowse button to select a bucket.
ClickExport.
The console returns to theImport/Export page. If the operation successfully starts,the page adds an entry to the recent imports and exports page. On failure,the page displays an error message.
gcloud
To export specific collections, use the--collection-ids flag. The operation exports onlythe collections with the given collection IDs.
gcloud firestore export gs://[BUCKET_NAME] \--collection-ids=[COLLECTION_ID_1],[COLLECTION_ID_2] \--database=[DATABASE]
For example, you can design arestaurants collection in thefoo database to include additional collections, such asratings,reviews, oroutlets. To export specific collectionrestaurants andreviews, your command looks asfollows:
gcloud firestore export gs://[BUCKET_NAME] \--collection-ids=restaurants,reviews \--database='cymbal'
Export from a PITR timestamp
You can export your database toCloud Storage fromPITR data.You can export PITR data where the timestamp is a whole minute timestamp withinthe past seven days, but not earlier than theearliestVersionTime. If data nolonger exists at the specified timestamp, the export operation fails.
The PITR export operation supports all filters, including exporting alldocuments and exporting specific collections.
Note the following points before exporting PITR data:
- Specify the timestamp inRFC 3339 format. For example,
2020-09-01T23:59:30.234233Z. - Make sure that the timestamp you specify is a whole minute timestampwithin the past seven days, but not earlier than the
earliestVersionTime. If data no longer exists at the specifiedtimestamp, an error is generated. - You are not charged for a failed PITR export.
Console
In the Google Cloud console, go to theDatabases page.
Go to Databases- Select a database from the list of databases.
- In the navigation menu, clickImport/Export.
- ClickExport.
- Configure the export source to export either the entire database or only specific collections.
In theChoose the state of your database to export section,selectExport from an earlier point in time.
Select a snapshot time to use for the export
- In theDestination section, enter the name of aCloud Storagebucket or use theBrowse button to select a bucket.
ClickExport.
The console returns to theImport/Export page. If the operation successfully starts,the page adds an entry to the recent imports and exports page. On failure,the page displays an error message.
gcloud
You can export your database toCloud Storage fromPITR data using thegcloud firestore export command.
Export the database, specifying thesnapshot-time parameter to a recovery timestamp. Run the following command to export the database to your bucket.
gcloudfirestoreexportgs://[BUCKET_NAME_PATH]\--snapshot-time=[PITR_TIMESTAMP]
WherePITR_TIMESTAMP is a PITR timestamp at the minute granularity, for example,2023-05-26T10:20:00.00Z.
Add the--collection-ids flag to export specific collections.
Manage export and import operations
After you start an export or import operation,Cloud Firestore assignsthe operation a unique name. You can use the operation name to delete,cancel, or status check the operation.
Operation names are prefixed withprojects/[PROJECT_ID]/databases/[DATABASE_ID]/operations/,for example:
projects/my-project/databases/my-database/operations/ASA1MTAwNDQxNAgadGx1YWZlZAcSeWx0aGdpbi1zYm9qLW5pbWRhEgopEg
However, you can leave out the prefix when specifying an operation name forthedescribe,cancel, anddeletecommands.
List all export and import operations
Google Cloud Console
You can view a list of recent export and import operations in theImport/Export page of the Google Cloud console.
In the Google Cloud console, go to theDatabases page.
Select the required database from the list of databases.
In the navigation menu, clickImport/Export.
gcloud
Use theoperations list command to see all running andrecently completed export and import operations:
gcloud firestore operations list
Check operation status
Google Cloud Console
You can view the status of a recent export or import operation in theImport/Export page of the Google Cloud console.
In the Google Cloud console, go to theDatabases page.
Select the required database from the list of databases.
In the navigation menu, clickImport/Export.
gcloud
Use theoperations describe command to show the status of an exportor import operation.
gcloud firestore operations describe [OPERATION_NAME]
Estimate the completion time
A request for the status of a long-running operation returns the metricsworkEstimated andworkCompleted. Each of these metrics is returned in bothnumber of bytes and number of entities:
workEstimatedshows the estimated total number of bytes and documents anoperation will process.Cloud Firestore might omit this metric if itcannot make an estimate.workCompletedshows the number of bytes and documents processed so far.After the operation completes, the value shows the total number ofbytes and documents that were actually processed, which might be larger than thevalue ofworkEstimated.
DivideworkCompleted byworkEstimated for a rough progress estimate. Thisestimate might be inaccurate, because it depends on delayed statisticscollection.
Cancel an operation
Google Cloud Console
You can cancel a running export or import operation in theImport/Export page of the Google Cloud console.
In the Google Cloud console, go to theDatabases page.
Select the required database from the list of databases.
In the navigation menu, clickImport/Export.
In theRecent imports and exports table, currently runningoperations include aCancel button in theCompleted column. Click theCancel button to stop the operation. The button changes to aCancellingmessage and then toCancelled when the operation stops completely.

gcloud
Use theoperations cancel command to stop an operation in progress:
gcloud firestore operations cancel [OPERATION_NAME]
Cancelling a running operation does not undo the operation. A cancelled exportoperation will leave documents already exported inCloud Storage, and a cancelledimport operation will leave in place updates already made to your database. Youcannot import a partially completed export.
Delete an operation
Use thegcloud firestore operations delete command to removean operation from the list of recent operations. This command will not deleteexport files fromCloud Storage.
gcloud firestore operations delete [OPERATION_NAME]
Billing and pricing for export and import operations
You are required to enable billing for yourGoogle Cloud project before you usethe managed export and import service.
Export and import operations are chargedfor read units and write units at the rates listed inCloud Firestore pricing.
Output files stored inCloud Storage count towards yourCloud Storage data storage costs.
Export or import operations won't trigger yourGoogle Cloud budget alerts until after completion. Export and importoperations won't affect the usage shown in the usage section of the console.
Viewing export and import costs
Export and import operations apply thegoog-firestoremanaged:exportimportlabel to billed operations. In theCloud Billing reports page,you can use this label to view costs related to import and export operations:

goog-firestoremanaged label.Export to BigQuery
You can load data from aCloud Firestore export intoBigQuery,but only if you specified acollection-ids filter. SeeLoading data fromCloud Firestore exports.
When loadingCloud Firestore data into BigQuery, BSON data types arerepresented with theSTRING data type.
BigQuery column limit
BigQuery imposes a limit of 10,000 columns per table.Cloud Firestore export operations generate aBigQuery tableschema for each collection. In this schema, each unique field name withina collection becomes a schema column.
If a collection'sBigQuery schema surpasses 10,000 columns, theCloud Firestore export operation attempts to stay under the column limitby treating map fields as bytes. If this conversion brings thenumber of columns below 10,000, you can load the data intoBigQuery, but you cannot query the subfields within the map fields.If the number of columns still exceeds 10,000, the export operation does notgenerate aBigQuery schema for the collection and you cannot loadits data intoBigQuery.
Export format and metadata files
The output of a managed export uses theLevelDB log format.
Metadata files
An export operation creates a metadata file for each collectionyou specify. Metadata files are typically namedALL_NAMESPACES_KIND_[COLLECTION_GROUP_ID].export_metadata.
The metadata files are protocol buffers and you can decode them with theprotoc protocol compiler.For example, you can decode a metadata file to determine the collectionsthe export files contain:
protoc --decode_raw < export0.export_metadata
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-02-18 UTC.