Export metadata from Dataproc Metastore

This page explains how to export metadata fromDataproc Metastore.

The export metadata feature lets you save your metadata in a portable storageformat.

After you export your data, you can thenimport themetadata into anotherDataproc Metastore service or a self-managed Hive Metastore(HMS).

About exporting metadata

When you export metadata from Dataproc Metastore, the servicestores the data in one of the following file formats:

  • A set of Avro files stored in a folder.
  • A single MySQL dump file stored in a Cloud Storage folder.

Avro

Avro based exports are only supported for Hive versions 2.3.6 and 3.1.2. Whenyou export Avro files, Dataproc Metastore creates a<table-name>.avro file for each table in your database.

To export Avro files, your Dataproc Metastore service can usethe MySQL or Spanner database type.

MySQL

MySQL based exports are supported for all versions of Hive. When you exportMySQL files, Dataproc Metastore creates a single SQL file thatcontains all your table information.

To export MySQL files, your Dataproc Metastore service must usethe MySQL database type. The Spanner database type doesn't support MySQLimports.

Before you begin

Required roles

To get the permissions that you need to export metadata into Dataproc Metastore, ask your administrator to grant you the following IAM roles:

For more information about granting roles, seeManage access to projects, folders, and organizations.

These predefined roles contain the permissions required to export metadata into Dataproc Metastore. To see the exact permissions that are required, expand theRequired permissions section:

Required permissions

The following permissions are required to export metadata into Dataproc Metastore:

  • To export metadata: metastore.services.export on the metastore service
  • For MySQL and Avro, to use the Cloud Storage object for export, grant your user account and the Dataproc Metastore service agent: storage.objects.create on the Cloud Storage bucket

You might also be able to get these permissions withcustom roles or otherpredefined roles.

For more information about specific Dataproc Metastore roles and permissions, seeDataproc Metastore IAM overview.

Export metadata

Before exporting your metadata, note the following considerations:

  • While an export is running, you can't update aDataproc Metastore service — for example changingconfiguration settings. However, you can still use it for normal operations,such as using it to access its metadata from attached Dataprocor self-managed clusters.
  • The metadata export feature only exports metadata. Data that's created byApache Hive in internal tables isn't replicated in the export.

To export metadata from a Dataproc Metastore service, perform thefollowing steps.

Console

  1. In the Google Cloud console, open theDataproc Metastore page:

    Open Dataproc Metastore

  2. On theDataproc Metastore page, click the name of the serviceyou want to export metadata from.

    TheService detail page opens.

    Service detail page
    Dataproc Metastore Service detail page
  3. In the navigation bar, clickExport.

    TheExport metadata page opens.

  4. In theDestination section, choose eitherMySQL orAvro.

  5. In theDestination URI field, clickBrowse and select theCloud Storage URI where you want to export your files to.

    You can also enter your bucket location in the provided text field. Usethe following format:bucket/object orbucket/folder/object.

    Note: To store your exported metadata, the export job creates a newsubfolder in yourCloud Storage URI destination.
  6. To start the export, clickSubmit.

    When finished, your export appears in a table on theService detailpage on theImport/Export tab.

    When the export completes, Dataproc Metastore automaticallyreturns to theactive state,regardless of whether or not the export succeeded.

gcloud CLI

  1. To export metadata from a service, run the followinggcloud metastore services export gcs command:

    gcloud metastore services export gcsSERVICE \    --location=LOCATION \    --destination-folder=gs://bucket-name/path/to/folder \    --dump-type=DUMP_TYPE

    Replace the following:

    • SERVICE: the name of yourDataproc Metastore service.
    • LOCATION: the Google Cloud region in which yourDataproc Metastore service resides.
    • bucket-name/path/to/folder: the Cloud Storagedestination folder where you want to store your export.
    • DUMP_TYPE: the type of database dump to begenerated by the export. Accepted values includemysql andavro.The default value ismysql.
  2. Verify that the export was successful.

    When the export completes, Dataproc Metastore automaticallyreturns to theactive state,regardless of whether or not the export succeeded.

REST

Follow the API instructions toexport metadata into a service by using the APIs Explorer.

When the export completes, the service automatically returns to theactive state,regardless of whether or not it succeeded.

View export history

To view the export history of a Dataproc Metastore service in theGoogle Cloud console, complete the following steps:

  1. In the Google Cloud console, open theDataproc Metastore page.
  2. In the navigation bar, clickImport/Export.

    Your export history appears in theExport history table.

    The history displays up to the last 25 exports.

Deleting a Dataproc Metastore service also deletes all associatedexport history.

Troubleshoot common issues

Some common issues include the following:

For more help solving common troubleshooting issues, seeImport and export error scenarios.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-19 UTC.