Movatterモバイル変換


[0]ホーム

URL:


Skip to main content
OurBuilding Ambient Agents with LangGraph course is now available on LangChain Academy!
Open In ColabOpen on GitHub

Google Firestore in Datastore Mode

Firestore in Datastore Mode is a NoSQL document database built for automatic scaling, high performance and ease of application development. Extend your database application to build AI-powered experiences leveraging Datastore's Langchain integrations.

This notebook goes over how to useFirestore in Datastore Mode tosave, load and delete langchain documents withDatastoreLoader andDatastoreSaver.

Learn more about the package onGitHub.

Open In Colab

Before You Begin

To run this notebook, you will need to do the following:

After confirmed access to database in the runtime environment of this notebook, filling the following values and run the cell before running example scripts.

🦜🔗 Library Installation

The integration lives in its ownlangchain-google-datastore package, so we need to install it.

%pip install-upgrade--quiet langchain-google-datastore

Colab only: Uncomment the following cell to restart the kernel or use the button to restart the kernel. For Vertex AI Workbench you can restart the terminal using the button on top.

# # Automatically restart kernel after installs so that your environment can access the new packages
# import IPython

# app = IPython.Application.instance()
# app.kernel.do_shutdown(True)

☁ Set Your Google Cloud Project

Set your Google Cloud project so that you can leverage Google Cloud resources within this notebook.

If you don't know your project ID, try the following:

# @markdown Please fill in the value below with your Google Cloud project ID and then run the cell.

PROJECT_ID="my-project-id"# @param {type:"string"}

# Set the project id
!gcloud configset project{PROJECT_ID}

🔐 Authentication

Authenticate to Google Cloud as the IAM user logged into this notebook in order to access your Google Cloud Project.

  • If you are using Colab to run this notebook, use the cell below and continue.
  • If you are using Vertex AI Workbench, check out the setup instructionshere.
from google.colabimport auth

auth.authenticate_user()

Basic Usage

Save documents

Save langchain documents withDatastoreSaver.upsert_documents(<documents>). By default it will try to extract the entity key from thekey in the Document metadata.

from langchain_core.documentsimport Document
from langchain_google_datastoreimport DatastoreSaver

saver= DatastoreSaver()

data=[Document(page_content="Hello, World!")]
saver.upsert_documents(data)
API Reference:Document

Save documents without key

If akind is specified the documents will be stored with an auto generated id.

saver= DatastoreSaver("MyKind")

saver.upsert_documents(data)

Load documents via Kind

Load langchain documents withDatastoreLoader.load() orDatastoreLoader.lazy_load().lazy_load returns a generator that only queries database during the iteration. To initializeDatastoreLoader class you need to provide:

  1. source - The source to load the documents. It can be an instance of Query or the name of the Datastore kind to read from.
from langchain_google_datastoreimport DatastoreLoader

loader= DatastoreLoader("MyKind")
data= loader.load()

Load documents via query

Other than loading documents from kind, we can also choose to load documents from query. For example:

from google.cloudimport datastore

client= datastore.Client(database="non-default-db", namespace="custom_namespace")
query_load= client.query(kind="MyKind")
query_load.add_filter("region","=","west_coast")

loader_document= DatastoreLoader(query_load)

data= loader_document.load()

Delete documents

Delete a list of langchain documents from Datastore withDatastoreSaver.delete_documents(<documents>).

saver= DatastoreSaver()

saver.delete_documents(data)

keys_to_delete=[
["Kind1","identifier"],
["Kind2",123],
["Kind3","identifier","NestedKind",456],
]
# The Documents will be ignored and only the document ids will be used.
saver.delete_documents(data, keys_to_delete)

Advanced Usage

Load documents with customized document page content & metadata

The arguments ofpage_content_properties andmetadata_properties will specify the Entity properties to be written into LangChain Documentpage_content andmetadata.

loader= DatastoreLoader(
source="MyKind",
page_content_fields=["data_field"],
metadata_fields=["metadata_field"],
)

data= loader.load()

Customize Page Content Format

When thepage_content contains only one field the information will be the field value only. Otherwise thepage_content will be in JSON format.

Customize Connection & Authentication

from google.authimport compute_engine
from google.cloud.firestoreimport Client

client= Client(database="non-default-db", creds=compute_engine.Credentials())
loader= DatastoreLoader(
source="foo",
client=client,
)

Related


[8]ページ先頭

©2009-2025 Movatter.jp