Bigtable for LangChain
Quick Start
In order to use this library, you first need to go through the followingsteps:
Installation
Install this library in avirtualenv using pip.virtualenv is a tool to create isolated Python environments. The basic problem it addresses isone of dependencies and versions, and indirectly permissions.
Withvirtualenv, it’s possible to install this library without needing system install permissions, and without clashing with the installed system dependencies.
Supported Python Versions
Python >= 3.9
Mac/Linux
pip install virtualenvvirtualenv <your-env>source <your-env>/bin/activate<your-env>/bin/pip install langchain-google-bigtableWindows
pip install virtualenvvirtualenv <your-env><your-env>\Scripts\activate<your-env>\Scripts\pip.exe install langchain-google-bigtableVector Store Usage
UseBigtableVectorStore to store documents and their vector embeddings, allowing you to search for the most similar or relevant documents from your database.
Full VectorStore Implementation: Supports all methods from the LangChain
VectorStoreabstract class.Async/Sync Support: All methods are available in both asynchronous and synchronous versions.
Metadata Filtering: Supports filtering on metadata fields, including logical AND/OR combinations and filtering on document IDs with a specific prefix.
Multiple Distance Strategies: Supports both Cosine and Euclidean distance for similarity search.
Customizable Storage: Full control over how content, embeddings, and metadata are stored in Bigtable columns.
from langchain_google_bigtable import BigtableVectorStore, BigtableEngine# Your embedding service and other configurations# embedding_service = ...engine = await BigtableEngine.async_initialize(project_id="your-project-id")vector_store = await BigtableVectorStore.create( engine=engine, instance_id="your-instance-id", table_id="your-table-id", embedding_service=embedding_service, collection="your_collection_name",)See the fullVector Store tutorial.
Key-value Store Usage
UseBigtableByteStore for a key-value store in LangChain
ByteStore Interface: Follows LangChain’s
ByteStorefor string keys and byte values.Sync/Async: Supports both synchronous and asynchronous operations.
BigtableEngine: Manages execution context.
from langchain_google_bigtable import BigtableByteStore, BigtableEngineengine = await BigtableEngine.async_initialize(project_id="your-project-id")store = await BigtableByteStore.create( engine=engine, instance_id="your-instance-id", table_id="your-table-id",)await store.amset([("key", b"value")])retrieved = await store.amget(["key"])See the fullKey-value Store tutorial.
Document Loader Usage
Use a document loader to load data as LangChainDocuments.
from langchain_google_bigtable import BigtableLoaderloader = BigtableLoader( instance_id="my-instance", table_id="my-table-name")docs = loader.lazy_load()See the fullDocument Loader tutorial.
Chat Message History Usage
UseChatMessageHistory to store messages and provide conversationhistory to LLMs.
from langchain_google_bigtable import BigtableChatMessageHistoryhistory = BigtableChatMessageHistory( instance_id="my-instance", table_id="my-message-store", session_id="my-session_id")See the fullChat Message History tutorial.
Contributions
Contributions to this library are always welcome and highly encouraged.
SeeCONTRIBUTING for more information how to get started.
Please note that this project is released with a Contributor Code of Conduct. By participating inthis project you agree to abide by its terms. SeeCode of Conduct for moreinformation.
License
Apache 2.0 - SeeLICENSEfor more information.
Disclaimer
This is not an officially supported Google product.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-10-30 UTC.