- Notifications
You must be signed in to change notification settings - Fork92
LangChain abstractions backed by Postgres Backend
License
langchain-ai/langchain-postgres
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Thelangchain-postgres
package implementations of core LangChain abstractions usingPostgres
.
The package is released under the MIT license.
Feel free to use the abstraction as provided or else modify them / extend them as appropriate for your own application.
The package supports theasyncpg andpsycopg3 drivers.
pip install -U langchain-postgres
Warning
In v0.0.14+,PGVector
is deprecated. Please migrate toPGVectorStore
for improved performance and manageability.See themigration guide for details on how to migrate fromPGVector
toPGVectorStore
.
fromlangchain_core.documentsimportDocumentfromlangchain_core.embeddingsimportDeterministicFakeEmbeddingfromlangchain_postgresimportPGEngine,PGVectorStore# Replace the connection string with your own Postgres connection stringCONNECTION_STRING="postgresql+psycopg3://langchain:langchain@localhost:6024/langchain"engine=PGEngine.from_connection_string(url=CONNECTION_STRING)# Replace the vector size with your own vector sizeVECTOR_SIZE=768embedding=DeterministicFakeEmbedding(size=VECTOR_SIZE)TABLE_NAME="my_doc_collection"engine.init_vectorstore_table(table_name=TABLE_NAME,vector_size=VECTOR_SIZE,)store=PGVectorStore.create_sync(engine=engine,table_name=TABLE_NAME,embedding_service=embedding,)docs= [Document(page_content="Apples and oranges"),Document(page_content="Cars and airplanes"),Document(page_content="Train")]store.add_documents(docs)query="I'd like a fruit."docs=store.similarity_search(query)print(docs)
Tip
All synchronous functions have corresponding asynchronous functions
The chat message history abstraction helps to persist chat message historyin a postgres table.
PostgresChatMessageHistory is parameterized using atable_name
and asession_id
.
Thetable_name
is the name of the table in the database wherethe chat messages will be stored.
Thesession_id
is a unique identifier for the chat session. It can be assignedby the caller usinguuid.uuid4()
.
importuuidfromlangchain_core.messagesimportSystemMessage,AIMessage,HumanMessagefromlangchain_postgresimportPostgresChatMessageHistoryimportpsycopg# Establish a synchronous connection to the database# (or use psycopg.AsyncConnection for async)conn_info= ...# Fill in with your connection infosync_connection=psycopg.connect(conn_info)# Create the table schema (only needs to be done once)table_name="chat_history"PostgresChatMessageHistory.create_tables(sync_connection,table_name)session_id=str(uuid.uuid4())# Initialize the chat history managerchat_history=PostgresChatMessageHistory(table_name,session_id,sync_connection=sync_connection)# Add messages to the chat historychat_history.add_messages([SystemMessage(content="Meow"),AIMessage(content="woof"),HumanMessage(content="bark"),])print(chat_history.messages)
Google Cloud provides Vector Store, Chat Message History, and Data Loader integrations forAlloyDB andCloud SQL for PostgreSQL databases via the following PyPi packages:
Using the Google Cloud integrations provides the following benefits:
- Enhanced Security: Securely connect to Google Cloud databases utilizing IAM for authorization and database authentication without needing to manage SSL certificates, configure firewall rules, or enable authorized networks.
- Simplified and Secure Connections: Connect to Google Cloud databases effortlessly using the instance name instead of complex connection strings. The integrations creates a secure connection pool that can be easily shared across your application using the
engine
object.
Vector Store | Metadata filtering | Async support | Schema Flexibility | Improved metadata handling | Hybrid Search |
---|---|---|---|---|---|
Google AlloyDB | ✓ | ✓ | ✓ | ✓ | ✗ |
Google Cloud SQL Postgres | ✓ | ✓ | ✓ | ✓ | ✗ |
About
LangChain abstractions backed by Postgres Backend
Topics
Resources
License
Security policy
Uh oh!
There was an error while loading.Please reload this page.