Movatterモバイル変換


[0]ホーム

URL:


Skip to main content
OurBuilding Ambient Agents with LangGraph course is now available on LangChain Academy!
Open In ColabOpen on GitHub

How to use BaseChatMessageHistory with LangGraph

Prerequisites

This guide assumes familiarity with the following concepts:

We recommend that new LangChain applications take advantage of thebuilt-in LangGraph persistence to implement memory.

In some situations, users may need to keep using an existing persistence solution for chat message history.

Here, we will show how to useLangChain chat message histories (implementations ofBaseChatMessageHistory) with LangGraph.

Set up

%%capture--no-stderr
%pip install--upgrade--quiet langchain-anthropic langgraph
import os
from getpassimport getpass

if"ANTHROPIC_API_KEY"notin os.environ:
os.environ["ANTHROPIC_API_KEY"]= getpass()

ChatMessageHistory

A message history needs to be parameterized by a conversation ID or maybe by the 2-tuple of (user ID, conversation ID).

Many of theLangChain chat message histories will have either asession_id or somenamespace to allow keeping track of different conversations. Please refer to the specific implementations to check how it is parameterized.

The built-inInMemoryChatMessageHistory does not contains such a parameterization, so we'll create a dictionary to keep track of the message histories.

import uuid

from langchain_core.chat_historyimport InMemoryChatMessageHistory

chats_by_session_id={}


defget_chat_history(session_id:str)-> InMemoryChatMessageHistory:
chat_history= chats_by_session_id.get(session_id)
if chat_historyisNone:
chat_history= InMemoryChatMessageHistory()
chats_by_session_id[session_id]= chat_history
return chat_history

Use with LangGraph

Next, we'll set up a basic chat bot using LangGraph. If you're not familiar with LangGraph, you should look at the followingQuick Start Tutorial.

We'll create aLangGraph node for the chat model, and manually manage the conversation history, taking into account the conversation ID passed as part of the RunnableConfig.

The conversation ID can be passed as either part of the RunnableConfig (as we'll do here), or as part of thegraph state.

import uuid

from langchain_anthropicimport ChatAnthropic
from langchain_core.messagesimport BaseMessage, HumanMessage
from langchain_core.runnablesimport RunnableConfig
from langgraph.graphimport START, MessagesState, StateGraph

# Define a new graph
builder= StateGraph(state_schema=MessagesState)

# Define a chat model
model= ChatAnthropic(model="claude-3-haiku-20240307")


# Define the function that calls the model
defcall_model(state: MessagesState, config: RunnableConfig)->list[BaseMessage]:
# Make sure that config is populated with the session id
if"configurable"notin configor"session_id"notin config["configurable"]:
raise ValueError(
"Make sure that the config includes the following information: {'configurable': {'session_id': 'some_value'}}"
)
# Fetch the history of messages and append to it any new messages.
chat_history= get_chat_history(config["configurable"]["session_id"])
messages=list(chat_history.messages)+ state["messages"]
ai_message= model.invoke(messages)
# Finally, update the chat message history to include
# the new input message from the user together with the
# response from the model.
chat_history.add_messages(state["messages"]+[ai_message])
return{"messages": ai_message}


# Define the two nodes we will cycle between
builder.add_edge(START,"model")
builder.add_node("model", call_model)

graph= builder.compile()

# Here, we'll create a unique session ID to identify the conversation
session_id= uuid.uuid4()
config={"configurable":{"session_id": session_id}}

input_message= HumanMessage(content="hi! I'm bob")
for eventin graph.stream({"messages":[input_message]}, config, stream_mode="values"):
event["messages"][-1].pretty_print()

# Here, let's confirm that the AI remembers our name!
input_message= HumanMessage(content="what was my name?")
for eventin graph.stream({"messages":[input_message]}, config, stream_mode="values"):
event["messages"][-1].pretty_print()
================================ Human Message =================================

hi! I'm bob
================================== Ai Message ==================================

Hello Bob! It's nice to meet you. I'm Claude, an AI assistant created by Anthropic. How are you doing today?
================================ Human Message =================================

what was my name?
================================== Ai Message ==================================

You introduced yourself as Bob when you said "hi! I'm bob".
tip

This also supports streaming LLM content token by token if using langgraph >= 0.2.28.

from langchain_core.messagesimport AIMessageChunk

first=True

for msg, metadatain graph.stream(
{"messages": input_message}, config, stream_mode="messages"
):
if msg.contentandnotisinstance(msg, HumanMessage):
print(msg.content, end="|", flush=True)
API Reference:AIMessageChunk
You| sai|d your| name was Bob.|

Using With RunnableWithMessageHistory

This how-to guide used themessages andadd_messages interface ofBaseChatMessageHistory directly.

Alternatively, you can useRunnableWithMessageHistory, asLCEL can be used inside anyLangGraph node.

To do that replace the following code:

defcall_model(state: MessagesState, config: RunnableConfig)->list[BaseMessage]:
# Make sure that config is populated with the session id
if"configurable"notin configor"session_id"notin config["configurable"]:
raise ValueError(
"You make sure that the config includes the following information: {'configurable': {'session_id': 'some_value'}}"
)
# Fetch the history of messages and append to it any new messages.
chat_history= get_chat_history(config["configurable"]["session_id"])
messages=list(chat_history.messages)+ state["messages"]
ai_message= model.invoke(messages)
# Finally, update the chat message history to include
# the new input message from the user together with the
# response from the model.
chat_history.add_messages(state["messages"]+[ai_message])
# hilight-end
return{"messages": ai_message}

With the corresponding instance ofRunnableWithMessageHistory defined in your current application.

runnable= RunnableWithMessageHistory(...)# From existing code

defcall_model(state: MessagesState, config: RunnableConfig)->list[BaseMessage]:
# RunnableWithMessageHistory takes care of reading the message history
# and updating it with the new human message and ai response.
ai_message= runnable.invoke(state['messages'], config)
return{
"messages": ai_message
}

[8]ページ先頭

©2009-2025 Movatter.jp