Movatterモバイル変換


[0]ホーム

URL:


Skip to main content
OurBuilding Ambient Agents with LangGraph course is now available on LangChain Academy!
Open In ColabOpen on GitHub

How to add values to a chain's state

An alternate way ofpassing data through steps of a chain is to leave the current values of the chain state unchanged while assigning a new value under a given key. TheRunnablePassthrough.assign() static method takes an input value and adds the extra arguments passed to the assign function.

This is useful in the commonLangChain Expression Language pattern of additively creating a dictionary to use as input to a later step.

Here's an example:

%pip install--upgrade--quiet langchain langchain-openai

import os
from getpassimport getpass

if"OPENAI_API_KEY"notin os.environ:
os.environ["OPENAI_API_KEY"]= getpass()
from langchain_core.runnablesimport RunnableParallel, RunnablePassthrough

runnable= RunnableParallel(
extra=RunnablePassthrough.assign(mult=lambda x: x["num"]*3),
modified=lambda x: x["num"]+1,
)

runnable.invoke({"num":1})
{'extra': {'num': 1, 'mult': 3}, 'modified': 2}

Let's break down what's happening here.

  • The input to the chain is{"num": 1}. This is passed into aRunnableParallel, which invokes the runnables it is passed in parallel with that input.
  • The value under theextra key is invoked.RunnablePassthrough.assign() keeps the original keys in the input dict ({"num": 1}), and assigns a new key calledmult. The value islambda x: x["num"] * 3), which is3. Thus, the result is{"num": 1, "mult": 3}.
  • {"num": 1, "mult": 3} is returned to theRunnableParallel call, and is set as the value to the keyextra.
  • At the same time, themodified key is called. The result is2, since the lambda extracts a key called"num" from its input and adds one.

Thus, the result is{'extra': {'num': 1, 'mult': 3}, 'modified': 2}.

Streaming

One convenient feature of this method is that it allows values to pass through as soon as they are available. To show this off, we'll useRunnablePassthrough.assign() to immediately return source docs in a retrieval chain:

from langchain_community.vectorstoresimport FAISS
from langchain_core.output_parsersimport StrOutputParser
from langchain_core.promptsimport ChatPromptTemplate
from langchain_core.runnablesimport RunnablePassthrough
from langchain_openaiimport ChatOpenAI, OpenAIEmbeddings

vectorstore= FAISS.from_texts(
["harrison worked at kensho"], embedding=OpenAIEmbeddings()
)
retriever= vectorstore.as_retriever()
template="""Answer the question based only on the following context:
{context}

Question: {question}
"""
prompt= ChatPromptTemplate.from_template(template)
model= ChatOpenAI()

generation_chain= prompt| model| StrOutputParser()

retrieval_chain={
"context": retriever,
"question": RunnablePassthrough(),
}| RunnablePassthrough.assign(output=generation_chain)

stream= retrieval_chain.stream("where did harrison work?")

for chunkin stream:
print(chunk)
{'question': 'where did harrison work?'}
{'context': [Document(page_content='harrison worked at kensho')]}
{'output': ''}
{'output': 'H'}
{'output': 'arrison'}
{'output': ' worked'}
{'output': ' at'}
{'output': ' Kens'}
{'output': 'ho'}
{'output': '.'}
{'output': ''}

We can see that the first chunk contains the original"question" since that is immediately available. The second chunk contains"context" since the retriever finishes second. Finally, the output from thegeneration_chain streams in chunks as soon as it is available.

Next steps

Now you've learned how to pass data through your chains to help format the data flowing through your chains.

To learn more, see the other how-to guides on runnables in this section.


[8]ページ先頭

©2009-2025 Movatter.jp