Movatterモバイル変換


[0]ホーム

URL:


SF, Wednesday: Agent Evals 101 →San Francisco, Wednesday - Marc (Langfuse CEO) on Agent Evals 101 →
GuidesCookbooksIntegration Langchain
This is a Jupyter notebook

Cookbook: Langchain Integration

This is a cookbook with examples of the Langfuse Integration for Langchain (Python).

Follow theintegration guide to add this integration to your Langchain project. The integration also supports Langchain JS.

Setup

%pip install langfuse langchain langchain_openai langchain_community--upgrade

Initialize the Langfuse client with your API keys from the project settings in the Langfuse UI and add them to your environment.

import os# Get keys for your project from the project settings page: https://cloud.langfuse.comos.environ["LANGFUSE_PUBLIC_KEY"]= "pk-lf-..."os.environ["LANGFUSE_SECRET_KEY"]= "sk-lf-..."os.environ["LANGFUSE_HOST"]= "https://cloud.langfuse.com" # 🇪🇺 EU region# os.environ["LANGFUSE_HOST"] = "https://us.cloud.langfuse.com" # 🇺🇸 US region# Your openai keyos.environ["OPENAI_API_KEY"]= "sk-proj-.."
from langfuse.langchainimport CallbackHandler# Initialize Langfuse CallbackHandler for Langchain (tracing)langfuse_handler= CallbackHandler()

Examples

Sequential Chain in Langchain Expression Language (LCEL)

Trace of Langchain LCEL

Example trace in Langfuse

from operatorimport itemgetterfrom langchain_openaiimport ChatOpenAIfrom langchain.promptsimport ChatPromptTemplatefrom langchain.schemaimport StrOutputParserlangfuse_handler= CallbackHandler()prompt1= ChatPromptTemplate.from_template("what is the city{person} is from?")prompt2= ChatPromptTemplate.from_template(    "what country is the city{city} in? respond in{language}")model= ChatOpenAI()chain1= prompt1| model| StrOutputParser()chain2= (    {"city": chain1,"language": itemgetter("language")}    | prompt2    | model    | StrOutputParser())chain2.invoke({"person":"obama","language":"spanish"},config={"callbacks":[langfuse_handler]})
'Barack Obama es de la ciudad de Chicago, Illinois, en los Estados Unidos.'

Runnable methods

Runnables are units of work that can be invoked, batched, streamed, transformed and composed.

The examples below show how to use the following methods with Langfuse:

  • invoke/ainvoke: Transforms a single input into an output.
  • batch/abatch: Efficiently transforms multiple inputs into outputs.
  • stream/astream: Streams output from a single input as it’s produced.
# Async Invokeawait chain2.ainvoke({"person":"biden","language":"german"},config={"callbacks":[langfuse_handler]})# Batchchain2.batch([{"person":"elon musk","language":"english"}, {"person":"mark zuckerberg","language":"english"}],config={"callbacks":[langfuse_handler]})# Async Batchawait chain2.abatch([{"person":"jeff bezos","language":"english"}, {"person":"tim cook","language":"english"}],config={"callbacks":[langfuse_handler]})# Streamfor chunkin chain2.stream({"person":"steve jobs","language":"english"},config={"callbacks":[langfuse_handler]}):    print("Streaming chunk:", chunk)# Async Streamasync for chunkin chain2.astream({"person":"bill gates","language":"english"},config={"callbacks":[langfuse_handler]}):    print("Async Streaming chunk:", chunk)

RetrievalQA

Trace of Langchain QA Retrieval in Langfuse

Example trace in Langfuse

import osos.environ["SERPAPI_API_KEY"]= "..."
%pip install unstructured selenium langchain-chroma--upgrade
from langchain_community.document_loadersimport SeleniumURLLoaderfrom langchain_chromaimport Chromafrom langchain_text_splittersimport CharacterTextSplitterfrom langchain_openaiimport OpenAIEmbeddingsfrom langchain.chainsimport RetrievalQAlangfuse_handler= CallbackHandler()urls= [    "https://raw.githubusercontent.com/langfuse/langfuse-docs/main/public/state_of_the_union.txt",]loader= SeleniumURLLoader(urls=urls)llm= OpenAI()documents= loader.load()text_splitter= CharacterTextSplitter(chunk_size=1000,chunk_overlap=0)texts= text_splitter.split_documents(documents)embeddings= OpenAIEmbeddings()docsearch= Chroma.from_documents(texts, embeddings)query= "What did the president say about Ketanji Brown Jackson"chain= RetrievalQA.from_chain_type(    llm,    retriever=docsearch.as_retriever(search_kwargs={"k":1}),)chain.invoke(query,config={"callbacks":[langfuse_handler]})
{'query': 'What did the president say about Ketanji Brown Jackson', 'result': " The president nominated her to serve on the United States Supreme Court and praised her as one of the nation's top legal minds who will continue the legacy of retiring Justice Stephen Breyer."}

AzureOpenAI

os.environ["AZURE_OPENAI_ENDPOINT"]= "<Azure OpenAI endpoint>"os.environ["AZURE_OPENAI_API_KEY"]= "<Azure OpenAI API key>"os.environ["OPENAI_API_TYPE"]= "azure"os.environ["OPENAI_API_VERSION"]= "2023-09-01-preview"
from langchain_openaiimport AzureChatOpenAIfrom langchain.promptsimport ChatPromptTemplatefrom langfuse.langchainimport CallbackHandler# Initialize Langfuse CallbackHandler for Langchain (tracing)langfuse_handler= CallbackHandler()prompt= ChatPromptTemplate.from_template("what is the city{person} is from?")model= AzureChatOpenAI(    deployment_name="gpt-4o",    model_name="gpt-4o",)chain= prompt| modelchain.invoke({"person":"Satya Nadella"},config={"callbacks":[langfuse_handler]})
Was this page helpful?

[8]ページ先頭

©2009-2025 Movatter.jp