ChatSeekrFlow
Seekr provides AI-powered solutions for structured, explainable, and transparent AI interactions.
This notebook provides a quick overview for getting started with Seekrchat models. For detailed documentation of allChatSeekrFlow
features and configurations, head to theAPI reference.
Overview
ChatSeekrFlow
class wraps a chat model endpoint hosted on SeekrFlow, enabling seamless integration with LangChain applications.
Integration Details
Class | Package | Local | Serializable | Package downloads | Package latest |
---|---|---|---|---|---|
ChatSeekrFlow | seekrai | ❌ | beta |
Model Features
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|---|
✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | ❌ |
Supported Methods
ChatSeekrFlow
supports all methods ofChatModel
,except async APIs.
Endpoint Requirements
The serving endpointChatSeekrFlow
wrapsmust have OpenAI-compatible chat input/output format. It can be used for:
- Fine-tuned Seekr models
- Custom SeekrFlow models
- RAG-enabled models using Seekr's retrieval system
For async usage, please refer toAsyncChatSeekrFlow
(coming soon).
Getting Started with ChatSeekrFlow in LangChain
This notebook covers how to use SeekrFlow as a chat model in LangChain.
Setup
Ensure you have the necessary dependencies installed:
pip install seekrai langchain langchain-community
You must also have an API key from Seekr to authenticate requests.
# Standard library
import getpass
import os
# Third-party
from langchain.promptsimport ChatPromptTemplate
from langchain.schemaimport HumanMessage
from langchain_core.runnablesimport RunnableSequence
# OSS SeekrFlow integration
from langchain_seekrflowimport ChatSeekrFlow
from seekraiimport SeekrFlow
API Key Setup
You'll need to set your API key as an environment variable to authenticate requests.
Run the below cell.
Or manually assign it before running queries:
SEEKR_API_KEY="your-api-key-here"
os.environ["SEEKR_API_KEY"]= getpass.getpass("Enter your Seekr API key:")
Instantiation
os.environ["SEEKR_API_KEY"]
seekr_client= SeekrFlow(api_key=SEEKR_API_KEY)
llm= ChatSeekrFlow(
client=seekr_client, model_name="meta-llama/Meta-Llama-3-8B-Instruct"
)
Invocation
response= llm.invoke([HumanMessage(content="Hello, Seekr!")])
print(response.content)
Hello there! I'm Seekr, nice to meet you! What brings you here today? Do you have a question, or are you looking for some help with something? I'm all ears (or rather, all text)!
Chaining
prompt= ChatPromptTemplate.from_template("Translate to French: {text}")
chain: RunnableSequence= prompt| llm
result= chain.invoke({"text":"Good morning"})
print(result)
content='The translation of "Good morning" in French is:\n\n"Bonne journée"' additional_kwargs={} response_metadata={}
deftest_stream():
"""Test synchronous invocation in streaming mode."""
print("\n🔹 Testing Sync `stream()` (Streaming)...")
for chunkin llm.stream([HumanMessage(content="Write me a haiku.")]):
print(chunk.content, end="", flush=True)
# ✅ Ensure streaming is enabled
llm= ChatSeekrFlow(
client=seekr_client,
model_name="meta-llama/Meta-Llama-3-8B-Instruct",
streaming=True,# ✅ Enable streaming
)
# ✅ Run sync streaming test
test_stream()
🔹 Testing Sync `stream()` (Streaming)...
Here is a haiku:
Golden sunset fades
Ripples on the quiet lake
Peaceful evening sky
Error Handling & Debugging
# Define a minimal mock SeekrFlow client
classMockSeekrClient:
"""Mock SeekrFlow API client that mimics the real API structure."""
classMockChat:
"""Mock Chat object with a completions method."""
classMockCompletions:
"""Mock Completions object with a create method."""
defcreate(self,*args,**kwargs):
return{
"choices":[{"message":{"content":"Mock response"}}]
}# Mimic API response
completions= MockCompletions()
chat= MockChat()
deftest_initialization_errors():
"""Test that invalid ChatSeekrFlow initializations raise expected errors."""
test_cases=[
{
"name":"Missing Client",
"args":{"client":None,"model_name":"seekrflow-model"},
"expected_error":"SeekrFlow client cannot be None.",
},
{
"name":"Missing Model Name",
"args":{"client": MockSeekrClient(),"model_name":""},
"expected_error":"A valid model name must be provided.",
},
]
for testin test_cases:
try:
print(f"Running test:{test['name']}")
faulty_llm= ChatSeekrFlow(**test["args"])
# If no error is raised, fail the test
print(f"❌ Test '{test['name']}' failed: No error was raised!")
except Exceptionas e:
error_msg=str(e)
assert test["expected_error"]in error_msg,f"Unexpected error:{error_msg}"
print(f"✅ Expected Error:{error_msg}")
# Run test
test_initialization_errors()
Running test: Missing Client
✅ Expected Error: SeekrFlow client cannot be None.
Running test: Missing Model Name
✅ Expected Error: A valid model name must be provided.
API reference
ChatSeekrFlow
class:langchain_seekrflow.ChatSeekrFlow
- PyPI package:
langchain-seekrflow
Related
- Chat modelconceptual guide
- Chat modelhow-to guides