Nebula (Symbl.ai)
This notebook covers how to get started withNebula - Symbl.ai's chat model.
Integration details
Head to theAPI reference for detailed documentation.
Model features: TODO
Setup
Credentials
To get started, request aNebula API key and set theNEBULA_API_KEY
environment variable:
import getpass
import os
os.environ["NEBULA_API_KEY"]= getpass.getpass()
Installation
The integration is set up in thelangchain-community
package.
Instantiation
from langchain_community.chat_models.symblai_nebulaimport ChatNebula
from langchain_core.messagesimport AIMessage, HumanMessage, SystemMessage
chat= ChatNebula(max_tokens=1024, temperature=0.5)
Invocation
messages=[
SystemMessage(
content="You are a helpful assistant that answers general knowledge questions."
),
HumanMessage(content="What is the capital of France?"),
]
chat.invoke(messages)
AIMessage(content=[{'role': 'human', 'text': 'What is the capital of France?'}, {'role': 'assistant', 'text': 'The capital of France is Paris.'}])
Async
await chat.ainvoke(messages)
AIMessage(content=[{'role': 'human', 'text': 'What is the capital of France?'}, {'role': 'assistant', 'text': 'The capital of France is Paris.'}])
Streaming
for chunkin chat.stream(messages):
print(chunk.content, end="", flush=True)
The capital of France is Paris.
Batch
chat.batch([messages])
[AIMessage(content=[{'role': 'human', 'text': 'What is the capital of France?'}, {'role': 'assistant', 'text': 'The capital of France is Paris.'}])]
Chaining
from langchain_core.promptsimport ChatPromptTemplate
prompt= ChatPromptTemplate.from_template("Tell me a joke about {topic}")
chain= prompt| chat
API Reference:ChatPromptTemplate
chain.invoke({"topic":"cows"})
AIMessage(content=[{'role': 'human', 'text': 'Tell me a joke about cows'}, {'role': 'assistant', 'text': "Sure, here's a joke about cows:\n\nWhy did the cow cross the road?\n\nTo get to the udder side!"}])
API reference
Check out theAPI reference for more detail.
Related
- Chat modelconceptual guide
- Chat modelhow-to guides