ChatMistralAI
This will help you get started with Mistralchat models. For detailed documentation of allChatMistralAI
features and configurations head to theAPI reference. TheChatMistralAI
class is built on top of theMistral API. For a list of all the models supported by Mistral, check outthis page.
Overview
Integration details
Class | Package | Local | Serializable | JS support | Package downloads | Package latest |
---|---|---|---|---|---|---|
ChatMistralAI | langchain_mistralai | ❌ | beta | ✅ |
Model features
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|---|
✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | ❌ |
Setup
To accessChatMistralAI
models you'll need to create a Mistral account, get an API key, and install thelangchain_mistralai
integration package.
Credentials
A validAPI key is needed to communicate with the API. Once you've done this set the MISTRAL_API_KEY environment variable:
import getpass
import os
if"MISTRAL_API_KEY"notin os.environ:
os.environ["MISTRAL_API_KEY"]= getpass.getpass("Enter your Mistral API key: ")
To enable automated tracing of your model calls, set yourLangSmith API key:
# os.environ["LANGSMITH_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")
# os.environ["LANGSMITH_TRACING"] = "true"
Installation
The LangChain Mistral integration lives in thelangchain_mistralai
package:
%pip install-qU langchain_mistralai
Instantiation
Now we can instantiate our model object and generate chat completions:
from langchain_mistralaiimport ChatMistralAI
llm= ChatMistralAI(
model="mistral-large-latest",
temperature=0,
max_retries=2,
# other params...
)
Invocation
messages=[
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human","I love programming."),
]
ai_msg= llm.invoke(messages)
ai_msg
AIMessage(content='Sure, I\'d be happy to help you translate that sentence into French! The English sentence "I love programming" translates to "J\'aime programmer" in French. Let me know if you have any other questions or need further assistance!', response_metadata={'token_usage': {'prompt_tokens': 32, 'total_tokens': 84, 'completion_tokens': 52}, 'model': 'mistral-small', 'finish_reason': 'stop'}, id='run-64bac156-7160-4b68-b67e-4161f63e021f-0', usage_metadata={'input_tokens': 32, 'output_tokens': 52, 'total_tokens': 84})
print(ai_msg.content)
Sure, I'd be happy to help you translate that sentence into French! The English sentence "I love programming" translates to "J'aime programmer" in French. Let me know if you have any other questions or need further assistance!
Chaining
We canchain our model with a prompt template like so:
from langchain_core.promptsimport ChatPromptTemplate
prompt= ChatPromptTemplate.from_messages(
[
(
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
),
("human","{input}"),
]
)
chain= prompt| llm
chain.invoke(
{
"input_language":"English",
"output_language":"German",
"input":"I love programming.",
}
)
AIMessage(content='Ich liebe Programmierung. (German translation)', response_metadata={'token_usage': {'prompt_tokens': 26, 'total_tokens': 38, 'completion_tokens': 12}, 'model': 'mistral-small', 'finish_reason': 'stop'}, id='run-dfd4094f-e347-47b0-9056-8ebd7ea35fe7-0', usage_metadata={'input_tokens': 26, 'output_tokens': 12, 'total_tokens': 38})
API reference
Head to theAPI reference for detailed documentation of all attributes and methods.
Related
- Chat modelconceptual guide
- Chat modelhow-to guides