Movatterモバイル変換


[0]ホーム

URL:


Skip to main content
OurBuilding Ambient Agents with LangGraph course is now available on LangChain Academy!
Open In ColabOpen on GitHub

ChatNetmind

This will help you get started with Netmindchat models. For detailed documentation of all ChatNetmind features and configurations head to theAPI reference.

Overview

Integration details

ClassPackageLocalSerializableJS supportPackage downloadsPackage latest
ChatNetmindlangchain-netmindPyPI - DownloadsPyPI - Version

Model features

Tool callingStructured outputJSON modeImage inputAudio inputVideo inputToken-level streamingNative asyncToken usageLogprobs

Setup

To access Netmind models you'll need to create a/an Netmind account, get an API key, and install thelangchain-netmind integration package.

Credentials

Head tohttps://www.netmind.ai/ to sign up to Netmind and generate an API key. Once you've done this set the NETMIND_API_KEY environment variable:

import getpass
import os

ifnot os.getenv("NETMIND_API_KEY"):
os.environ["NETMIND_API_KEY"]= getpass.getpass("Enter your Netmind API key: ")

If you want to get automated tracing of your model calls you can also set yourLangSmith API key by uncommenting below:

# os.environ["LANGCHAIN_TRACING_V2"] = "true"
# os.environ["LANGCHAIN_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")

Installation

The LangChain Netmind integration lives in thelangchain-netmind package:

%pip install-qU langchain-netmind

[notice] A new release of pip is available: 24.0 -> 25.0.1
[notice] To update, run: pip install --upgrade pip
Note: you may need to restart the kernel to use updated packages.

Instantiation

Now we can instantiate our model object and generate chat completions:

from langchain_netmindimport ChatNetmind

llm= ChatNetmind(
model="deepseek-ai/DeepSeek-V3",
temperature=0,
max_tokens=None,
timeout=None,
max_retries=2,
# other params...
)

Invocation

messages=[
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human","I love programming."),
]
ai_msg= llm.invoke(messages)
ai_msg
AIMessage(content="J'adore programmer.", additional_kwargs={'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 13, 'prompt_tokens': 31, 'total_tokens': 44, 'completion_tokens_details': None, 'prompt_tokens_details': None}, 'model_name': 'deepseek-ai/DeepSeek-V3', 'system_fingerprint': None, 'finish_reason': 'stop', 'logprobs': None}, id='run-ca6c2010-844d-4bf6-baac-6e248491b000-0', usage_metadata={'input_tokens': 31, 'output_tokens': 13, 'total_tokens': 44, 'input_token_details': {}, 'output_token_details': {}})
print(ai_msg.content)
J'adore programmer.

Chaining

We canchain our model with a prompt template like so:

from langchain_core.promptsimport ChatPromptTemplate

prompt= ChatPromptTemplate(
[
(
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
),
("human","{input}"),
]
)

chain= prompt| llm
chain.invoke(
{
"input_language":"English",
"output_language":"German",
"input":"I love programming.",
}
)
API Reference:ChatPromptTemplate
AIMessage(content='Ich liebe es zu programmieren.', additional_kwargs={'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 14, 'prompt_tokens': 26, 'total_tokens': 40, 'completion_tokens_details': None, 'prompt_tokens_details': None}, 'model_name': 'deepseek-ai/DeepSeek-V3', 'system_fingerprint': None, 'finish_reason': 'stop', 'logprobs': None}, id='run-d63adcc6-53ba-4caa-9a79-78d640b39274-0', usage_metadata={'input_tokens': 26, 'output_tokens': 14, 'total_tokens': 40, 'input_token_details': {}, 'output_token_details': {}})

API reference

For detailed documentation of all ChatNetmind features and configurations head to the API reference:

Related


[8]ページ先頭

©2009-2025 Movatter.jp