Movatterモバイル変換


[0]ホーム

URL:


Skip to main content
OurBuilding Ambient Agents with LangGraph course is now available on LangChain Academy!
Open In ColabOpen on GitHub

ChatDeepSeek

This will help you get started with DeepSeek's hostedchat models. For detailed documentation of all ChatDeepSeek features and configurations head to theAPI reference.

tip

DeepSeek's models are open source and can be run locally (e.g. inOllama) or on other inference providers (e.g.Fireworks,Together) as well.

Overview

Integration details

ClassPackageLocalSerializableJS supportPackage downloadsPackage latest
ChatDeepSeeklangchain-deepseekbetaPyPI - DownloadsPyPI - Version

Model features

Tool callingStructured outputJSON modeImage inputAudio inputVideo inputToken-level streamingNative asyncToken usageLogprobs
note

DeepSeek-R1, specified viamodel="deepseek-reasoner", does not support tool calling or structured output. Those featuresare supported by DeepSeek-V3 (specified viamodel="deepseek-chat").

Setup

To access DeepSeek models you'll need to create a/an DeepSeek account, get an API key, and install thelangchain-deepseek integration package.

Credentials

Head toDeepSeek's API Key page to sign up to DeepSeek and generate an API key. Once you've done this set theDEEPSEEK_API_KEY environment variable:

import getpass
import os

ifnot os.getenv("DEEPSEEK_API_KEY"):
os.environ["DEEPSEEK_API_KEY"]= getpass.getpass("Enter your DeepSeek API key: ")

To enable automated tracing of your model calls, set yourLangSmith API key:

# os.environ["LANGSMITH_TRACING"] = "true"
# os.environ["LANGSMITH_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")

Installation

The LangChain DeepSeek integration lives in thelangchain-deepseek package:

%pip install-qU langchain-deepseek

Instantiation

Now we can instantiate our model object and generate chat completions:

from langchain_deepseekimport ChatDeepSeek

llm= ChatDeepSeek(
model="deepseek-chat",
temperature=0,
max_tokens=None,
timeout=None,
max_retries=2,
# other params...
)
API Reference:ChatDeepSeek

Invocation

messages=[
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human","I love programming."),
]
ai_msg= llm.invoke(messages)
ai_msg.content

Chaining

We canchain our model with a prompt template like so:

from langchain_core.promptsimport ChatPromptTemplate

prompt= ChatPromptTemplate(
[
(
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
),
("human","{input}"),
]
)

chain= prompt| llm
chain.invoke(
{
"input_language":"English",
"output_language":"German",
"input":"I love programming.",
}
)
API Reference:ChatPromptTemplate

API reference

For detailed documentation of all ChatDeepSeek features and configurations head to theAPI Reference.

Related


[8]ページ先頭

©2009-2025 Movatter.jp