ModelScope
ModelScope is a big repository of the models and datasets.
This page covers how to use the modelscope ecosystem within LangChain.It is broken into two parts: installation and setup, and then references to specific modelscope wrappers.
Installation
pip install -U langchain-modelscope-integration
Head toModelScope to sign up to ModelScope and generate anSDK token. Once you've done this set theMODELSCOPE_SDK_TOKEN
environment variable:
export MODELSCOPE_SDK_TOKEN=<your_sdk_token>
Chat Models
ModelScopeChatEndpoint
class exposes chat models from ModelScope. See available modelshere.
from langchain_modelscopeimport ModelScopeChatEndpoint
llm= ModelScopeChatEndpoint(model="Qwen/Qwen2.5-Coder-32B-Instruct")
llm.invoke("Sing a ballad of LangChain.")
Embeddings
ModelScopeEmbeddings
class exposes embeddings from ModelScope.
from langchain_modelscopeimport ModelScopeEmbeddings
embeddings= ModelScopeEmbeddings(model_id="damo/nlp_corom_sentence-embedding_english-base")
embeddings.embed_query("What is the meaning of life?")
LLMs
ModelScopeLLM
class exposes LLMs from ModelScope.
from langchain_modelscopeimport ModelScopeLLM
llm= ModelScopeLLM(model="Qwen/Qwen2.5-Coder-32B-Instruct")
llm.invoke("The meaning of life is")