Movatterモバイル変換


[0]ホーム

URL:


Skip to main content
OurBuilding Ambient Agents with LangGraph course is now available on LangChain Academy!
Open on GitHub

DeepSparse

This page covers how to use theDeepSparse inference runtime within LangChain.It is broken into two parts: installation and setup, and then examples of DeepSparse usage.

Installation and Setup

LLMs

There exists a DeepSparse LLM wrapper, which you can access with:

from langchain_community.llmsimport DeepSparse
API Reference:DeepSparse

It provides a unified interface for all models:

llm= DeepSparse(model='zoo:nlg/text_generation/codegen_mono-350m/pytorch/huggingface/bigpython_bigquery_thepile/base-none')

print(llm.invoke('def fib():'))

Additional parameters can be passed using theconfig parameter:

config={'max_generated_tokens':256}

llm= DeepSparse(model='zoo:nlg/text_generation/codegen_mono-350m/pytorch/huggingface/bigpython_bigquery_thepile/base-none', config=config)

[8]ページ先頭

©2009-2025 Movatter.jp