language_models#

Language models.

Language Model is a type of model that can generate text or completetext prompts.

LangChain has two main classes to work with language models:Chat Modelsand “old-fashioned”LLMs.

Chat Models

Language models that use a sequence of messages as inputs and return chat messagesas outputs (as opposed to using plain text). These are traditionally newer models (older models are generally LLMs, see below). Chat models support the assignment ofdistinct roles to conversation messages, helping to distinguish messages from the AI,users, and instructions such as system messages.

The key abstraction for chat models isBaseChatModel. Implementationsshould inherit from this class. Please see LangChain how-to guides with moreinformation on how to implement a custom chat model.

To implement a custom Chat Model, inherit fromBaseChatModel. Seethe following guide for more information on how to implement a custom Chat Model:

https://python.langchain.com/docs/how_to/custom_chat_model/

LLMs

Language models that takes a string as input and returns a string.These are traditionally older models (newer models generally are Chat Models, see below).

Although the underlying models are string in, string out, the LangChain wrappersalso allow these models to take messages as input. This gives them the same interfaceas Chat Models. When messages are passed in as input, they will be formatted into astring under the hood before being passed to the underlying model.

To implement a custom LLM, inherit fromBaseLLM orLLM.Please see the following guide for more information on how to implement a custom LLM:

https://python.langchain.com/docs/how_to/custom_llm/

Classes

language_models.base.BaseLanguageModel

Abstract base class for interfacing with language models.

language_models.base.BaseLanguageModel[str]

Abstract base class for interfacing with language models.

language_models.base.LangSmithParams

LangSmith parameters for tracing.

language_models.chat_models.BaseChatModel

Base class for chat models.

language_models.chat_models.SimpleChatModel

Simplified implementation for a chat model to inherit from.

language_models.fake.FakeListLLM

Fake LLM for testing purposes.

language_models.fake.FakeListLLMError

Fake error for testing purposes.

language_models.fake.FakeStreamingListLLM

Fake streaming list LLM for testing purposes.

language_models.fake_chat_models.FakeChatModel

Fake Chat Model wrapper for testing purposes.

language_models.fake_chat_models.FakeListChatModel

Fake ChatModel for testing purposes.

language_models.fake_chat_models.FakeListChatModelError

Fake error for testing purposes.

language_models.fake_chat_models.FakeMessagesListChatModel

Fake ChatModel for testing purposes.

language_models.fake_chat_models.GenericFakeChatModel

Generic fake chat model that can be used to test the chat model interface.

language_models.fake_chat_models.ParrotFakeChatModel

Generic fake chat model that can be used to test the chat model interface.

language_models.llms.BaseLLM

Base LLM abstract interface.

language_models.llms.LLM

Simple interface for implementing a custom LLM.

Functions

language_models.chat_models.agenerate_from_stream(stream)

Async generate from a stream.

language_models.chat_models.generate_from_stream(stream)

Generate from a stream.

language_models.llms.aget_prompts(params, ...)

Get prompts that are already cached.

language_models.llms.aupdate_cache(cache, ...)

Update the cache and get the LLM output.

language_models.llms.create_base_retry_decorator(...)

Create a retry decorator for a given LLM and provided a list of error types.

language_models.llms.get_prompts(params, prompts)

Get prompts that are already cached.

language_models.llms.update_cache(cache, ...)

Update the cache and get the LLM output.