pydantic-ai 0.0.12
pip install pydantic-ai==0.0.12
Released:
Agent Framework / shim to use Pydantic with LLMs
Navigation
Verified details
These details have beenverified by PyPIProject links
Owner
GitHub Statistics
Maintainers
Unverified details
These details havenot been verified by PyPIProject links
Meta
- License: MIT License (MIT)
- Author:Samuel Colvin
- Requires: Python >=3.9
- Provides-Extra:
examples
,logfire
Classifiers
- Development Status
- Environment
- Framework
- Intended Audience
- License
- Operating System
- Programming Language
- Topic
Project description
Documentation:ai.pydantic.dev
When I first found FastAPI, I got it immediately. I was excited to find something so innovative and ergonomic built on Pydantic.
Virtually every Agent Framework and LLM library in Python uses Pydantic, but when we began to use LLMs inPydantic Logfire, I couldn't find anything that gave me the same feeling.
PydanticAI is a Python Agent Framework designed to make it less painful to build production grade applications with Generative AI.
Why use PydanticAI
- Built by the team behind Pydantic (the validation layer of the OpenAI SDK, the Anthropic SDK, LangChain, LlamaIndex, AutoGPT, Transformers, CrewAI, Instructor and many more)
- Model-agnostic — currently OpenAI, Gemini, and Groq are supported. And there is a simple interface to implement support for other models.
- Type-safe
- Control flow and agent composition is done with vanilla Python, allowing you to make use of the same Python development best practices you'd use in any other (non-AI) project
- Structured response validation with Pydantic
- Streamed responses, including validation of streamedstructured responses with Pydantic
- Novel, type-safedependency injection system, useful for testing and eval-driven iterative development
- Logfire integration for debugging and monitoring the performance and general behavior of your LLM-powered application
In Beta!
PydanticAI is in early beta, the API is still subject to change and there's a lot more to do.Feedback is very welcome!
Hello World Example
Here's a minimal example of PydanticAI:
frompydantic_aiimportAgent# Define a very simple agent including the model to use, you can also set the model when running the agent.agent=Agent('gemini-1.5-flash',# Register a static system prompt using a keyword argument to the agent.# For more complex dynamically-generated system prompts, see the example below.system_prompt='Be concise, reply with one sentence.',)# Run the agent synchronously, conducting a conversation with the LLM.# Here the exchange should be very short: PydanticAI will send the system prompt and the user query to the LLM,# the model will return a text response. See below for a more complex run.result=agent.run_sync('Where does "hello world" come from?')print(result.data)"""The first known use of "hello, world" was in a 1974 textbook about the C programming language."""
(This example is complete, it can be run "as is")
Not very interesting yet, but we can easily add "tools", dynamic system prompts, and structured responses to build more powerful agents.
Tools & Dependency Injection Example
Here is a concise example using PydanticAI to build a support agent for a bank:
(Better documented examplein the docs)
fromdataclassesimportdataclassfrompydanticimportBaseModel,Fieldfrompydantic_aiimportAgent,RunContextfrombank_databaseimportDatabaseConn# SupportDependencies is used to pass data, connections, and logic into the model that will be needed when running# system prompt and tool functions. Dependency injection provides a type-safe way to customise the behavior of your agents.@dataclassclassSupportDependencies:customer_id:intdb:DatabaseConn# This pydantic model defines the structure of the result returned by the agent.classSupportResult(BaseModel):support_advice:str=Field(description='Advice returned to the customer')block_card:bool=Field(description="Whether to block the customer's card")risk:int=Field(description='Risk level of query',ge=0,le=10)# This agent will act as first-tier support in a bank.# Agents are generic in the type of dependencies they accept and the type of result they return.# In this case, the support agent has type `Agent[SupportDependencies, SupportResult]`.support_agent=Agent('openai:gpt-4o',deps_type=SupportDependencies,# The response from the agent will, be guaranteed to be a SupportResult,# if validation fails the agent is prompted to try again.result_type=SupportResult,system_prompt=('You are a support agent in our bank, give the ''customer support and judge the risk level of their query.'),)# Dynamic system prompts can make use of dependency injection.# Dependencies are carried via the `RunContext` argument, which is parameterized with the `deps_type` from above.# If the type annotation here is wrong, static type checkers will catch it.@support_agent.system_promptasyncdefadd_customer_name(ctx:RunContext[SupportDependencies])->str:customer_name=awaitctx.deps.db.customer_name(id=ctx.deps.customer_id)returnf"The customer's name is{customer_name!r}"# `tool` let you register functions which the LLM may call while responding to a user.# Again, dependencies are carried via `RunContext`, any other arguments become the tool schema passed to the LLM.# Pydantic is used to validate these arguments, and errors are passed back to the LLM so it can retry.@support_agent.toolasyncdefcustomer_balance(ctx:RunContext[SupportDependencies],include_pending:bool)->float:"""Returns the customer's current account balance."""# The docstring of a tool is also passed to the LLM as the description of the tool.# Parameter descriptions are extracted from the docstring and added to the parameter schema sent to the LLM.balance=awaitctx.deps.db.customer_balance(id=ctx.deps.customer_id,include_pending=include_pending,)returnbalance...# In a real use case, you'd add more tools and a longer system promptasyncdefmain():deps=SupportDependencies(customer_id=123,db=DatabaseConn())# Run the agent asynchronously, conducting a conversation with the LLM until a final response is reached.# Even in this fairly simple case, the agent will exchange multiple messages with the LLM as tools are called to retrieve a result.result=awaitsupport_agent.run('What is my balance?',deps=deps)# The result will be validated with Pydantic to guarantee it is a `SupportResult`, since the agent is generic,# it'll also be typed as a `SupportResult` to aid with static type checking.print(result.data)""" support_advice='Hello John, your current account balance, including pending transactions, is $123.45.' block_card=False risk=1 """result=awaitsupport_agent.run('I just lost my card!',deps=deps)print(result.data)""" support_advice="I'm sorry to hear that, John. We are temporarily blocking your card to prevent unauthorized transactions." block_card=True risk=8 """
Next Steps
To try PydanticAI yourself, follow the instructionsin the examples.
Read thedocs to learn more about building applications with PydanticAI.
Read theAPI Reference to understand PydanticAI's interface.
Project details
Verified details
These details have beenverified by PyPIProject links
Owner
GitHub Statistics
Maintainers
Unverified details
These details havenot been verified by PyPIProject links
Meta
- License: MIT License (MIT)
- Author:Samuel Colvin
- Requires: Python >=3.9
- Provides-Extra:
examples
,logfire
Classifiers
- Development Status
- Environment
- Framework
- Intended Audience
- License
- Operating System
- Programming Language
- Topic
Release historyRelease notifications |RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more aboutinstalling packages.
Source Distribution
Built Distribution
File details
Details for the filepydantic_ai-0.0.12.tar.gz
.
File metadata
- Download URL:pydantic_ai-0.0.12.tar.gz
- Upload date:
- Size: 41.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e86ff5e5137be64c247a9ea1cf39867a96a92de28ed60440ec46fcc32ca8916c | |
MD5 | 2f844fe74416363238aa9f236a3078d3 | |
BLAKE2b-256 | 5d0d846bcf91a22c8b167b3923fa4771d57d79635a32ad20bba3e6e9aa496ce1 |
Provenance
The following attestation bundles were made forpydantic_ai-0.0.12.tar.gz
:
Publisher:ci.yml
on pydantic/pydantic-ai
- Statement:
- Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
pydantic_ai-0.0.12.tar.gz
- Subject digest:
e86ff5e5137be64c247a9ea1cf39867a96a92de28ed60440ec46fcc32ca8916c
- Sigstore transparency entry:154082857
- Sigstore integration time:
- Permalink:
pydantic/pydantic-ai@98681d7a8ae82a92a28c9bad343adc96ce410ad1
- Branch / Tag:
refs/tags/v0.0.12
- Owner:https://github.com/pydantic
- Access:
public
- Token Issuer:
https://token.actions.githubusercontent.com
- Runner Environment:
github-hosted
- Publication workflow:
ci.yml@98681d7a8ae82a92a28c9bad343adc96ce410ad1
- Trigger Event:
push
- Statement type:
File details
Details for the filepydantic_ai-0.0.12-py3-none-any.whl
.
File metadata
- Download URL:pydantic_ai-0.0.12-py3-none-any.whl
- Upload date:
- Size: 9.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0dbfec39744a4ed87467bb0ce34c736cf6b3cade0256a7530e8b4a7c7c916afb | |
MD5 | 3f90551758e8b884fe81b44303c71c89 | |
BLAKE2b-256 | d74f19046bd564a21cfc6d65e5053a6385aa170c312dd6b2ed9fa98218f03831 |
Provenance
The following attestation bundles were made forpydantic_ai-0.0.12-py3-none-any.whl
:
Publisher:ci.yml
on pydantic/pydantic-ai
- Statement:
- Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
pydantic_ai-0.0.12-py3-none-any.whl
- Subject digest:
0dbfec39744a4ed87467bb0ce34c736cf6b3cade0256a7530e8b4a7c7c916afb
- Sigstore transparency entry:154082865
- Sigstore integration time:
- Permalink:
pydantic/pydantic-ai@98681d7a8ae82a92a28c9bad343adc96ce410ad1
- Branch / Tag:
refs/tags/v0.0.12
- Owner:https://github.com/pydantic
- Access:
public
- Token Issuer:
https://token.actions.githubusercontent.com
- Runner Environment:
github-hosted
- Publication workflow:
ci.yml@98681d7a8ae82a92a28c9bad343adc96ce410ad1
- Trigger Event:
push
- Statement type: