Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit431ec33

Browse files
stevenhDouweM
andauthored
Support AG-UI protocol for frontend-agent communication (#2223)
Co-authored-by: Douwe Maan <douwe@pydantic.dev>
1 parent3c43c2d commit431ec33

26 files changed

+2665
-35
lines changed

‎docs/ag-ui.md

Lines changed: 187 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,187 @@
1+
#Agent User Interaction (AG-UI) Protocol
2+
3+
The[Agent User Interaction (AG-UI) Protocol](https://docs.ag-ui.com/introduction) is an open standard introduced by the
4+
[CopilotKit](https://webflow.copilotkit.ai/blog/introducing-ag-ui-the-protocol-where-agents-meet-users)
5+
team that standardises how frontend applications communicate with AI agents, with support for streaming, frontend tools, shared state, and custom events.
6+
7+
Any Pydantic AI agent can be exposed as an AG-UI server using the[`Agent.to_ag_ui()`][pydantic_ai.Agent.to_ag_ui] convenience method.
8+
9+
!!! note
10+
The AG-UI integration was originally built by the team at[Rocket Science](https://www.rocketscience.gg/) and contributed in collaboration with the Pydantic AI and CopilotKit teams. Thanks Rocket Science!
11+
12+
##Installation
13+
14+
The only dependencies are:
15+
16+
-[ag-ui-protocol](https://docs.ag-ui.com/introduction): to provide the AG-UI types and encoder
17+
-[starlette](https://www.starlette.io): to expose the AG-UI server as an[ASGI application](https://asgi.readthedocs.io/en/latest/)
18+
19+
You can install Pydantic AI with the`ag-ui` extra to ensure you have all the
20+
required AG-UI dependencies:
21+
22+
```bash
23+
pip/uv-add'pydantic-ai-slim[ag-ui]'
24+
```
25+
26+
To run the examples you'll also need:
27+
28+
-[uvicorn](https://www.uvicorn.org/) or another ASGI compatible server
29+
30+
```bash
31+
pip/uv-add uvicorn
32+
```
33+
34+
##Quick start
35+
36+
To expose a Pydantic AI agent as an AG-UI server, you can use the[`Agent.to_ag_ui()`][pydantic_ai.Agent.to_ag_ui] method:
37+
38+
```py {title="agent_to_ag_ui.py" py="3.10" hl_lines="4"}
39+
from pydantic_aiimport Agent
40+
41+
agent= Agent('openai:gpt-4.1',instructions='Be fun!')
42+
app= agent.to_ag_ui()
43+
```
44+
45+
Since`app` is an ASGI application, it can be used with any ASGI server:
46+
47+
```shell
48+
uvicorn agent_to_ag_ui:app --host 0.0.0.0 --port 9000
49+
```
50+
51+
This will expose the agent as an AG-UI server, and your frontend can start sending requests to it.
52+
53+
The`to_ag_ui()` method accepts the same arguments as the[`Agent.iter()`][pydantic_ai.agent.Agent.iter] method as well as arguments that let you configure the[Starlette](https://www.starlette.io)-based ASGI app.
54+
55+
##Design
56+
57+
The Pydantic AI AG-UI integration supports all features of the spec:
58+
59+
-[Events](https://docs.ag-ui.com/concepts/events)
60+
-[Messages](https://docs.ag-ui.com/concepts/messages)
61+
-[State Management](https://docs.ag-ui.com/concepts/state)
62+
-[Tools](https://docs.ag-ui.com/concepts/tools)
63+
64+
The app receives messages in the form of a
65+
[`RunAgentInput`](https://docs.ag-ui.com/sdk/js/core/types#runagentinput)
66+
which describes the details of a request being passed to the agent including
67+
messages and state. These are then converted to Pydantic AI types and passed to the
68+
agent which then process the request.
69+
70+
Events from the agent, including tool calls, are converted to AG-UI events and
71+
streamed back to the caller as Server-Sent Events (SSE).
72+
73+
A user request may require multiple round trips between client UI and Pydantic AI
74+
server, depending on the tools and events needed.
75+
76+
##Features
77+
78+
###State management
79+
80+
The adapter provides full support for
81+
[AG-UI state management](https://docs.ag-ui.com/concepts/state), which enables
82+
real-time synchronization between agents and frontend applications.
83+
84+
In the example below we have document state which is shared between the UI and
85+
server using the[`StateDeps`][pydantic_ai.ag_ui.StateDeps] which implements the
86+
[`StateHandler`][pydantic_ai.ag_ui.StateHandler] protocol that can be used to automatically
87+
decode state contained in[`RunAgentInput.state`](https://docs.ag-ui.com/sdk/js/core/types#runagentinput)
88+
when processing requests.
89+
90+
```python {title="ag_ui_state.py" py="3.10"}
91+
from pydanticimport BaseModel
92+
93+
from pydantic_aiimport Agent
94+
from pydantic_ai.ag_uiimport StateDeps
95+
96+
97+
classDocumentState(BaseModel):
98+
"""State for the document being written."""
99+
100+
document:str=''
101+
102+
103+
agent= Agent(
104+
'openai:gpt-4.1',
105+
instructions='Be fun!',
106+
deps_type=StateDeps[DocumentState],
107+
)
108+
app= agent.to_ag_ui(deps=StateDeps(DocumentState()))
109+
```
110+
111+
Since`app` is an ASGI application, it can be used with any ASGI server:
112+
113+
```bash
114+
uvicorn ag_ui_state:app --host 0.0.0.0 --port 9000
115+
```
116+
117+
###Tools
118+
119+
AG-UI frontend tools are seamlessly provided to the Pydantic AI agent, enabling rich
120+
user experiences with frontend user interfaces.
121+
122+
###Events
123+
124+
Pydantic AI tools can send
125+
[AG-UI events](https://docs.ag-ui.com/concepts/events) simply by defining a tool
126+
which returns a (subclass of)
127+
[`BaseEvent`](https://docs.ag-ui.com/sdk/python/core/events#baseevent), which allows
128+
for custom events and state updates.
129+
130+
```python {title="ag_ui_tool_events.py" py="3.10"}
131+
from ag_ui.coreimport CustomEvent, EventType, StateSnapshotEvent
132+
from pydanticimport BaseModel
133+
134+
from pydantic_aiimport Agent, RunContext
135+
from pydantic_ai.ag_uiimport StateDeps
136+
137+
138+
classDocumentState(BaseModel):
139+
"""State for the document being written."""
140+
141+
document:str=''
142+
143+
144+
agent= Agent(
145+
'openai:gpt-4.1',
146+
instructions='Be fun!',
147+
deps_type=StateDeps[DocumentState],
148+
)
149+
app= agent.to_ag_ui(deps=StateDeps(DocumentState()))
150+
151+
152+
@agent.tool
153+
defupdate_state(ctx: RunContext[StateDeps[DocumentState]]) -> StateSnapshotEvent:
154+
return StateSnapshotEvent(
155+
type=EventType.STATE_SNAPSHOT,
156+
snapshot=ctx.deps.state,
157+
)
158+
159+
160+
@agent.tool_plain
161+
defcustom_events() -> list[CustomEvent]:
162+
return [
163+
CustomEvent(
164+
type=EventType.CUSTOM,
165+
name='count',
166+
value=1,
167+
),
168+
CustomEvent(
169+
type=EventType.CUSTOM,
170+
name='count',
171+
value=2,
172+
),
173+
]
174+
```
175+
176+
Since`app` is an ASGI application, it can be used with any ASGI server:
177+
178+
```bash
179+
uvicorn ag_ui_tool_events:app --host 0.0.0.0 --port 9000
180+
```
181+
182+
##Examples
183+
184+
For more examples of how to use[`to_ag_ui()`][pydantic_ai.Agent.to_ag_ui] see
185+
[`pydantic_ai_examples.ag_ui`](https://github.com/pydantic/pydantic-ai/tree/main/examples/pydantic_ai_examples/ag_ui),
186+
which includes a server for use with the
187+
[AG-UI Dojo](https://docs.ag-ui.com/tutorials/debugging#the-ag-ui-dojo).

‎docs/api/ag_ui.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
#`pydantic_ai.ag_ui`
2+
3+
::: pydantic_ai.ag_ui

‎docs/examples/ag-ui.md

Lines changed: 204 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,204 @@
1+
#Agent User Interaction (AG-UI)
2+
3+
Example of using Pydantic AI agents with the[AG-UI Dojo](https://github.com/ag-ui-protocol/ag-ui/tree/main/typescript-sdk/apps/dojo) example app.
4+
5+
See the[AG-UI docs](../ag-ui.md) for more information about the AG-UI integration.
6+
7+
Demonstrates:
8+
9+
-[AG-UI](../ag-ui.md)
10+
-[Tools](../tools.md)
11+
12+
##Prerequisites
13+
14+
- An[OpenAI API key](https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key)
15+
16+
##Running the Example
17+
18+
With[dependencies installed and environment variables set](./index.md#usage)
19+
you will need two command line windows.
20+
21+
###Pydantic AI AG-UI backend
22+
23+
Setup your OpenAI API Key
24+
25+
```bash
26+
export OPENAI_API_KEY=<your api key>
27+
```
28+
29+
Start the Pydantic AI AG-UI example backend.
30+
31+
```bash
32+
python/uv-run -m pydantic_ai_examples.ag_ui
33+
```
34+
35+
###AG-UI Dojo example frontend
36+
37+
Next run the AG-UI Dojo example frontend.
38+
39+
1. Clone the[AG-UI repository](https://github.com/ag-ui-protocol/ag-ui)
40+
41+
```shell
42+
git clone https://github.com/ag-ui-protocol/ag-ui.git
43+
```
44+
45+
2. Change into to the`ag-ui/typescript-sdk` directory
46+
47+
```shell
48+
cd ag-ui/typescript-sdk
49+
```
50+
51+
3. Run the Dojo app following the [official instructions](https://github.com/ag-ui-protocol/ag-ui/tree/main/typescript-sdk/apps/dojo#development-setup)
52+
4. Visit<http://localhost:3000/pydantic-ai>
53+
5. Select View`Pydantic AI` from the sidebar
54+
55+
## Feature Examples
56+
57+
### Agentic Chat
58+
59+
This demonstrates a basic agent interaction including Pydantic AI server side
60+
tools and AG-UI client side tools.
61+
62+
View the [Agentic Chat example](http://localhost:3000/pydantic-ai/feature/agentic_chat).
63+
64+
#### Agent Tools
65+
66+
-`time` - Pydantic AI tool to check the currenttimefor atime zone
67+
-`background` - AG-UI tool toset the background color of the client window
68+
69+
#### Agent Prompts
70+
71+
```text
72+
What is thetimein New York?
73+
```
74+
75+
```text
76+
Change the background to blue
77+
```
78+
79+
A complex example which mixes both AG-UI and Pydantic AI tools:
80+
81+
```text
82+
Perform the following steps, waitingfor the response of each step before continuing:
83+
1. Get thetime
84+
2. Set the background to red
85+
3. Get thetime
86+
4. Report how long the backgroundset took by diffing the twotimes
87+
```
88+
89+
#### Agentic Chat - Code
90+
91+
```snippet {path="/examples/pydantic_ai_examples/ag_ui/api/agentic_chat.py"}```
92+
93+
### Agentic Generative UI
94+
95+
Demonstrates a long running task where the agent sends updates to the frontend
96+
tolet the user know what's happening.
97+
98+
View the [Agentic Generative UI example](http://localhost:3000/pydantic-ai/feature/agentic_generative_ui).
99+
100+
#### Plan Prompts
101+
102+
```text
103+
Create a plan for breakfast and execute it
104+
```
105+
106+
#### Agentic Generative UI - Code
107+
108+
```snippet {path="/examples/pydantic_ai_examples/ag_ui/api/agentic_generative_ui.py"}```
109+
110+
### Human in the Loop
111+
112+
Demonstrates simple human in the loop workflow where the agent comes up with a
113+
plan and the user can approve it using checkboxes.
114+
115+
#### Task Planning Tools
116+
117+
- `generate_task_steps` - AG-UI tool to generate and confirm steps
118+
119+
#### Task Planning Prompt
120+
121+
```text
122+
Generate a list of steps for cleaning a car for me to review
123+
```
124+
125+
#### Human in the Loop - Code
126+
127+
```snippet {path="/examples/pydantic_ai_examples/ag_ui/api/human_in_the_loop.py"}```
128+
129+
### Predictive State Updates
130+
131+
Demonstrates how to use the predictive state updates feature to update the state
132+
of the UI based on agent responses, including user interaction via user
133+
confirmation.
134+
135+
View the [Predictive State Updates example](http://localhost:3000/pydantic-ai/feature/predictive_state_updates).
136+
137+
#### Story Tools
138+
139+
- `write_document` - AG-UI tool to write the document to a window
140+
- `document_predict_state` - Pydantic AI tool that enables document state
141+
prediction for the `write_document` tool
142+
143+
This also shows how to use custom instructions based on shared state information.
144+
145+
#### Story Example
146+
147+
Starting document text
148+
149+
```markdown
150+
Bruce was a good dog,
151+
```
152+
153+
Agent prompt
154+
155+
```text
156+
Help me complete my story about bruce the dog, is should be no longer than a sentence.
157+
```
158+
159+
#### Predictive State Updates - Code
160+
161+
```snippet {path="/examples/pydantic_ai_examples/ag_ui/api/predictive_state_updates.py"}```
162+
163+
### Shared State
164+
165+
Demonstrates how to use the shared state between the UI and the agent.
166+
167+
State sent to the agent is detected by a function based instruction. This then
168+
validates the data using a custom pydantic model before using to create the
169+
instructions for the agent to follow and send to the client using a AG-UI tool.
170+
171+
View the [Shared State example](http://localhost:3000/pydantic-ai/feature/shared_state).
172+
173+
#### Recipe Tools
174+
175+
- `display_recipe` - AG-UI tool to display the recipe in a graphical format
176+
177+
#### Recipe Example
178+
179+
1. Customise the basic settings of your recipe
180+
2. Click `Improve with AI`
181+
182+
#### Shared State - Code
183+
184+
```snippet {path="/examples/pydantic_ai_examples/ag_ui/api/shared_state.py"}```
185+
186+
### Tool Based Generative UI
187+
188+
Demonstrates customised rendering for tool output with used confirmation.
189+
190+
View the [Tool Based Generative UI example](http://localhost:3000/pydantic-ai/feature/tool_based_generative_ui).
191+
192+
#### Haiku Tools
193+
194+
- `generate_haiku` - AG-UI tool to display a haiku in English and Japanese
195+
196+
#### Haiku Prompt
197+
198+
```text
199+
Generate a haiku about formula 1
200+
```
201+
202+
#### Tool Based Generative UI - Code
203+
204+
```snippet {path="/examples/pydantic_ai_examples/ag_ui/api/tool_based_generative_ui.py"}```

0 commit comments

Comments
 (0)

[8]ページ先頭

©2009-2025 Movatter.jp