- Notifications
You must be signed in to change notification settings - Fork1k
Add provider for Anthropic's Vertexai Client#1392
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
base:main
Are you sure you want to change the base?
Conversation
# breaking this in multiple lines breaks pycharm type recognition. However, I was unable to stop ruff from | ||
# doing it - # fmt: skip etc didn't work :( | ||
provider: Literal['anthropic', 'anthropic-vertex'] | ||
| Provider[AsyncAnthropicVertex] | ||
| Provider[AsyncAnthropic] = # fmt: skip | ||
'anthropic', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
I think you need to do# fmt: off
and# fmt: on
after.
instance of `Provider[AsyncAnthropic]`. If not provided, the other parameters will be used. | ||
provider: The provider to use for the Anthropic API. Can be either the string 'anthropic', | ||
'anthropic-vertex', or an instance of Provider[AsyncAnthropic] or Provider[AsyncAnthropicVertex]. | ||
Defaults to 'anthropic'. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Maybe you should also add an entry to theKnownModelName
literal atmodels/__init__.py
elif provider == 'anthropic-vertex': | ||
from .anthropic_vertex import AnthropicVertexProvider | ||
return AnthropicVertexProvider() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
This is tricky. If you wantgoogle-vertex:<anthropic-model>
, the agent should be able to infer this provider instead of the other... Maybe the provider should offer different clients depending on the model?
Right now, theGoogleVertexProvider
is a generic on the client, which currently ishttpx.AsyncClient
. What the provider can offer a client via method?
classGoogleVertexProvider(Provider):defget_client(self,tp:type[T])->T:ifisinstance(tp,httpx.AsyncClient):returnself.httpx_clientelifisinstance(tp,AsyncAnthropicVertex):returnself.anthropic_clientelse:raiseValueError('not supported')
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
@Kludex really want to see this happen - I would propose the following options:
- Kick out VertexAI to its own top level provider - e.g. Bedrock - and let the VertexAI model class handle Anthropic/Mistral/Llama/Gemini VertexAI providers. This would be pretty clean, as the string would be
google-vertex: <model name>
which should route to the hypothetical VertexAI model class instead of a GeminiModel class. - Have routing logic based on
google-vertex: <model>
that would be responsible for routing to Pydantic-supported model types - e.g. it may only support Mistral/Gemini/Anthropic. - Change the naming convention for VertexAI Gemini models from
google-vertexai
togemini-vertexai
to support a model-provider model name format.
Personally, I like number 1 as I think it would be the cleanest from a top-level abstraction standpoint.
This PR is fine, we need to get around on how to choose this client when passing the model string to the agent. |
RoyLeviLangware commentedMay 5, 2025
For those stuck on this too, here is a temporary solution:
|
acehand commentedMay 29, 2025
When i do this, i get cannot import name 'AsyncAnthropicVertex' from 'pydantic_ai.providers.anthropic, am i doing something wrong ? i am loading the latest (0.2.11) pydantic-ai |
@acehand |
andaag commentedJun 2, 2025
Worth noting on this that prompt caching does not work with antropic python sdk + python...anthropics/anthropic-sdk-python#653 😢 |
@Kludex Just putting this back on your radar! |
This PR is stale, and will be closed in 3 days if no reply is received. |
Hello folks,
I had some time and started drafting a possible implementation for this issue:#960.
There's still lots of work to do, but I was feeling kind of insecure about my approach and wanted to get some (both general and concrete) feedback if possible before investing more time on it :)
My main questions are:
test_init
intests/models/test_anthropic.py
, how should I handle this?Thank you very much for your feedback and help and have a nice day :)