Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Add provider for Anthropic's Vertexai Client#1392

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Draft
ehaca wants to merge1 commit intopydantic:main
base:main
Choose a base branch
Loading
fromehaca:anthropic_via_hyperscaler

Conversation

ehaca
Copy link
Contributor

Hello folks,
I had some time and started drafting a possible implementation for this issue:#960.
There's still lots of work to do, but I was feeling kind of insecure about my approach and wanted to get some (both general and concrete) feedback if possible before investing more time on it :)

My main questions are:

  1. Is the idea about creating yet another extra something of your liking?
  2. The Pycharm typing issue about
def__init__(self,model_name:AnthropicModelName,*,# breaking this in multiple lines breaks pycharm type recognition. However, I was unable to stop ruff from# doing it - # fmt: skip etc didn't work  :(provider:Literal['anthropic','anthropic-vertex']|Provider[AsyncAnthropicVertex]|Provider[AsyncAnthropic]=# fmt: skip'anthropic',    ):
  1. Now I am getting some static typing errors fortest_init intests/models/test_anthropic.py, how should I handle this?

Thank you very much for your feedback and help and have a nice day :)

kkom reacted with thumbs up emojiavivex1000, davemaguire, and kkom reacted with heart emoji
@ehacaehaca mentioned this pull requestApr 6, 2025
Comment on lines +120 to +125
# breaking this in multiple lines breaks pycharm type recognition. However, I was unable to stop ruff from
# doing it - # fmt: skip etc didn't work :(
provider: Literal['anthropic', 'anthropic-vertex']
| Provider[AsyncAnthropicVertex]
| Provider[AsyncAnthropic] = # fmt: skip
'anthropic',
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

I think you need to do# fmt: off and# fmt: on after.

instance of `Provider[AsyncAnthropic]`. If not provided, the other parameters will be used.
provider: The provider to use for the Anthropic API. Can be either the string 'anthropic',
'anthropic-vertex', or an instance of Provider[AsyncAnthropic] or Provider[AsyncAnthropicVertex].
Defaults to 'anthropic'.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Maybe you should also add an entry to theKnownModelName literal atmodels/__init__.py

Comment on lines +76 to +79
elif provider == 'anthropic-vertex':
from .anthropic_vertex import AnthropicVertexProvider

return AnthropicVertexProvider()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

This is tricky. If you wantgoogle-vertex:<anthropic-model>, the agent should be able to infer this provider instead of the other... Maybe the provider should offer different clients depending on the model?

Right now, theGoogleVertexProvider is a generic on the client, which currently ishttpx.AsyncClient. What the provider can offer a client via method?

classGoogleVertexProvider(Provider):defget_client(self,tp:type[T])->T:ifisinstance(tp,httpx.AsyncClient):returnself.httpx_clientelifisinstance(tp,AsyncAnthropicVertex):returnself.anthropic_clientelse:raiseValueError('not supported')

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

@Kludex really want to see this happen - I would propose the following options:

  1. Kick out VertexAI to its own top level provider - e.g. Bedrock - and let the VertexAI model class handle Anthropic/Mistral/Llama/Gemini VertexAI providers. This would be pretty clean, as the string would begoogle-vertex: <model name> which should route to the hypothetical VertexAI model class instead of a GeminiModel class.
  2. Have routing logic based ongoogle-vertex: <model> that would be responsible for routing to Pydantic-supported model types - e.g. it may only support Mistral/Gemini/Anthropic.
  3. Change the naming convention for VertexAI Gemini models fromgoogle-vertexai togemini-vertexai to support a model-provider model name format.

Personally, I like number 1 as I think it would be the cleanest from a top-level abstraction standpoint.

@DouweMDouweM closed thisApr 30, 2025
@KludexKludex reopened thisApr 30, 2025
@Kludex
Copy link
Member

This PR is fine, we need to get around on how to choose this client when passing the model string to the agent.

@RoyLeviLangware
Copy link

For those stuck on this too, here is a temporary solution:

model_obj = AnthropicModel(        "claude-3-7-sonnet@20250219",        provider=AnthropicProvider(            anthropic_client=AsyncAnthropicVertex(                project_id="PROJECT_ID",                region="us-east5",            )  # type: ignore        ),    )
kimandrik reacted with thumbs up emojiluminoso, kraft87, andaag, and mr-ryan-james reacted with heart emoji

@acehand
Copy link

For those stuck on this too, here is a temporary solution:

model_obj = AnthropicModel(        "claude-3-7-sonnet@20250219",        provider=AnthropicProvider(            anthropic_client=AsyncAnthropicVertex(                project_id="PROJECT_ID",                region="us-east5",            )  # type: ignore        ),    )

When i do this, i get cannot import name 'AsyncAnthropicVertex' from 'pydantic_ai.providers.anthropic, am i doing something wrong ? i am loading the latest (0.2.11) pydantic-ai

@DouweM
Copy link
Contributor

@acehandAsyncAnthropicVertex is not provided bypydantic_ai, you'll have to import it fromanthropic.

@andaag
Copy link

Worth noting on this that prompt caching does not work with antropic python sdk + python...anthropics/anthropic-sdk-python#653 😢

kkom reacted with confused emoji

@DouweM
Copy link
Contributor

@Kludex Just putting this back on your radar!

@github-actionsGitHub Actions
Copy link

This PR is stale, and will be closed in 3 days if no reply is received.

Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment
Reviewers

@acostajohnacostajohnacostajohn left review comments

@KludexKludexKludex left review comments

@Jflick58Jflick58Jflick58 left review comments

Assignees

@KludexKludex

Projects
None yet
Milestone
No milestone
Development

Successfully merging this pull request may close these issues.

8 participants
@ehaca@Kludex@RoyLeviLangware@acehand@DouweM@andaag@acostajohn@Jflick58

[8]ページ先頭

©2009-2025 Movatter.jp