- Notifications
You must be signed in to change notification settings - Fork1k
Add Hugging Face as a provider#1911
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
Uh oh!
There was an error while loading.Please reload this page.
Conversation
Great! :) How can I help here? |
Hi@Kludex, |
Amazing! :) |
hyperlint-aibot commentedJun 9, 2025 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
PR Change SummaryAdded support for Hugging Face as a model provider, enhancing the library's capabilities for AI inference.
Modified Files
Added Files
How can I customize these reviews?Check out theHyperlint AI Reviewer docs for more information on how to customize the review. If you just want to ignore it on this PR, you can add the Note specifically for link checks, we only check the first 30 links in a file and we cache the results for several hours (for instance, if you just added a page, you might experience this). Our recommendation is to add |
Hi@Kludex, |
Hi@Kludex, |
This comment was marked as outdated.
This comment was marked as outdated.
Sorry, something went wrong.
Uh oh!
There was an error while loading.Please reload this page.
'huggingface:Qwen/QwQ-32B', | ||
'huggingface:Qwen/Qwen2.5-72B-Instruct', | ||
'huggingface:Qwen/Qwen3-235B-A22B', | ||
'huggingface:Qwen/Qwen3-32B', | ||
'huggingface:deepseek-ai/DeepSeek-R1', | ||
'huggingface:meta-llama/Llama-3.3-70B-Instruct', | ||
'huggingface:meta-llama/Llama-4-Maverick-17B-128E-Instruct', | ||
'huggingface:meta-llama/Llama-4-Scout-17B-16E-Instruct', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
How can we keep the list of those models up-to-date? Do you folks have an endpoint that we can call to list a lot of them, or something?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
yes! we've just shippedhttps://router.huggingface.co/v1/models that returns the list of models (sorted by trending score on the Hugging Face Hub). Not sure how can I updateKnownModelName
dynamically? i can do that for [LatestHuggingFaceModelNames](https://github.com/hanouticelina/pydantic-ai/blob/873f090197b6c7b8c2ae1c0760db17ef54814f86/pydantic_ai_slim/pydantic_ai/models/huggingface.py#L69, butKnownModelName
needs to be update as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
We can create a test that: if run locally, it would call that endpoint and it will check against this list of models.
In CI it should use the cassette.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
I wrote some comments here.
We need to createThinkingPart
s (does the HuggingFace client handles them?) and we need to add code coverage.
I prefer tests strictly with VCR, if possible.
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
Kludex commentedJul 8, 2025 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
@hanouticelina Are you still working on this? Let me know when I should take over. |
Hey@Kludex, |
[DOCS] add more context to hugging face models page
…antic-ai into hf-inference-providers
Hey@Kludex, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Pull Request Overview
Add Hugging Face as a first-class inference provider, mirroring the existing OpenAI support.
- Introduce
HuggingFaceProvider
and register it in provider selection. - Implement
HuggingFaceModel
with full streaming and request/response handling. - Update tests, CLI, examples, dependencies, and documentation to include Hugging Face.
Reviewed Changes
Copilot reviewed 27 out of 28 changed files in this pull request and generated 1 comment.
Show a summary per file
File | Description |
---|---|
pydantic_ai_slim/pydantic_ai/providers/huggingface.py | NewHuggingFaceProvider implementation |
pydantic_ai_slim/pydantic_ai/models/huggingface.py | NewHuggingFaceModel implementation |
tests/providers/test_huggingface.py | Unit tests for provider initialization and errors |
docs/models/huggingface.md | User documentation for Hugging Face support |
pyproject.toml & pydantic_ai_slim/pyproject.toml | Addhuggingface optional dependency |
Comments suppressed due to low confidence (1)
docs/models/huggingface.md:10
- [nitpick] The installation command appears incorrect (
pip/uv-add
). It should bepip install "pydantic-ai-slim[huggingface]"
for clarity and accuracy.
pip/uv-add "pydantic-ai-slim[huggingface]"
Uh oh!
There was an error while loading.Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
I think we are good here.
Thanks for handling all the coverage! :)
4d755d2
intopydantic:mainUh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
closes#1085.
Hi there, maintainer of
huggingface_hub
library 🤗 here,This PR introduces support for Hugging Face's Inference Providers (documentationhere) as a Model Provider.
Our API is fully compatible with the OpenAI REST API spec, and the implementation closely mirrors the existing
OpenAIProvider
/OpenAIModel
pair. Under the hood, we use thehuggingface_hub.AsyncInferenceClient
client, which is a drop-in replacement of the async OpenAI client but includes provider-specific (de)serialization logic that cannot be reproduced reliably with the OpenAI client alone, see@Wauplin’s detailed explanationhere).Note that
huggingface_hub
is a stable and widely used library that was already listed as a dependency in the lockfile.TODO: