- Notifications
You must be signed in to change notification settings - Fork468
Use Hugging Face with JavaScript
License
huggingface/huggingface.js
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
awaitinference.translation({model:'t5-base',inputs:'My name is Wolfgang and I live in Berlin'})awaitinference.textToImage({model:'stabilityai/stable-diffusion-2',inputs:'award winning high resolution photo of a giant tortoise/((ladybird)) hybrid, [trending on artstation]',parameters:{negative_prompt:'blurry',}})
This is a collection of JS libraries to interact with the Hugging Face API, with TS types included.
- @huggingface/inference: Use the Inference API to make calls to 100,000+ Machine Learning models, or your owninference endpoints!
- @huggingface/hub: Interact with huggingface.co to create or delete repos and commit / download files
With more to come, like@huggingface/endpoints
to manage your HF Endpoints!
We use modern features to avoid polyfills and dependencies, so the libraries will only work on modern browsers / Node.js >= 18 / Bun / Deno.
The libraries are still very young, please help us by opening issues!
To install via NPM, you can download the libraries as needed:
npm install @huggingface/inferencenpm install @huggingface/hub
Then import the libraries in your code:
import{HfInference}from"@huggingface/inference";import{createRepo,commit,deleteRepo,listFiles}from"@huggingface/hub";importtype{RepoId,Credentials}from"@huggingface/hub";
You can run our packages with vanilla JS, without any bundler, by using a CDN or static hosting. UsingES modules, i.e.<script type="module">
, you can import the libraries in your code:
<scripttype="module">import{HfInference}from'https://cdn.jsdelivr.net/npm/@huggingface/inference@2.0.0/+esm';import{createRepo,commit,deleteRepo,listFiles}from"https://cdn.jsdelivr.net/npm/@huggingface/hub@0.5.0/+esm";</script>
Get your HF access token in youraccount settings.
import{HfInference}from"@huggingface/inference";constHF_ACCESS_TOKEN="hf_...";constinference=newHfInference(HF_ACCESS_TOKEN);awaitinference.translation({model:'t5-base',inputs:'My name is Wolfgang and I live in Berlin'})awaitinference.textToImage({model:'stabilityai/stable-diffusion-2',inputs:'award winning high resolution photo of a giant tortoise/((ladybird)) hybrid, [trending on artstation]',parameters:{negative_prompt:'blurry',}})awaitinference.imageToText({data:await(awaitfetch('https://picsum.photos/300/300')).blob(),model:'nlpconnect/vit-gpt2-image-captioning',})// Using your own inference endpoint: https://hf.co/docs/inference-endpoints/constgpt2=inference.endpoint('https://xyz.eu-west-1.aws.endpoints.huggingface.cloud/gpt2');const{ generated_text}=awaitgpt2.textGeneration({inputs:'The answer to the universe is'});
import{createRepo,uploadFile,deleteFiles}from"@huggingface/hub";constHF_ACCESS_TOKEN="hf_...";awaitcreateRepo({repo:"my-user/nlp-model",// or {type: "model", name: "my-user/nlp-test"},credentials:{accessToken:HF_ACCESS_TOKEN}});awaituploadFile({repo:"my-user/nlp-model",credentials:{accessToken:HF_ACCESS_TOKEN},// Can work with native File in browsersfile:{path:"pytorch_model.bin",content:newBlob(...)}});awaitdeleteFiles({repo:{type:"space",name:"my-user/my-space"},// or "spaces/my-user/my-space"credentials:{accessToken:HF_ACCESS_TOKEN},paths:["README.md",".gitattributes"]});
There are more features of course, check each library's README!
pnpm installpnpm -r format:checkpnpm -r lint:checkpnpm -r test
pnpm -r build
This will generate ESM and CJS javascript files inpackages/*/dist
, egpackages/inference/dist/index.mjs
.
About
Use Hugging Face with JavaScript
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Uh oh!
There was an error while loading.Please reload this page.