Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Use Hugging Face with JavaScript

License

NotificationsYou must be signed in to change notification settings

huggingface/huggingface.js

Repository files navigation


huggingface javascript library logo

awaitinference.translation({model:'t5-base',inputs:'My name is Wolfgang and I live in Berlin'})awaitinference.textToImage({model:'stabilityai/stable-diffusion-2',inputs:'award winning high resolution photo of a giant tortoise/((ladybird)) hybrid, [trending on artstation]',parameters:{negative_prompt:'blurry',}})

This is a collection of JS libraries to interact with the Hugging Face API, with TS types included.

With more to come, like@huggingface/endpoints to manage your HF Endpoints!

We use modern features to avoid polyfills and dependencies, so the libraries will only work on modern browsers / Node.js >= 18 / Bun / Deno.

The libraries are still very young, please help us by opening issues!

Installation

From NPM

To install via NPM, you can download the libraries as needed:

npm install @huggingface/inferencenpm install @huggingface/hub

Then import the libraries in your code:

import{HfInference}from"@huggingface/inference";import{createRepo,commit,deleteRepo,listFiles}from"@huggingface/hub";importtype{RepoId,Credentials}from"@huggingface/hub";

From CDN or Static hosting

You can run our packages with vanilla JS, without any bundler, by using a CDN or static hosting. UsingES modules, i.e.<script type="module">, you can import the libraries in your code:

<scripttype="module">import{HfInference}from'https://cdn.jsdelivr.net/npm/@huggingface/inference@2.0.0/+esm';import{createRepo,commit,deleteRepo,listFiles}from"https://cdn.jsdelivr.net/npm/@huggingface/hub@0.5.0/+esm";</script>

Usage examples

Get your HF access token in youraccount settings.

@huggingface/inference examples

import{HfInference}from"@huggingface/inference";constHF_ACCESS_TOKEN="hf_...";constinference=newHfInference(HF_ACCESS_TOKEN);awaitinference.translation({model:'t5-base',inputs:'My name is Wolfgang and I live in Berlin'})awaitinference.textToImage({model:'stabilityai/stable-diffusion-2',inputs:'award winning high resolution photo of a giant tortoise/((ladybird)) hybrid, [trending on artstation]',parameters:{negative_prompt:'blurry',}})awaitinference.imageToText({data:await(awaitfetch('https://picsum.photos/300/300')).blob(),model:'nlpconnect/vit-gpt2-image-captioning',})// Using your own inference endpoint: https://hf.co/docs/inference-endpoints/constgpt2=inference.endpoint('https://xyz.eu-west-1.aws.endpoints.huggingface.cloud/gpt2');const{ generated_text}=awaitgpt2.textGeneration({inputs:'The answer to the universe is'});

@huggingface/hub examples

import{createRepo,uploadFile,deleteFiles}from"@huggingface/hub";constHF_ACCESS_TOKEN="hf_...";awaitcreateRepo({repo:"my-user/nlp-model",// or {type: "model", name: "my-user/nlp-test"},credentials:{accessToken:HF_ACCESS_TOKEN}});awaituploadFile({repo:"my-user/nlp-model",credentials:{accessToken:HF_ACCESS_TOKEN},// Can work with native File in browsersfile:{path:"pytorch_model.bin",content:newBlob(...)}});awaitdeleteFiles({repo:{type:"space",name:"my-user/my-space"},// or "spaces/my-user/my-space"credentials:{accessToken:HF_ACCESS_TOKEN},paths:["README.md",".gitattributes"]});

There are more features of course, check each library's README!

Formatting & testing

pnpm installpnpm -r format:checkpnpm -r lint:checkpnpm -r test

Building

pnpm -r build

This will generate ESM and CJS javascript files inpackages/*/dist, egpackages/inference/dist/index.mjs.


[8]ページ先頭

©2009-2025 Movatter.jp