Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Swift Package to implement a transformers-like API in Swift

License

NotificationsYou must be signed in to change notification settings

huggingface/swift-transformers

Repository files navigation

Swift + Transformers

Unit Tests

swift-transformers is a collection of utilities to help adopt language models in Swift apps.

It tries to follow the Pythontransformers API and abstractions whenever possible, but it also aims to provide an idiomatic Swift interface and does not assume prior familiarity withtransformers ortokenizers.

Rationale & Overview

Check outour announcement post.

Modules

  • Tokenizers: Utilities to convert text to tokens and back, with support for Chat Templates and Tools. Follows the abstractions intokenizers. Usage example:
import Tokenizersfunc testTokenizer()asyncthrows{lettokenizer=tryawaitAutoTokenizer.from(pretrained:"deepseek-ai/DeepSeek-R1-Distill-Qwen-7B")letmessages=[["role":"user","content":"Describe the Swift programming language."]]letencoded=try tokenizer.applyChatTemplate(messages: messages)letdecoded= tokenizer.decode(tokens: encoded)}
  • Hub: Utilities for interacting with the Hugging Face Hub! Download models, tokenizers and other config files. Usage example:
import Hubfunc testHub()asyncthrows{letrepo=Hub.Repo(id:"mlx-community/Qwen2.5-0.5B-Instruct-2bit-mlx")letfilesToDownload=["config.json","*.safetensors"]letmodelDirectory:URL=tryawaitHub.snapshot(        from: repo,        matching: filesToDownload,        progressHandler:{ progressinprint("Download progress:\(progress.fractionCompleted*100)%")})print("Files downloaded to:\(modelDirectory.path)")}
  • Generation: Algorithms for text generation. Handles tokenization internally. Currently supported ones are: greedy search, top-k sampling, and top-p sampling.
  • Models: Language model abstraction over a Core ML package.

Usage via SwiftPM

To useswift-transformers with SwiftPM, you can add this to yourPackage.swift:

dependencies:[.package(url:"https://github.com/huggingface/swift-transformers", from:"0.1.17")]

And then, add the Transformers library as a dependency to your target:

targets:[.target(        name:"YourTargetName",        dependencies:[.product(name:"Transformers",package:"swift-transformers")])]

Projects that use swift-transformers ❤️

Usingswift-transformers in your project? Let us know and we'll add you to the list!

Supported Models

You can run inference on Core ML models withswift-transformers. Note that Core ML is not required to use theTokenizers orHub modules.

This package has been tested with autoregressive language models such as:

  • GPT, GPT-Neox, GPT-J.
  • SantaCoder.
  • StarCoder.
  • Falcon.
  • Llama 2.

Encoder-decoder models such as T5 and Flan are currentlynot supported.

Other Tools

Contributing

Swift Transformers is a community project and we welcome contributions. Pleasecheck outIssuestagged withgood first issue if you are looking for a place to start!

Please ensure your code passes the build and test suite before submitting a pullrequest. You can run the tests withswift test.

License

Apache 2.

About

Swift Package to implement a transformers-like API in Swift

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp