- Notifications
You must be signed in to change notification settings - Fork130
Swift Package to implement a transformers-like API in Swift
License
huggingface/swift-transformers
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
This is a collection of utilities to help adopt language models in Swift apps. It tries to follow the Pythontransformers
API and abstractions whenever possible, but it also aims to provide an idiomatic Swift interface and does not assume prior familiarity withtransformers
ortokenizers
.
Please, checkour post.
Tokenizers
. Utilities to convert text to tokens and back. Follows the abstractions intokenizers
andtransformers.js
. Usage example:
import Tokenizersfunc testTokenizer()asyncthrows{lettokenizer=tryawaitAutoTokenizer.from(pretrained:"pcuenq/Llama-2-7b-chat-coreml")letinputIds=tokenizer("Today she took a train to the West")assert(inputIds==[1,20628,1183,3614,263,7945,304,278,3122])}
However, you don't usually need to tokenize the input text yourself - theGeneration
code will take care of it.
Hub
. Utilities to download configuration files from the Hub, used to instantiate tokenizers and learn about language model characteristics.Generation
. Algorithms for text generation. Currently supported ones are greedy search and top-k sampling.Models
. Language model abstraction over a Core ML package.
This package has been tested with autoregressive language models such as:
- GPT, GPT-Neox, GPT-J.
- SantaCoder.
- StarCoder.
- Falcon.
- Llama 2.
Encoder-decoder models such as T5 and Flan are currentlynot supported. They are high up in ourpriority list.
swift-chat
, a simple app demonstrating how to use this package.exporters
, a Core ML conversion package for transformers models, based on Apple'scoremltools
.transformers-to-coreml
, a no-code Core ML conversion tool built onexporters
.
To useswift-transformers
with SwiftPM, you can add this to yourPackage.swift
:
dependencies:[.package(url:"https://github.com/huggingface/swift-transformers", from:"0.1.5")]
And then, add the Transformers library as a dependency to your target:
targets: [ .target( name: "YourTargetName", dependencies: [ .product(name: "Transformers", package: "swift-transformers") ] )]
- Tokenizers: download from the Hub, port from
tokenizers
- BPE family
- Fix Falcon, broken while porting BPE
- Improve tests, add edge cases, seehttps://github.com/xenova/transformers.js/blob/27920d84831e323275b38f0b5186644b7936e1a2/tests/generate_tests.py#L24
- Include fallback
tokenizer_config.json
for known architectures whose models don't have a configuration in the Hub (GPT2) - Port other tokenizer types: Unigram, WordPiece
exporters
– Core ML conversion tool.- Allow max sequence length to be specified.
- Allow discrete shapes
- Return
logits
from converted Core ML model - Use
coremltools
@main
for latest fixes. In particular,this merged PR makes it easier to use recent versions of transformers.
- Generation
- Nucleus sampling (we currently have greedy and top-k sampling)
- Usenew
top-k
implementation inAccelerate
. - Support discrete shapes in the underlying Core ML model by selecting the smallest sequence length larger than the input.
- Optimization: cache past key-values.
- Encoder-decoder models (T5)
- Demo app
- Allow system prompt to be specified.
- How to define a system prompt template?
- Test a code model (to stretch system prompt definition)
About
Swift Package to implement a transformers-like API in Swift
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Uh oh!
There was an error while loading.Please reload this page.