Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

A collection of tricks and tools to speed up transformer models

License

NotificationsYou must be signed in to change notification settings

OpenMachine-ai/transformer-tricks

Repository files navigation

A collection of tricks to simplify and speed up transformer models:

These transformer tricks extend a recent trend in neural network design toward architectural parsimony, in which unnecessary components are removed to create more efficient models. Notable examples includeRMSNorm’s simplification of LayerNorm by removing mean centering,PaLM's elimination of bias parameters, anddecoder-only transformer's omission of the encoder stack. This trend began with the originaltransformer model's removal of recurrence and convolutions.

For example, ourFlashNorm removes the weights from RMSNorm and merges them with the next linear layer. Andslim attention removes the entire V-cache from the context memory for MHA transformers.


Explainer videos

heyheyheyhey


Installation

Install the transformer tricks package:

pip install transformer-tricks

Alternatively, to run from latest repo:

git clone https://github.com/OpenMachine-ai/transformer-tricks.gitpython3 -m venv .venvsource .venv/bin/activatepip3 install --quiet -r requirements.txt

Documentation

Follow the links below for documentation of the python code in this directory:


Notebooks

The papers are accompanied by the following Jupyter notebooks:

  • Slim attention:Colab
  • Flash normalization:ColabColab
  • Removing weights from skipless transformers:Colab

Newsletter

Please subscribe to ournewsletter on substack to get the latest news about this project. We will never send you more than one email per month.

Substack


Contributing

We pay cash for high-impact contributions. Please check outCONTRIBUTING for how to get involved.


Sponsors

The Transformer Tricks project is currently sponsored byOpenMachine. We'd love to hear from you if you'd like to join us in supporting this project.


Please give us a ⭐ if you like this repo, and check outTinyFive


Star History Chart

Releases

No releases published

Packages

No packages published

Contributors5


[8]ページ先頭

©2009-2025 Movatter.jp