Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

License

NotificationsYou must be signed in to change notification settings

google/flaxformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Flaxformer is a transformer library for primarily NLP and multimodal research atGoogle. It is used for many NLP research use cases, providing both off-the-shelfBERT and T5 models, and several research projects built on shared components.

General library goals

The Flaxformer library aims to provide transformer models that are:

  • High performance: Models are annotated for use with the PJIT API,enabling them to be used for training the largest models.
  • Reusable: Components have self-contained configuration, and high-levelmodules like encoders, decoders, etc. don't make too many assumptions aboutwhat their sub-modules look like.
  • Tested: We aim to employ a reasonable amount of unit testing, and writetests whenever bugs are encountered. However no guarantees are provided.
  • Maintainble: We have created a versioning strategy for our modules socode refactors can take place which alter the module structure. This istricky in Flax, because Flax generates a tree of parameters based on theexact module structure. Our approach lets us maintain compatibility withpreviously trained model checkpoints.

Code locations

Modeling components such as dense attention, layer norms, and MLP blocks can befound in thecomponents/ directory.

Higher-level classes which combine these components can be found in thearchitectures/ directory. The current architecture file for the T5 family ofmodels isarchitectures/t5/t5_architecture.py; this is a mid-level APIrequiring sub-components to be configured. A high-level starting point, exposingfewer parameters, isarchitectures/t5/t5_1_1.py.

Relationship to other codebases

Flaxformer is primarily used by other research projects, in particularT5X.We hope to release examples demonstrating the integration of these codebasessoon.

If you would like to use Flaxformer independently of T5X, please see the unittests for examples instantiating the models. In the medium-term future, we hopeto provide more stand-alone examples of Flaxformer use.

Contributions

Unfortunately, we cannot accept contributions to the Flaxformer repo at thistime, so any pull requests will be automatically closed - but please file issuesas needed!

Installing dependencies and running tests

First, we recommend installing a few dependencies manually,

pip3 install numpy sentencepiece tensorflow>=2.14.0

This is a workaround to prevent pip backtracking on package versions; webelieve there is either a version conflict in upstream packages, or pip'sconstraint solving process is imperfect.

Then, check out this repository. In its root directory, you can install italong with test dependencies by running,

pip3 install '.[testing]'

If you like, you can run the tests from pytest with the following invocation,

python3 -m pytest

Uninstalling

If you need to uninstall Flaxformer, please run,

pip3 uninstall flaxformer

Troubleshooting

Flax deps

Flaxformer is developed in close collaboration with the Flax team. There may bebugs if your Flax version is not up to date. To install the latest version fromGitHub, please run,

pip3 uninstall flaxpip3 install git+https://github.com/google/flax

Note

Flaxformer is a project maintained by a team in Google Research. It is not anofficial Google product.

About

No description, website, or topics provided.

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp