- Notifications
You must be signed in to change notification settings - Fork161
License
google-research/FLAN
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Original Flan (2021) |The Flan Collection (2022) |Flan 2021 Citation |License
This repository contains code to generate instruction tuning dataset collections. The first is the original Flan 2021, documented inFinetuned Language Models are Zero-Shot Learners, and the second is the expanded version, called the Flan Collection, described inThe Flan Collection: Designing Data and Methods for Effective Instruction Tuning and used to produceFlan-T5 andFlan-PaLM.
To generate the Flan 2021 data as Seqio mixtures, first install the relevantrequirements.txt then usemixtures.py.
Please cite the following if you found Flan 2021 useful in your research.
@inproceedings{weifinetuned, title={Finetuned Language Models are Zero-Shot Learners}, author={Wei, Jason and Bosma, Maarten and Zhao, Vincent and Guu, Kelvin and Yu, Adams Wei and Lester, Brian and Du, Nan and Dai, Andrew M and Le, Quoc V}, booktitle={International Conference on Learning Representations}}The code in this repository is licensed according to theLICENSE file.
To contact us feel free to create an Issue in this repository, or email the respective authors that contributed to this code base: Jason Wei for theFlan 2021 paper, Le Hou for theScaling Flan paper, and Shayne Longpre for theFlan Collection.
About
Resources
License
Contributing
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.