- Notifications
You must be signed in to change notification settings - Fork5
lanwuwei/GigaBERT
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
This repo contains pre-trained models and code-switched data generation script forGigaBERT:
@inproceedings{lan2020gigabert, author = {Lan, Wuwei and Chen, Yang and Xu, Wei and Ritter, Alan}, title = {An Empirical Study of Pre-trained Transformers for Arabic Information Extraction}, booktitle = {Proceedings of The 2020 Conference on Empirical Methods on Natural Language Processing (EMNLP)}, year = {2020}}Please checkYang Chen's GitHub for code and data.
The pre-trained models can be found here:GigaBERT-v3 andGigaBERT-v4
Please contactWuwei Lan for code-switched GigaBERT with different configurations.
Apache License 2.0
This material is based in part on research sponsored by IARPA via the BETTER program (2019-19051600004).
About
Zero-shot Transfer Learning from English to Arabic
Topics
Resources
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
No releases published
Packages0
No packages published
