Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Zero-shot Transfer Learning from English to Arabic

NotificationsYou must be signed in to change notification settings

lanwuwei/GigaBERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 

Repository files navigation

This repo contains pre-trained models and code-switched data generation script forGigaBERT:

@inproceedings{lan2020gigabert,  author     = {Lan, Wuwei and Chen, Yang and Xu, Wei and Ritter, Alan},  title      = {An Empirical Study of Pre-trained Transformers for Arabic Information Extraction},  booktitle  = {Proceedings of The 2020 Conference on Empirical Methods on Natural Language Processing (EMNLP)},  year       = {2020}}

Fine-tuning Experiments

Please checkYang Chen's GitHub for code and data.


Checkpoints

The pre-trained models can be found here:GigaBERT-v3 andGigaBERT-v4

Please contactWuwei Lan for code-switched GigaBERT with different configurations.

License

Apache License 2.0

Acknowledgement

This material is based in part on research sponsored by IARPA via the BETTER program (2019-19051600004).

Releases

No releases published

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp