🎯
Focusing
NLP Researcher. Mainly interested in Pre-trained Language Model, Machine Reading Comprehension, Question Answering, etc.
- Joint Laboratory of HIT and iFLYTEK Research (HFL)
- Beijing, China
- 18:45
(UTC +08:00) - http://ymcui.github.io
- https://orcid.org/0000-0002-2452-375X
- @KCrosner
- https://scholar.google.com/citations?user=Xl53m0QAAAAJ&hl=en
Highlights
PinnedLoading
- Chinese-LLaMA-Alpaca
Chinese-LLaMA-Alpaca Public中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
- Chinese-LLaMA-Alpaca-2
Chinese-LLaMA-Alpaca-2 Public中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
- Chinese-LLaMA-Alpaca-3
Chinese-LLaMA-Alpaca-3 Public中文羊驼大模型三期项目 (Chinese Llama-3 LLMs) developed from Meta Llama 3
- Chinese-BERT-wwm
Chinese-BERT-wwm PublicPre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
Something went wrong, please refresh the page to try again.
If the problem persists, check theGitHub status page orcontact support.
If the problem persists, check theGitHub status page orcontact support.