Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.

License

NotificationsYou must be signed in to change notification settings

line/LINE-DistilBERT-Japanese

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

This is a DistilBERT model pre-trained on 131 GB of Japanese web text.The teacher model is BERT-base that built in-house at LINE.The model was trained byLINE Corporation.

https://huggingface.co/line-corporation/line-distilbert-base-japanese

For Japanese

README_ja.md is written in Japanese.

How to use

fromtransformersimportAutoTokenizer,AutoModeltokenizer=AutoTokenizer.from_pretrained("line-corporation/line-distilbert-base-japanese",trust_remote_code=True)model=AutoModel.from_pretrained("line-corporation/line-distilbert-base-japanese")sentence="LINE株式会社で[MASK]の研究・開発をしている。"print(model(**tokenizer(sentence,return_tensors="pt")))

Requirements

fugashisentencepieceunidic-lite

Model architecture

The model architecture is the DitilBERT base model; 6 layers, 768 dimensions of hidden states, 12 attention heads, 66M parameters.

Evaluation

The evaluation byJGLUE is as follows:

model name#ParamsMarc_jaJNLIJSTSJSQuADJCommonSenseQA
accaccPearson/SpearmanEM/F1acc
LINE-DistilBERT68M95.688.989.2/85.187.3/93.376.1
Laboro-DistilBERT68M94.782.087.4/82.770.2/87.373.2
BandaiNamco-DistilBERT68M94.681.686.8/82.180.0/88.066.5

Tokenization

The texts are first tokenized by MeCab with the Unidic dictionary and then split into subwords by the SentencePiece algorithm. The vocabulary size is 32768.

Licenses

The pretrained models are distributed under the terms of theApache License, Version 2.0.

To cite this work

We haven't published any paper on this work. Please citethis GitHub repository:

@article{LINE DistilBERT Japanese,  title = {LINE DistilBERT Japanese},  author = {"Koga, Kobayashi and Li, Shengzhe and Nakamachi, Akifumi and Sato, Toshinori"},  year = {2023},  howpublished = {\url{http://github.com/line/LINE-DistilBERT-Japanese}}}

About

DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp