Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Trials of pre-trained BERT models for the medical domain in Japanese.

License

NotificationsYou must be signed in to change notification settings

ou-medinfo/medbertjp

Repository files navigation

They are designed to be adapted to the Japanese medical domain.
The medical corpora were scraped for academic use fromToday's diagnosis and treatment: premium, which consists of 15 digital references for clinicians in Japanese published byIGAKU-SHOIN Ltd..
The general corpora were extracted from a Wikipedia dump file (jawiki-20190901) onhttps://dumps.wikimedia.org/jawiki/.

Our demonstration models

Requirements

For just using the models:

Usage

Please check code examples oftokenization_example.ipynb, or try to useexample_google_colab.ipynb onGoogle Colab.

Funding

This work was supported by Council for Science, Technology and Innovation (CSTI), cross-ministerial Strategic Innovation Promotion Program (SIP), "Innovative AI Hospital System" (Funding Agency: National Institute of Biomedical Innovation, Health and Nutrition (NIBIOHN)).

Licenses

Creative Commons License
The pretrained models are distributed under aCreative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0).
They are freely available for academic purpose or individual research, but restricted for commecial use.

The codes in this repository are licensed under the Apache License, Version2.0.

About

Trials of pre-trained BERT models for the medical domain in Japanese.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp