|
24 | 24 | └── samples : the retrieval results by TSUBAKI, BERT, and Joint model |
25 | 25 |
|
26 | 26 | ``` |
27 | | -\*1 We modified the original code of BERT so that it can tokenize Japanese sentences and read localgovFAQ or other FAQ datasets. See[ku-nlp/bert](https://github.com/ku-nlp/bert/tree/FAQretrieval). |
28 | | -\*2 The detail about localgovFAQ is on[localgovFAQ.md](localgovFAQ.md). |
| 27 | +**\*1 We modified the original code of BERT so that it can tokenize Japanese sentences and read localgovFAQ or other FAQ datasets. See[ku-nlp/bert](https://github.com/ku-nlp/bert/tree/FAQretrieval).** |
| 28 | + |
| 29 | +**\*2 The detail about localgovFAQ is on[localgovFAQ.md](localgovFAQ.md).** |
29 | 30 |
|
30 | 31 | ###BERT application for FAQ retrieval |
31 | 32 |
|
|