Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit3932244

Browse files
committed
change default param in pretraining bert tutorial
1 parent5939b85 commit3932244

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

‎machine-learning/nlp/pretraining-bert/PretrainingBERT.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -202,7 +202,7 @@
202202
"# maximum sequence length, lowering will result to faster training (when increasing batch size)\n",
203203
"max_length = 512\n",
204204
"# whether to truncate\n",
205-
"truncate_longer_samples =False"
205+
"truncate_longer_samples =True"
206206
]
207207
},
208208
{

‎machine-learning/nlp/pretraining-bert/pretrainingbert.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,7 @@ def dataset_to_text(dataset, output_filename="data.txt"):
6666
# maximum sequence length, lowering will result to faster training (when increasing batch size)
6767
max_length=512
6868
# whether to truncate
69-
truncate_longer_samples=False
69+
truncate_longer_samples=True
7070

7171
# initialize the WordPiece tokenizer
7272
tokenizer=BertWordPieceTokenizer()

0 commit comments

Comments
 (0)

[8]ページ先頭

©2009-2025 Movatter.jp