Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit935ac90

Browse files
committed
update the tutorial.
1 parent6e19b73 commit935ac90

File tree

1 file changed

+18
-7
lines changed

1 file changed

+18
-7
lines changed

‎notebooks/en/optuna_hpo_with_transformers.ipynb

Lines changed: 18 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -9,14 +9,17 @@
99
"\n",
1010
"_Authored by: [Parag Ekbote](https://github.com/ParagEkbote)_\n",
1111
"\n",
12-
"In this notebook, we are going\n",
13-
"\n"
12+
"In this notebook, we are going to use the [optuna](https://github.com/optuna/optuna) library to perform hyperparameter optimization on a light-weight BERT model on a small subset of the IMDB dataset. To learn more about transformers' hyperparameter search, you can check the following documentation [here](https://huggingface.co/docs/transformers/en/hpo_train).\n",
13+
"\n",
14+
"Firstly, we will install the following dependencies to ensure that our code is executed:"
1415
]
1516
},
1617
{
17-
"cell_type":"markdown",
18-
"id":"3612d3e9",
18+
"cell_type":"code",
19+
"execution_count":null,
20+
"id":"a309e1a0",
1921
"metadata": {},
22+
"outputs": [],
2023
"source": [
2124
"!pip install -q datasets evaluate transformers"
2225
]
@@ -72,7 +75,8 @@
7275
"id":"5a46fac4",
7376
"metadata": {},
7477
"source": [
75-
"# Set the Metrics and define the model"
78+
"# Set the Metrics and define the Trainer class\n",
79+
"\n"
7680
]
7781
},
7882
{
@@ -116,7 +120,14 @@
116120
"id":"b10c26c6",
117121
"metadata": {},
118122
"source": [
119-
"# Define the Search Space and Start the Trials"
123+
"# Define the Search Space and Start the Trials\n",
124+
"\n",
125+
"We will now define the optuna hyperparameter search space to find the best set of hyperparameters for the learning rate and batch size. We can now launch the hyperparameter search by passing the following metrics:\n",
126+
"\n",
127+
"1. direction: We aim to maxime the evaluation metric\n",
128+
"2. backend: We will use optuna for searching\n",
129+
"3. n_trials: The number of trials optuna will be executed\n",
130+
"4. compute_objective: THe objective to minimize or maximize from the metrics returned by `evaluate`"
120131
]
121132
},
122133
{
@@ -139,7 +150,7 @@
139150
" direction=\"maximize\",\n",
140151
" backend=\"optuna\",\n",
141152
" hp_space=optuna_hp_space,\n",
142-
" n_trials=5,\n",
153+
" n_trials=20,\n",
143154
" compute_objective=compute_objective,\n",
144155
")\n",
145156
"\n",

0 commit comments

Comments
 (0)

[8]ページ先頭

©2009-2025 Movatter.jp