Rate this Page
★★★★★
ExponentialLR#
- classtorch.optim.lr_scheduler.ExponentialLR(optimizer,gamma,last_epoch=-1)[source]#
Decays the learning rate of each parameter group by gamma every epoch.
When last_epoch=-1, sets initial lr as lr.
- Parameters
Example
>>>scheduler=ExponentialLR(optimizer,gamma=0.95)>>>forepochinrange(100):>>>train(...)>>>validate(...)>>>scheduler.step()

- load_state_dict(state_dict)[source]#
Load the scheduler’s state.
- Parameters
state_dict (dict) – scheduler state. Should be an object returnedfrom a call to
state_dict().
On this page