Rate this Page

MultiplicativeLR#

classtorch.optim.lr_scheduler.MultiplicativeLR(optimizer,lr_lambda,last_epoch=-1)[source]#

Multiply the learning rate of each parameter group by the factor given in the specified function.

When last_epoch=-1, set initial lr as lr.

Parameters
  • optimizer (Optimizer) – Wrapped optimizer.

  • lr_lambda (function orlist) – A function which computes a multiplicativefactor given an integer parameter epoch, or a list of suchfunctions, one for each group in optimizer.param_groups.

  • last_epoch (int) – The index of last epoch. Default: -1.

Example

>>>lmbda=lambdaepoch:0.95>>>scheduler=MultiplicativeLR(optimizer,lr_lambda=lmbda)>>>forepochinrange(100):>>>train(...)>>>validate(...)>>>scheduler.step()
../_images/MultiplicativeLR.png
get_last_lr()[source]#

Return last computed learning rate by current scheduler.

Return type

list[float]

get_lr()[source]#

Compute the learning rate of each parameter group.

Return type

list[float]

load_state_dict(state_dict)[source]#

Load the scheduler’s state.

Parameters

state_dict (dict) – scheduler state. Should be an object returnedfrom a call tostate_dict().

state_dict()[source]#

Return the state of the scheduler as adict.

It contains an entry for every variable in self.__dict__ whichis not the optimizer.The learning rate lambda functions will only be saved if they are callable objectsand not if they are functions or lambdas.

Return type

dict[str,Any]

step(epoch=None)[source]#

Perform a step.