Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Polynomial Learning Rate Decay Scheduler for PyTorch

License

NotificationsYou must be signed in to change notification settings

cmpark0126/pytorch-polynomial-lr-decay

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Polynomial Learning Rate Decay Scheduler for PyTorch

This scheduler is frequently used in many DL paper. But there is no official implementation in PyTorch. So I propose this code.

Install

$ pip install git+https://github.com/cmpark0126/pytorch-polynomial-lr-decay.git

Usage

fromtorch_poly_lr_decayimportPolynomialLRDecayscheduler_poly_lr_decay=PolynomialLRDecay(optim,max_decay_steps=100,end_learning_rate=0.0001,power=2.0)forepochinrange(train_epoch):scheduler_poly_lr_decay.step()# you can handle step as epoch number    ...

or

fromtorch_poly_lr_decayimportPolynomialLRDecayscheduler_poly_lr_decay=PolynomialLRDecay(optim,max_decay_steps=100,end_learning_rate=0.0001,power=2.0)...forbatch_idx, (inputs,targets)inenumerate(trainloader):scheduler_poly_lr_decay.step()# also, you can handle step as each iter number

About

Polynomial Learning Rate Decay Scheduler for PyTorch

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp