lr-scheduling
Here are 9 public repositories matching this topic...
Language:All
A (warmup) (cyclic) flat and anneal learning rate scheduler in pytorch
- Updated
Jan 8, 2021 - Python
A method for assigning separate learning rate schedulers to different parameters group in a model.
- Updated
Jul 20, 2023 - Python
Contains the examples which covers how to incrementally train, how to implement learning_rate scheduler, and how to implement custom objective and evaluation function in case of lightgbm/xgboost models.
- Updated
Dec 13, 2020 - Jupyter Notebook
Class activation maps, Weight Updates, Optimizers & LR Schedulers
- Updated
Oct 14, 2022 - Jupyter Notebook
Cosine Annealed 1cycle Policy for PyTorch
- Updated
Jun 27, 2020 - Python
TinyYoloV2 imagenet 1K results.
- Updated
Mar 10, 2019 - Python
High-performance PyTorch LR schedulers with cosine annealing, flexible waypoints, plateau steps, and LR scaling. Unified API with pre-computed segments for zero runtime overhead.
- Updated
Oct 13, 2025 - Python
Built a custom adam scheduler using gradient clipping, LR scheduling, momentum updates, with two different loss functions
- Updated
Jan 20, 2024 - Python
Improve this page
Add a description, image, and links to thelr-scheduling topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with thelr-scheduling topic, visit your repo's landing page and select "manage topics."