adam-optimizer
Here are 401 public repositories matching this topic...
Language:All
Sort:Most stars
On the Variance of the Adaptive Learning Rate and Beyond
- Updated
Jul 31, 2021 - Python
Deep learning library in plain Numpy.
- Updated
Jun 21, 2022 - Python
This repository contains the results for the paper: "Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers"
- Updated
Jul 17, 2021
CS F425 Deep Learning course at BITS Pilani (Goa Campus)
- Updated
Feb 5, 2025 - Jupyter Notebook
ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
- Updated
Jan 19, 2021 - C++
Lion and Adam optimization comparison
- Updated
Feb 23, 2023 - Jupyter Notebook
Google Street View House Number(SVHN) Dataset, and classifying them through CNN
- Updated
Mar 4, 2018 - Jupyter Notebook
Reproducing the paper "PADAM: Closing The Generalization Gap of Adaptive Gradient Methods In Training Deep Neural Networks" for the ICLR 2019 Reproducibility Challenge
- Updated
Apr 13, 2019 - Python
PyTorch/Tensorflow solutions for Stanford's CS231n: "CNNs for Visual Recognition"
- Updated
Jan 27, 2021 - Jupyter Notebook
Toy implementations of some popular ML optimizers using Python/JAX
- Updated
Jun 20, 2021 - Python
A collection of various gradient descent algorithms implemented in Python from scratch
- Updated
Feb 28, 2023 - Python
This library provides a set of functionalities for different type of deep learning (and ML) algorithms in C
- Updated
Sep 29, 2023 - C
From linear regression towards neural networks...
- Updated
Nov 23, 2025 - C++
A compressed adaptive optimizer for training large-scale deep learning models using PyTorch
- Updated
Nov 26, 2019 - Python
Modified XGBoost implementation from scratch with Numpy using Adam and RSMProp optimizers.
- Updated
Jul 24, 2020 - Jupyter Notebook
The project aimed to implement Deep NN / RNN based solution in order to develop flexible methods that are able to adaptively fillin, backfill, and predict time-series using a large number of heterogeneous training datasets.
- Updated
Aug 16, 2016 - Python
Lookahead optimizer ("Lookahead Optimizer: k steps forward, 1 step back") for tensorflow
- Updated
Sep 3, 2019 - Python
[Python] [arXiv/cs] Paper "An Overview of Gradient Descent Optimization Algorithms" by Sebastian Ruder
- Updated
Mar 23, 2019 - Python
📈Implementing the ADAM optimizer from the ground up with PyTorch and comparing its performance on six 3-D objective functions (each progressively more difficult to optimize) against SGD, AdaGrad, and RMSProp.
- Updated
Jul 2, 2022 - Jupyter Notebook
Improve this page
Add a description, image, and links to theadam-optimizer topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with theadam-optimizer topic, visit your repo's landing page and select "manage topics."