- Notifications
You must be signed in to change notification settings - Fork12
Pre-trained NFNets with 99% of the accuracy of the official paper "High-Performance Large-Scale Image Recognition Without Normalization".
License
hoangthang1607/nfnets-Tensorflow-2
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Paper:https://arxiv.org/abs/2102.06171.pdf
Original code:https://github.com/deepmind/deepmind-research/tree/master/nfnets
I recommend using Docker to run the code:
docker build -t nfnets/imagenet:latest --build-arg USER_ID=$(id -u) --build-arg GROUP_ID=$(id -g) .
To train NFNets on imagenet dataset:
docker run --rm -it --gpus all -v $(pwd):/tf -p 8889:8888 -p 6006:6006 nfnets/imagenet:latest python train.py --variant F0 --batch_size 4096 --num_epochs 360
Please see thetrain.py
module to get more arguments.
Pre-trained weights have been converted to be compatible with my models' implementation. You can download them fromhere
To evaluate NFNets on test set of imagenet dataset:
docker run --rm -it --gpus all -v $(pwd):/tf -p 8889:8888 -p 6006:6006 nfnets/imagenet:latest python evaluate_imagenet.py --variant F0 --batch_size 50
You can also check the notebook in the repo showing how to run an NFNet to classify an image.
- WSConv2d
- Clipping Gradient module
- Documentation
- NFNets
- NF-ResNets
- Update pretrained weights
- How to find-tune
To cite the original paper, use:
@article{brock2021high, author={Andrew Brock and Soham De and Samuel L. Smith and Karen Simonyan}, title={High-Performance Large-Scale Image Recognition Without Normalization}, journal={arXiv preprint arXiv:}, year={2021}}
About
Pre-trained NFNets with 99% of the accuracy of the official paper "High-Performance Large-Scale Image Recognition Without Normalization".
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.