You signed in with another tab or window.Reload to refresh your session.You signed out in another tab or window.Reload to refresh your session.You switched accounts on another tab or window.Reload to refresh your session.Dismiss alert
[1] L2 Regularization versus Batch and Weight Normalization [2] Towards Understanding Regularization in Batch Normalization [3] Learning Efficient Convolutional Networks through Network Slimming [4] Accurate, Large Minibatch SGD:Training ImageNet in 1 Hour [5] Large Batch Training of Convolutional Networks [6] On the Variance of the Adaptive Learning Rate and Beyond [7] Rethinking the inception architecture for computer vision [8] Gradient Harmonized Single-stage Detector [9] Data-Driven Sparse Structure Selection for Deep Neural Networks [10] Rethinking the Value of Network Pruning
TODO
解决类别不平衡的做法:
reweighted sample从而实现self-balance(参考sklearn);
先用训练一个网络然后采样平衡数据集做finetune。
使用GAN生成数据,进行数据增强;
Handwriting Recognition in Low-resource Scripts Using Adversarial Learning。