Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

a lightweight project for classification and bag of tricks are employed for better performance

License

NotificationsYou must be signed in to change notification settings

wwn1233/A-Lightwe-Classification-Project-based-on-PyTorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 

Repository files navigation

News

2019.07.15 add OHEM and Mixup methods

2019.06.06 add warm up method of lr

2019.05.28 add reduced-resnet and different optimizers for cifar and fashionminist dtatset

2019.05.11 add support for cifar and fashionmnist dataset

Requirements

Training and Testing on Python3.5

pytorch = 0.4.0torchvision>=0.2.0matplotlibnumpyscipyopencvpyyamlpackagingPILtqdmtime

Main Results

  • MINC-2500 is a patch classification dataset with 2500 samples per category. This is a subset of MINC where samples have been sized to 362 x 362 and each category is sampled evenly. Error rate and five fold cross validation are employed for evaluating. Based on resnet50, we can achieve a comparable result with state-od-the-arts.
train1-vali1train1-test1train2-vali2train2-test2train3-vali3train3-test3train4-vali4train4-test4train5-vali5train5-test5Average
Deep-TEN----------19.4%
ours19.0%19.0%19.0%19.0%19.0%18.0%19.0%19.0%20.0%19.0%19.0%
  • CIFAR100. In this experiment, we choose thereduced-resnet as our backbone network(you can choose yours).
ModelsBase+RE+Mixup
REResNet-2030.84%29.87%-
oursResNet-2029.85%28.61%27.7%
  • More dataset coming soon ......

Characteristics

  1. Basic data augmentation methods
    • horizontal/vertical flip
    • random rot (90)
    • color jitter
    • random erasing
    • test augmentation
    • lighting noise
    • mixup
  2. Multiple backbones
    • Resnet
    • Desnsenet
    • Reduced-resnet
  3. Other methods
    • Focal loss
    • Label smooth
    • Combining global max pooling and global average pooling
    • Orthgonal center loss based on subspace masking
    • Learning rate warmup
    • OHEM(online hard example mining)

Data Preparation

  • MINC-2500. The data structure is following the Materials in Context Database (MINC)
  • data/minc-2500
    • images
    • labels
  • CIFAR100. The data would be automaticly downloaded to the folder: "./data"

Train

  • MINC-2500

python experiments/recognition/main.py - -dataset minc - -loss CrossEntropyLoss - -nclass 23 - -backbone resnet50 - -checkname test - -ocsm

  • CIFAR100

python experiments/recognition/main.py - -backbone resnet_reduce - -res_reduce_depth 20 - -solver_type SGD - -lr-step 200,300 - -dataset cifar100 - -lr 0.1 - -epochs 375 - -batch-size 384 - -mixup

Note: (- -lr-step 200,300) indicates that leanrning rate is decayed by 10 at 200-th and 300-th epoch; (- -lr-step 200,) indicates that learning rate is decayed by 10 evary 200 epochs. (- - batch-size 384 - -ohem 192) indicates choosing 192 hard examples from 384 instances.

Test

  • MINC-2500. For example:

python experiments/recognition/main.py - -dataset minc - -nclass 23 - -backbone resnet18 - -test-batch-size 128 - -eval --resume experiments/recognition/runs/minc/deepten/09-3/*.pth

  • CIFAR100. For example:

python experiments/recognition/main.py - -backbone resnet_reduce - -res_reduce_depth 20 - -dataset cifar100 - -test-batch-size 128 - -eval --resume experiments/recognition/runs/cifar100/deepten/0/*.pth

Related Repos

PyTorch Encoding

Random Erasing

About

a lightweight project for classification and bag of tricks are employed for better performance

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp