Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

Code for Fast as CHITA: Neural Network Pruning with Combinatorial Optimization

NotificationsYou must be signed in to change notification settings

mazumder-lab/CHITA

Repository files navigation

This is the offical repo of the ICML 2023 paperFast as CHITA: Neural Network Pruning with Combinatorial Optimization

Requirements

This code has been tested with Python 3.7 and the following packages:

numba==0.56.4numpy==1.21.6scikit_learn==1.0.2torch==1.12.1+cu113torchvision==0.13.1+cu113

Pruned models

We provide checkpoints for our best pruned models, obtained with the gradual pruning procedure described in the paper.

MobileNetV1

SparsityCheckpoint
75.28link
89.00link

ResNet50

SparsityCheckpoint
90.00link
95.00link
98.00link

Structure of the repo

Scripts to run the algorithms are located inscripts/. The current code supports the following architectures (datasets): MLPNet (MNIST), ResNet20 (Cifar10), MobileNetV1 (Imagenet) and ResNet50 (Imagenet). Adding new models can be done throughmodel_factory function inutils/main_utils.py.

Citing CHITA

If you find CHITA useful in your research, please consider citing the following paper.

@InProceedings{pmlr-v202-benbaki23a,  title =  {Fast as {CHITA}: Neural Network Pruning with Combinatorial Optimization},  author =       {Benbaki, Riade and Chen, Wenyu and Meng, Xiang and Hazimeh, Hussein and Ponomareva, Natalia and Zhao, Zhe and Mazumder, Rahul},  booktitle =  {Proceedings of the 40th International Conference on Machine Learning},  pages =  {2031--2049},  year =  {2023},}

About

Code for Fast as CHITA: Neural Network Pruning with Combinatorial Optimization

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp