Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

SCOPS: Self-Supervised Co-Part Segmentation (CVPR'19)

License

NotificationsYou must be signed in to change notification settings

NVlabs/SCOPS

 
 

Repository files navigation

project_page

PyTorch implementation for self-supervised co-part segmentation.

License

Copyright (C) 2019 NVIDIA Corporation. All rights reserved.Licensed under the CC BY-NC-SA 4.0 license (https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode).

Paper

paper

supplementary

Installation

The code is developed based on Pytorch v0.4 with TensorboardX as visualization tools. We recommend to use virtualenv to run our code:

$ virtualenv -p python3 scops_env$ source scops_env/bin/activate(scops_env)$ pip install -r requirements.txt

To deactivate the virtual environment, run$ deactivate. To activate the environment again, run$ source scops_env/bin/activate.

SCOPS on Unaligned CelebA

Download data (Saliency, labels, pretrained model)

$ ./download_CelebA.sh

Download CelebA unaligned fromhere.

Test the pretrained model

$ ./evaluate_celebAWild.sh and accept all default options. The results are stored in a single webpage atresults_CelebA/SCOPS_K8/ITER_100000/web_html/index.html.

Train the model

$ CUDA_VISIBLE_DEVICES={GPU} python train.py -f exps/SCOPS_K8_retrain.json where{GPU} is the GPU device number.

Test the pretrained model

Note: The model is trained with two main differences in the master branch: 1) it is trained with ground truth silhouettes rather than saliency maps. 2) it crops birds w.r.t bounding boxes rather than using the original image.

First setimage andannotation path inline 35 andline 37 indataset/cub.py. Then run:

sh eval_cub.sh

Results as well as visualizations could be found in theresults/cub/ITER_60000/train/ folder.

Citation

Please consider citing our paper if you find this code useful for your research.

@inproceedings{hung:CVPR:2019,title = {SCOPS: Self-Supervised Co-Part Segmentation},author = {Hung, Wei-Chih and Jampani, Varun and Liu, Sifei and Molchanov, Pavlo and Yang, Ming-Hsuan and Kautz, Jan},booktitle = {IEEE Conf. on Computer Vision and Pattern Recognition (CVPR)},month = june,year = {2019}}

About

SCOPS: Self-Supervised Co-Part Segmentation (CVPR'19)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors3

  •  
  •  
  •  

[8]ページ先頭

©2009-2025 Movatter.jp