Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

UPSNet: A Unified Panoptic Segmentation Network

License

NotificationsYou must be signed in to change notification settings

uber-research/UPSNet

Repository files navigation

Introduction

UPSNet is initially described in aCVPR 2019 oral paper.

Disclaimer

This repository is tested under Python 3.6, PyTorch 0.4.1. And model training is done with 16 GPUs by usinghorovod. It should also work under Python 2.7 / PyTorch 1.0 and with 4 GPUs.

License

© Uber, 2018-2019. Licensed under the Uber Non-Commercial License.

Citing UPSNet

If you find UPSNet is useful in your research, please consider citing:

@inproceedings{xiong19upsnet,    Author = {Yuwen Xiong, Renjie Liao, Hengshuang Zhao, Rui Hu, Min Bai, Ersin Yumer, Raquel Urtasun},    Title = {UPSNet: A Unified Panoptic Segmentation Network},    Conference = {CVPR},    Year = {2019}}

Main Results

COCO 2017 (trained on train-2017 set)

test splitPQSQRQPQThPQSt
UPSNet-50val42.578.052.448.533.4
UPSNet-101-DCNtest-dev46.680.556.953.236.7

Cityscapes

PQSQRQPQThPQSt
UPSNet-5059.379.773.054.662.7
UPSNet-101-COCO (ms test)61.881.374.857.664.8

Requirements: Software

We recommend using Anaconda3 as it already includes many common packages.

Requirements: Hardware

We recommend using 4~16 GPUs with at least 11 GB memory to train our model.

Installation

Clone this repo to$UPSNet_ROOT

Runinit.sh to build essential C++/CUDA modules and download pretrained model.

For Cityscapes:

Assuming you already downloaded Cityscapes dataset at$CITYSCAPES_ROOT and TrainIds label images are generated, please create a soft link byln -s $CITYSCAPES_ROOT data/cityscapes underUPSNet_ROOT, and runinit_cityscapes.sh to prepare Cityscapes dataset for UPSNet.

For COCO:

Assuming you already downloaded COCO dataset at$COCO_ROOT and haveannotations andimages folders under it, please create a soft link byln -s $COCO_ROOT data/coco underUPSNet_ROOT, and runinit_coco.sh to prepare COCO dataset for UPSNet.

Training:

python upsnet/upsnet_end2end_train.py --cfg upsnet/experiments/$EXP.yaml

Test:

python upsnet/upsnet_end2end_test.py --cfg upsnet/experiments/$EXP.yaml

We provide serveral config files (16/4 GPUs for Cityscapes/COCO dataset) under upsnet/experiments folder.

Model Weights

The model weights that can reproduce numbers in our paper are available now. Please follow these steps to use them:

Rundownload_weights.sh to get trained model weights for Cityscapes and COCO.

For Cityscapes:

python upsnet/upsnet_end2end_test.py --cfg upsnet/experiments/upsnet_resnet50_cityscapes_16gpu.yaml --weight_path ./model/upsnet_resnet_50_cityscapes_12000.pth
python upsnet/upsnet_end2end_test.py --cfg upsnet/experiments/upsnet_resnet101_cityscapes_w_coco_16gpu.yaml --weight_path ./model/upsnet_resnet_101_cityscapes_w_coco_3000.pth

For COCO:

python upsnet/upsnet_end2end_test.py --cfg upsnet/experiments/upsnet_resnet50_coco_16gpu.yaml --weight_path model/upsnet_resnet_50_coco_90000.pth
python upsnet/upsnet_end2end_test.py --cfg upsnet/experiments/upsnet_resnet101_dcn_coco_3x_16gpu.yaml --weight_path model/upsnet_resnet_101_dcn_coco_270000.pth

Releases

No releases published

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp