Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

detrex is a research platform for Transformer-based Object Detection algorithms including DETR (ECCV 2020), Deformable-DETR (ICLR 2021), Conditional-DETR (ICCV 2021), DAB-DETR (ICLR 2022), DN-DETR (CVPR 2022), DINO (arXiv 2022), H-DETR (arXiv 2022), MaskDINO (arXiv 2022), etc.

License

NotificationsYou must be signed in to change notification settings

niqbal996/detrex

 
 

Repository files navigation

releasedocsDocumentation StatusGitHubPRs Welcomeopen issues

📘Documentation |🛠️Installation |👀Model Zoo |🚀Awesome DETR |🆕News |🤔Reporting Issues

Introduction

detrex is an open-source toolbox that provides state-of-the-art Transformer-based detection algorithms. It is built on top ofDetectron2 and its module design is partially borrowed fromMMDetection andDETR. Many thanks for their nicely organized code. The main branch works withPytorch 1.10+ or higher (we recommendPytorch 1.12).

Major Features
  • Modular Design. detrex decomposes the Transformer-based detection framework into various components which help users easily build their own customized models.

  • State-of-the-art Methods. detrex provides a series of Transformer-based detection algorithms, includingDINO which reached the SOTA of DETR-like models with63.3AP!

  • Easy to Use. detrex is designed to belight-weight and easy for users to use:

Apart from detrex, we also released a repoAwesome Detection Transformer to present papers about Transformer for detection and segmentation.

Fun Facts

The repo name detrex has several interpretations:

  • detr-ex : We take our hats off to DETR and regard this repo as an extension of Transformer-based detection algorithms.

  • det-rex : rex literally means 'king' in Latin. We hope this repo can help advance the state of the art on object detection by providing the best Transformer-based detection algorithms from the research community.

  • de-t.rex : de means 'the' in Dutch. T.rex, also called Tyrannosaurus Rex, means 'king of the tyrant lizards' and connects to our research work 'DINO', which is short for Dinosaur.

What's New

v0.2.0 was released on 13/11/2022:

  • Release new baselines forDINO-R50-12ep,DINO-Swin-Large-36ep,DAB-Deformable-DETR-R50-50ep,DAB-Deformable-DETR-R50-Two-Stage, please checkModel Zoo.
  • Rebuild more clear config files for projects.
  • SupportH-Deformable-DETR
  • Release H-Deformable-DETR pretrained weights includingH-Deformable-DETR-R50,H-Deformable-DETR-Swin-Tiny,H-Deformable-DETR-Swin-Large inH-Deformable-DETR
  • Add demo for visualizing customized input images or videos using pretrained weights indemo

Please seechangelog.md for details and release history.

Installation

Please refer toInstallation Instructions for the details of installation.

Getting Started

Please refer toGetting Started with detrex for the basic usage of detrex. We also provides other tutorials for:

Documentation

Please seedocumentation for full API documentation and tutorials.

Model Zoo

Results and models are available inmodel zoo.

Supported methods

Please seeprojects for the details about projects that are built based on detrex.

License

This project is released under theApache 2.0 license.

Acknowledgement

  • detrex is an open-source toolbox for Transformer-based detection algorithms created by researchers ofIDEACVR. We appreciate all contributions to detrex!
  • detrex is built based onDetectron2 and part of its module design is borrowed fromMMDetection,DETR, andDeformable-DETR.

Citation

If you find the projects held by detrex useful in your research, please consider cite:

Citation List
  • Citedetrex
@misc{ideacvr2022detrex,author ={detrex contributors},title ={detrex: An Research Platform for Transformer-based Object Detection Algorithms},howpublished ={\url{https://github.com/IDEA-Research/detrex}},year ={2022}}
  • CiteDETR
@inproceedings{carion2020end,title={End-to-end object detection with transformers},author={Carion, Nicolas and Massa, Francisco and Synnaeve, Gabriel and Usunier, Nicolas and Kirillov, Alexander and Zagoruyko, Sergey},booktitle={European conference on computer vision},pages={213--229},year={2020},organization={Springer}}
  • CiteDeformable-DETR
@article{zhu2020deformable,title={Deformable DETR: Deformable Transformers for End-to-End Object Detection},author={Zhu, Xizhou and Su, Weijie and Lu, Lewei and Li, Bin and Wang, Xiaogang and Dai, Jifeng},journal={arXiv preprint arXiv:2010.04159},year={2020}}
  • CiteConditional-DETR
@inproceedings{meng2021-CondDETR,title       ={Conditional DETR for Fast Training Convergence},author      ={Meng, Depu and Chen, Xiaokang and Fan, Zejia and Zeng, Gang and Li, Houqiang and Yuan, Yuhui and Sun, Lei and Wang, Jingdong},booktitle   ={Proceedings of the IEEE International Conference on Computer Vision (ICCV)},year        ={2021}}
  • CiteDAB-DETR
@inproceedings{      liu2022dabdetr,title={{DAB}-{DETR}: Dynamic Anchor Boxes are Better Queries for {DETR}},author={Shilong Liu and Feng Li and Hao Zhang and Xiao Yang and Xianbiao Qi and Hang Su and Jun Zhu and Lei Zhang},booktitle={International Conference on Learning Representations},year={2022},url={https://openreview.net/forum?id=oMI9PjOb9Jl}}
  • CiteDN-DETR
@inproceedings{li2022dn,title={Dn-detr: Accelerate detr training by introducing query denoising},author={Li, Feng and Zhang, Hao and Liu, Shilong and Guo, Jian and Ni, Lionel M and Zhang, Lei},booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},pages={13619--13627},year={2022}}
  • CiteDINO
@misc{zhang2022dino,title={DINO: DETR with Improved DeNoising Anchor Boxes for End-to-End Object Detection},author={Hao Zhang and Feng Li and Shilong Liu and Lei Zhang and Hang Su and Jun Zhu and Lionel M. Ni and Heung-Yeung Shum},year={2022},eprint={2203.03605},archivePrefix={arXiv},primaryClass={cs.CV}}
  • CiteGroup-DETR
@article{chen2022group,title={Group DETR: Fast DETR Training with Group-Wise One-to-Many Assignment},author={Chen, Qiang and Chen, Xiaokang and Wang, Jian and Feng, Haocheng and Han, Junyu and Ding, Errui and Zeng, Gang and Wang, Jingdong},journal={arXiv preprint arXiv:2207.13085},year={2022}}
  • CiteH-DETR
@article{jia2022detrs,title={DETRs with Hybrid Matching},author={Jia, Ding and Yuan, Yuhui and He, Haodi and Wu, Xiaopei and Yu, Haojun and Lin, Weihong and Sun, Lei and Zhang, Chao and Hu, Han},journal={arXiv preprint arXiv:2207.13080},year={2022}}

About

detrex is a research platform for Transformer-based Object Detection algorithms including DETR (ECCV 2020), Deformable-DETR (ICLR 2021), Conditional-DETR (ICCV 2021), DAB-DETR (ICLR 2022), DN-DETR (CVPR 2022), DINO (arXiv 2022), H-DETR (arXiv 2022), MaskDINO (arXiv 2022), etc.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python94.5%
  • Cuda4.9%
  • Other0.6%

[8]ページ先頭

©2009-2025 Movatter.jp