- Notifications
You must be signed in to change notification settings - Fork4
RalphMao/VMetrics
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
This repo provides the evaluation codes used in our ICCV 2019 paperA Delay Metric for Video Object Detection: What Average Precision Fails to Tell, including:
- Mean Average Precision (mAP)
- Average Delay (AD)
- A redesignedNAB metric for the video object detection problem.
Download the groundtruth annotations and the sample detector outputs fromGoogle Drive.
The groundtruth annotations of VIDT are stored in KITTI-format due to its simplicity and io-efficiency.
We provide the outputs of the following methods. The github repos that generate those outputs are also listed.
All the evaluation scripts are under./experiments
folder. For instance, to measure the mAP and AD of FGFA, run command:
python experiments/eval_map_ad.py examples/rfcn_fgfa_7 data/ILSVRC2015_KITTI_FORMAT
For every video sequence, output a file as<sequence_name>.txt
. Each line in the file should be one single object in<frame_id> <class_id> <confidence> <xmin> <ymin> <xmax> <ymax>
format.
This pure Python-based mAP evaluation code is refactored fromCartucho/mAP. It has been tested against the original matlab version.
About
A Python library to evaluate mean Average Precision(mAP) for object detection. Provides the same output as PASCAL VOC's matlab code.
Topics
Resources
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.