Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
NotificationsYou must be signed in to change notification settings

ai-forever/dynamic_gestures

Repository files navigation

HaGRID main repo

Overview

This repository contains using HaGRID dataset for dynamic gesture recognition. The dataset is availablehere.

Project

├── ocsort/ # source code for Observation-Centric Sorting│   ├── kalmanfilter.py # Kalman filter│   ├── kalmanboxtracker.py # Kalman box tracker│   ├── association.py # Association of boxes with trackers├── utils/ # useful utils│   ├── action_controller.py # Action controller for dynamic gestures│   ├── box_utils_numpy.py # Box utils for numpy│   ├── enums.py # Enums for dynamic gestures and actions│   ├── hand.py # Hand class for dynamic gestures recognition│   ├── drawer.py # Debug drawer├── onnx_models.py # ONNX models for gesture recognition├── main_controller.py # Main controller for dynamic gestures recognition, uses ONNX models, ocsort and utils├── run_demo.py # Demo script for dynamic gestures recognition

Installation

Clone and install required python packages:

git clone https://github.com/ai-forever/dynamic_gestures.git# or mirror link:cd dynamic_gestures# Create virtual env by conda or venvconda create -n dynamic_gestures python=3.9 -yconda activate dynamic_gestures# Install requirementspip install -r requirements.txt

Demo

To run demo, you just need to runrun_demo.py script.

python run_demo.py --detector<path_to_detector> --classifier<path_to_classifier> --debug

--detector (optional) Path to the hand detector model.Default:models/hand_detector.onnx

--classifier (optional) Path to the crops classifier model.Default:models/crops_classifier.onnx

--debug (optional) Enables debug mode to see bounding boxes and class labels.

Dynamic gestures

Next, we will show dynamic gestures in user mode and debug mode. In user mode, we show only the final result of dynamic gesture recognition. In debug mode, we show the result of each step of dynamic gesture recognition:

  1. hand detection
  2. hand tracking
  3. gesture recognition
  4. action recognition

At the moment the code supports 6 groups of dynamic gestures:

ZOOM

Zoom In/OutZoom

DRAG AND DROP

Drag and Drop 1Drag and Drop 2Drag and Drop 3

FAST SWIPE UP / DOWN

Fast Swipe Up/Down

CLICK

Clicks

SWIPES LEFT / RIGHT

Swipe Left/RightSwipe 2 Left/RightSwipe 3 Left/Right

SWIPES UP / DOWN

Swipe Up/DownSwipe 2 Up/DownSwipe 3 Up/Down

License

This work is licensed under a variant ofApache License, Version 2.0.

Please see the specificlicense.

Citation

You can cite the paper using the following BibTeX entry:

@misc{nuzhdin2024hagridv21mimagesstatic,      title={HaGRIDv2: 1M Images for Static and Dynamic Hand Gesture Recognition},      author={Anton Nuzhdin and Alexander Nagaev and Alexander Sautin and Alexander Kapitanov and Karina Kvanchiani},      year={2024},      eprint={2412.01508},      archivePrefix={arXiv},      primaryClass={cs.CV},      url={https://arxiv.org/abs/2412.01508},}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp