- Notifications
You must be signed in to change notification settings - Fork2
A self contained example demonstrating how to use MediaPipe Hand Gesture Recognizer with Max's jweb connected to either a live webcamera stream or using still images. This example does both handtracking and gesture recognition.
License
lysdexic-audio/jweb-hands-gesture-recognizer
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
A self contained example demonstrating how to use MediaPipe Hand Gesture Recognizer with Max'sjweb connected to either a live webcamera stream or using still images.
This example does both handtracking and gesture recognition. If you only require handtracking try:jweb-hands-landmarker
The Gesture classification model bundle can recognize these common hand gestures:
0 - Unrecognized gesture, label: Unknown1 - Closed fist, label: Closed_Fist2 - Open palm, label: Open_Palm3 - Pointing up, label: Pointing_Up4 - Thumbs down, label: Thumb_Down5 - Thumbs up, label: Thumb_Up6 - Victory, label: Victory7 - Love, label: ILoveYouThe hand landmark model bundle detects the keypoint localization of 21 hand-knuckle coordinates within the detected hand regions. The model was trained on approximately 30K real-world images, as well as several rendered synthetic hand models imposed over various backgrounds.
This example is inspired byan example by Rob Ramirez, which is in turn inspired byMediaPipe in JavaScript.
About
A self contained example demonstrating how to use MediaPipe Hand Gesture Recognizer with Max's jweb connected to either a live webcamera stream or using still images. This example does both handtracking and gesture recognition.
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.

