- Notifications
You must be signed in to change notification settings - Fork2
An iPhone app enabling hand, head, eye and body motion tracking.
License
teobabic/Simo
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Simo is the first approach that transforms an single off-the-shelf smartphone into a user motion tracking device and controller. Both the front and back cameras of the smartphone are used simultaneously to track the user’s hand, head, eye-gaze and body movements in real-world space and scale.
Simo is an ARKit iOS application made in Unity.
Device/hand motion + touch inputs: Users can interact by performing 3D hand movements in 6DOF (translation + rotation) and can reliably segment and further enhance their gestures by touchscreen inputs.
Head pose tracking: 6DOF head tracking. Example: This can be used for head-pointing.
Eye-gaze tracking: 6DOF eye-gaze tracking. Example: This can be used for eye-pointing.
Body pose tracking: 6DOF tracking of the torso (position + orientation). Example: This can be used for body-position or ego-centric interactions.
No specialized hardware required: No external hardware, external trackers, markers or cameras are required. Everything relies only on a single iPhone.
The Simo app tracks all following user motions simultaneously, in real-time and in world-scale:
Tracking areas of the front and back iPhone camera.
Device/Hand tracking.
Head pose tracking.
Eye-Gaze tracking.
Body tracking.
- Open theSimo project in Unity.
- Select iOS as the target platform under File > Build Settings > Platform > iOS > clickSwitch Platform.
- Build the iOS app under File > Build Settings > Build > Create folder > Choose a name > clickSave.
- In the created folder selectUnity-iPhone.xcodeproj and open it in Xcode.
- Connect your iPhone to your Mac and select it as the buildDevice.
- In Xcode project settings underSigning & Capabilities select your Apple Developer SigningTeam ID,Bundle Identifier, and pressPlay.
- Open the Simo app on your iPhone.
- Place an AR Anchor on the floor of your room and use theChange camera andChange view buttons to toggle between different tracking views.
The project was last tested and run in Unity 2021.3+, Xcode 14.3, iPhone 13 Pro and iOS 16.4
System requirements:
- iPhone with ARKit and FaceID capabilities
- Unity
- Xcode
- MacOS
This work is based on a publication"Simo: Interactions with Distant Displays by Smartphones with Simultaneous Face and World Tracking" in the CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. The publication also includes related work, user studies, applications, and future work directions.
Copyright (C) 2023 Teo Babic
About
An iPhone app enabling hand, head, eye and body motion tracking.
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.