Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

An iPhone app enabling hand, head, eye and body motion tracking.

License

NotificationsYou must be signed in to change notification settings

teobabic/Simo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Simo is the first approach that transforms an single off-the-shelf smartphone into a user motion tracking device and controller. Both the front and back cameras of the smartphone are used simultaneously to track the user’s hand, head, eye-gaze and body movements in real-world space and scale.

Simo is an ARKit iOS application made in Unity.

Features

  • Device/hand motion + touch inputs: Users can interact by performing 3D hand movements in 6DOF (translation + rotation) and can reliably segment and further enhance their gestures by touchscreen inputs.

  • Head pose tracking: 6DOF head tracking. Example: This can be used for head-pointing.

  • Eye-gaze tracking: 6DOF eye-gaze tracking. Example: This can be used for eye-pointing.

  • Body pose tracking: 6DOF tracking of the torso (position + orientation). Example: This can be used for body-position or ego-centric interactions.

  • No specialized hardware required: No external hardware, external trackers, markers or cameras are required. Everything relies only on a single iPhone.

The Simo app tracks all following user motions simultaneously, in real-time and in world-scale:

Tracking areas of the front and back iPhone camera.

Device/Hand tracking.

Head pose tracking.

Eye-Gaze tracking.

Body tracking.

Quickstart guide

  • Open theSimo project in Unity.
  • Select iOS as the target platform under File > Build Settings > Platform > iOS > clickSwitch Platform.
  • Build the iOS app under File > Build Settings > Build > Create folder > Choose a name > clickSave.
  • In the created folder selectUnity-iPhone.xcodeproj and open it in Xcode.
  • Connect your iPhone to your Mac and select it as the buildDevice.
  • In Xcode project settings underSigning & Capabilities select your Apple Developer SigningTeam ID,Bundle Identifier, and pressPlay.
  • Open the Simo app on your iPhone.
  • Place an AR Anchor on the floor of your room and use theChange camera andChange view buttons to toggle between different tracking views.

Compatibility

The project was last tested and run in Unity 2021.3+, Xcode 14.3, iPhone 13 Pro and iOS 16.4

System requirements:

  • iPhone with ARKit and FaceID capabilities
  • Unity
  • Xcode
  • MacOS

More information

This work is based on a publication"Simo: Interactions with Distant Displays by Smartphones with Simultaneous Face and World Tracking" in the CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. The publication also includes related work, user studies, applications, and future work directions.


Copyright (C) 2023 Teo Babic

Releases

No releases published

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp