Movatterモバイル変換


[0]ホーム

URL:


US20150089453A1 - Systems and Methods for Interacting with a Projected User Interface - Google Patents

Systems and Methods for Interacting with a Projected User Interface
Download PDF

Info

Publication number
US20150089453A1
US20150089453A1US14/497,090US201414497090AUS2015089453A1US 20150089453 A1US20150089453 A1US 20150089453A1US 201414497090 AUS201414497090 AUS 201414497090AUS 2015089453 A1US2015089453 A1US 2015089453A1
Authority
US
United States
Prior art keywords
user interface
interface display
image data
projected
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/497,090
Inventor
Carlo Dal Mutto
Abbas Rafii
Britta Hummel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aquifi Inc
Original Assignee
Aquifi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aquifi IncfiledCriticalAquifi Inc
Priority to US14/497,090priorityCriticalpatent/US20150089453A1/en
Assigned to AQUIFI, INC.reassignmentAQUIFI, INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: DAL MUTTO, CARLO, HUMMEL, BRITTA, RAFII, ABBAS
Publication of US20150089453A1publicationCriticalpatent/US20150089453A1/en
Abandonedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A system and method for providing a 3D gesture based interaction system for a projected 3D user interface is disclosed. A user interface display is projected onto a user surface. Image data of the user interface display and an interaction medium are captured. The image data includes visible light data and IR data. The visible light data is used to register the user interface display on the projected surface with the Field of View (FOV) of at least one camera capturing the image data. The IR data is used to determine gesture recognition information for the interaction medium. The registration information and gesture recognition information is then used to identify interactions.

Description

Claims (30)

What is claimed is:
1. A processing system configured to conduct Three Dimensional (3D) gesture based interactive sessions for a projected user interface display comprising:
a memory containing an image processing application; and
a processor directed by the image processing application read from the memory to:
receive image data that includes visible light image data and Infrared (IR) image data,
obtain visible light image data from the image data,
generate registration information for a user interface display on a projected surface with a field of view of one or more image capture devices using the visible light image data,
obtain IR image data from the image data, and
generate gesture information for an interaction medium using the IR data, and
identify an interaction with an interactive object with the user interface display using the gesture information and the registration information.
2. The processing system ofclaim 1 wherein the generating of the registration information includes determining geometric relationship information that relates the FOV of the at least one camera to the user interface display on the projection surface.
3. The processing system ofclaim 2 wherein the geometric relationship is the homography between the FOV of the at least one camera and the user interface display on the projection surface.
4. The processing system ofclaim 3 wherein the geometric relationship information is determined based upon AR tags in the projected user interface display.
5. The processing system ofclaim 4 wherein the projected user interface display includes at least four AR tags.
6. The processing system ofclaim 4 wherein the AR tags are interactive objects in the user interface display.
7. The processing system ofclaim 1 wherein the generating of the registration information includes determining 3D location information for the projection surface indicating a position of the projection surface in 3D space.
8. The processing system ofclaim 7 wherein the 3D location information is determined based upon fiducials within the user interface display.
9. The processing system ofclaim 8 wherein the user interface display includes at least 3 fiducials.
10. The processing system ofclaim 3 wherein each fiducial in the user interface display is an interactive object in the user interface display.
11. The processing system ofclaim 1 wherein the interaction medium is illuminated with an IR illumination source.
12. The processing system ofclaim 1 wherein the visible light image data is obtained from images captured by the at least one camera that include only the projected user interface display on the projection surface.
13. The process system ofclaim 1 wherein the visible light image data is obtained from images captured by the at least one camera that include the interaction medium and the projected user interface display on the projection surface.
14. The processing system ofclaim 1 wherein the IR image data is obtained from images captured by the at least one camera that include the interaction medium and the projected user interface display on the projected surface.
15. The processing system ofclaim 1 wherein the image data is captured using at least one depth camera.
16. A method for providing a Three Dimensional (3D) gesture interactive sessions for a projected user interface display comprising:
generating a user interface display including interactive object using a processing system;
projecting the user interface display onto a projection surface using a projector;
capturing image data of the projected user interface display on the projection surface using at least one camera;
obtaining visible light image data from the image data using the processing system;
generating registration information for the user interface display on the projected surface with the field of view of one or more image capture devices providing the image data from the visible light data using the processing system;
obtaining the IR image data from the image data using the processing system; and
generating gesture information for an interaction medium in the image data from the IR image data using the processing system; and
identifying an interaction with an interactive object with the user interface display using the gesture information and the registration information.
17. The method ofclaim 16 wherein the generating of the registration information includes determining geometric relationship information that relates the FOV of the at least one camera to the user interface display on the projection surface using the processing system.
18. The method ofclaim 17 wherein the geometric relationship is the homography between the FOV of the at least one camera and the user interface display on the projection surface.
19. The method ofclaim 18 wherein the geometric relationship information is determined based upon AR tags in the projected user interface display.
20. The method ofclaim 19 wherein the projected user interface display includes at least four AR tags.
21. The method ofclaim 19 wherein the AR tags are interactive objects in the user interface display.
22. The method ofclaim 16 wherein the generating of the registration information includes determining 3D location information for the projection surface indicating a location of the projection surface in 3D space using the processing system.
23. The method ofclaim 22 wherein the 3D location information is determined based upon fiducials within the user interface display.
24. The method ofclaim 23 wherein the user interface display includes at least 3 fiducials.
25. The method ofclaim 23 wherein each fiducial in the user interface display is an interactive object in the user interface display.
26. The method ofclaim 16 further comprising:
emitting IR light towards the projected surface using at least one IR emitter to illuminate the interaction medium.
27. The method ofclaim 16 wherein the visible light image data is obtained from images captured by the at least one camera that include only the projected user interface display on the projection surface.
28. The method ofclaim 16 wherein the visible light image data is obtained from images captured by the at least one camera that include the interaction medium and the projected user interface display on the projection surface.
29. The method ofclaim 16 wherein the IR image data is obtained from images captured by the at least one camera that include the interaction medium and the projected user interface display on the projection surface.
30. The method ofclaim 16 wherein the image data is captured using at least one depth camera.
US14/497,0902013-09-252014-09-25Systems and Methods for Interacting with a Projected User InterfaceAbandonedUS20150089453A1 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US14/497,090US20150089453A1 (en)2013-09-252014-09-25Systems and Methods for Interacting with a Projected User Interface

Applications Claiming Priority (3)

Application NumberPriority DateFiling DateTitle
US201361960783P2013-09-252013-09-25
US201462009844P2014-06-092014-06-09
US14/497,090US20150089453A1 (en)2013-09-252014-09-25Systems and Methods for Interacting with a Projected User Interface

Publications (1)

Publication NumberPublication Date
US20150089453A1true US20150089453A1 (en)2015-03-26

Family

ID=52692212

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US14/497,090AbandonedUS20150089453A1 (en)2013-09-252014-09-25Systems and Methods for Interacting with a Projected User Interface

Country Status (1)

CountryLink
US (1)US20150089453A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20160239154A1 (en)*2013-09-302016-08-18Hewlett-Packard Development Company, L.P.Projection system manager
WO2016184971A1 (en)*2015-05-212016-11-24Audi AgMethod for operating an operating device, and operating device for a motor vehicle
CN106775415A (en)*2016-12-282017-05-31珠海市魅族科技有限公司Application program launching method and system
WO2017093779A1 (en)*2015-12-012017-06-08Vinci ConstructionMethod and system for assisting installation of elements in a construction work
US20170285875A1 (en)*2014-10-032017-10-05Sony CorporationProjection display unit
US9857868B2 (en)2011-03-192018-01-02The Board Of Trustees Of The Leland Stanford Junior UniversityMethod and system for ergonomic touch-free interface
US10163198B2 (en)2016-02-262018-12-25Samsung Electronics Co., Ltd.Portable image device for simulating interaction with electronic device
US10241588B1 (en)2018-01-312019-03-26Piccolo Labs Inc.System for localizing devices in a room
WO2019067799A1 (en)*2017-09-292019-04-04Sony Interactive Entertainment Inc. ROBOTIC UTILITY AND INTERFACE DEVICE
US10296102B1 (en)*2018-01-312019-05-21Piccolo Labs Inc.Gesture and motion recognition using skeleton tracking
US20200125176A1 (en)*2018-10-212020-04-23XRSpace CO., LTD.Method of Virtual User Interface Interaction Based on Gesture Recognition and Related Device
US10849532B1 (en)*2017-12-082020-12-01Arizona Board Of Regents On Behalf Of Arizona State UniversityComputer-vision-based clinical assessment of upper extremity function
CN112732094A (en)*2021-02-222021-04-30福建中青手拉手教育科技有限公司Projection interaction method and device based on motion sensing
CN113797525A (en)*2020-12-232021-12-17广州富港生活智能科技有限公司Novel game system
US20220253148A1 (en)*2021-02-052022-08-11Pepsico, Inc.Devices, Systems, and Methods for Contactless Interfacing
US20230181061A1 (en)*2021-12-152023-06-15Qr8 Health Inc.Peg sensing apparatus and methods of use
CN116974369A (en)*2023-06-212023-10-31广东工业大学Method, system, equipment and storage medium for operating medical image in operation

Citations (27)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20070098234A1 (en)*2005-10-312007-05-03Mark FialaMarker and method for detecting said marker
US20080143975A1 (en)*2006-10-252008-06-19International Business Machines CorporationSystem and method for interacting with a display
US20100045701A1 (en)*2008-08-222010-02-25Cybernet Systems CorporationAutomatic mapping of augmented reality fiducials
US20100079369A1 (en)*2008-09-302010-04-01Microsoft CorporationUsing Physical Objects in Conjunction with an Interactive Surface
US20100079481A1 (en)*2007-01-252010-04-01Li ZhangMethod and system for marking scenes and images of scenes with optical tags
US20100103101A1 (en)*2008-10-272010-04-29Song HyunyoungSpatially-aware projection pen interface
US20100231522A1 (en)*2005-02-232010-09-16Zienon, LlcMethod and apparatus for data entry input
US20110254939A1 (en)*2010-04-162011-10-20Tatiana Pavlovna KadantsevaDetecting User Input Provided To A Projected User Interface
US20110285633A1 (en)*2007-01-252011-11-24Microsoft CorporationDynamic projected user interface
US20110288964A1 (en)*2010-05-242011-11-24Massachusetts Institute Of TechnologyKinetic Input/Output
US20120017147A1 (en)*2010-07-162012-01-19John Liam MarkMethods and systems for interacting with projected user interface
US20120042288A1 (en)*2010-08-162012-02-16Fuji Xerox Co., Ltd.Systems and methods for interactions with documents across paper and computers
US20120166993A1 (en)*2010-12-242012-06-28Anderson Glen JProjection interface techniques
US20120249416A1 (en)*2011-03-292012-10-04Giuliano MaciocciModular mobile connected pico projectors for a local multi-user collaboration
US20120249409A1 (en)*2011-03-312012-10-04Nokia CorporationMethod and apparatus for providing user interfaces
US20120299876A1 (en)*2010-08-182012-11-29Sony Ericsson Mobile Communications AbAdaptable projection on occluding object in a projected user interface
US20120315965A1 (en)*2011-06-082012-12-13Microsoft CorporationLocational Node Device
US20130060146A1 (en)*2010-04-282013-03-07Ryerson UniversitySystem and methods for intraoperative guidance feedback
US20130086531A1 (en)*2011-09-292013-04-04Kabushiki Kaisha ToshibaCommand issuing device, method and computer program product
US8471868B1 (en)*2007-11-282013-06-25Sprint Communications Company L.P.Projector and ultrasonic gesture-controlled communicator
US20140006997A1 (en)*2011-03-162014-01-02Lg Electronics Inc.Method and electronic device for gesture-based key input
US20140125668A1 (en)*2012-11-052014-05-08Jonathan SteedConstructing augmented reality environment with pre-computed lighting
US20140267007A1 (en)*2013-03-152014-09-18Texas Instruments IncorporatedInteraction Detection Using Structured Light Images
US8887043B1 (en)*2012-01-172014-11-11Rawles LlcProviding user feedback in projection environments
US9081418B1 (en)*2013-03-112015-07-14Rawles LlcObtaining input from a virtual user interface
US20150262426A1 (en)*2012-08-282015-09-17University Of South AustraliaSpatial Augmented Reality (SAR) Application Development System
US9565394B2 (en)*2012-04-242017-02-07Hewlett-Packard Development Company, L.P.System for displaying an image

Patent Citations (29)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20100231522A1 (en)*2005-02-232010-09-16Zienon, LlcMethod and apparatus for data entry input
US20070098234A1 (en)*2005-10-312007-05-03Mark FialaMarker and method for detecting said marker
US20080143975A1 (en)*2006-10-252008-06-19International Business Machines CorporationSystem and method for interacting with a display
US20130002539A1 (en)*2006-10-252013-01-03International Business Machines CorporationSystem and method for interacting with a display
US20110285633A1 (en)*2007-01-252011-11-24Microsoft CorporationDynamic projected user interface
US20100079481A1 (en)*2007-01-252010-04-01Li ZhangMethod and system for marking scenes and images of scenes with optical tags
US8471868B1 (en)*2007-11-282013-06-25Sprint Communications Company L.P.Projector and ultrasonic gesture-controlled communicator
US20100045701A1 (en)*2008-08-222010-02-25Cybernet Systems CorporationAutomatic mapping of augmented reality fiducials
US20100079369A1 (en)*2008-09-302010-04-01Microsoft CorporationUsing Physical Objects in Conjunction with an Interactive Surface
US20100103101A1 (en)*2008-10-272010-04-29Song HyunyoungSpatially-aware projection pen interface
US20140078052A1 (en)*2010-04-162014-03-20Seiko Epson CorporationDetecting User Input Provided to a Projected User Interface
US20110254939A1 (en)*2010-04-162011-10-20Tatiana Pavlovna KadantsevaDetecting User Input Provided To A Projected User Interface
US20130060146A1 (en)*2010-04-282013-03-07Ryerson UniversitySystem and methods for intraoperative guidance feedback
US20110288964A1 (en)*2010-05-242011-11-24Massachusetts Institute Of TechnologyKinetic Input/Output
US20120017147A1 (en)*2010-07-162012-01-19John Liam MarkMethods and systems for interacting with projected user interface
US20120042288A1 (en)*2010-08-162012-02-16Fuji Xerox Co., Ltd.Systems and methods for interactions with documents across paper and computers
US20120299876A1 (en)*2010-08-182012-11-29Sony Ericsson Mobile Communications AbAdaptable projection on occluding object in a projected user interface
US20120166993A1 (en)*2010-12-242012-06-28Anderson Glen JProjection interface techniques
US20140006997A1 (en)*2011-03-162014-01-02Lg Electronics Inc.Method and electronic device for gesture-based key input
US20120249416A1 (en)*2011-03-292012-10-04Giuliano MaciocciModular mobile connected pico projectors for a local multi-user collaboration
US20120249409A1 (en)*2011-03-312012-10-04Nokia CorporationMethod and apparatus for providing user interfaces
US20120315965A1 (en)*2011-06-082012-12-13Microsoft CorporationLocational Node Device
US20130086531A1 (en)*2011-09-292013-04-04Kabushiki Kaisha ToshibaCommand issuing device, method and computer program product
US8887043B1 (en)*2012-01-172014-11-11Rawles LlcProviding user feedback in projection environments
US9565394B2 (en)*2012-04-242017-02-07Hewlett-Packard Development Company, L.P.System for displaying an image
US20150262426A1 (en)*2012-08-282015-09-17University Of South AustraliaSpatial Augmented Reality (SAR) Application Development System
US20140125668A1 (en)*2012-11-052014-05-08Jonathan SteedConstructing augmented reality environment with pre-computed lighting
US9081418B1 (en)*2013-03-112015-07-14Rawles LlcObtaining input from a virtual user interface
US20140267007A1 (en)*2013-03-152014-09-18Texas Instruments IncorporatedInteraction Detection Using Structured Light Images

Non-Patent Citations (22)

* Cited by examiner, † Cited by third party
Title
Barone et al., "Three-dimensional point cloud alignment detecting fiducial markers by structured light stereo imaging", Machine Vision and Applications, v. 23, pp. 217–229, 2012.*
Bimber et al., "Spatial Augmented Reality Merging Real and Virtual Worlds", 2005.*
Chan et al., "Enabling Beyond-Surface Interactions for Interactive Surface with An Invisible Projection", Proceedings of the 23rd Annual ACM Symposium on User Interface Software and Technology (UIST '10),pp. 263-272, October 2010.*
Chan et al., "MagicPad: the projection based 3D user interface", International Journal on Interactive Design and Manufacturing, v. 6, n. 2, pp. 75–81, 2012.*
Chou et al., "BlueSpace: Creating a Personalized and Context-Aware Workspace", IBM Research Report RC22281, 11 December 2001.*
DePiero et al., "3-D Computer Vision Using Structured Light: Design, Calibration and Implementation Issues", Advances in Computers(43), Academic Press, pp. 243-278, 1996.*
Fitriani et al., "Interacting with Projected Media on Deformable Surfaces", IEEE 11th International Conference on Computer Vision, October 2007.*
Follmer et al., "deForm: An Interactive Malleable Surface For Capturing 2.5D Arbitrary Objects, Tools and Touch", Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST '11),pp. 527-536, October 2011.*
Franzen et al., "Magic Wako -- User Interaction in a Projector-based Augmented Reality Game", Third International Conference on Creative Content Technologies, pp. 24-28, 2011.*
Gupta et al., "Active Pursuit Tracking in a Projector-Camera System with Application to Augmented Reality", Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), Sept. 2005.*
Ishii et al., "The Tangible User Interface and Its Evolution", Communications of the ACM, v. 51, n. 6, pp. 32-36, June 2008.*
Kato et al., "Marker Tracking and HMD Calibration for a Video-based Augmented Reality Conferencing System", Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR '99), pp. 85-94,October, 1999.*
Kjeldsen et al., "Interacting with Steerable Projected Displays", Proc. of 5th International Conference on Automatic Face and Gesture Recognition (FG’02), May 2002.*
Ko et al., "Public Issues on Projected User Interface", CHI'10, pp. 2873-2881, April 2010.*
Lee et al., "Foldable Interactive Displays", Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology (UIST '08), pp.287-290, October 2008.*
Lee et al., "Hybrid Infrared and Visible Light Projection for Location Tracking", Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology (UIST '07), pp.57-60, October 2007*
Raskar et al., "The Office of the Future: A Unified Approach to Image-Based Modeling and Spatially Immersive Displays", SIGGRAPH '98, pp. 1-10, 1998.*
Seo et al., "Direct-Projected AR Based Interactive User Interface for Medical Surgery", 17th International Conference on Artificial Reality and Telexistence, pp. 105-112, 2007.*
Sukaviriya et al., "A Portable System for Anywhere Interactions", CHI'04, pp. 789-790, April 2004.*
Torres et al., "Augmented Reality Using Spatially Multiplexed Structured Light", 19th International Conference on Mechatronics and Machine Vision in Practice (M2VIP), pp. 385-390, 2012.*
Voida et al., "A Study on the Manipulation of 2D Objects in a Projector/Camera–Based Augmented Reality Environment", CHI'05, 2005.*
Willis et al., "HideOut: Mobile Projector Interaction with Tangible Objects and Surfaces", Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction (TEI '13), pp. 331-338, February 2013.*

Cited By (24)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9857868B2 (en)2011-03-192018-01-02The Board Of Trustees Of The Leland Stanford Junior UniversityMethod and system for ergonomic touch-free interface
US20160239154A1 (en)*2013-09-302016-08-18Hewlett-Packard Development Company, L.P.Projection system manager
US10114512B2 (en)*2013-09-302018-10-30Hewlett-Packard Development Company, L.P.Projection system manager
US10528192B2 (en)2014-10-032020-01-07Sony CorporationProjection display unit
US10303306B2 (en)*2014-10-032019-05-28Sony CorporationProjection display unit
US20170285875A1 (en)*2014-10-032017-10-05Sony CorporationProjection display unit
CN107636567A (en)*2015-05-212018-01-26奥迪股份公司 Method for operating an operating device and operating device for a motor vehicle
WO2016184971A1 (en)*2015-05-212016-11-24Audi AgMethod for operating an operating device, and operating device for a motor vehicle
WO2017093779A1 (en)*2015-12-012017-06-08Vinci ConstructionMethod and system for assisting installation of elements in a construction work
US10824312B2 (en)2015-12-012020-11-03Vinci ConstructionMethod and system for assisting installation of elements in a construction work
US10163198B2 (en)2016-02-262018-12-25Samsung Electronics Co., Ltd.Portable image device for simulating interaction with electronic device
CN106775415A (en)*2016-12-282017-05-31珠海市魅族科技有限公司Application program launching method and system
WO2019067799A1 (en)*2017-09-292019-04-04Sony Interactive Entertainment Inc. ROBOTIC UTILITY AND INTERFACE DEVICE
US11219837B2 (en)2017-09-292022-01-11Sony Interactive Entertainment Inc.Robot utility and interface device
US10849532B1 (en)*2017-12-082020-12-01Arizona Board Of Regents On Behalf Of Arizona State UniversityComputer-vision-based clinical assessment of upper extremity function
US10241588B1 (en)2018-01-312019-03-26Piccolo Labs Inc.System for localizing devices in a room
US10296102B1 (en)*2018-01-312019-05-21Piccolo Labs Inc.Gesture and motion recognition using skeleton tracking
US10678342B2 (en)*2018-10-212020-06-09XRSpace CO., LTD.Method of virtual user interface interaction based on gesture recognition and related device
US20200125176A1 (en)*2018-10-212020-04-23XRSpace CO., LTD.Method of Virtual User Interface Interaction Based on Gesture Recognition and Related Device
CN113797525A (en)*2020-12-232021-12-17广州富港生活智能科技有限公司Novel game system
US20220253148A1 (en)*2021-02-052022-08-11Pepsico, Inc.Devices, Systems, and Methods for Contactless Interfacing
CN112732094A (en)*2021-02-222021-04-30福建中青手拉手教育科技有限公司Projection interaction method and device based on motion sensing
US20230181061A1 (en)*2021-12-152023-06-15Qr8 Health Inc.Peg sensing apparatus and methods of use
CN116974369A (en)*2023-06-212023-10-31广东工业大学Method, system, equipment and storage medium for operating medical image in operation

Similar Documents

PublicationPublication DateTitle
US20150089453A1 (en)Systems and Methods for Interacting with a Projected User Interface
US10732725B2 (en)Method and apparatus of interactive display based on gesture recognition
US10019074B2 (en)Touchless input
US9507437B2 (en)Algorithms, software and an interaction system that support the operation of an on the fly mouse
US9454260B2 (en)System and method for enabling multi-display input
EP2898399B1 (en)Display integrated camera array
US8290210B2 (en)Method and system for gesture recognition
US9600078B2 (en)Method and system enabling natural user interface gestures with an electronic system
US20150220150A1 (en)Virtual touch user interface system and methods
US20150089436A1 (en)Gesture Enabled Keyboard
TWI559174B (en)Gesture based manipulation of three-dimensional images
US20050249386A1 (en)Pointing device having fingerprint image recognition function, fingerprint image recognition and pointing method, and method for providing portable terminal service using thereof
US10528145B1 (en)Systems and methods involving gesture based user interaction, user interface and/or other features
Dai et al.Touchscreen everywhere: On transferring a normal planar surface to a touch-sensitive display
US20120319945A1 (en)System and method for reporting data in a computer vision system
CN101673161A (en)Visual, operable and non-solid touch screen system
US20150009119A1 (en)Built-in design of camera system for imaging and gesture processing applications
US9811916B1 (en)Approaches for head tracking
US20180260033A1 (en)Input apparatus, input method, and program
US20180075294A1 (en)Determining a pointing vector for gestures performed before a depth camera
US9201519B2 (en)Three-dimensional pointing using one camera and three aligned lights
US9377866B1 (en)Depth-based position mapping
Liang et al.Shadowtouch: Enabling free-form touch-based hand-to-surface interaction with wrist-mounted illuminant by shadow projection
US9678583B2 (en)2D and 3D pointing device based on a passive lights detection operation method using one camera
US20140085264A1 (en)Optical touch panel system, optical sensing module, and operation method thereof

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:AQUIFI, INC., CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DAL MUTTO, CARLO;RAFII, ABBAS;HUMMEL, BRITTA;REEL/FRAME:033906/0576

Effective date:20140930

STCBInformation on status: application discontinuation

Free format text:ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION


[8]ページ先頭

©2009-2025 Movatter.jp