Movatterモバイル変換


[0]ホーム

URL:


US20220382379A1 - Touch Free User Interface - Google Patents

Touch Free User Interface
Download PDF

Info

Publication number
US20220382379A1
US20220382379A1US17/659,349US202217659349AUS2022382379A1US 20220382379 A1US20220382379 A1US 20220382379A1US 202217659349 AUS202217659349 AUS 202217659349AUS 2022382379 A1US2022382379 A1US 2022382379A1
Authority
US
United States
Prior art keywords
location
pointing
user
space
pointing element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/659,349
Inventor
Itay Katz
Amnon Shenfeld
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eyesight Mobile Technologies Ltd
Original Assignee
Eyesight Mobile Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eyesight Mobile Technologies LtdfiledCriticalEyesight Mobile Technologies Ltd
Priority to US17/659,349priorityCriticalpatent/US20220382379A1/en
Publication of US20220382379A1publicationCriticalpatent/US20220382379A1/en
Abandonedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

The presently disclosed subject matter includes a method of recognizing an aimed point on a plane. Images captured by one or more image sensor are processed for obtaining data obtaining data indicative of location of at least one pointing element in the viewing space and data indicative of at least one predefined user's body part in the viewing space; using the obtained data an aimed point on the plane is identified. In case it is determined that a predefined condition is met a predefined command and/or message is executed.

Description

Claims (20)

1. A method of recognizing an aimed point or area in a space, the method comprising:
obtaining, using an image sensor, at least one image of the space;
processing the at least one image by at least one processor operatively connected to the image sensor and obtaining data indicative of a location of at least one pointing element in the space;
obtaining data indicative of a location of at least one predefined body part of a user in the space;
determining by the at least one processor, a location of the aimed point or area in the space, using a combination of:
the data indicative of the location of the at least one pointing element,
the data indicative of the location of the at least one predefined body part;
determining a change in a motion vector of the at least one pointing element during the pointing gesture, wherein the change in the motion vector relates to a change in a direction of the motion vector in at least one of a vertical axis or a horizontal axis;
determining a user selection of an icon or a graphical element, based on the change in the motion vector and the determined location of the aimed point or area; and
executing a predefined command or message associated with the selected icon or graphical element.
2.-61. (canceled)
62. The method ofclaim 1, wherein the space comprises a three-dimensional space, and the data indicative of the location of the at least one pointing element includes x, y, and z coordinates.
63. The method ofclaim 1, further comprising detecting a change in a speed of the at least one pointing element along a motion path.
64. The method ofclaim 1, further comprising detecting a deceleration of the at least one pointing element along a motion path.
65. The method ofclaim 1, further comprising:
displaying at least one graphical element in the space; and
identifying, by the at least one processor, a given graphical element from the at least one displayed graphical element, using data indicative of the location of the aimed point or area in the space.
66. The method ofclaim 1, further comprising:
determining the location of the at least one pointing element using location features extracted from a motion path of the at least one pointing element.
67. The method ofclaim 66, wherein:
the features extracted from the motion path of the at least one pointing element include a plurality of selected position data components of the at least one pointing element, which comply with a predefined criterion, and
the method further comprises:
for each given position data component in the plurality of selected position data components, determining a respective viewing ray, the respective viewing ray extending from the location of the at least one predefined body part, through the location of the at least one pointing element, and intersecting a plane on which at least one graphical element is displayed, thus yielding a respective candidate plane, wherein the location of the at least one pointing element corresponds to the given position data component;
determining an overlapping area between the respective candidate planes; and
determining the aimed point or area using data indicative of the overlapping area.
68. The method ofclaim 1, wherein the image sensor is a proximity sensor.
69. The method ofclaim 1, further comprising determining a clicking gesture.
70. A non-transitory computer readable medium storing instructions which, when executed, cause at least one processor to perform operations for recognizing an aimed point or area in a space, the operations comprising:
obtaining, using an image sensor, at least one image of the space;
processing the at least one image by at least one processor operatively connected to the image sensor and obtaining data indicative of a location of at least one pointing element in the space;
obtaining data indicative of a location of at least one predefined body part of a user in the space;
determining by the at least one processor, a location of the aimed point or area in the space using a combination of:
the data indicative of the location of the at least one pointing element, and
the data indicative of the location of the at least one predefined body part;
detecting a change in a motion vector of the at least one pointing element during the pointing gesture, wherein the change in the motion vector relates to a change in a direction of the motion vector in at least one of a vertical axis or a horizontal axis;
determining a line of sight extending from a location of the at least one predefined body part of a user in the space;
determining, based on the determined line of sight, a user selection of an icon or a graphical element, while obtaining the data indicative of the location of the at least one pointing element in the space; and
executing a predefined command or message associated with the selected icon or graphical element.
71. The computer readable medium ofclaim 70, wherein the space comprises a three-dimensional space, and the data indicative of the location of the at least one pointing element includes x, y, and z coordinates.
72. The computer readable medium ofclaim 70, the operations further comprising detecting a change in a speed of the at least one pointing element along a motion path.
73. The computer readable medium ofclaim 70, the operations further comprising detecting a deceleration of the at least one pointing element along a motion path.
74. The computer readable medium ofclaim 70, the operations further comprising:
displaying at least one graphical element in the space; and
identifying, by the at least one processor, a given graphical element from the at least one displayed graphical element, using data indicative of the location of the aimed point or area in the space.
75. The computer readable medium ofclaim 70, the operations further comprising:
determining the location of the at least one pointing element using location features extracted from a motion path of the at least one pointing element.
76. The computer readable medium ofclaim 75, wherein:
the features with respect to the motion path of the at least one pointing element further include a plurality of selected position data components of the at least one pointing element, which comply with a predefined criterion, and
the method further comprises:
for each given position data component in the plurality of selected position data components, determining a respective viewing ray, the respective viewing ray extending from the location of the at least one predefined body part, through the location of the at least one pointing element, and intersecting a plane on which at least one graphical element is displayed, thus yielding a respective candidate plane, wherein the location of the at least one pointing element corresponds to the given position data component;
determining an overlapping area between the respective candidate planes; and
determining the aimed point or area using data indicative of the overlapping area.
77. The computer readable medium ofclaim 70, wherein the image sensor is a proximity sensor.
78. The computer readable medium ofclaim 70, the operations further comprising determining a clicking gesture.
79. A method of recognizing an aimed point or area in a space, the method comprising:
obtaining, using an image sensor, at least one image of the space;
processing the at least one image by at least one processor operatively connected to the image sensor and obtaining data indicative of a location of at least one pointing element in the space;
obtaining data indicative of a location of at least one predefined body part of a user in the space;
determining by the at least one processor, a location of the aimed point or area in the space using a combination of:
the data indicative of the location of the at least one pointing element,
the data indicative of the location of the at least one predefined body part;
determining a change in a motion vector of the at least one pointing element during the pointing gesture, wherein the change in the motion vector relates to a change in a direction of the motion vector in at least one of a vertical axis or a horizontal axis;
determining a user selection of an icon or a graphical element, based on the change in the motion vector and the determined location of the aimed point or area; and
executing a predefined command or message associated with the determined location of the aimed point or area in the space.
US17/659,3492012-03-132022-04-15Touch Free User InterfaceAbandonedUS20220382379A1 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US17/659,349US20220382379A1 (en)2012-03-132022-04-15Touch Free User Interface

Applications Claiming Priority (6)

Application NumberPriority DateFiling DateTitle
US201261610116P2012-03-132012-03-13
PCT/IL2013/050230WO2013136333A1 (en)2012-03-132013-03-12Touch free user interface
US14/130,359US9671869B2 (en)2012-03-132013-03-12Systems and methods of direct pointing detection for interaction with a digital device
US15/583,958US10248218B2 (en)2012-03-132017-05-01Systems and methods of direct pointing detection for interaction with a digital device
US16/371,565US11307666B2 (en)2012-03-132019-04-01Systems and methods of direct pointing detection for interaction with a digital device
US17/659,349US20220382379A1 (en)2012-03-132022-04-15Touch Free User Interface

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
US16/371,565ContinuationUS11307666B2 (en)2012-03-132019-04-01Systems and methods of direct pointing detection for interaction with a digital device

Publications (1)

Publication NumberPublication Date
US20220382379A1true US20220382379A1 (en)2022-12-01

Family

ID=49160334

Family Applications (4)

Application NumberTitlePriority DateFiling Date
US14/130,359ActiveUS9671869B2 (en)2012-03-132013-03-12Systems and methods of direct pointing detection for interaction with a digital device
US15/583,958ActiveUS10248218B2 (en)2012-03-132017-05-01Systems and methods of direct pointing detection for interaction with a digital device
US16/371,565ActiveUS11307666B2 (en)2012-03-132019-04-01Systems and methods of direct pointing detection for interaction with a digital device
US17/659,349AbandonedUS20220382379A1 (en)2012-03-132022-04-15Touch Free User Interface

Family Applications Before (3)

Application NumberTitlePriority DateFiling Date
US14/130,359ActiveUS9671869B2 (en)2012-03-132013-03-12Systems and methods of direct pointing detection for interaction with a digital device
US15/583,958ActiveUS10248218B2 (en)2012-03-132017-05-01Systems and methods of direct pointing detection for interaction with a digital device
US16/371,565ActiveUS11307666B2 (en)2012-03-132019-04-01Systems and methods of direct pointing detection for interaction with a digital device

Country Status (3)

CountryLink
US (4)US9671869B2 (en)
CN (2)CN108469899B (en)
WO (1)WO2013136333A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US12182333B2 (en)*2020-03-132024-12-31Interdigital Madison Patent Holdings, SasDisplay user interface method and system

Families Citing this family (61)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9414051B2 (en)2010-07-202016-08-09Memory Engine, IncorporatedExtensible authoring and playback platform for complex virtual reality interactions and immersive applications
KR20140069124A (en)2011-09-192014-06-09아이사이트 모빌 테크놀로지 엘티디Touch free interface for augmented reality systems
US10691219B2 (en)2012-01-172020-06-23Ultrahaptics IP Two LimitedSystems and methods for machine control
US8693731B2 (en)2012-01-172014-04-08Leap Motion, Inc.Enhanced contrast for object detection and characterization by optical imaging
US11493998B2 (en)2012-01-172022-11-08Ultrahaptics IP Two LimitedSystems and methods for machine control
US8638989B2 (en)2012-01-172014-01-28Leap Motion, Inc.Systems and methods for capturing motion in three-dimensional space
US12260023B2 (en)2012-01-172025-03-25Ultrahaptics IP Two LimitedSystems and methods for machine control
US9679215B2 (en)2012-01-172017-06-13Leap Motion, Inc.Systems and methods for machine control
US9070019B2 (en)2012-01-172015-06-30Leap Motion, Inc.Systems and methods for capturing motion in three-dimensional space
US20150253428A1 (en)2013-03-152015-09-10Leap Motion, Inc.Determining positional information for an object in space
US9501152B2 (en)2013-01-152016-11-22Leap Motion, Inc.Free-space user interface and control using virtual constructs
CN108469899B (en)*2012-03-132021-08-10视力移动技术有限公司Method of identifying an aiming point or area in a viewing space of a wearable display device
US8938124B2 (en)2012-05-102015-01-20Pointgrab Ltd.Computer vision based tracking of a hand
US9285893B2 (en)2012-11-082016-03-15Leap Motion, Inc.Object detection and tracking with variable-field illumination devices
US10609285B2 (en)2013-01-072020-03-31Ultrahaptics IP Two LimitedPower consumption in motion-capture systems
US9465461B2 (en)2013-01-082016-10-11Leap Motion, Inc.Object detection and tracking with audio and optical signals
US9459697B2 (en)2013-01-152016-10-04Leap Motion, Inc.Dynamic, free-space user interactions for machine control
US9734582B2 (en)*2013-02-212017-08-15Lg Electronics Inc.Remote pointing method
US9122916B2 (en)*2013-03-142015-09-01Honda Motor Co., Ltd.Three dimensional fingertip tracking
US9916009B2 (en)2013-04-262018-03-13Leap Motion, Inc.Non-tactile interface systems and methods
US10620775B2 (en)2013-05-172020-04-14Ultrahaptics IP Two LimitedDynamic interactive objects
US9436288B2 (en)2013-05-172016-09-06Leap Motion, Inc.Cursor mode switching
US10281987B1 (en)2013-08-092019-05-07Leap Motion, Inc.Systems and methods of free-space gestural interaction
US10846942B1 (en)2013-08-292020-11-24Ultrahaptics IP Two LimitedPredictive information for free space gesture control and communication
US9632572B2 (en)2013-10-032017-04-25Leap Motion, Inc.Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US9996638B1 (en)2013-10-312018-06-12Leap Motion, Inc.Predictive information for free space gesture control and communication
US9622322B2 (en)2013-12-232017-04-11Sharp Laboratories Of America, Inc.Task light based system and gesture control
US9613262B2 (en)2014-01-152017-04-04Leap Motion, Inc.Object detection and tracking for providing a virtual device experience
US9785247B1 (en)2014-05-142017-10-10Leap Motion, Inc.Systems and methods of tracking moving hands and recognizing gestural interactions
KR102265143B1 (en)*2014-05-162021-06-15삼성전자주식회사Apparatus and method for processing input
US9741169B1 (en)2014-05-202017-08-22Leap Motion, Inc.Wearable augmented reality devices with object detection and tracking
CN204480228U (en)2014-08-082015-07-15厉动公司motion sensing and imaging device
JP6520918B2 (en)*2014-12-262019-05-29株式会社ニコン Control device, electronic device, control method and program
WO2016103520A1 (en)*2014-12-262016-06-30株式会社ニコンDetection device, electronic instrument, detection method, and program
US10656720B1 (en)2015-01-162020-05-19Ultrahaptics IP Two LimitedMode switching for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
PL411339A1 (en)*2015-02-232016-08-29Samsung Electronics Polska Spółka Z Ograniczoną OdpowiedzialnościąMethod for controlling a device by means of gestures and the system for controlling a device by means of gestures
US9791917B2 (en)*2015-03-242017-10-17Intel CorporationAugmentation modification based on user interaction with augmented reality scene
CN104717365B (en)*2015-03-302017-05-10京东方科技集团股份有限公司Brilliance regulating system and method and mobile terminal
CN107787497B (en)*2015-06-102021-06-22维塔驰有限公司Method and apparatus for detecting gestures in a user-based spatial coordinate system
CN105159539B (en)*2015-09-102018-06-01京东方科技集团股份有限公司Touch-control response method, device and the wearable device of wearable device
CN105843054A (en)*2016-03-222016-08-10美的集团股份有限公司Method for controlling household devices, intelligent household system and mobile device
CN108008811A (en)*2016-10-272018-05-08中兴通讯股份有限公司A kind of method and terminal using non-touch screen mode operating terminal
CN106682468A (en)*2016-12-302017-05-17百度在线网络技术(北京)有限公司Method of unlocking electronic device and electronic device
EP3678126A4 (en)2017-08-312020-08-26Sony CorporationInformation processing device, information processing method, and program
KR102511522B1 (en)*2017-10-182023-03-17삼성전자주식회사Data learning server, method for generating and using thereof
US11568273B2 (en)2017-11-142023-01-31International Business Machines CorporationMulti-dimensional cognition for unified cognition in cognitive assistance
US11544576B2 (en)2017-11-142023-01-03International Business Machines CorporationUnified cognition for a virtual personal cognitive assistant of an entity when consuming multiple, distinct domains at different points in time
US11443196B2 (en)*2017-11-142022-09-13International Business Machines CorporationUnified cognition for a virtual personal cognitive assistant when cognition is embodied across multiple embodied cognition object instances
JP2019152984A (en)*2018-03-012019-09-12富士ゼロックス株式会社Information processing device and program
IT201800003723A1 (en)*2018-03-192019-09-19Candy Spa Home appliance with user interface
TWI734024B (en)2018-08-282021-07-21財團法人工業技術研究院Direction determination system and direction determination method
LU100922B1 (en)*2018-09-102020-03-10Hella Saturnus Slovenija D O OA system and a method for entertaining players outside of a vehicle
US20220031158A1 (en)*2018-09-262022-02-03Essilor InternationalMethod for determining at least one geometrico-morphological parameter of a subject
JP2020052681A (en)*2018-09-262020-04-02シュナイダーエレクトリックホールディングス株式会社Operation processing device
US20200125175A1 (en)*2018-10-172020-04-23WiSilica Inc.System using location, video-processing, and voice as user interface for controlling devices
US10894198B1 (en)*2019-10-012021-01-19Strikezone Technologies, LLCSystems and methods for dynamic and accurate pitch detection
US11195259B2 (en)*2019-12-042021-12-07Samsung Electronics Co., Ltd.Apparatus and method for dynamic multi-camera rectification using depth camera
CN112213850A (en)*2020-08-032021-01-12深圳市莫廷影像技术有限公司Digital microscopic intelligent switching system and control method thereof, and slit lamp microscope
CN112835484B (en)*2021-02-022022-11-08北京地平线机器人技术研发有限公司Dynamic display method and device based on operation body, storage medium and electronic equipment
US11792506B2 (en)*2022-02-092023-10-17Motorola Mobility LlcElectronic devices and corresponding methods for defining an image orientation of captured images
US20250076969A1 (en)*2023-08-302025-03-06Samsung Electronics Co., Ltd.Dynamically-adaptive planar transformations for video see-through (vst) extended reality (xr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11307666B2 (en)*2012-03-132022-04-19Eyesight Mobile Technologies Ltd.Systems and methods of direct pointing detection for interaction with a digital device

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6176782B1 (en)*1997-12-222001-01-23Philips Electronics North America Corp.Motion-based command generation technology
JP3795647B2 (en)*1997-10-292006-07-12株式会社竹中工務店 Hand pointing device
US6147678A (en)*1998-12-092000-11-14Lucent Technologies Inc.Video hand image-three-dimensional computer interface with multiple degrees of freedom
US6531999B1 (en)*2000-07-132003-03-11Koninklijke Philips Electronics N.V.Pointing direction calibration in video conferencing and other camera-based system applications
US6600475B2 (en)*2001-01-222003-07-29Koninklijke Philips Electronics N.V.Single camera system for gesture-based input and target indication
US7893920B2 (en)*2004-05-062011-02-22Alpine Electronics, Inc.Operation input device and method of operation input
CN100432897C (en)*2006-07-282008-11-12上海大学System and method of contactless position input by hand and eye relation guiding
EP1950957A2 (en)*2007-01-232008-07-30Funai Electric Co., Ltd.Image display system
US8073198B2 (en)*2007-10-262011-12-06Samsung Electronics Co., Ltd.System and method for selection of an object of interest during physical browsing by finger framing
US8149210B2 (en)*2007-12-312012-04-03Microsoft International Holdings B.V.Pointing device and method
DE102008010990A1 (en)*2008-02-252009-09-03Siemens Aktiengesellschaft Device comprising an object that can be moved in space, to be observed, in particular medical examination or treatment device with a display device that can be moved in space
US7991896B2 (en)*2008-04-212011-08-02Microsoft CorporationGesturing to select and configure device communication
US20100088637A1 (en)*2008-10-072010-04-08Himax Media Solutions, Inc.Display Control Device and Display Control Method
JP2011028366A (en)*2009-07-222011-02-10Sony CorpOperation control device and operation control method
US8291322B2 (en)*2009-09-302012-10-16United Video Properties, Inc.Systems and methods for navigating a three-dimensional media guidance application
US20110137727A1 (en)*2009-12-072011-06-09Rovi Technologies CorporationSystems and methods for determining proximity of media objects in a 3d media environment
US8659658B2 (en)*2010-02-092014-02-25Microsoft CorporationPhysical interaction zone for gesture-based user interfaces
US8522308B2 (en)*2010-02-112013-08-27Verizon Patent And Licensing Inc.Systems and methods for providing a spatial-input-based multi-user shared display experience
KR101334107B1 (en)*2010-04-222013-12-16주식회사 굿소프트웨어랩Apparatus and Method of User Interface for Manipulating Multimedia Contents in Vehicle
US9715275B2 (en)2010-04-262017-07-25Nokia Technologies OyApparatus, method, computer program and user interface
US8593402B2 (en)*2010-04-302013-11-26Verizon Patent And Licensing Inc.Spatial-input-based cursor projection systems and methods
US20110304649A1 (en)*2010-06-102011-12-15Microsoft CorporationCharacter selection
EP2455841A3 (en)*2010-11-222015-07-15Samsung Electronics Co., Ltd.Apparatus and method for selecting item using movement of object
US20130154913A1 (en)*2010-12-162013-06-20Siemens CorporationSystems and methods for a gaze and gesture interface
JP2012141930A (en)*2011-01-062012-07-26Sony CorpInformation processor, information processing system and information processing method
CN201926822U (en)*2011-01-182011-08-10上海麦启数码科技有限公司3D (three dimensional) glasses system capable of identifying spatial location
EP2672880B1 (en)*2011-02-092019-05-22Apple Inc.Gaze detection in a 3d mapping environment
KR101151962B1 (en)*2011-02-162012-06-01김석중Virtual touch apparatus and method without pointer on the screen
US9104239B2 (en)*2011-03-092015-08-11Lg Electronics Inc.Display device and method for controlling gesture functions using different depth ranges
US9218063B2 (en)*2011-08-242015-12-22Apple Inc.Sessionless pointing user interface
US8947351B1 (en)*2011-09-272015-02-03Amazon Technologies, Inc.Point of view determinations for finger tracking
KR20220136517A (en)*2013-06-272022-10-07아이사이트 모빌 테크놀로지 엘티디Systems and methods of direct pointing detection for interaction with a digital device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11307666B2 (en)*2012-03-132022-04-19Eyesight Mobile Technologies Ltd.Systems and methods of direct pointing detection for interaction with a digital device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US12182333B2 (en)*2020-03-132024-12-31Interdigital Madison Patent Holdings, SasDisplay user interface method and system

Also Published As

Publication numberPublication date
CN104471511B (en)2018-04-20
US20140375547A1 (en)2014-12-25
US20190324552A1 (en)2019-10-24
CN108469899A (en)2018-08-31
US9671869B2 (en)2017-06-06
US11307666B2 (en)2022-04-19
CN108469899B (en)2021-08-10
US10248218B2 (en)2019-04-02
CN104471511A (en)2015-03-25
US20170235376A1 (en)2017-08-17
WO2013136333A1 (en)2013-09-19

Similar Documents

PublicationPublication DateTitle
US20220382379A1 (en)Touch Free User Interface
US11314335B2 (en)Systems and methods of direct pointing detection for interaction with a digital device
US20210096651A1 (en)Vehicle systems and methods for interaction detection
US20190250714A1 (en)Systems and methods for triggering actions based on touch-free gesture detection
US9600078B2 (en)Method and system enabling natural user interface gestures with an electronic system
US9619105B1 (en)Systems and methods for gesture based interaction with viewpoint dependent user interfaces
US20180292907A1 (en)Gesture control system and method for smart home
KR101890459B1 (en)Method and system for responding to user's selection gesture of object displayed in three dimensions
US20140240225A1 (en)Method for touchless control of a device
US20130343607A1 (en)Method for touchless control of a device
MX2009000305A (en)Virtual controller for visual displays.
Haubner et al.Recognition of dynamic hand gestures with time-of-flight cameras
Zhang et al.Free-hand gesture control with" touchable" virtual interface for human-3DTV interaction

Legal Events

DateCodeTitleDescription
STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STCBInformation on status: application discontinuation

Free format text:ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION


[8]ページ先頭

©2009-2025 Movatter.jp