Movatterモバイル変換


[0]ホーム

URL:


WO2011033519A1 - Methods circuits apparatus and systems for human machine interfacing with an electronic appliance - Google Patents

Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
Download PDF

Info

Publication number
WO2011033519A1
WO2011033519A1PCT/IL2010/000791IL2010000791WWO2011033519A1WO 2011033519 A1WO2011033519 A1WO 2011033519A1IL 2010000791 WIL2010000791 WIL 2010000791WWO 2011033519 A1WO2011033519 A1WO 2011033519A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensors
finger
limb
elements
implement
Prior art date
Application number
PCT/IL2010/000791
Other languages
French (fr)
Inventor
Dor Givon
Original Assignee
Extreme Reality Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Extreme Reality Ltd.filedCriticalExtreme Reality Ltd.
Priority to JP2012529401ApriorityCriticalpatent/JP2013505493A/en
Priority to EP10816800.6Aprioritypatent/EP2480951A4/en
Priority to US13/497,061prioritypatent/US9218126B2/en
Priority to CA2774867Aprioritypatent/CA2774867A1/en
Publication of WO2011033519A1publicationCriticalpatent/WO2011033519A1/en
Priority to IL218765Aprioritypatent/IL218765A0/en
Priority to US13/468,282prioritypatent/US8878779B2/en
Priority to US13/737,345prioritypatent/US9046962B2/en

Links

Classifications

Definitions

Landscapes

Abstract

Disclosed are methods, circuits, apparatus and systems for human machine interfacing with a computational platform or any other electronic device such as a cell-phone, smart-phone, e-book, notebook computer, tablet computer, etc. According to some embodiments, there may be provided an adaptive touch¬ screen input arrangement such as a keyboard, keypad or any other touch screen input arrangements including one or more input elements such as rendered or projected keys or buttons which may be projected onto or rendered on a touch screen display. The adaptive touch-screen input arrangement may be adapted to alter the size, shape or location of input elements within proximity of a finger, limb or implement used by a user to touch the screen.

Description

METHODS CIRCUITS APPARATUS AND SYSTEMS FOR HUMAN
MACHINE INTERFACING WITH AN ELECTRONIC APPLIANCE
FIELD OF THE INVENTION
[001] The present invention generally relates to the field of electronics. More specifically, the present invention relates to a methods, circuits, apparatus and systems for facilitating human interface with electronic devices such as mobile devices, cell phones, Personal Digital Assistants ("PDA"), digital cameras, or any integrated combination of electronic devices.
BACKGROUND
[002] In recent decades, electronic technology, including communication technology, has revolutionized our everyday lives. Electronic devices such as PDA's, cell phones, e-books, notebook computers, mobile media players and digital cameras have permeated the lives of almost every person living in the developed world - and quite a number of people living in undeveloped countries. Mobile communication and computing devices, especially, have become the means by which countless millions conduct their personal and professional interactions with the world. It has become almost impossible for many people, especially those in the business world, who use these devices as a means to improve productivity, to function without access to their electronic devices.
[003] However, with this tremendous proliferation in the use of electronic devices, there has developed a tradeoff between enhanced productivity and simplicity or convenience. As handheld devices evolved to perform more and more tasks, the complexity of the interfaces required to interact which these devices has likewise increased. Many of today's handheld devices come equipped with some variation or another of a full typewriter keyboard. Some devices have fixed keyboards which are electromechanical in nature, while others have project a keyboard, a key pad or some variation of either onto a display associated with a touch screen sensor array. Because of the need to keep mobile or handheld devices compact enough to carry around, many of the physical and virtual (i.e. projected keyboards and keypads) implemented on these devices have keys or other interface components which are quite small relative to an average human finger, thus difficult to operate.
[004] Thus, there is a need for improved methods, circuits, apparatus and systems for interfacing with an electronic device.
SUMMARY OF THE INVENTION
[005] According to embodiments of the present invention, there are provided methods, circuits, apparatus and systems for human machine interfacing with a computational platform or any other electronic device such as a cell-phone, smart-phone, e-book, notebook computer, tablet computer, etc. According to some embodiments of the present invention, there may be provided an adaptive touch-screen input arrangement such as a keyboard, keypad or any other touch screen input arrangements including one or more input elements such as rendered or projected keys or buttons which may be projected onto or rendered on a touch screen display. The adaptive touch-screen input arrangement may be adapted to alter the size, shape or location of input elements within proximity of a finger, limb or implement used by a user to touch the screen.
[006] According to some embodiment of the present invention, one or more sensors such as: (1) image sensors, (2) image sensor arrays, (3) electrostatic sensors, (4) capacitive sensors, or (5) any other functionally suited sensor may sense a location and/or motion vector of a finger, limb or implement approaching the touch screen. The sensor(s) may provide to the adaptive touch screen-input arrangement an indication of the sensed position or motion vector of the finger/limb/implement relative to the input elements or keys - thereby indicating which input elements or keys are being approached. In response to the indication, the touch screen input arrangement may alter the size, shape or location of input elements within proximity of the sensed finger, limb or implement in order to make them more prominent (e.g. larger or in a better location) and more easily engagable.
[007] According to further embodiments of the present invention, there may be provided a human interface surface (e.g. touch screen display) comprising presentation and sensing elements. The presentation elements and the sensing elements may be integrated into a single substrate material or may part of separate substrates which are mechanically attached to one another in an overlapping manner. According to further embodiments of the present invention, there may be provided a controller (e.g. display drive circuit) adapted to send one or more presentation signals to the presentation elements of the human interface surface based at least partially on data stored in a presentation configuration table (e.g. virtual keyboard layout including location and size of keys) and based on a current state of the device. The current state of the device may be determined based on one or more signals received from the sensing elements and/or based on one or more signals received from the device.
[008] According to further embodiments of the present invention, the controller may associate a function or device command signal with each of one or more signals received from the sensing elements (e.g. when the sensing element is touched), wherein the association of a command or function may be at least partially based on data from a first data set in the sensing element configuration table. The data selected from the sensing element configuration table may be correlated to data from the presentation configuration used by the controller to send one or more signals to the presentations elements.
BRIEF DESCRIPTION OF THE DRAWINGS
[009] The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, however, both as to organization and method of operation, together with objects, features, and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanying drawings in which: [0010] FIG. 1 shows a block diagram of an exemplary mobile device according to some embodiments of the present invention, including an interface surface and various electrical functional blocks to drive and interact the interface surface.
[0011] It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
DETAILED DESCRIPTION
[0012] In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the present invention.
[0013] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing", "computing", "calculating", "determining", or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.
[0014] Embodiments of the present invention may include apparatuses for performing the operations herein. This apparatus may be specially constructed for the desired purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs) electrically programmable read-only memories (EPROMs), electrically erasable and programmable read only memories (EEPROMs), magnetic or optical cards, or any other type of media suitable for storing electronic instructions, and capable of being coupled to a computer system bus.
[0015] The processes and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the desired method. The desired structure for a variety of these systems will appear from the description below. In addition, embodiments of the present invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the inventions as described herein.
[0016] According to embodiments, there may be provided an interface apparatus for an electronic device including an adaptive touch-screen input arrangement adapted to alter a size, position or shape of an input element based on signal or an indication from a touchless sensor regarding a position or motion vector of a finger, limb or implement. The adaptive touch-screen input arrangement may include a display functionally associated with a graphics processing circuit adapted to render one or more input elements and to project the one or more elements on the display. The apparatus may include a touchless sensor adapted to sense a position or motion vector of a finger, limb or implement in proximity with said display. A signal derived from an output of the touchless sensor may be provided to the graphics processing circuit and may cause the graphics processing circuit to alter a feature of one or more projected interface elements - for example the size of an input element (e.g. a keyboard key projected on the display and its associated touch-screen sensor area) in proximity with a position of a finger, limb or implement may be enlarged. The touchless sensor may be selected from a group of sensors consisting of (1) proximity sensors, (2) image sensors, (3) image sensor arrays, (4) electrostatic sensors, and (5) capacitive sensors. The interface apparatus may be part of a computing device, communication device or any other electronic device known today or to be developed in the future.
[0017] Turning now to FIG. 1 , there is shown a block diagram of an exemplary mobile device according to some embodiments of the present invention, including an interface surface and various electrical functional blocks to drive and interact the interface surface or touch-screen assembly. The exemplary device may include a controller 100 adapted to regulate signals to a presentation element driver 300, which presentation element driver 300 may be functionally associated with presentation elements (e.g. Light Emitting Diodes, LCD, etc..) of an interface surface 10. The controller may also be adapted to receive signals from a touch sensing element decoder 400, which decoder 400 is functionally associated with touch sensing elements (e.g. touch sensors, etc.) of the interface surface. The controller may also be adapted to receive finger/limb/implement location or motion indications or information from a touchless sensor 600. [0018] It should be understood that the controller 100 may be a processor, a graphics processor, dedicated control logic or any combination thereof.
[0019] A configuration database 200 may include information used by the controller 100 in regulating signal flow to the presentation element driver. As shown in Fig. 1 , the configuration database 200 may include such information as interface element (e.g. buttons or display area) shape and location, display properties, etc., for each of the applications 500A through 500N installed on the device. It should be understood by one of ordinary skill in the art that interface elements such as buttons and displays mentioned above are not physical buttons or displays, but rather virtual elements projected through the presentation surface 10. For each given application 500A through 500N, the configuration database 200 may also include sensing element mapping information corresponding to presentation information/elements associated with given application to specific functions. The controller 100 may use the mapping information to determine which interface element is interacted with (when the screen is touched) by the user and which function/command that interaction is meant to trigger.
[0020] The controller may be adapted to alter the size, shape, location or any other feature of any element projected/rendered by the display elements based on a signal or indication provided by the touchless sensor 600 regarding finger/limb/implement location or motion relative to the sensing surface or any of the elements projected onto the sensing surface. The controller may make an input element towards which the finger/limb/implement is approaching more prominent. The controller may also adjust its touch-sensor element to function mapping to correlate with the adjusted projected/displayed input element size, shape or location.
[0021] According to some embodiments of the present invention, the touchless sensor 600 may be adapted to determine the position and/or motion of a finger/limb/implement relative to the touch-screen or relative to elements projected/rendered/displayed thereon. The touchless sensor may be part of an image based human machine interface. The image based human machine interface may include one or more image sensors and software running on the controller, on another generable purpose processor or on a dedicated processor. According to further embodiments of the present invention, the sensor 600 may be part of electrostatic sensing arrangement. It should be understood by one of skill in the art that any functionally correlated (i.e. can serve the same function) touchless sensor may be used as part of some embodiments of the present invention.
[0022] While certain features of the invention have been illustrated and described herein, many modifications, substitutions, changes, and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.

Claims

CLAIMS What is claimed:
1. An interface apparatus for an electronic device, comprising: an adaptive touch-screen input arrangement adapted to alter a size, position or shape of an input element based on an indication from a touchless sensor regarding a position or motion vector of a finger, limb or implement.
2. The apparatus according to claim 1 , wherein said adaptive touch-screen input arrangement includes a display functionally associated with a graphics processing circuit adapted to render one or more input elements and to project the one or more elements on said display.
3. The apparatus according to claim 2, further comprising a touchless sensor adapted to sense a position or motion vector of a finger, limb or implement in proximity with said display.
4. The apparatus according to claim 3, wherein a signal derived from an output of said touchless sensor is provided to said graphics processing circuit.
5. The apparatus according to claim 3, wherein said touchless sensor is selected from the group of sensors consisting of (1) proximity sensors, (2) image sensors, (3) image sensor arrays, (4) electrostatic sensors, and (5) capacitive sensors.
6. The apparatus according to claim 1 , wherein the size of an input element in proximity with a position of a finger, limb or implement is enlarged.
7. An electronic device, comprising:
a processor; and
an adaptive touch-screen input arrangement adapted to alter a size, position or shape of an input element based on an indication from a touchless sensor regarding a position or motion vector of a finger, limb or implement.
8. The device according to claim 7, wherein said adaptive touch-screen input arrangement includes a display functionally associated with a graphics processing circuit adapted to render one or more input elements and to project the one or more elements on said display.
9. The device according to claim 8, further comprising a touchless sensor adapted to sense a position or motion vector of a finger, limb or implement in proximity with said display.
10. The device according to claim 9, wherein a signal derived from an output of said touchless sensor is provided to said graphics processing circuit.
11. The device according to claim 9, wherein said touchless sensor is selected from the group of sensors consisting of (1) proximity sensors, (2) image sensors, (3) image sensor arrays, (4) electrostatic sensors, and (5) capacitive sensors.
12. The device according to claim 7, wherein the size of an input element in proximity with a position of a finger, limb or implement is enlarged.
PCT/IL2010/0007912005-10-312010-09-21Methods circuits apparatus and systems for human machine interfacing with an electronic applianceWO2011033519A1 (en)

Priority Applications (7)

Application NumberPriority DateFiling DateTitle
JP2012529401AJP2013505493A (en)2009-09-212010-09-21 Method, circuit, apparatus and system for human-machine interfacing with electronic equipment
EP10816800.6AEP2480951A4 (en)2009-09-212010-09-21Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
US13/497,061US9218126B2 (en)2009-09-212010-09-21Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
CA2774867ACA2774867A1 (en)2009-09-212010-09-21Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
IL218765AIL218765A0 (en)2009-09-212012-03-20Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
US13/468,282US8878779B2 (en)2009-09-212012-05-10Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen
US13/737,345US9046962B2 (en)2005-10-312013-01-09Methods, systems, apparatuses, circuits and associated computer executable code for detecting motion, position and/or orientation of objects within a defined spatial region

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US24413609P2009-09-212009-09-21
US61/244,1362009-09-21

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
US13/734,987Continuation-In-PartUS9131220B2 (en)2005-10-312013-01-06Apparatus method and system for imaging

Related Child Applications (3)

Application NumberTitlePriority DateFiling Date
US13/497,061A-371-Of-InternationalUS9218126B2 (en)2009-09-212010-09-21Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
US13/468,282Continuation-In-PartUS8878779B2 (en)2009-09-212012-05-10Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen
US13/737,345Continuation-In-PartUS9046962B2 (en)2005-10-312013-01-09Methods, systems, apparatuses, circuits and associated computer executable code for detecting motion, position and/or orientation of objects within a defined spatial region

Publications (1)

Publication NumberPublication Date
WO2011033519A1true WO2011033519A1 (en)2011-03-24

Family

ID=43758175

Family Applications (1)

Application NumberTitlePriority DateFiling Date
PCT/IL2010/000791WO2011033519A1 (en)2005-10-312010-09-21Methods circuits apparatus and systems for human machine interfacing with an electronic appliance

Country Status (7)

CountryLink
US (1)US9218126B2 (en)
EP (1)EP2480951A4 (en)
JP (1)JP2013505493A (en)
KR (1)KR101577106B1 (en)
CA (1)CA2774867A1 (en)
IL (1)IL218765A0 (en)
WO (1)WO2011033519A1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20130097550A1 (en)*2011-10-142013-04-18Tovi GrossmanEnhanced target selection for a touch-based input enabled user interface
US8462199B2 (en)2005-10-312013-06-11Extreme Reality Ltd.Apparatus method and system for imaging
US8548258B2 (en)2008-10-242013-10-01Extreme Reality Ltd.Method system and associated modules and software components for providing image sensor based human machine interfacing
US8615108B1 (en)2013-01-302013-12-24Imimtek, Inc.Systems and methods for initializing motion tracking of human hands
US8655021B2 (en)2012-06-252014-02-18Imimtek, Inc.Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US8830312B2 (en)2012-06-252014-09-09Aquifi, Inc.Systems and methods for tracking human hands using parts based template matching within bounded regions
US8872899B2 (en)2004-07-302014-10-28Extreme Reality Ltd.Method circuit and system for human to machine interfacing by hand gestures
US8878779B2 (en)2009-09-212014-11-04Extreme Reality Ltd.Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen
US8928654B2 (en)2004-07-302015-01-06Extreme Reality Ltd.Methods, systems, devices and associated processing logic for generating stereoscopic images and video
US9046962B2 (en)2005-10-312015-06-02Extreme Reality Ltd.Methods, systems, apparatuses, circuits and associated computer executable code for detecting motion, position and/or orientation of objects within a defined spatial region
US9092665B2 (en)2013-01-302015-07-28Aquifi, IncSystems and methods for initializing motion tracking of human hands
US9177220B2 (en)2004-07-302015-11-03Extreme Reality Ltd.System and method for 3D space-dimension based image processing
US9218126B2 (en)2009-09-212015-12-22Extreme Reality Ltd.Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
US9298266B2 (en)2013-04-022016-03-29Aquifi, Inc.Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9310891B2 (en)2012-09-042016-04-12Aquifi, Inc.Method and system enabling natural user interface gestures with user wearable glasses
US9504920B2 (en)2011-04-252016-11-29Aquifi, Inc.Method and system to create three-dimensional mapping in a two-dimensional game
US9507417B2 (en)2014-01-072016-11-29Aquifi, Inc.Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9600078B2 (en)2012-02-032017-03-21Aquifi, Inc.Method and system enabling natural user interface gestures with an electronic system
US9619105B1 (en)2014-01-302017-04-11Aquifi, Inc.Systems and methods for gesture based interaction with viewpoint dependent user interfaces
US9798388B1 (en)2013-07-312017-10-24Aquifi, Inc.Vibrotactile system to augment 3D input systems
US9857868B2 (en)2011-03-192018-01-02The Board Of Trustees Of The Leland Stanford Junior UniversityMethod and system for ergonomic touch-free interface
US10394442B2 (en)2013-11-132019-08-27International Business Machines CorporationAdjustment of user interface elements based on user accuracy and content consumption
US10706217B2 (en)2012-03-282020-07-07International Business Machines CorporationQuick access panel for displaying a web page on a mobile device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8681100B2 (en)2004-07-302014-03-25Extreme Realty Ltd.Apparatus system and method for human-machine-interface
US20130104039A1 (en)*2011-10-212013-04-25Sony Ericsson Mobile Communications AbSystem and Method for Operating a User Interface on an Electronic Device
CN114168236A (en)*2020-09-102022-03-11华为技术有限公司Application access method and related device

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20060164230A1 (en)*2000-03-022006-07-27Dewind Darryl PInterior mirror assembly with display
US20070236475A1 (en)*2006-04-052007-10-11Synaptics IncorporatedGraphical scroll wheel
US20080007533A1 (en)*2006-07-062008-01-10Apple Computer, Inc., A California CorporationCapacitance sensing electrode with integrated I/O mechanism
US20080111710A1 (en)*2006-11-092008-05-15Marc BoillotMethod and Device to Control Touchless Recognition

Family Cites Families (123)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4376950A (en)1980-09-291983-03-15Ampex CorporationThree-dimensional television system using holographic techniques
US5130794A (en)1990-03-291992-07-14Ritchey Kurtis JPanoramic display system
US5515183A (en)1991-08-081996-05-07Citizen Watch Co., Ltd.Real-time holography system
US5691885A (en)1992-03-171997-11-25Massachusetts Institute Of TechnologyThree-dimensional interconnect having modules with vertical top and bottom connectors
JP3414417B2 (en)1992-09-302003-06-09富士通株式会社 3D image information transmission system
US5745719A (en)1995-01-191998-04-28Falcon; Fernando D.Commands functions invoked from movement of a control input device
US5835133A (en)1996-01-231998-11-10Silicon Graphics, Inc.Optical system for single camera stereo video
US6115482A (en)1996-02-132000-09-05Ascent Technology, Inc.Voice-output reading system with gesture-based navigation
US5909218A (en)1996-04-251999-06-01Matsushita Electric Industrial Co., Ltd.Transmitter-receiver of three-dimensional skeleton structure motions and method thereof
JP3337938B2 (en)1996-04-252002-10-28松下電器産業株式会社 Motion transmitting / receiving device having three-dimensional skeleton structure and motion transmitting / receiving method
US6445814B2 (en)1996-07-012002-09-03Canon Kabushiki KaishaThree-dimensional information processing apparatus and method
US5852450A (en)1996-07-111998-12-22Lamb & Company, Inc.Method and apparatus for processing captured motion data
US5831633A (en)1996-08-131998-11-03Van Roy; Peter L.Designating, drawing and colorizing generated images by computer
JPH10188028A (en)1996-10-311998-07-21Konami Co LtdAnimation image generating device by skeleton, method for generating the animation image and medium storing program for generating the animation image
US6243106B1 (en)1998-04-132001-06-05Compaq Computer CorporationMethod for figure tracking using 2-D registration and 3-D reconstruction
US6681031B2 (en)1998-08-102004-01-20Cybernet Systems CorporationGesture-controlled interfaces for self-service machines and other applications
US6303924B1 (en)1998-12-212001-10-16Microsoft CorporationImage sensing operator input device
US6529643B1 (en)1998-12-212003-03-04Xerox CorporationSystem for electronic compensation of beam scan trajectory distortion
US6657670B1 (en)1999-03-162003-12-02Teco Image Systems Co., Ltd.Diaphragm structure of digital still camera
DE19917660A1 (en)1999-04-192000-11-02Deutsch Zentr Luft & Raumfahrt Method and input device for controlling the position of an object to be graphically represented in a virtual reality
US6597801B1 (en)1999-09-162003-07-22Hewlett-Packard Development Company L.P.Method for object registration via selection of models with dynamically ordered features
US7123292B1 (en)1999-09-292006-10-17Xerox CorporationMosaicing images with an offset lens
JP2001246161A (en)1999-12-312001-09-11Square Co Ltd GAME DEVICE USING GESTURE RECOGNITION TECHNOLOGY, METHOD THEREOF, AND RECORDING MEDIUM CONTAINING PROGRAM FOR IMPLEMENTING THE METHOD
GB2358098A (en)2000-01-062001-07-11Sharp KkMethod of segmenting a pixelled image
EP1117072A1 (en)2000-01-172001-07-18Koninklijke Philips Electronics N.V.Text improvement
US6674877B1 (en)2000-02-032004-01-06Microsoft CorporationSystem and method for visually tracking occluded objects in real time
US9607301B2 (en)*2000-04-272017-03-28Merck Patent GmbhPhotovoltaic sensor facilities in a home environment
US6554706B2 (en)2000-05-312003-04-29Gerard Jounghyun KimMethods and apparatus of displaying and evaluating motion data in a motion game apparatus
US7227526B2 (en)2000-07-242007-06-05Gesturetek, Inc.Video-based image control system
US6906687B2 (en)2000-07-312005-06-14Texas Instruments IncorporatedDigital formatter for 3-dimensional display applications
JP4047575B2 (en)2000-11-152008-02-13株式会社セガ Display object generation method in information processing apparatus, program for controlling execution thereof, and recording medium storing the program
IL139995A (en)2000-11-292007-07-24Rvc LlcSystem and method for spherical stereoscopic photographing
US7116330B2 (en)2001-02-282006-10-03Intel CorporationApproximating motion using a three-dimensional model
US7061532B2 (en)2001-03-272006-06-13Hewlett-Packard Development Company, L.P.Single sensor chip digital stereo camera
WO2002099541A1 (en)2001-06-052002-12-12California Institute Of TechnologyMethod and method for holographic recording of fast phenomena
JP4596220B2 (en)2001-06-262010-12-08ソニー株式会社 Image processing apparatus and method, recording medium, and program
US7680295B2 (en)2001-09-172010-03-16National Institute Of Advanced Industrial Science And TechnologyHand-gesture based interface apparatus
CA2359269A1 (en)2001-10-172003-04-17Biodentity Systems CorporationFace imaging system for recordal and automated identity confirmation
WO2003039698A1 (en)2001-11-022003-05-15Atlantis Cyberspace, Inc.Virtual reality game system with pseudo 3d display driver & mission control
US20050063596A1 (en)2001-11-232005-03-24Yosef YomdinEncoding of geometric modeled images
US6833843B2 (en)2001-12-032004-12-21Tempest MicrosystemsPanoramic imaging and display system with canonical magnifier
EP1472869A4 (en)2002-02-062008-07-30Nice Systems LtdSystem and method for video content analysis-based detection, surveillance and alarm management
US8599266B2 (en)2002-07-012013-12-03The Regents Of The University Of CaliforniaDigital processing of video images
JP3866168B2 (en)2002-07-312007-01-10独立行政法人科学技術振興機構 Motion generation system using multiple structures
US8013852B2 (en)2002-08-022011-09-06Honda Giken Kogyo Kabushiki KaishaAnthropometry-based skeleton fitting
US8460103B2 (en)2004-06-182013-06-11IgtGesture controlled casino gaming system
AU2003293788A1 (en)*2002-12-182004-07-09Orange S.A.Mobile graphics device and server
US8599403B2 (en)2003-01-172013-12-03Koninklijke Philips N.V.Full depth map acquisition
US9177387B2 (en)2003-02-112015-11-03Sony Computer Entertainment Inc.Method and apparatus for real time motion capture
US7257237B1 (en)2003-03-072007-08-14Sandia CorporationReal time markerless motion tracking using linked kinematic chains
US7103852B2 (en)*2003-03-102006-09-05International Business Machines CorporationDynamic resizing of clickable areas of touch screen applications
US8745541B2 (en)2003-03-252014-06-03Microsoft CorporationArchitecture for controlling a computer using hand gestures
AU2003289108A1 (en)2003-04-222004-11-19Hiroshi ArisawaMotion capturing method, motion capturing device, and motion capturing marker
EP1627294A2 (en)2003-05-012006-02-22Delta Dansk Elektronik, Lys & AkustikA man-machine interface based on 3-d positions of the human body
WO2005036456A2 (en)2003-05-122005-04-21Princeton UniversityMethod and apparatus for foreground segmentation of video sequences
US7831088B2 (en)2003-06-132010-11-09Georgia Tech Research CorporationData reconstruction using directional interpolation techniques
JP2005020227A (en)2003-06-252005-01-20Pfu Ltd Image compression device
JP2005025415A (en)2003-06-302005-01-27Sony CorpPosition detector
US7755608B2 (en)2004-01-232010-07-13Hewlett-Packard Development Company, L.P.Systems and methods of interfacing with a machine
WO2005103863A2 (en)2004-03-232005-11-03Fujitsu LimitedDistinguishing tilt and translation motion components in handheld devices
US20070183633A1 (en)2004-03-242007-08-09Andre HoffmannIdentification, verification, and recognition method and system
US8036494B2 (en)2004-04-152011-10-11Hewlett-Packard Development Company, L.P.Enhancing image resolution
US7308112B2 (en)2004-05-142007-12-11Honda Motor Co., Ltd.Sign based human-machine interaction
US7519223B2 (en)2004-06-282009-04-14Microsoft CorporationRecognizing gestures and using gestures for interacting with software applications
US7366278B2 (en)2004-06-302008-04-29Accuray, Inc.DRR generation using a non-linear attenuation model
US8872899B2 (en)2004-07-302014-10-28Extreme Reality Ltd.Method circuit and system for human to machine interfacing by hand gestures
WO2008126069A2 (en)2007-04-152008-10-23Xtr - Extreme RealityAn apparatus system and method for human-machine-interface
US8432390B2 (en)2004-07-302013-04-30Extreme Reality LtdApparatus system and method for human-machine interface
US8381135B2 (en)*2004-07-302013-02-19Apple Inc.Proximity detector in handheld device
US8114172B2 (en)2004-07-302012-02-14Extreme Reality Ltd.System and method for 3D space-dimension based image processing
GB0424030D0 (en)2004-10-282004-12-01British TelecommA method and system for processing video data
US7386150B2 (en)2004-11-122008-06-10Safeview, Inc.Active subject imaging with body identification
US7903141B1 (en)2005-02-152011-03-08Videomining CorporationMethod and system for event detection by multi-scale image invariant analysis
WO2006099597A2 (en)2005-03-172006-09-21Honda Motor Co., Ltd.Pose estimation based on critical point analysis
US7774713B2 (en)2005-06-282010-08-10Microsoft CorporationDynamic user experience with semantic rich objects
US20070285554A1 (en)2005-10-312007-12-13Dor GivonApparatus method and system for imaging
US9046962B2 (en)2005-10-312015-06-02Extreme Reality Ltd.Methods, systems, apparatuses, circuits and associated computer executable code for detecting motion, position and/or orientation of objects within a defined spatial region
US8265349B2 (en)2006-02-072012-09-11Qualcomm IncorporatedIntra-mode region-of-interest video object segmentation
KR20080106265A (en)*2006-02-162008-12-04에프티케이 테크놀로지스 리미티드 System and method for entering data into a computing system
JP2007302223A (en)2006-04-122007-11-22Hitachi Ltd Non-contact input operation device for in-vehicle devices
CN102685533B (en)2006-06-232015-03-18图象公司Methods and systems for converting 2d motion pictures into stereoscopic 3d exhibition
US7701439B2 (en)2006-07-132010-04-20Northrop Grumman CorporationGesture recognition simulation system and method
US7783118B2 (en)2006-07-132010-08-24Seiko Epson CorporationMethod and apparatus for determining motion in images
US7907117B2 (en)2006-08-082011-03-15Microsoft CorporationVirtual controller for visual displays
US7936932B2 (en)2006-08-242011-05-03Dell Products L.P.Methods and apparatus for reducing storage size
US20080055259A1 (en)*2006-08-312008-03-06Honeywell International, Inc.Method for dynamically adapting button size on touch screens to compensate for hand tremor
US8356254B2 (en)2006-10-252013-01-15International Business Machines CorporationSystem and method for interacting with a display
US20080104547A1 (en)2006-10-252008-05-01General Electric CompanyGesture-based communications
US8756516B2 (en)2006-10-312014-06-17Scenera Technologies, LlcMethods, systems, and computer program products for interacting simultaneously with multiple application programs
US8354997B2 (en)*2006-10-312013-01-15NavisenseTouchless user interface for a mobile device
US7885480B2 (en)2006-10-312011-02-08Mitutoyo CorporationCorrelation peak finding method for image correlation displacement sensing
US8075499B2 (en)2007-05-182011-12-13Vaidhi NathanAbnormal motion detector and monitor
US7916944B2 (en)2007-01-312011-03-29Fuji Xerox Co., Ltd.System and method for feature level foreground segmentation
WO2008134745A1 (en)2007-04-302008-11-06Gesturetek, Inc.Mobile video-based therapy
US20080284726A1 (en)*2007-05-172008-11-20Marc BoillotSystem and Method for Sensory Based Media Control
US8219936B2 (en)*2007-08-302012-07-10Lg Electronics Inc.User interface for a mobile device using a user's gesture in the proximity of an electronic device
WO2009029767A1 (en)2007-08-302009-03-05Next Holdings, Inc.Optical touchscreen with improved illumination
US8064695B2 (en)2007-09-272011-11-22Behavioral Recognition Systems, Inc.Dark scene compensation in a background-foreground module of a video analysis system
US8005263B2 (en)2007-10-262011-08-23Honda Motor Co., Ltd.Hand sign recognition using label assignment
JP2009116769A (en)*2007-11-092009-05-28Sony CorpInput device, control method for input device and program
WO2009069392A1 (en)*2007-11-282009-06-04Nec CorporationInput device, server, display management method, and recording medium
US9451142B2 (en)2007-11-302016-09-20Cognex CorporationVision sensors, systems, and methods
KR101352994B1 (en)*2007-12-102014-01-21삼성전자 주식회사Apparatus and method for providing an adaptive on-screen keyboard
EP2085865A1 (en)*2008-01-302009-08-05Research In Motion LimitedElectronic device and method of controlling the same
CN101533320B (en)*2008-03-102012-04-25神基科技股份有限公司 Proximity magnification display method and device for area image of touch display device
US8107726B2 (en)2008-06-182012-01-31Samsung Electronics Co., Ltd.System and method for class-specific object segmentation of image data
US8443302B2 (en)*2008-07-012013-05-14Honeywell International Inc.Systems and methods of touchless interaction
BRPI0917864A2 (en)2008-08-152015-11-24Univ Brown apparatus and method for estimating body shape
WO2010026587A1 (en)2008-09-042010-03-11Extreme Reality Ltd.Method system and software for providing image sensor based human machine interfacing
EP2350925A4 (en)2008-10-242012-03-21Extreme Reality LtdA method system and associated modules and software components for providing image sensor based human machine interfacing
US8289440B2 (en)2008-12-082012-10-16Lytro, Inc.Light field data acquisition devices, and methods of using and manufacturing same
WO2010096279A2 (en)2009-02-172010-08-26Omek Interactive , Ltd.Method and system for gesture recognition
US10705692B2 (en)*2009-05-212020-07-07Sony Interactive Entertainment Inc.Continuous and dynamic scene decomposition for user interface
US8320619B2 (en)2009-05-292012-11-27Microsoft CorporationSystems and methods for tracking a model
US8466934B2 (en)2009-06-292013-06-18Min Liang TanTouchscreen interface
KR20110020082A (en)*2009-08-212011-03-02엘지전자 주식회사 Control device and method for mobile terminal
US8270733B2 (en)2009-08-312012-09-18Behavioral Recognition Systems, Inc.Identifying anomalous object types during classification
US8878779B2 (en)2009-09-212014-11-04Extreme Reality Ltd.Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen
CA2774867A1 (en)2009-09-212011-03-24Extreme Reality Ltd.Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
US8659592B2 (en)2009-09-242014-02-25Shenzhen Tcl New Technology Ltd2D to 3D video conversion
US20110292036A1 (en)2010-05-312011-12-01Primesense Ltd.Depth sensor with application interface
WO2012098534A1 (en)2011-01-232012-07-26Extreme Reality Ltd.Methods, systems, devices and associated processing logic for generating stereoscopic images and video
US9251422B2 (en)2011-11-132016-02-02Extreme Reality Ltd.Methods systems apparatuses circuits and associated computer executable code for video based subject characterization, categorization, identification and/or presence response

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20060164230A1 (en)*2000-03-022006-07-27Dewind Darryl PInterior mirror assembly with display
US20070236475A1 (en)*2006-04-052007-10-11Synaptics IncorporatedGraphical scroll wheel
US20080007533A1 (en)*2006-07-062008-01-10Apple Computer, Inc., A California CorporationCapacitance sensing electrode with integrated I/O mechanism
US20080111710A1 (en)*2006-11-092008-05-15Marc BoillotMethod and Device to Control Touchless Recognition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references ofEP2480951A4*

Cited By (31)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8872899B2 (en)2004-07-302014-10-28Extreme Reality Ltd.Method circuit and system for human to machine interfacing by hand gestures
US9177220B2 (en)2004-07-302015-11-03Extreme Reality Ltd.System and method for 3D space-dimension based image processing
US8928654B2 (en)2004-07-302015-01-06Extreme Reality Ltd.Methods, systems, devices and associated processing logic for generating stereoscopic images and video
US9046962B2 (en)2005-10-312015-06-02Extreme Reality Ltd.Methods, systems, apparatuses, circuits and associated computer executable code for detecting motion, position and/or orientation of objects within a defined spatial region
US9131220B2 (en)2005-10-312015-09-08Extreme Reality Ltd.Apparatus method and system for imaging
US8462199B2 (en)2005-10-312013-06-11Extreme Reality Ltd.Apparatus method and system for imaging
US8878896B2 (en)2005-10-312014-11-04Extreme Reality Ltd.Apparatus method and system for imaging
US8548258B2 (en)2008-10-242013-10-01Extreme Reality Ltd.Method system and associated modules and software components for providing image sensor based human machine interfacing
US9218126B2 (en)2009-09-212015-12-22Extreme Reality Ltd.Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
US8878779B2 (en)2009-09-212014-11-04Extreme Reality Ltd.Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen
US9857868B2 (en)2011-03-192018-01-02The Board Of Trustees Of The Leland Stanford Junior UniversityMethod and system for ergonomic touch-free interface
US9504920B2 (en)2011-04-252016-11-29Aquifi, Inc.Method and system to create three-dimensional mapping in a two-dimensional game
US10684768B2 (en)*2011-10-142020-06-16Autodesk, Inc.Enhanced target selection for a touch-based input enabled user interface
US20130097550A1 (en)*2011-10-142013-04-18Tovi GrossmanEnhanced target selection for a touch-based input enabled user interface
CN104011629A (en)*2011-10-142014-08-27欧特克公司Enhanced target selection for a touch-based input enabled user interface
US9600078B2 (en)2012-02-032017-03-21Aquifi, Inc.Method and system enabling natural user interface gestures with an electronic system
US10706217B2 (en)2012-03-282020-07-07International Business Machines CorporationQuick access panel for displaying a web page on a mobile device
US9098739B2 (en)2012-06-252015-08-04Aquifi, Inc.Systems and methods for tracking human hands using parts based template matching
US8830312B2 (en)2012-06-252014-09-09Aquifi, Inc.Systems and methods for tracking human hands using parts based template matching within bounded regions
US9111135B2 (en)2012-06-252015-08-18Aquifi, Inc.Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera
US8655021B2 (en)2012-06-252014-02-18Imimtek, Inc.Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US8934675B2 (en)2012-06-252015-01-13Aquifi, Inc.Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US9310891B2 (en)2012-09-042016-04-12Aquifi, Inc.Method and system enabling natural user interface gestures with user wearable glasses
US9092665B2 (en)2013-01-302015-07-28Aquifi, IncSystems and methods for initializing motion tracking of human hands
US9129155B2 (en)2013-01-302015-09-08Aquifi, Inc.Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map
US8615108B1 (en)2013-01-302013-12-24Imimtek, Inc.Systems and methods for initializing motion tracking of human hands
US9298266B2 (en)2013-04-022016-03-29Aquifi, Inc.Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9798388B1 (en)2013-07-312017-10-24Aquifi, Inc.Vibrotactile system to augment 3D input systems
US10394442B2 (en)2013-11-132019-08-27International Business Machines CorporationAdjustment of user interface elements based on user accuracy and content consumption
US9507417B2 (en)2014-01-072016-11-29Aquifi, Inc.Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9619105B1 (en)2014-01-302017-04-11Aquifi, Inc.Systems and methods for gesture based interaction with viewpoint dependent user interfaces

Also Published As

Publication numberPublication date
KR20120093224A (en)2012-08-22
US20120176414A1 (en)2012-07-12
JP2013505493A (en)2013-02-14
KR101577106B1 (en)2015-12-11
CA2774867A1 (en)2011-03-24
IL218765A0 (en)2012-06-28
EP2480951A4 (en)2014-04-30
EP2480951A1 (en)2012-08-01
US9218126B2 (en)2015-12-22

Similar Documents

PublicationPublication DateTitle
US9218126B2 (en)Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
US11656711B2 (en)Method and apparatus for configuring a plurality of virtual buttons on a device
US8614666B2 (en)Sensing user input at display area edge
US10162444B2 (en)Force sensor incorporated into display
TWI393045B (en)Method, system, and graphical user interface for viewing multiple application windows
US8098235B2 (en)Multi-touch device having dynamic haptic effects
US20090213081A1 (en)Portable Electronic Device Touchpad Input Controller
US20140152585A1 (en)Scroll jump interface for touchscreen input/output device
US20140062875A1 (en)Mobile device with an inertial measurement unit to adjust state of graphical user interface or a natural language processing unit, and including a hover sensing function
US20110285653A1 (en)Information Processing Apparatus and Input Method
CN1402116A (en)Device with touch screen using connected external apparatus for displaying information, and method thereof
GB2472339A (en)A Method for interpreting contacts on a clickable touch sensor panel
US20160147310A1 (en)Gesture Multi-Function on a Physical Keyboard
WO2009071123A1 (en)Power reduction for touch screens
US20200081551A1 (en)Talking multi-surface keyboard
WO2015047358A1 (en)Multi-function key in a keyboard for an electronic device
US20110285625A1 (en)Information processing apparatus and input method
US20220283641A1 (en)Keyboards with haptic outputs
US9176631B2 (en)Touch-and-play input device and operating method thereof
US10042440B2 (en)Apparatus, system, and method for touch input
US20140085340A1 (en)Method and electronic device for manipulating scale or rotation of graphic on display
US11099664B2 (en)Talking multi-surface keyboard
US20110242016A1 (en)Touch screen
JP2011204092A (en)Input device
US20200241743A1 (en)Electronic device with flick operation and scrolling

Legal Events

DateCodeTitleDescription
121Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number:10816800

Country of ref document:EP

Kind code of ref document:A1

WWEWipo information: entry into national phase

Ref document number:2012529401

Country of ref document:JP

WWEWipo information: entry into national phase

Ref document number:13497061

Country of ref document:US

Ref document number:218765

Country of ref document:IL

NENPNon-entry into the national phase

Ref country code:DE

WWEWipo information: entry into national phase

Ref document number:2774867

Country of ref document:CA

ENPEntry into the national phase

Ref document number:20127010067

Country of ref document:KR

Kind code of ref document:A

WWEWipo information: entry into national phase

Ref document number:2010816800

Country of ref document:EP


[8]ページ先頭

©2009-2025 Movatter.jp