Movatterモバイル変換


[0]ホーム

URL:


US7046232B2 - Information processing apparatus, method of displaying movement recognizable standby state, method of showing recognizable movement, method of displaying movement recognizing process, and program storage medium - Google Patents

Information processing apparatus, method of displaying movement recognizable standby state, method of showing recognizable movement, method of displaying movement recognizing process, and program storage medium
Download PDF

Info

Publication number
US7046232B2
US7046232B2US09/838,644US83864401AUS7046232B2US 7046232 B2US7046232 B2US 7046232B2US 83864401 AUS83864401 AUS 83864401AUS 7046232 B2US7046232 B2US 7046232B2
Authority
US
United States
Prior art keywords
movement
user
recognizable
image
predetermined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime, expires
Application number
US09/838,644
Other versions
US20020006222A1 (en
Inventor
Takeo Inagaki
Junko Saito
Keigo Ihara
Takahiko Sueyoshi
Yoshihiro Yamaguchi
Shinichiro Gomi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2000126344Aexternal-prioritypatent/JP2001306049A/en
Priority claimed from JP2000126343Aexternal-prioritypatent/JP4415227B2/en
Priority claimed from JP2000126342Aexternal-prioritypatent/JP2001307108A/en
Application filed by Sony CorpfiledCriticalSony Corp
Assigned to SONY CORPORATIONreassignmentSONY CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GOMI, SHINICHIRO, SUEYOSHI, TAKAHIKO, IHARA, KEIGO, YAMAGUCHI, YOSHIHIRO, SAITO, JUNKO, INAGAKI, TAKEO
Publication of US20020006222A1publicationCriticalpatent/US20020006222A1/en
Application grantedgrantedCritical
Publication of US7046232B2publicationCriticalpatent/US7046232B2/en
Adjusted expirationlegal-statusCritical
Expired - Lifetimelegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

The information processing apparatus allows, based on an image obtained by photographing a user by the CCD camera8, to recognize the movement direction of the user and when generating a command corresponding to the recognized movement direction, to create the gesture recognition screen100 in a search state indicating that the apparatus is searching for the user's movement and display it on a liquid crystal display10 so as to certainly notify the user that the user's movement is recognizable and it is in a standby state. Further, the information processing apparatus allows, before recognizing the movement direction of the user's action based on the image obtained by photographing the user by the CCD camera8, to generate a gesture recognition screen100 in advance for making the user visualize a recognizable movement direction and display a target division107 arranged in a horizontal line on the gesture recognition screen100 so as to notify the user in advance that right-and-left movement is recognizable, using the target division107. Furthermore, the information processing apparatus allows, based on the image obtained by photographing the user's hand with the CCD camera8, to recognize the movement direction of the hand in the image and generate a visual feedback screen representing the trail of the movement of the recognized hand and display it on the liquid crystal display10 so as to make the user learn on how the movement of the hand was recognized, by feedback.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to an information processing apparatus, a method of displaying movement recognizable standby state, a method of showing recognizable movement, a method of displaying movement recognizing process, and a program storage medium, and more particularly, is preferably applicable to a notebook personal computer (hereafter referred to as a notebook PC).
2. Description of the Related Art
A notebook PC is composed of a display means such as a liquid crystal display and an input means for inputting commands and characters, such as a keyboard and a mouse so as to execute predetermined processes according to commands entered by key operations and to display the execution results on the display means.
In addition, as the input means other than a keyboard and a mouse, some recent notebook PCs each has a rotating controller of a predetermined shape, that is, a jog dial, sticking out a bit from a side of the notebook PC, so that instructions such as selection of a menu item and determination of a command can be inputted by turning and pressing the jog dial.
By the way, as for such a notebook PC, while a command is inputted by directly operating the input means such as a keyboard, a mouse or a jog dial, on a predetermined active window screen, the active window screen does not necessarily show a user of which input means is effective and that the input means is on standby for the user's input operation. As a result, a problem occurs that it is unfriendly and not easy to use for the user who is unaccustomed to the computer.
Further, such a notebook PC has a problem in that, in the case where a menu item is selected by rotating the jog dial, a user cannot recognize which direction is effective, a right-and left direction or an up-and-down direction, until the user actually manipulates the jog dial.
Furthermore, in such a notebook PC, while it is proposed to photograph a user using an externally connected camera and automatically input a command according to the user's movement, in addition to the above-mentioned keyboard, the mouse and the jog dial, there is a not-friendly and not-easy-to-use problem because when the user inputs a different command by mistake, he/she cannot know what movement brings about the wrong recognition.
SUMMARY OF THE INVENTION
In view of the foregoing, an object of this invention is to provide an information processing apparatus, a method of displaying movement recognizable standby state, and a program storage medium which are much easier to use even for the unaccustomed user.
The foregoing object and other objects of the invention have been achieved by the provision of an information processing apparatus, a method of displaying movement recognizable standby state and a program storage medium, which are capable of certainly informing a user that the apparatus can recognize the movement direction of a recognition subject and also is in a standby state, by recognizing the movement direction of the recognition subject based on a picture obtained by photographing the recognition subject by an image pickup means, and displaying a predetermined standby state picture indicating that the apparatus is searching the recognition subject in the picture on a predetermined display means if the apparatus does not recognize the movement direction of the recognition subject at the time of generating a predetermined command corresponding to the movement direction of the recognition subject recognized.
Another object of the invention is to provide an information processing apparatus, a method of showing recognizable movement, and a program storage medium which are capable of notifying a user in advance of how his/her input operation will be recognized.
The foregoing object and other objects of the invention have been achieved by a provision of an information processing apparatus, a method of showing recognizable movement and a program storage medium which are capable of informing a user of recognizable movement direction using a predetermined recognizable movement direction image picture, by previously creating the recognizable movement direction image which makes the user visualize recognizable movement direction and displaying the image on a predetermined display means, before recognizing the movement direction of a recognition subject based on a picture obtained by photographing the recognition subject by an image pickup means.
Another object of the invention is to provide an information processing apparatus, a method of displaying movement recognizing process and a program storage medium which are capable of making a user learn by feedback on the recognizing process until the movement of the recognition subject is recognized.
The foregoing object and other objects of the invention have been achieved by a provision of an information processing apparatus, a method of displaying movement recognizing process and a program storage medium which are capable of making learn by feedback on how a movement direction of a recognition subject is recognized, by recognizing the movement direction of the recognition subject in a picture obtained by photographing the recognition subject by an image pickup means, creating an recognizing process image indicating a trail of the movement of the recognition subject recognized and displaying the image on a predetermined display means.
The nature, principle and utility of the invention will become more apparent from the following detailed description when read in conjunction with the accompanying drawings in which like parts are designated by like reference numerals or characters.
BRIEF DESCRIPTION OF THE DRAWINGS
In the accompanying drawings:
FIG. 1 is a schematic perspective view showing an overall configuration of a notebook personal computer according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram showing a configuration of the left side of the main body;
FIG. 3 is a schematic diagram showing the configurations of the backside and the bottom of the main body;
FIG. 4 is a block diagram showing a circuit configuration of the notebook personal computer;
FIG. 5 is a flowchart showing a gesture recognizing processing procedure;
FIG. 6 is a schematic diagram showing a gesture recognition screen displayed overlapping on the active window screen;
FIG. 7 is a schematic diagram showing a configuration of the gesture recognition screen;
FIG. 8 is a schematic diagram showing a configuration of a target;
FIG. 9 is a flowchart showing a processing procedure for obtaining information about a hand position;
FIG. 10 is a schematic diagram showing a color area displayed in YUV chromatic space;
FIG. 11 is a schematic diagram showing the gesture recognition screen under a search state;
FIG. 12 is a schematic diagram showing the gesture recognition screen with a pointer and a palm area recognition frame overlapped thereon;
FIG. 13 is a flowchart showing a processing procedure for judging gesture movement;
FIG. 14 is a schematic diagram explaining calculation of a distance of fingertip movement;
FIG. 15 is a schematic diagram explaining a processing flow as software;
FIGS. 16A to 16C are schematic diagrams showing visual feedback screens;
FIG. 17 is a schematic diagram showing an overall configuration of a network system in a second embodiment;
FIG. 18 is a schematic perspective view showing exterior features of a digital portable telephone with a camera;
FIG. 19 is a schematic perspective view showing a display division when the camera division is rotated; and
FIG. 20 is a block diagram showing a circuit configuration of the digital portable telephone with a camera.
DETAILED DESCRIPTION OF THE EMBODIMENT
Preferred embodiments of this invention will be described with reference to the accompanying drawings:
(1) First Embodiment
(1-1) Exterior Features of a Notebook Personal Computer
InFIG. 1, areference numeral1 shows, as a whole, a notebook personal computer (hereafter referred to as a notebook PC) as an information processing apparatus to which the present invention is applied, and it is configured by amain unit2 and adisplay division3 attached to themain unit2 so as to be closed or opened.
Themain unit2 has on its topside a plurality ofcontrol keys4 for inputting various characters, codes and numbers, a stick pointing device (hereafter, simply referred to as a stick)5 used for moving a mouse cursor, left andright click buttons5A and5B that are equivalent to the left and right buttons of an ordinary mouse, a center button SC for controlling a scroll bar without putting the mouse cursor on a scroll button, built-inspeakers6A and6B, a push-type power switch7, ashutter button9 for a charge coupled device (CCD)camera8 provided in thedisplay division3, a power lamp PL, a battery lamp BL and a message lamp ML which are light emitting diodes (LEDs), and so on.
In thedisplay division3, on its front, for instance, aliquid crystal display10 being a thin film transistor (TFT) color liquid crystal of 8.9-inch (1,024×480 pixels) is provided, and besides, at the upper-center of its front, animage pickup division11 equipped with theCCD camera8 serving as an image pickup means is rotatably provided.
In thisimage pickup division11, theCCD camera8 can be rotated and positioned within an angle of 180°, facing the front direction to the back direction of thedisplay division3, and also focus adjustment in photographing a desired image pickup subject with theCCD camera8 can be performed easily by rotating anadjustment ring12 provided at the top of theimage pickup division11.
Thedisplay division3 also has amicrophone13 provided on the front and on the back of theimage pickup division11 close to its left and it is possible to collect sounds via themicrophone13 in a large extent from the front to the back of thedisplay division3.
In addition, thedisplay division3 hasnails14 and15 provided close to the left and right of theliquid crystal display10 respectively, andopenings16 and17 are provided at the predetermined positions of themain unit2 corresponding to thenails14 and15, so that thenails14 and15 are fitted into thecorresponding openings16 and17 with the front surface of thedisplay division3 contacted with the top surface of themain unit2.
In this connection, when thedisplay division3 contacted with themain unit2 is tilted, the fitting states between theopenings16 and17 and thenails14 and15 are released and thedisplay division3 can be opened up from themain unit2.
Moreover, themain unit2 has on its right side aninfrared port18 based on the Infrared Data Association (IrDA), aheadphone terminal19, amicrophone input terminal20, a universal serial bus (USB)terminal21, anexternal power connector22, an externaldisplay output connector23, ajog dial24 capable of inputting an instruction to execute a predetermined process by rotating and pushing the rotating controller, and amodem terminal25 for a modular jack.
On the other hand, as shown inFIG. 2, themain unit2 has on its left side anexhaust hole26, a personal computer (PC)card slot27 for a PC card based on the Personal Computer Memory Card International Association (PCMCIA), and an institute of electrical and electronics engineers (IEEE) 1394terminal28 for four pins.
Furthermore, as shown inFIG. 3, themain unit2 has on its backside abattery connector29, and it has at its bottom a sliding removinglever31 for removing a battery pack30 (FIG. 1) and alock lever32 for locking the sliding removinglever31 and also areset switch33 for interrupting operation of themain unit2 and reconstructing its environment on power-up. Thebattery pack30 is detachably connected to thebattery connector29.
(1-2) Circuit Configuration of the Notebook Personal Computer
Next, the circuit configuration of the notebook PC1 will be described in detail with reference toFIG. 4. In themain unit2 of the notebook PC1, a central processing unit (CPU)50 for controlling the whole functions of themain unit2 is connected with a host bus52. TheCPU50 executes processes according to programs and application software loaded into a random access memory (RAM)53, at a predetermined operating speed based on a system clock given from aclock generator60 so as to realize various functions.
In addition, the host bus52 is connected to acache memory51 in order to cache data to be used by theCPU50 and to realize high-speed access.
This host bus52 is connected to a peripheral component interconnect (PCI) bus55 with a host-PCI bridge54, and the PCI bus55 is connected to avideo controller56, an IEEE1349interface57, a videocapture processing chip83, and aPC card interface58.
Here, the host-PCI bridge54 controls sending and receiving of various data between theCPU50 and thevideo controller56, the videocapture processing chip83, the IEEE 1349interface57 and thePC card interface58, and also performs memory control of theRAM53 connected with a memory bus59.
In addition, the host-PCI bridge54 is connected to thevideo controller56 with a signal wire being an accelerated graphics port (AGP), and it is thereby possible to transfer image data at high speed between thevideo controller56 and thehostPCI bridge54.
The videocapture processing chip83 is connected with an I2C bus82 (also referred to as system management (SM) bus generally) being a serial bus, and if image data photographed by theCCD camera8 is supplied through the I2C bus82, it is stored in a built-in frame memory (not shown) once to generate JPEG image data by performing an image compression process based on the joint photographic experts group (JPEG) standards, and then the JPEG image data is stored in the frame memory again.
Then, the videocapture processing chip83 transfers the JPEG image data stored in the frame memory to theRAM53 in response to a request from theCPU50 by using a bus master function, and then the JPEG image data is transferred to a hard disk drive (HDD)67 as JPEG image (still picture) data or Motion JPEG (motion picture) data.
In addition, thevideo controller56 outputs image data based on various application software provided at appropriate times and image data photographed by theCCD camera8, to theliquid crystal display10 of thedisplay division3 so as to display a plurality of window screens.
The IEEE 1349interface57 is directly connected to theIEEE 1394terminal28, and can be connected to another computer apparatus and an external device such as a digital video camera with theIEEE 1394terminal28.
ThePC card interface58 is to be connected to a PC card (not shown) which is loaded in aPC card slot27 when an optional function is to be added, and can be connected to an external device such as a compact disc-read only memory (CD-ROM) drive or a digital versatile disc (DVD) drive with the PC card.
The PCI bus55 is connected to an industrial standard architecture (ISA) bus65 with the PCI-ISA bridge66, and the PCI-ISA bridge66 is connected to theHDD67 and aUSB terminal21.
Here, the PCI-ISA bridge66 is comprised of an integrated drive electronics (IDE) interface, a configuration register, a real-time clock (RTC) circuit, a USB interface and so on, and it controls theHDD67 via the IDE interface based on the system clock given from theclock generator60.
A hard disk ofHDD67 stores an operating system (OS) such as Windows 98 (trademark), an electronic mail program, an auto pilot program, a jog dial server program, a jog dial driver, capture software, digital map software and other various application software, which are transferred and loaded into theRAM53 at appropriate times in a starting process.
Moreover, the PCI-ISA bridge66 controls via the USB interface (not shown) external devices such as a floppy disk drive, a printer and a USB mouse connected to theUSB terminal21, and also controls amodem69 and asound controller70 connected with the ISA bus65.
Themodem69 is connected to an Internet service provider (hereafter referred to as a provider) using themodem terminal25 with the public telephone circuit (not shown), and performs dialup IP connection to the Internet via the provider.
Thesound controller70 generates audio data by converting an audio signal collected by themicrophone13 into an analog form, and outputs it to theCPU50, and also converts the audio data supplied from theCPU50 into an analog form to generate an audio signal which is outputted to outside via the built-inspeaker6.
In addition, the ISA bus65 is connected to an in/out (I/O)controller73, and it receives electric power supplied from an external power source via anexternal power connector22 and a power supply and chargingcontrol circuit85 to supply power to each circuit when thepower switch7 is turned on. Note that, the I/O controller73 also operates based on the system clock given from theclock generator60.
Moreover, the power supply and chargingcontrol circuit85 controls charging of thebattery pack30 connected to the battery connector29 (FIG. 3) under the control of the I/O controller73.
The I/O controller73 is composed of a microcontroller, an I/O interface, a CPU, a ROM, a RAM and so on, and controls input/output of data between the OS and application software and various peripherals such as theliquid crystal display10 and theHDD67, based on a basic input/output system (BIOS) stored in aflash memory79.
In addition, the I/O controller73 is connected to theinfrared port18 and is capable of performing infrared communication with another computer apparatus for instance.
Furthermore, the I/O controller73 is connected to a reversingswitch77 which is turned on when theimage pickup division11 is rotated toward the backside of theliquid crystal display10 by 180 degrees. Then, the I/O controller73 informs theCPU50 of this situation, via the PCI-ISA bridge66 and the host-PCI bridge54.
In addition, the I/O controller73 is connected to a full/half push switch78, and the full/half push switch78 is in a half push state when theshutter button9 provided on the topside of themain unit2 is half pushed. Then, the I/O controller73 informs theCPU50 of this situation. On the other hand, theswitch78 is in a full push state when thebutton9 is fully pushed, and then the I/O controller73 informs theCPU50 of this situation.
To be more specific, theCPU50 enters a still picture mode when theshutter button9 is half pushed by a user in a state where capture software is started up from the hard disk of theHDD67 on theRAM53, and it controls theCCD camera8 to freeze the still picture, and then when theshutter button9 is fully pushed, it captures the frozen still picture data and sends it to thevideo controller56.
On the contrary, theCPU50 enters a motion picture mode when theshutter button9 is fully pushed by the user without starting up the capture software, and it captures a motion picture for60 seconds at maximum and sends it to thevideo controller56.
Incidentally, the ROM of the I/O controller73 stores a wakeup program, a key input monitoring program, an LED control program, a jog dial state monitoring program and other various control programs.
Here, the jog dial state monitoring program is a program used accompanying with the jog dial server program stored in the hard disk of theHDD67 and is intended to monitor whether or not thejog dial24 is rotated or pushed.
The wakeup program is a program which executes a given process when a current time from an RTC circuit in the PCI-ISA bridge66 coincides with a preset starting time, under the control of theCPU50. The key input monitoring program is a program for monitoring input from thecontrol keys4 and other various key switches. The LED control program is a program for controlling lighting of various lamps such as the power lamp PL, the battery lamp BL and the message lamp ML (FIG. 1).
In addition, the RAM of the I/O controller73 has an I/O register for the jog dial state monitoring program, a set time register for the wakeup program, a key input monitoring register for the key input monitoring program, a LED control register for the LED control program and registers for other various programs.
The set time register stores time information on a starting time preset by the user in order to use in the wakeup program. Therefore, the I/O controller73 determines whether or not the current time from the RTC circuit coincides with the preset starting time based on the wakeup program, and when they coincide with each other, the register informs theCPU50 of this situation.
Then, theCPU50 starts up predetermined application software at the preset starting time, and executes a predetermined process according to the application software.
Moreover, the key input monitoring register stores control key flags corresponding to input operations of thecontrol keys4, thestick5, theleft click button5A, theright click button5B, thecenter button5C and so on.
Therefore, the I/O controller73 determines, according to the key input monitoring program, whether or not thestick5 is pointed or theleft click button5A, the right click button SB or thecenter button5C was clicked, based on the control key flags, and when the pointing or click operation was performed, the I/O controller73 informs theCPU50 of this situation.
This pointing operation here is an operation of manipulating thestick5 up and down and right and left by a finger to move the mouse cursor to a desired position on the screen, and the click operation is an operation of swiftly pushing and releasing theleft click button5A or theright click button5B by a finger.
Then, theCPU50 executes a predetermined process depending on the movement of the cursor by the pointing operation or the click operation.
In addition, the LED control register stores lighting flags for indicating a lighting state of various lamps such as the power lamp PL, the battery lamp BL and the message lamp ML.
Therefore, the I/O controller73 stores the lighting flags and controls theLEDs81 based on the lighting flags and for example, lights the message lamp ML when theCPU50 starts up the electronic mail program from the hard disk of theHDD67 in response to the pushing operation of thejog dial24 and receives electronic mail according to the electronic mail program.
Moreover, the I/O register for the jog dial state monitoring program stores a rotation flag and a push flag corresponding to rotating and pushing operations of thejog dial24.
Therefore, when a desired menu item is selected by the user out of a plurality of menu items by rotating and pushing thejog dial24 connected to arotation detecting division88, the I/O controller73 sets the rotation flag and the push flag stored in the I/O register and informs theCPU50 of this situation.
Thus, theCPU50 starts up application software corresponding to the menu item determined by rotating and pushing thejog dial24 to execute the predetermined process, according to the jog dial server program read from theHDD67 and started up on theRAM53.
Here, the I/O controller73 is always in operation under the control of the power supply and chargingcontrol circuit85 even if thepower switch7 is off and the OS is not active, so that application software or a script file desired by the user can be started up by pushing thejog dial24 which is in a power-saving state or in a power-off state, without providing a special key.
Moreover, the I/O controller73 is also connected with the I2C bus82, and adjusts brightness and contrast in theCCD camera8 by supplying various setting parameters for theCCD camera8 set by thecontrol keys4 or thejog dial24 through the I2C bus82.
(1-3) Gesture Recognizing Process
In addition to such configuration, thenotebook PC1 starts up application software called a cyber gesture program for recognizing the movement of the user's hand (gesture) photographed by theCCD camera8, from the hard disk of theHDD67, and recognizes the movement of the user's hand photographed by theCCD camera8 in accordance with the cyber gesture program and executes a predetermined process depending on the recognition result on the active window screen based on the application software.
To be more specific, for instance, in the case of starting up the image editing program capable of processing photographed still pictures and sequentially displaying a plurality of still pictures stored in the hard disk of theHDD67 on theliquid crystal display10 in order to select still pictures to be processed, thenotebook PC1 performs an image forwarding operation such as forwarding and playing back the still pictures displayed on theliquid crystal display10 one by one in response to the rotating operation of thejog dial24 by the user. The present invention, however, allows the above-mentioned image forwarding operation to be executed under the control of theCPU50 without contacting thejog dial24, by having theCPU50 recognize the movement of the user's hand photographed by theCCD camera8.
In this connection, thenotebook PC1 forwards the still picture by only one on theliquid crystal display10 when thejog dial24 is rotated further toward the depth from the user by more than a predetermined angle and on the contrary, plays back the still pictures by only one on theliquid crystal display10 when thejog dial24 is rotated toward the user by more than a predetermined angle.
In actual, theCPU50 of thenotebook PC1 enters a routine RT1 at starting step inFIG. 5 and moves to the next step SP1, and starts up the cyber gesture program from the hard disk of theHDD67 by the user's operation, and creates agesture recognition screen100 as shown inFIG. 6 according to the cyber gesture program, and then moves on to the next step SP2 after overlapping and displaying it on the still picture of the active window screen according to the image editing program.
Here, as shown inFIG. 7, thegesture recognition screen100 has a screen size of 164×136 pixels, in which atitle character division101 of “CYBERGESTURE” (trademark of Sony Corp.) indicating the cyber gesture program, anoption button102 for optionally selecting a function, ahelp button103, a minimizingbutton104 and aclosing button105 are provided at the top.
Thisgesture recognition screen100 is formed in a screen size much smaller than that of the liquid crystal display10 (1,024×480 pixels) so that the area concealing the still picture displayed behind thegesture recognition screen100 on the active window screen becomes as small as possible.
Moreover, when the mouse cursor is put on any of theoption button102, thehelp button103, the minimizingbutton104 and theclosing button105 on thegesture recognition screen100, theCPU50 of thenotebook PC1 displays the put button in a convex state, and displays it in a concave state after selection by clicking, which allows the selecting and determining operation of the buttons to be easily and visually recognized.
In addition, theCPU50 of thenotebook PC1 displays the gesturerecognition display area106 on thegesture recognition screen100 on a 256-level gray scale, and also displays atarget division107 comprised of fivesquare targets107A to107E arranged in a horizontal line approximately at the center of the gesturerecognition display area106.
Thus, theCPU50 is capable of easily making him/her visualize that thenotebook PC1 can recognize right and left movement of the user's hand, by using thetarget division107 displayed in the gesturerecognition display area106 on thegesture recognition screen100.
In addition, eachtarget107A to107E has a size of 8×8 pixels, with a frame107AF to107EF of 1-pixel width. The frames107AF to107EF are colored in red so as to make thetargets107A to107E more visually remarkable against the gray scale display behind.
Moreover, the gesturerecognition display area106 on thegesture recognition screen100 has a black line (not shown) for every two horizontal lines of scan lines, so that the user can easily recognize this area as thegesture recognition screen100, which is different from a screen showing an ordinary image.
At step SP2, theCPU50 photographs the user existing in front of thedisplay division3 with theCCD camera8, and displays the resultant input image in the gesturerecognition display area106 on thegesture recognition screen100, and then moves on to the next subroutine SRT2.
As shown inFIG. 9, at step SP21 of the subroutine SRT2, theCPU50 divides the input image displayed in the gesturerecognition display area106 on thegesture recognition screen100 into a plurality of color areas based on the color components, and then moves on to the next step SP22.
In this connection, the color area is expressed by predetermined YUV chromatic space, as shown inFIG. 10, and for example, a predetermined area of quadrants +Y, −U and −V indicated by oblique lines in the YUV chromatic space is regarded as a color area R (hereafter referred to as a skin-color area R) equivalent to color of a user's palm.
At step SP22, theCPU50 compares a predetermined skin-color table corresponding to the skin-color area R in the YUV (brightness, color difference) chromatic space with the corresponding color areas of the input image, and then moves on to the next step SP23.
In this case, the color areas of the input image are roughly divided into the skin-color area R such as the user's face area and palm area and a non-skin-color area such as clothing.
At step SP23, theCPU50 determines whether or not the skin color area R recognized as a skin-color exists in the input image, by comparing the skin-color table with the color areas of the input image.
If a negative result is obtained here, it represents that the skin-color area R matching the skin-color table does not exist in the input image, and then theCPU50 moves on to the next step SP29.
At step SP29, theCPU50 moves on to an input image of the next frame since the skin-color area R does not exist in the input image and the movement of the user's hand cannot be recognized, and returns to the above-mentioned step SP21.
On the contrary, if a positive result is obtained at step SP23, it represents that the skin-color area R matching the skin-color table exists, and then theCPU50 moves on to the next step SP24.
At step SP24, theCPU50 detects movement of the skin-color area R in the input image of the current frame based on change in coordinate values between the current frame and the previous frame, and then moves on to the next step SP25.
At step SP25, theCPU50 determines whether or not the moving skin-color area R exists in the input image. If a negative result is obtained here, it represents that the moving skin-color area R does not exist in the input image, and then theCPU50 moves on to the next step SP29, proceeds to the input image of the next frame and returns to the above-mentioned step SP21.
On the contrary, if a positive result is obtained at step SP25, it represents that the moving skin-color area R exists in the input image, and then theCPU50 moves on to the next step SP26.
At step SP26, theCPU50 detects the largest skin-color area R out of the moving skin-color areas R, and regards it as a palm area and then moves on to the next step SP27.
At step SP27, theCPU50 acquires the coordinate values of the entire palm area determined at step SP26, and then moves on to the next step SP28.
At step SP28, theCPU50 calculates the center of gravity of the palm area based on the coordinate values of the entire palm area obtained at step SP27, detects the coordinates of the uppermost of the palm area in a vertical direction with respective to the center of gravity, as the upper-part-of-center-of-gravity data equivalent to a fingertip of the hand, finishes a process of obtaining information about the hand position in the subroutine SRT2, and then returns to the step SP3 of the routine RT1 (FIG. 5).
At step SP3, theCPU50 determines whether or not the user's hand exists in the gesturerecognition display area106 on thegesture recognition screen100 based on the upper-part-of-center-of-gravity data obtained in the subroutine SRT2.
If a negative result is obtained here, it represents that the upper-part-of-center-of-gravity data is not obtained in the subroutine SRT2, that is, the user's hand does not exist in the gesturerecognition display area106 on thegesture recognition screen100, and then theCPU50 moves on to the next step SP4.
At step SP4, theCPU50 displays an animation indicating that it is searching for the user's hand because the hand is not displayed in the gesturerecognition display area106 on thegesture recognition screen100, and then returns to the above-mentioned step SP2.
In this case, as shown inFIG. 11, theCPU50 can display an animation by using thetarget division107 to make the user easily recognize that it is searching for the skin-color area R since the user's skin-color portion is hardly displayed in the gesturerecognition display area106 on thegesture recognition screen100 and the user's hand has not be recognized yet.
To be more specific, theCPU50 produces a graduation effect by alternately coloring the areas (shown by broken lines) inside the frames107AF to107EF of thetargets107A to107E in red in sequence in the left and right directions indicated by the arrows A and B, so that the user can easily visualize that the cyber gesture program has started up and the user's hand is being searched.
On the contrary, if a positive result is obtained at step SP3, it represents that the upper-part-of-center-of-gravity data has been obtained in the subroutine SRT2, that is, the user's hand exists in the gesturerecognition display area106 on thegesture recognition screen100, and then theCPU50 moves on to the next step SP5.
At step SP5, as shown inFIG. 12, theCPU50 displays apointer108 of a predetermined shape at the position corresponding to the obtained upper-part-of-center-of-gravity data, and also overlaps and displays a palmarea recognition frame109 including thepointer108 and covering the user's entire palm area, on the input image in the gesturerecognition display area106, and moves on to the next subroutine SRT3.
Here, theCPU50 colors the palmarea recognition frame109 of 1-pixel width in white, and colors apointer frame108F of 1-pixel width of thepointer108 which is the same in shape and size as the107A to107E of thetarget division107, in white and also colors its inside in red.
As a result, theCPU50 can make the user clearly distinguish between thetargets107A to107E and thepointer108 by coloring the frames107AF to107EF of thetargets107A to107E in red and coloring thepointer frame108F of thepointer108 in white.
Moreover, theCPU50 displays the palmarea recognition frame109 and thepointer108 while moving them together with the movement of the user's hand.
As subsequently shown inFIG. 13, at step SP31 of the subroutine SRT3, theCPU50 obtains a distance of fingertip movement based on the differences in the coordinate values of the upper-part-of-center-of-gravity data between adjacent frames, that is, the current frame stored in a ring buffer form in theRAM53 and the previous frame adjacent to the current frame, and then moves on to the next step SP32.
At step SP32, theCPU50 determines whether or not the distance of fingertip movement calculated at step SP31 is a predetermined maximum threshold or less. If a negative result is obtained here, it represents that the distance of fingertip movement is inadequate as data for recognizing movement of the hand because it is extremely distant between the position showing the fingertip in the previous frame and the position showing the fingertip in the current frame, and then theCPU50 moves on to the next step SP33.
At step SP33, theCPU50 stops the calculation of the distance of fingertip movement between adjacent frames after step SP34 and thereafter since the distance of fingertip movement is inadequate to use as data, and returns to step SP2 of the routine RT1 (FIG. 5) to repeat the above-mentioned process.
On the contrary, if a positive result is obtained at step SP32, it represents that the distance of fingertip movement is adequate as data for recognizing movement of the hand because it is not extremely distant between the position showing the fingertip in the previous frame and the position showing the fingertip in the current frame, and then theCPU50 moves on to the next step SP34.
At step SP34, as shown inFIG. 14, theCPU50 obtains as a longest distance of fingertip movement, the longest length in coordinate values between the upper-part-of-center-of-gravity data indicating the fingertip in the current frame and the upper-part-of-center-of-gravity data indicating the fingertip in an arbitrary past frame selected out of several past frames for a predetermined time, which are sequentially stored in the ring buffer form, and determines whether or not the longest distance of fingertip movement is larger than a predetermined minimum threshold.
If a negative result is obtained here, it represents that the longest distance of the fingertip movement based on state transition of the input image over a plurality of frames is smaller than the predetermined minimum threshold, that is, the movement of the hand is not enough to be recognized, and then theCPU50 excludes the longest distance of fingertip movement from the recognizing process and returns to step SP31 to repeat the above-mentioned process.
On the contrary, if a positive result is obtained at step SP34, it represents that the longest distance of the fingertip movement is larger than the predetermined minimum threshold, that is, the fingertips certainly moved to the right or left, and then theCPU50 moves on to the next step SP35.
At step SP35, theCPU50 detects the direction (rightward or leftward) of the fingertip movement based on a movement vector between the upper-part-of-center-of-gravity data indicating the fingertip in the current frame and the upper-part-of-center-of-gravity data indicating the fingertip in the past frame used when calculating the longest distance of the fingertip movement, and then returns to step SP6 of the routine RT1 (FIG. 5).
As theCPU50 detected the longest distance of the fingertip movement and the movement direction thereof at step SP6, it determines whether or not the speed of movement of the detected entire palm area is exceeding a predetermined speed based on change per unit time in the coordinate values between pixel data of the current frame and pixel data of the previous frame of the detected entire palm area.
If a negative result is obtained here, it determines that the speed of movement of the detected entire palm area is not exceeding the predetermined speed, that is, it may be the face area rather than the palm area in reality since it is moving rather slowly, and returns to step SP2 again to repeat the above-mentioned process.
On the contrary, if a positive result is obtained at step SP6, it determines that the speed of movement of the detected entire palm area is exceeding the predetermined speed, that is, it is much more likely to be the palm area since it is moving rather fast, and moves on to the next step SP7.
In the above process, theCPU50 can determine the palm area and the face area much more correctly in the case where there are two or more candidates that seem to be the palm area.
At step SP7, as shown inFIG. 15, theCPU50 supplies to a jogdial server program182 the recognition result of gesture movement of the palm area recognized by acyber gesture program180, via an application programming interface (API) for ajog dial181, and also performs visual feedback display of a trail indicating the movement of the hand (gesture) moved by the user and the recognizing process indicating how thenotebook PC1 recognized the gesture, on thegesture recognition screen100, and moves on to the next step SP8.
Here, the API is a program interface for the application software by the OS, where the application software basically executes all processes via the API. In this connection, the API of the currently general OS takes the form of a function, and an appropriate argument (parameter) is specified by the application software to invoke the API's functions.
Incidentally, theCPU50 takes in an operation of thejog dial24 and the recognition result by thecyber gesture program180 in the same input form, and supplies them to the jogdial server program182 via the common API for ajog dial181, so as to allow the software process to be simplified.
In reality, theCPU50 generates avisual feedback screen191 as shown inFIG. 16A, and displays thepointer108 with overlapped on thetargets107A to107E obliquely placed in advance in atrail display frame120 while moving it in a direction of an arrow C according to the trail indicating the movement of the hand (gesture) actually moved by the user so that the user can visually check the actual recognizing process of the movement of the user's hand on thevisual feedback screen191.
Subsequently, theCPU50 creates avisual feedback screen192 as shown inFIG. 16B, and displays it by replacing it with thevisual feedback screen191.
Thisvisual feedback screen192 deforms thetrail display frame120 in thevisual feedback screen191 to form adirection display frame121 where thetargets107A to107E are arranged in a horizontal line, and also displays thetarget107E at the right end in thedirection display frame121 and thepointer108 at the left end so as to sketchily indicate that the user's hand was moved in a direction of an arrow D (right to left) using thedirection display frame121.
Lastly, theCPU50 generates avisual feedback screen193 as shown inFIG. 16C, and displays it by replacing it with thevisual feedback screen192.
Thisvisual feedback screen193 erases thedirection display frame121 on thevisual feedback screen192 and sequentially moves and displays thepointer108 on thetargets107A to107E, which are arranged in a horizontal line, in the direction shown by the arrow D, so that the user to easily recognize that thenotebook PC1 has recognized the user's hand moved from the right side to the left side (direction shown by the arrow D).
Moreover, in the case of moving and displaying thepointer108 on thetargets107A to107E in the direction shown by the arrow D, theCPU50 moves thepointer108 at the same speed as the movement of the recognized user's hand, so that the user can know the speed of movement of the hand that thenotebook PC1 can handle with.
At step SP8, theCPU50 recognizes the movement of the user's hand and then supplies a predetermined command corresponding the movement of the hand, from the jog dial server program182 (FIG. 15) toapplication software183, to execute a predetermined process. TheCPU50, however, is executing a predetermined process according to the recognition result, so that it does not execute a gesture recognizing process for input images of several frames immediately after recognizing the movement of the hand, and returns to step SP2 again to repeat the above-mentioned process.
In this way, theCPU50 can execute a process according to the movement of the user's hand on the active window without any malfunction and then execute a process according to the next movement of the user's hand.
As described above, theCPU50 of thenotebook PC1 recognizes the movement of the user's hand by thecyber gesture program180, and then supplies a predetermined command corresponding to the recognition result to theapplication software183 by the jogdial server program182, so as to execute the predetermined image forwarding operation according to the command on the active window screen based on theapplication software183.
In reality, theCPU50 of thenotebook PC1 forwards still pictures by only one on the active window screen displayed as the background on thegesture recognition screen100 in the case of recognizing that the gesture of the hand is a movement from the left side to the right side (opposite direction of the arrow D) and on the other hand, plays back the still pictures by only one on the active window screen displayed as the background on thegesture recognition screen100 in the case of recognizing that the gesture of the hand is a movement from the right side to the left side (direction shown by the arrow D).
In this way, the user can forward or play back the still pictures on the active window screen displayed as the background on thegesture recognition screen100 just by holding his/her hand over theCCD camera8 of theimage pickup division11 and moving either to the right or the left, even he/she does not directly manipulates thejog dial24.
(1-4) Operations and Effects in the First Embodiment
In the above configuration, theCPU50 of thenotebook PC1 displays a still picture on which thegesture recognition screen100 is overlapped, on the active window screen by starting thecyber gesture program180 in a state where the image editing program has started and the active window screen has been displayed on theliquid crystal display10.
And theCPU50 of thenotebook PC1 photographs the user existing in front of thedisplay division3, with theCCD camera8 of theimage pickup division11, and displays the resultant input image in the gesturerecognition display area106 on thegesture recognition screen100.
At this time, in order to recognize the movement of the user's hand, theCPU50 of thenotebook PC1 first divides the input image in the gesturerecognition display area106 into a plurality of color areas to search for the skin-color area R therein. If theCPU50 can not detect the skin-color area R, it produces a graduation effect by alternately coloring the areas inside the frames107AF to107EF of thetargets107A to107E in the target division107 (FIG. 11) in red in sequence in the left and right directions indicated by arrows A and B, so as to certainly notify the user that it is searching for the user's hand.
In this way, as theCPU50 of thenotebook PC1 can make the user recognize that it has not recognized the movement of the hand yet, it can immediately prompt the user to perform an input operation such as holding his/her hand over theCCD camera8 and moving either to the right or the left.
Thus, theCPU50 of thenotebook PC1 can recognize the movement of the user's hand in a short time, generate a command according to the recognized movement of the hand, and forward or play back the still pictures displayed on the active window screen in response to the command.
According to the above configuration, in the case where thenotebook PC1 does not recognize the movement of the user's hand photographed by theCCD camera8, it can certainly notify the user that it is searching for the user's hand and is in a recognizable standby state, by alternately coloring the areas inside the frames107AF to107EF of thetargets107A to107E in red in sequence in the left and right directions indicated by arrows A and B.
Further, in the above configuration, theCPU50 of thenotebook PC1 displays the still picture on which thegesture recognition screen100 is overlapped, on the active window screen by starting thecyber gesture program180 in a state where the image editing program has started and the active window screen has been displayed on theliquid crystal display10.
At this time, theCPU50 of thenotebook PC1 displays thetarget division107 comprised of the fivesquare targets107A to107E sequentially arranged in a horizontal line approximately at the center of the gesturerecognition display area106 on thegesture recognition screen100.
Thereby, theCPU50 of thenotebook PC1 makes the user easily visualize that thenotebook PC1 can recognize the right-and-left movement out of the movement directions of the user's hand, so that theCPU50 can certainly notify the user of the recognizable movement direction in advance.
In addition, theCPU50 of thenotebook PC1 can color the frames107AF to107EF of thetargets107A to107E in red so as to make the user visually distinguish thetargets107A to107E from the gray scale display as the background.
According to the above configuration, thenotebook PC1 can certainly notify the user in advance that thenotebook PC1 can recognize the right-and-left movement out of the movement directions of the user's hand by displaying thetarget division107 comprised of the fivesquare targets107A to107E sequentially arranged in a horizontal line approximately at the center of the gesturerecognition display area106 on thegesture recognition screen100, by thecyber gesture program180.
Furthermore, in the above configuration, theCPU50 of thenotebook PC1 displays the still picture on which thegesture recognition screen100 is overlapped, on the active window screen by starting thecyber gesture program180 in a state wherein the image editing program has started and the active window screen has been displayed on theliquid crystal display10.
And theCPU50 of thenotebook PC1 photographs the user existing in front of thedisplay division3 with theCCD camera8 of theimage pickup division11, and displays the resultant input image in the gesturerecognition display area106 on thegesture recognition screen100 and also recognizes the movement of the user's hand to generate a command according to the movement of the recognized hand.
At this time, theCPU50 of thenotebook PC1 obliquely places thetargets107A to107E in atrail display frame120 in advance as shown inFIG. 16A corresponding to the trail showing the movement of the hand actually made by the user (gesture), in order to allow the user to visually check the trail of the recognizing process of the movement of the user's hand by alternately moving and displaying thepointer108 on thetargets107A to107E in sequence in the direction shown by the arrow C.
After that, theCPU50 deforms thetrail display frame120 as shown inFIG. 16B to form adirection display frame121 in a state where thetargets107A to107E are arranged in a horizontal line, and also displays thetarget107E at the right end in thedirection display frame121 and thepointer108 at the left end so as to sketchily indicate that the user's hand was moved in the direction shown by an arrow D (leftward).
Lastly, as shown inFIG. 16C, theCPU50 erases thedirection display frame121 and moves and displays thepointer108 on thetargets107A to107E in the direction shown by the arrow D at the speed of movement of the recognized user's hand, so as to certainly notify the user that it recognized the movement of the user's hand from the right side to the left side (direction shown by the arrow D).
According to the above configuration, thenotebook PC1 can display the trail of the recognizing process and therecognition9 result of the movement of the user's hand photographed by theCCD camera8 as an animation using thetargets107A to107E and thepointer108, so as to make the user learn on how and in what recognizing process the movement of the user's hand was recognized, by feedback.
Thus, the user makes thenotebook PC1 recognize the movement of his/her hand, taking the direction and speed of the movement of the hand into consideration, so that an input operation of a command for forwarding images, for instance, can be performed in a short time.
(2) Second Embodiment
(2-1) Overall Configuration of Network System
InFIG. 17, areference numeral200 shows, as a whole, a network system including a portable telephone MS3 to which the present invention is applied, in which base stations CS1 to CS4 which are fixed radio stations are set up in areas obtained by dividing a communication service area into a desired size.
These base stations CS1 to CS4 are connected by radio to personal digital assistants MS1 and MS2 and digital portable telephones with a camera MS3 and MS4 that are mobile radio stations by the code division multiple access method called a wideband-code division multiple access (W-CDMA) for instance, and can perform high-speed data communication of mass data at a maximum of data transfer rate of 2 Mbps by using 2 GHz frequency band.
Thus, the personal digital assistants MS1 and MS2 and the digital portable telephones with a camera MS3 and MS4 are capable of high-speed data communication of mass data by the W-CDMA method, so that they can realize various data communications such as sending and receiving electronic mail, viewing simple home pages, sending and receiving images, in addition to the audio communication.
In addition, the base stations CS1 to CS4 are connected to a public line network INW with a wire circuit, and the public line network INW is connected with the Internet ITN and a number of unillustrated subscriber's wire terminals, computer networks, local area networks and so on.
The public line network INW is also connected to an access server AS of an Internet service provider, and the access server AS is connected to a contents server TS owned by the Internet service provider.
This contents server TS provides contents such as a simple home page as a file in a compact hyper text markup language (HTML) format in response to a request from the subscriber's wire terminals, the personal digital assistants MS1 and MS2 and the digital portable telephones with a camera MS3 and MS4.
Incidentally, the Internet ITN is connected to a number of WWW servers WS1 to WSn, so that the WWW servers WS1 to WSn can be accessed from the subscriber's wire terminals, the personal digital assistants MS1 and MS2 and the digital portable telephones with a camera MS3 and MS4 in the TCP/IP protocol.
In this connection, when the personal digital assistants MS1 and MS2 and the digital portable telephones with a camera MS3 and MS4 perform communication, the communication between the devices MS1 to MS4 and the base stations CS1 to CS4 is performed in a 2 Mbps simple transport protocol, and the communication between the base stations CS1 to CS4 and the WWW servers WS1 to WSn via the Internet ITN is performed in the TCP/IP protocol.
Moreover, a management control unit MCU is connected to the subscriber's wire terminals, the personal digital assistants MS1 and MS2 and the digital portable telephones with a camera MS3 and MS4 with he public line network INW so as to perform identification and accounting processes for the subscriber's wire terminals, the personal digital assistants MS1 and MS2 and the digital portable telephones with a camera MS3 and MS4.
(2-2) Exterior Features of a Digital Portable Telephone with a Camera
Next, the exterior features of the digital portable telephone with a camera MS3 to which the present invention is applied will be described. As shown inFIG. 18, the digital portable telephone with a camera MS3 is divided into adisplay division212 and amain body213 bordered with ahinge division211 at the center, and is formed to be folded down thehinge division211.
Thedisplay division212 has anantenna214 for transmission and reception mounted at the upper-left so as to be drowned or contained, so that radio waves can be sent and received to and from the base station CS3 via theantenna214.
Thedisplay division212 also has acamera division215 mounted at the upper-center which is rotatable within an angular range of approximately 180°, so that a desired subject to be photographed can be photographed by theCCD camera216 of thecamera division215.
Here, in the case where thecamera division215 is rotated and positioned by approximately 180° by the user, aspeaker217 of thedisplay division212 provided at the center of the backside as shown inFIG. 19 is positioned in front so as to switch to an ordinary audio communication state.
Furthermore, thedisplay division212 has aliquid crystal display218 provided in front, which can display a radio wave receiving state, a remaining battery level, contact names, telephone numbers, an outgoing history and so on registered as a telephone book, and besides, contents of electronic mail, a simple home page, an image photographed by theCCD camera216 of thecamera division215.
On the other hand, themain body213 hascontrol keys219 including namely numeric keys of “0” to “9,” a call key, a redial key, a clearing and power key, a clear key, an electronic mail key, on its surface, so that various instructions can be inputted by using thecontrol keys219.
Themain body213 also has amemo button220 and amicrophone221 under thecontrol keys219, so that voice of the other party can be recorded with thememo button220 and the user's voice during a call is collected with themicrophone221.
In addition, themain body213 has arotatable jog dial222 above thecontrol keys219, stuck out a bit from themain body213, to perform various operations including scroll operations of a telephone book list and electronic mail displayed on theliquid crystal display218, a page turning operation of the simple home page and an image forwarding operation by the rotating operation of thejog dial222.
For instance, as for themain body213, if a desired telephone number is selected out of a plurality of telephone numbers on the telephone book list displayed on theliquid crystal display218 by the rotating operation of thejog dial222 by the user and thejog dial222 is pushed into themain body213, it determines the selected telephone number and automatically makes a call on the telephone number.
Moreover, themain body213 has a battery pack placed on its backside (not shown), and if the clearing and power key is put in an ON state, power is supplied from the battery pack to each circuit division and it starts up in an operable state.
Incidentally, themain body213 has amemory stick slot224 for inserting a detachable Memory Stick (trademark of Sony Corp.)223 therein at the upper part of the left side so that, if thememo button220 is pressed, it can record voice of a communication party on the phone as well as the electronic mails, the simple home pages and the images photographed by theCCD camera216 on theMemory Stick223 by the user's operation.
Here, theMemory Stick223 is a kind of a flash memory card developed by Sony Corp., the applicant hereof. ThisMemory Stick223 is a small and slim plastic case of 21.5 mm high × 50 mm wide × 2.8 mm thick containing a flash memory element that is a kind of electrically erasable and programmable read only memory (EEPROM), namely a nonvolatile memory capable of electrically rewriting and erasing, and it allows various data such as images, audio and music to be written and read via a 10-pin terminal.
In addition, theMemory Stick223 adopts a unique serial protocol capable of securing compatibility with a used apparatus even in the case of a change of the built-in flash memory in specification such as enlarged capacity, and thus implements high performance of maximum writing speed of 1.5 MB/S and maximum reading speed of 2.45 MB/S and also secures high reliability by providing an erroneous erasure prevention switch.
As described above, the digital portable telephone with a camera MS3 can insert such aMemory Stick223 therein, so that it can share data with other electronic apparatuses via theMemory Stick223.
(2-3) Circuit Configuration of the Digital Portable Telephone with a Camera
As shown inFIG. 20, the digital portable telephone with a camera MS3 has themain control division250 for centrally controlling thedisplay division212 and themain body213 connected to apower circuit division251, an operationinput control division252, animage encoder253, acamera interface division254, a liquid crystal display (LCD)control division255, animage decoder256, ademultiplexing division257, a recording and reproducingdivision262, a modulator anddemodulator circuit division258 and anaudio CODEC259 with amain bus260, and also to theimage encoder253, theimage decoder256, thedemultiplexing division257, the modulator anddemodulator circuit division258 and theaudio CODEC259 with asynchronous bus261.
When the clearing and power key is turned on by the user, thepower circuit division251 starts up the digital portable telephone with a camera MS3 in an operable state by providing power from the battery pack to each division.
Under the control of themain control division250 comprised of the CPU, ROM, RAM and so on, the digital portable telephone with a camera MS3 converts an audio signal collected with themicrophone221 in an audio communication mode, into digital audio data through theaudio CODEC259 and performs a spread spectrum process on it at the modulator anddemodulator circuit division258, and performs digital-to-analog conversion and frequency conversion processes at the sending and receivingcircuit division263, and then sends it via theantenna214.
In addition, the digital portable telephone with a camera MS3 amplifies a received signal received by theantenna214 in the audio communication mode and performs frequency conversion and analog-to-digital conversion processes, performs a de-spread spectrum process at the modulator anddemodulator circuit division258 and converts it into an analog audio signal by theaudio CODEC259, and then outputs it via thespeaker217.
Furthermore, in the case of sending an electronic mail in a data communication mode, the digital portable telephone with a camera MS3 sends text data of the electronic mail inputted by operating thecontrol keys219 and thejog dial222 to themain control division250 via the operationinput control division252.
Themain control division250 performs the spread spectrum process on the text data at the modulator anddemodulator circuit division258 and performs digital-to-analog conversion and frequency conversion processes on it at the sending and receivingcircuit division263, and then sends it to the base station CS3 (FIG. 17) via theantenna214.
On the other hand, in the case of receiving an electronic mail in the data communication mode, the digital portable telephone with a camera MS3 performs a de-spread spectrum process on the received signal, received from the base station CS3 via theantenna214, at the modulator anddemodulator circuit division258 to restore the original text data, and then displays it as the electronic mail on theliquid crystal display218 via theLCD control division255.
It is also possible thereafter for the digital portable telephone with a camera MS3 to record the received electronic mail on theMemory Stick223 via the recording and reproducingdivision262 by the user's operation.
On the other hand, in the case of sending image data in the data communication mode, the digital portable telephone with a camera MS3 supplies the image data photographed by theCCD camera216 to theimage encoder253 via thecamera interface division254.
In this connection, in the case of not sending the image data, it is also possible for the digital portable telephone with a camera MS3 to directly display the image data photographed by theCCD camera216 on theliquid crystal display218 via thecamera interface division254 and theLCD control division255.
Theimage encoder253 converts the image data supplied from theCCD camera216 into coded image data by compression-coding it by a predetermined encoding method such as the moving picture experts group (MPEG) 2 or MPEG4, and sends the resultant to thedemultiplexing division257.
At this time, the digital portable telephone with a camera MS3 simultaneously sends the audio collected with themicrophone221 during photographing by theCCD camera216 as digital audio data to thedemultiplexing division257 via theaudio CODEC259.
Thedemultiplexing division257 multiplexes the coded image data supplied from theimage encoder253 and the audio data supplied from theaudio CODEC259 by a predetermined method, and performs the spread spectrum process on the resultant multiplexed data at the modulator anddemodulator circuit division258 and performs digital-to-analog conversion and frequency conversion processes on it at the sending and receivingcircuit division263, and then sends the resultant via theantenna214.
On the other hand, in the case of receiving image data such as a simple home page in the data communication mode, the digital portable telephone with a camera MS3 performs the de-spread spectrum process on the received signal received from the base station CS3 via theantenna214 at the modulator anddemodulator circuit division258 and sends the resultant multiplexed data to thedemultiplexing division257.
Thedemultiplexing division257 demultiplexes the multiplexed data to divide it into coded image data and audio data, and supplies the coded image data to theimage decoder256 and also supplies the audio data to theaudio CODEC259, through thesynchronous bus261.
Theimage decoder256 generates reproduction image data by decoding the coded image data by the decoding method corresponding to the predetermined encoding method such as MPEG2 or MPEG4, and displays it as, for instance, an image linked to the simple home page on theliquid crystal display218 via theLCD control division255.
At this time, theaudio CODEC259 converts the audio data into analog audio data, and then outputs it as, for instance, sounds linked to the simple home page via thespeaker217.
Also in this case, just as in the case of the electronic mail, the digital portable telephone with a camera MS3 can record the received image data of the simple home page on theMemory Stick223 via the recording and reproducingdivision262 by the user's operation.
In addition to such configuration, the digital portable telephone with a camera MS3 has the cyber gesture program180 (FIG. 15) and the jogdial server program182 which are the same as those in the first embodiment, in the ROM of themain control division250, and it can overlap and display the gesture recognition screen100 (FIG. 6) on the active window screen by thecyber gesture program180 and also display an image of the user photographed by theCCD camera216 in the gesturerecognition display area106 on thegesture recognition screen100 while displaying the active window screen based on thepredetermined application software183 on theliquid crystal display218.
Next, just as in the first embodiment and as shown inFIGS. 5 to 16, the digital portable telephone with a camera MS3 detects the skin-color area R in the user's image displayed in the gesturerecognition display area106 on thegesture recognition screen100 under the control of themain control division250, and recognizes the moving skin-color area R therein as the palm area, and then it supplies a predetermined command corresponding to the gesture movement of the palm area to theapplication software183 by the jogdial server program182.
Thus, just like thenotebook PC1 in the first embodiment, the digital portable telephone with a camera MS3 can forward or play back still pictures on the active window screen displayed as the background on thegesture recognition screen100 by theapplication software183 under the control of themain control division250, in response to the command.
(2-4) Operations and Effects in the Second Embodiment
In the above configuration, themain control division250 starts thecyber gesture program180, and the digital portable telephone with a camera MS3 displays thetarget division107 comprised of the fivesquare targets107A to107E arranged in a horizontal line approximately at the center of the gesturerecognition display area106 on thegesture recognition screen100.
At this time, just as in the first embodiment, in the case where the digital portable telephone with a camera MS3 cannot recognize the right and left movement of the user's hand, it produces a graduation effect by alternately coloring the areas inside the frames107AF to107EF of thetargets107A to107E in red in sequence in the left and right directions indicated by the arrows A and B so as to certainly notify the user that it is searching for the user's hand.
In this way, the digital portable telephone with a camera MS3 can prompt the user to perform an input operation for recognizing the movement of his/her hand, so that it can recognize the movement of the user's hand in a short time, and can generate a command according to the recognized movement of the hand, and then forward or play back the still pictures displayed on the active window screen in response to the command.
According to the above configuration, in the case where the digital portable telephone with a camera MS3 does not recognize the movement of the user's hand photographed by theCCD camera216, it can certainly notify the user that it is searching for the user's hand and is in a recognizable standby state by alternately coloring the areas inside the frames107AF to107EF of thetargets107A to107E in red in sequence in the left and right directions indicated by the arrows A and B.
Further, in the above configuration, themain control division250 starts thecyber gesture program180, and the digital portable telephone with a camera MS3 displays thetarget division107 comprised of the fivesquare targets107A to107E arranged in a horizontal line approximately at the center of the gesturerecognition display area106 on thegesture recognition screen100.
Therefore, just as in the first embodiment, the digital portable telephone with a camera MS3 can make the user visualize that the digital portable telephone with a camera MS3 can recognize the right-and-left movement of the user's hand, so that the user can certainly recognize the recognizable movement direction in advance.
In addition, the digital portable telephone with a camera MS3 can color the frames107AF to107EF of thetargets107A to107E in red so as to make thetargets107A to107E more visually remarkable against the gray scale display as the background.
According to the above configuration, the digital portable telephone with a camera MS3 can certainly notify the user in advance that the digital portable telephone with a camera MS3 can recognize the right-and-left movement of the user's hand by displaying thetarget division107 comprised of the fivesquare targets107A to107E sequentially arranged in a horizontal line approximately at the center of the gesturerecognition display area106 on thegesture recognition screen100, by thecyber gesture program180.
Furthermore, in the above configuration, themain control division250 starts thecyber gesture program180, and the digital portable telephone with a camera MS3 displays thetarget division107 comprised of the fivesquare targets107A to107E arranged in a horizontal line approximately at the center of the gesturerecognition display area106 on thegesture recognition screen100.
Then, just as in the first embodiment, the digital portable telephone with a camera MS3 can recognize the right-and-left movement of the user's hand is moved and also allow the user to check the trail and the recognition result of the movement of the hand by moving and displaying thepointer108 on thetargets107A to107E in sequence according to the movement of the hand.
Thus, the digital portable telephone with a camera MS3 can make the user learn how to move his/her hand so that the his/her hand can be easily recognized by the digital portable telephone with a camera MS3 by feedback on the trail and the recognition resultant of the movement of the recognized user's hand, as a result, the time until a command is inputted by the user can be shortened.
According to the above configuration, the digital portable telephone with a camera MS3 can display the trail and the recognition resultant of the movement of the user's hand as a result of recognizing the movement of the hand photographed by theCCD camera216, by an animation using thetargets107A to107E and thepointer108, so as to make the user learn by feedback on the trail and the recognition result of the movement of the hand.
(3) Other Embodiments
Moreover, while the above first and second embodiments described the cases where, when thenotebook PC1 and the digital portable telephone with a camera MS3 do not recognize the movement of the user's hand in the left and right directions, they notify the user that they do not recognize the movement of the hand by alternately coloring the areas inside the frames107AF to107EF of thetargets107A to107E in red and in sequence in the left and right directions indicated by the arrows A and B on the gesture recognition screen100 (FIG. 11) in the search state as a standby state image, the present invention is not limited thereto and it is also feasible to blink thetargets107A to107E in sequence in the left and right directions indicated by the arrows A and B or to directly display “Cannot recognize the movement of your hand” on theliquid crystal display10.
Moreover, while the above first and second embodiments described the cases where thenotebook PC1 and the digital portable telephone with a camera MS3 visually notify the user that they can recognize the right-and-left movement of the user's hand on thegesture recognition screen100 as a recognizable movement direction image picture, the present invention is not limited thereto and it is also feasible to visually notify the user that the up-and-down movement can be recognized by the gesture recognition screen including thetarget division107 arranged in a vertical line.
In addition, while the above first and second embodiments described the cases where thesquare targets107A to107E are used as marks of a predetermined shape displayed on thegesture recognition screen100, the present invention is not limited thereto and it is also feasible to use various other shapes of targets such as circles, or arbitrary animation images.
In addition, while the above first and second embodiments described the cases where the user's movement is recognized as a recognition subject, the present invention is not limited thereto and it is also feasible to recognize movement of various recognition subjects such as a robot and an animal other than the user.
Furthermore, while the above first and second embodiments described the cases where theCPUs50 and250 as movement direction recognizing means and control means notify the user that they are in a recognizable standby state using thegesture recognition screen100 based on thecyber gesture program180 stored in advance on the hard disk of theHDD67 or the ROM, the present invention is not limited thereto and it is also feasible, by inserting a program storage medium storing thecyber gesture program180 into thenotebook PC1 and the digital portable telephone with a camera MS3, to notify the user that they are in a recognizable standby state.
Furthermore, while the above first and second embodiments described the cases where theCPUs50 and250 as movement direction recognizing means and control means notify the user of the recognizable movement direction in advance by displaying thegesture recognition screen100 based on thecyber gesture program180 stored in advance in the hard disk of theHDD67 or the ROM, the present invention is not limited thereto and it is also feasible to display the above-mentionedgesture recognition screen100 by inserting a program storage medium storing thecyber gesture program180 into thenotebook PC1 and the digital portable telephone with a camera MS3.
Furthermore, while the above first and second embodiments describes the cases where theCPUs50 and250 as movement direction recognizing means and control means notify the user of the trail and the recognition result of the movement of the user's hand by displaying an animation in which thepointer108 is moved and overlapped on thetargets107A to107E according to the movement of his/her hand by thecyber gesture program180 stored in advance in the hard disk of theHDD67 or the ROM, the present invention is not limited thereto and it is also feasible to display an animation to notify the user of the movement and the recognition result of the user's hand by inserting a program storage medium storing thecyber gesture program180 into thenotebook PC1 and the digital portable telephone with a camera MS3.
As the program storage medium used for installing thecyber gesture program180 that is used to perform the above-mentioned series of processes in thenotebook PC1 and the digital portable telephone with a camera MS3 and executing it in thenotebook PC1 and the digital portable telephone with a camera MS3, not only package media such as a floppy disk, a compact disc-read only memory (CD-ROM) and a digital versatile disc (DVD) but also a semiconductor memory and a magnetic disk that store thecyber gesture program180 temporarily or permanently can be used. In addition, as a means for storing thecyber gesture program180 on these program storage media, wire and radio communication media such as a local area network, the Internet and digital satellite broadcasting can be utilized and it can also be stored via another various communication interface such as a router or a modem.
Moreover, while the above first and second embodiments described the cases where the information processing apparatuses in the present invention is applied to thenotebook PC1 and the digital portable telephone with a camera MS3, they can also be applied to other various information processing apparatuses such as personal digital assistants MS1 and MS2.
As mentioned above, the present invention allows, based on an image obtained by photographing a recognition subject by an image pickup means, to recognize the movement direction of the recognition subject and when generating a predetermined command corresponding to the movement direction of the recognition subject recognized, to create a predetermined standby state image while not recognizing the movement direction of the recognition subject, indicating the situation of searching for the recognition subject in the image and to display it on predetermined display means so as to certainly notify the user that the movement direction of the recognition subject is recognizable and the apparatus is in a standby state.
Further, as mentioned above, the present invention allows, before recognizing the movement direction of a recognition subject based on an image obtained by photographing the recognition subject by an image pickup means, to generate a predetermined recognizable movement direction image picture for making the user imagine a recognizable movement direction in advance and to display it on a predetermined display means so as to notify the user of the recognizable movement direction in advance using the recognizable movement direction image picture.
Furthermore, as mentioned above, the present invention allows, based on an image obtained by photographing a recognition subject by an image pickup means, to recognize the movement direction of the recognition subject in the image and generate a recognizing process image representing the trail of the movement direction of the recognized recognition subject and then display it on predetermined display means so as to make the user learn on how the movement of the recognition subject was recognized, by feedback.
While there has been described in connection with the preferred embodiments of the invention, it will be obvious to those skilled in the art that various changes and modifications may be aimed, therefore, to cover in the appended claims all such changes and modifications as fall within the true spirit and scope of the invention.

Claims (7)

4. A method of showing a recognizable movement comprising:
a movement trail recognizing step of, based on an image obtained by photographing a recognition subject by an image pickup unit, recognizing the movement trail of said recognition subject;
an image generation processing step of, generating a predetermined recognizable movement trail image picture in advance for making a user visualize said recognizable movement direction which can be recognized by said movement direction recognizing processing step;
a display processing step of displaying said recognizable movement direction image picture generated by said image generation processing step in predetermined display means;
transforming said movement trail recognized by said movement trail recognizing step to correspond with said displayed predetermined recognizable movement direction.
5. A program storage medium causing an information processing apparatus to execute a program, wherein said program comprises:
a movement trail recognizing processing step of, based on an image obtained by photographing a recognition subject by an image pickup unit, recognizing the movement trail of said recognition subject;
an image generation processing step of, generating a predetermined recognizable movement direction image picture in advance for making a user visualize said movement recognizable direction which can be recognized by said movement direction recognizing processing step; and
a display processing step of displaying said recognizable movement direction image picture generated by said image generation processing step on a predetermined display unit; and
transforming said movement trail recognized by said movement trial recognizing step to correspond with said displayed predetermined recognizable movement direction.
US09/838,6442000-04-212001-04-19Information processing apparatus, method of displaying movement recognizable standby state, method of showing recognizable movement, method of displaying movement recognizing process, and program storage mediumExpired - LifetimeUS7046232B2 (en)

Applications Claiming Priority (6)

Application NumberPriority DateFiling DateTitle
JP2000126344AJP2001306049A (en)2000-04-212000-04-21Information processor, movement recognition process displaying method and program storage medium
JP2000-1263422000-04-21
JP2000-1263432000-04-21
JP2000126343AJP4415227B2 (en)2000-04-212000-04-21 Information processing apparatus, information processing method, and recording medium
JP2000-1263442000-04-21
JP2000126342AJP2001307108A (en)2000-04-212000-04-21Information processor, method for displaying operation recognizing waiting state and program storage medium

Publications (2)

Publication NumberPublication Date
US20020006222A1 US20020006222A1 (en)2002-01-17
US7046232B2true US7046232B2 (en)2006-05-16

Family

ID=27343207

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US09/838,644Expired - LifetimeUS7046232B2 (en)2000-04-212001-04-19Information processing apparatus, method of displaying movement recognizable standby state, method of showing recognizable movement, method of displaying movement recognizing process, and program storage medium

Country Status (4)

CountryLink
US (1)US7046232B2 (en)
EP (1)EP1148411A3 (en)
KR (1)KR100843811B1 (en)
CN (1)CN100487633C (en)

Cited By (39)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20040258292A1 (en)*2003-06-192004-12-23Hiroyuki MatsunoImaging method and apparatus
US20050116911A1 (en)*2003-10-072005-06-02Tomohiro MukaiInformation display
US20060267925A1 (en)*2005-05-302006-11-30Renesas Technology Corp.Liquid crystal display drive and control device, morbile terminal system, and data processing system
US20070124694A1 (en)*2003-09-302007-05-31Koninklijke Philips Electronics N.V.Gesture to define location, size, and/or content of content window on a display
US20080089587A1 (en)*2006-10-112008-04-17Samsung Electronics Co.; LtdHand gesture recognition input system and method for a mobile phone
US20080155481A1 (en)*2006-12-012008-06-26Samsung Electronics Co., Ltd.Idle screen arrangement structure and idle screen display method for mobile terminal
US20090172606A1 (en)*2007-12-312009-07-02Motorola, Inc.Method and apparatus for two-handed computer user interface with gesture recognition
US20100045469A1 (en)*2006-08-072010-02-25Koninklijke Philips Electronics N.V.Method and apparatus for monitoring user activity at a computer screen to stimulate motility
US20100295783A1 (en)*2009-05-212010-11-25Edge3 Technologies LlcGesture recognition systems and related methods
US20100306713A1 (en)*2009-05-292010-12-02Microsoft CorporationGesture Tool
US20110134064A1 (en)*2001-11-022011-06-09Neonode, Inc.On a substrate formed or resting display arrangement
US20120094723A1 (en)*2002-12-102012-04-19Neonode, Inc.User interface
US20120194692A1 (en)*2011-01-312012-08-02Hand Held Products, Inc.Terminal operative for display of electronic record
US8396252B2 (en)2010-05-202013-03-12Edge 3 TechnologiesSystems and related methods for three dimensional gesture recognition in vehicles
US8467599B2 (en)2010-09-022013-06-18Edge 3 Technologies, Inc.Method and apparatus for confusion learning
US8582866B2 (en)2011-02-102013-11-12Edge 3 Technologies, Inc.Method and apparatus for disparity computation in stereo images
US8620024B2 (en)2010-09-172013-12-31Sony CorporationSystem and method for dynamic gesture recognition using geometric classification
US8643628B1 (en)2012-10-142014-02-04Neonode Inc.Light-based proximity detection system and user interface
US8655093B2 (en)2010-09-022014-02-18Edge 3 Technologies, Inc.Method and apparatus for performing segmentation of an image
US8666144B2 (en)2010-09-022014-03-04Edge 3 Technologies, Inc.Method and apparatus for determining disparity of texture
US8705877B1 (en)2011-11-112014-04-22Edge 3 Technologies, Inc.Method and apparatus for fast computational stereo
US8775023B2 (en)2009-02-152014-07-08Neanode Inc.Light-based touch controls on a steering wheel and dashboard
US8917239B2 (en)2012-10-142014-12-23Neonode Inc.Removable protective cover with embedded proximity sensors
US8970589B2 (en)2011-02-102015-03-03Edge 3 Technologies, Inc.Near-touch interaction with a stereo camera grid structured tessellations
US9092093B2 (en)2012-11-272015-07-28Neonode Inc.Steering wheel user interface
US9134800B2 (en)2010-07-202015-09-15Panasonic Intellectual Property Corporation Of AmericaGesture input device and gesture input method
US9164625B2 (en)2012-10-142015-10-20Neonode Inc.Proximity sensor for determining two-dimensional coordinates of a proximal object
US9213890B2 (en)2010-09-172015-12-15Sony CorporationGesture recognition system for TV control
US9336456B2 (en)2012-01-252016-05-10Bruno DeleanSystems, methods and computer program products for identifying objects in video data
US9741184B2 (en)2012-10-142017-08-22Neonode Inc.Door handle with optical proximity sensors
US9921661B2 (en)2012-10-142018-03-20Neonode Inc.Optical proximity sensor and associated user interface
US10282034B2 (en)2012-10-142019-05-07Neonode Inc.Touch sensitive curved and flexible displays
US10324565B2 (en)2013-05-302019-06-18Neonode Inc.Optical proximity sensor
US10585530B2 (en)2014-09-232020-03-10Neonode Inc.Optical proximity sensor
US10721448B2 (en)2013-03-152020-07-21Edge 3 Technologies, Inc.Method and apparatus for adaptive exposure bracketing, segmentation and scene organization
US11153472B2 (en)2005-10-172021-10-19Cutting Edge Vision, LLCAutomatic upload of pictures from a camera
US11429230B2 (en)2018-11-282022-08-30Neonode IncMotorist user interface sensor
US11842014B2 (en)2019-12-312023-12-12Neonode Inc.Contactless touch input system
US12032817B2 (en)2012-11-272024-07-09Neonode Inc.Vehicle user interface

Families Citing this family (41)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6901561B1 (en)*1999-10-192005-05-31International Business Machines CorporationApparatus and method for using a target based computer vision system for user interaction
JP3948986B2 (en)*2002-03-142007-07-25三洋電機株式会社 Captured image display device and captured image display method
US8225224B1 (en)*2003-02-252012-07-17Microsoft CorporationComputer desktop use via scaling of displayed objects with shifts to the periphery
JP4048435B2 (en)2003-10-232008-02-20ソニー株式会社 Electronics
US7646896B2 (en)*2005-08-022010-01-12A4VisionApparatus and method for performing enrollment of user biometric information
WO2006031143A1 (en)*2004-08-122006-03-23A4 Vision S.A.Device for contactlessly controlling the surface profile of objects
WO2006031147A1 (en)*2004-08-122006-03-23A4 Vision S.A.Device for biometrically controlling a face surface
US7583819B2 (en)2004-11-052009-09-01Kyprianos PapademetriouDigital signal processing methods, systems and computer program products that identify threshold positions and values
US20070047774A1 (en)*2005-08-312007-03-01Artiom YukhinMethod and apparatus for performing enrollment of 2D and 3D face biometrics
US9317124B2 (en)*2006-09-282016-04-19Nokia Technologies OyCommand input by hand gestures captured from camera
KR101304461B1 (en)*2006-12-042013-09-04삼성전자주식회사Method and apparatus of gesture-based user interface
US8726194B2 (en)*2007-07-272014-05-13Qualcomm IncorporatedItem selection using enhanced control
US8555207B2 (en)2008-02-272013-10-08Qualcomm IncorporatedEnhanced input using recognized gestures
US9772689B2 (en)*2008-03-042017-09-26Qualcomm IncorporatedEnhanced gesture-based image manipulation
US8514251B2 (en)*2008-06-232013-08-20Qualcomm IncorporatedEnhanced character input using recognized gestures
JP5024632B2 (en)*2008-09-192012-09-12ソニー株式会社 Image processing apparatus and method, and program
FR2936546B1 (en)*2008-10-012017-03-10Valeo Securite Habitacle DEVICE FOR AUTOMATICALLY UNLOCKING AN AUTOMATIC VEHICLE OPENING.
JP5500812B2 (en)*2008-10-202014-05-21株式会社ソニー・コンピュータエンタテインメント Captured image storage control device, captured image storage control method, captured image storage control program, and storage medium storing captured image storage control program
EP2441254A4 (en)*2009-04-162016-12-21Hewlett Packard Development CoCommunicating visual representations in virtual collaboration systems
US8719714B2 (en)2009-07-082014-05-06Steelseries ApsApparatus and method for managing operations of accessories
TW201106200A (en)*2009-08-122011-02-16Inventec Appliances CorpElectronic device, operating method thereof, and computer program product thereof
US8988190B2 (en)*2009-09-032015-03-24Dell Products, LpGesture based electronic latch for laptop computers
US8843857B2 (en)*2009-11-192014-09-23Microsoft CorporationDistance scalable no touch computing
WO2012142323A1 (en)*2011-04-122012-10-18Captimo, Inc.Method and system for gesture based searching
US20110158605A1 (en)*2009-12-182011-06-30Bliss John StuartMethod and system for associating an object to a moment in time in a digital video
WO2011075740A2 (en)*2009-12-182011-06-23Blipsnips, Inc.Method and system for associating an object to a moment in time in a digital video
JP2011253292A (en)*2010-06-012011-12-15Sony CorpInformation processing system, method and program
US9195345B2 (en)2010-10-282015-11-24Microsoft Technology Licensing, LlcPosition aware gestures with visual feedback as input method
KR102086495B1 (en)*2011-05-272020-03-10엘지디스플레이 주식회사Method and device of recognizing user's movement, and electric-using apparatus using the device
EP2817693B1 (en)*2012-02-242023-04-05Moscarillo, Thomas, J.Gesture recognition device
US9575652B2 (en)2012-03-312017-02-21Microsoft Technology Licensing, LlcInstantiable gesture objects
US9619036B2 (en)*2012-05-112017-04-11Comcast Cable Communications, LlcSystem and methods for controlling a user experience
US9423874B2 (en)2013-03-152016-08-23Steelseries ApsGaming accessory with sensory feedback device
US9687730B2 (en)*2013-03-152017-06-27Steelseries ApsGaming device with independent gesture-sensitive areas
CN104460959B (en)*2013-09-182018-03-27联想(北京)有限公司Information processing method and electronic equipment
CN104914982B (en)*2014-03-122017-12-26联想(北京)有限公司 Method and device for controlling electronic equipment
KR102171817B1 (en)*2014-03-142020-10-29삼성전자주식회사Display apparatus and method for controlling display apparatus thereof
US10146318B2 (en)2014-06-132018-12-04Thomas MalzbenderTechniques for using gesture recognition to effectuate character selection
US10409443B2 (en)*2015-06-242019-09-10Microsoft Technology Licensing, LlcContextual cursor display based on hand tracking
CN109558000B (en)*2017-09-262021-01-22京东方科技集团股份有限公司Man-machine interaction method and electronic equipment
CN112861641B (en)*2021-01-152022-05-20复旦大学 A Dynamic Gesture Recognition Hardware Accelerator for Human-Computer Interaction

Citations (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5018082A (en)*1987-05-251991-05-21Fujitsu LimitedGuidance message display timing control using time intervals
US5214414A (en)*1991-04-121993-05-25International Business Machines Corp.Cursor for lcd displays
US5864334A (en)1997-06-271999-01-26Compaq Computer CorporationComputer keyboard with switchable typing/cursor control modes
US6009210A (en)*1997-03-051999-12-28Digital Equipment CorporationHands-free interface to a virtual reality environment using head tracking
US6043805A (en)1998-03-242000-03-28Hsieh; Kuan-HongControlling method for inputting messages to a computer
WO2000021023A1 (en)1998-10-072000-04-13Intel CorporationControlling a pointer using digital video
US6222538B1 (en)*1998-02-272001-04-24Flashpoint Technology, Inc.Directing image capture sequences in a digital imaging device using scripts
US6388665B1 (en)*1994-07-082002-05-14Microsoft CorporationSoftware platform having a real world interface with animated characters
US6388181B2 (en)*1999-12-062002-05-14Michael K. MoeComputer graphic animation, live video interactive method for playing keyboard music
US6392675B1 (en)*1999-02-242002-05-21International Business Machines CorporationVariable speed cursor movement

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP3355708B2 (en)1993-06-292002-12-09カシオ計算機株式会社 Command processing device
JPH08315118A (en)1995-05-191996-11-29Sony CorpMan-machine interface
US6144366A (en)*1996-10-182000-11-07Kabushiki Kaisha ToshibaMethod and apparatus for generating information input using reflected light image of target object

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5018082A (en)*1987-05-251991-05-21Fujitsu LimitedGuidance message display timing control using time intervals
US5214414A (en)*1991-04-121993-05-25International Business Machines Corp.Cursor for lcd displays
US6388665B1 (en)*1994-07-082002-05-14Microsoft CorporationSoftware platform having a real world interface with animated characters
US6009210A (en)*1997-03-051999-12-28Digital Equipment CorporationHands-free interface to a virtual reality environment using head tracking
US5864334A (en)1997-06-271999-01-26Compaq Computer CorporationComputer keyboard with switchable typing/cursor control modes
US6222538B1 (en)*1998-02-272001-04-24Flashpoint Technology, Inc.Directing image capture sequences in a digital imaging device using scripts
US6043805A (en)1998-03-242000-03-28Hsieh; Kuan-HongControlling method for inputting messages to a computer
WO2000021023A1 (en)1998-10-072000-04-13Intel CorporationControlling a pointer using digital video
US6690357B1 (en)*1998-10-072004-02-10Intel CorporationInput device using scanning sensors
US6392675B1 (en)*1999-02-242002-05-21International Business Machines CorporationVariable speed cursor movement
US6388181B2 (en)*1999-12-062002-05-14Michael K. MoeComputer graphic animation, live video interactive method for playing keyboard music

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Alberto Tomita, Jr. et al., "Extraction of a Person's Handshape for Application in a Human Interface", 2334a IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol. E78-A, No. 8, XP-000536050, Aug. 1995, pp. 951-956.
Byong K. KO, et al., "Finger Mouse and Gesture Recognition System As a New Human Computer Interface", Computers and Graphics, vol. 21, No. 5, XP-004101109, Sep. 10, 1997, pp. 555-561.

Cited By (101)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20110134064A1 (en)*2001-11-022011-06-09Neonode, Inc.On a substrate formed or resting display arrangement
US8692806B2 (en)2001-11-022014-04-08Neonode Inc.On a substrate formed or resting display arrangement
US8812993B2 (en)*2002-12-102014-08-19Neonode Inc.User interface
US20120094723A1 (en)*2002-12-102012-04-19Neonode, Inc.User interface
US20040258292A1 (en)*2003-06-192004-12-23Hiroyuki MatsunoImaging method and apparatus
US7606406B2 (en)*2003-06-192009-10-20Canon Kabushiki KaishaImaging method and apparatus
US20070124694A1 (en)*2003-09-302007-05-31Koninklijke Philips Electronics N.V.Gesture to define location, size, and/or content of content window on a display
US20050116911A1 (en)*2003-10-072005-06-02Tomohiro MukaiInformation display
US7385599B2 (en)*2003-10-072008-06-10Seiko Epson CorporationInformation display
US8253683B2 (en)*2005-05-302012-08-28Renesas Electronics CorporationLiquid crystal display drive and control device, mobile terminal system, and data processing system
US20060267925A1 (en)*2005-05-302006-11-30Renesas Technology Corp.Liquid crystal display drive and control device, morbile terminal system, and data processing system
US11818458B2 (en)2005-10-172023-11-14Cutting Edge Vision, LLCCamera touchpad
US11153472B2 (en)2005-10-172021-10-19Cutting Edge Vision, LLCAutomatic upload of pictures from a camera
US20100045469A1 (en)*2006-08-072010-02-25Koninklijke Philips Electronics N.V.Method and apparatus for monitoring user activity at a computer screen to stimulate motility
US8487750B2 (en)*2006-08-072013-07-16Koninklijke Philips Electronics N.V.Method and apparatus for monitoring user activity at a computer screen to stimulate motility
US20080089587A1 (en)*2006-10-112008-04-17Samsung Electronics Co.; LtdHand gesture recognition input system and method for a mobile phone
US8064704B2 (en)*2006-10-112011-11-22Samsung Electronics Co., Ltd.Hand gesture recognition input system and method for a mobile phone
US8612897B2 (en)2006-12-012013-12-17Samsung Electronics Co., LtdIdle screen arrangement structure and idle screen display method for mobile terminal
US20080155481A1 (en)*2006-12-012008-06-26Samsung Electronics Co., Ltd.Idle screen arrangement structure and idle screen display method for mobile terminal
US20090172606A1 (en)*2007-12-312009-07-02Motorola, Inc.Method and apparatus for two-handed computer user interface with gesture recognition
US10007422B2 (en)2009-02-152018-06-26Neonode Inc.Light-based controls in a toroidal steering wheel
US8918252B2 (en)2009-02-152014-12-23Neonode Inc.Light-based touch controls on a steering wheel
US8775023B2 (en)2009-02-152014-07-08Neanode Inc.Light-based touch controls on a steering wheel and dashboard
US9389710B2 (en)2009-02-152016-07-12Neonode Inc.Light-based controls on a toroidal steering wheel
US11703951B1 (en)2009-05-212023-07-18Edge 3 TechnologiesGesture recognition systems
US9417700B2 (en)2009-05-212016-08-16Edge3 TechnologiesGesture recognition systems and related methods
US20100295783A1 (en)*2009-05-212010-11-25Edge3 Technologies LlcGesture recognition systems and related methods
US12105887B1 (en)2009-05-212024-10-01Golden Edge Holding CorporationGesture recognition systems
US8856691B2 (en)2009-05-292014-10-07Microsoft CorporationGesture tool
US20100306713A1 (en)*2009-05-292010-12-02Microsoft CorporationGesture Tool
US9891716B2 (en)2010-05-202018-02-13Microsoft Technology Licensing, LlcGesture recognition in vehicles
US8625855B2 (en)2010-05-202014-01-07Edge 3 Technologies LlcThree dimensional gesture recognition in vehicles
US9152853B2 (en)2010-05-202015-10-06Edge 3Technologies, Inc.Gesture recognition in vehicles
US8396252B2 (en)2010-05-202013-03-12Edge 3 TechnologiesSystems and related methods for three dimensional gesture recognition in vehicles
US9134800B2 (en)2010-07-202015-09-15Panasonic Intellectual Property Corporation Of AmericaGesture input device and gesture input method
US8655093B2 (en)2010-09-022014-02-18Edge 3 Technologies, Inc.Method and apparatus for performing segmentation of an image
US11398037B2 (en)2010-09-022022-07-26Edge 3 TechnologiesMethod and apparatus for performing segmentation of an image
US8891859B2 (en)2010-09-022014-11-18Edge 3 Technologies, Inc.Method and apparatus for spawning specialist belief propagation networks based upon data classification
US8798358B2 (en)2010-09-022014-08-05Edge 3 Technologies, Inc.Apparatus and method for disparity map generation
US12087044B2 (en)2010-09-022024-09-10Golden Edge Holding CorporationMethod and apparatus for employing specialist belief propagation networks
US8983178B2 (en)2010-09-022015-03-17Edge 3 Technologies, Inc.Apparatus and method for performing segment-based disparity decomposition
US11967083B1 (en)2010-09-022024-04-23Golden Edge Holding CorporationMethod and apparatus for performing segmentation of an image
US8467599B2 (en)2010-09-022013-06-18Edge 3 Technologies, Inc.Method and apparatus for confusion learning
US11710299B2 (en)2010-09-022023-07-25Edge 3 TechnologiesMethod and apparatus for employing specialist belief propagation networks
US9723296B2 (en)2010-09-022017-08-01Edge 3 Technologies, Inc.Apparatus and method for determining disparity of textured regions
US8644599B2 (en)2010-09-022014-02-04Edge 3 Technologies, Inc.Method and apparatus for spawning specialist belief propagation networks
US8666144B2 (en)2010-09-022014-03-04Edge 3 Technologies, Inc.Method and apparatus for determining disparity of texture
US11023784B2 (en)2010-09-022021-06-01Edge 3 Technologies, Inc.Method and apparatus for employing specialist belief propagation networks
US10909426B2 (en)2010-09-022021-02-02Edge 3 Technologies, Inc.Method and apparatus for spawning specialist belief propagation networks for adjusting exposure settings
US10586334B2 (en)2010-09-022020-03-10Edge 3 Technologies, Inc.Apparatus and method for segmenting an image
US9990567B2 (en)2010-09-022018-06-05Edge 3 Technologies, Inc.Method and apparatus for spawning specialist belief propagation networks for adjusting exposure settings
US9213890B2 (en)2010-09-172015-12-15Sony CorporationGesture recognition system for TV control
US8620024B2 (en)2010-09-172013-12-31Sony CorporationSystem and method for dynamic gesture recognition using geometric classification
US20120194692A1 (en)*2011-01-312012-08-02Hand Held Products, Inc.Terminal operative for display of electronic record
US9652084B2 (en)2011-02-102017-05-16Edge 3 Technologies, Inc.Near touch interaction
US10061442B2 (en)2011-02-102018-08-28Edge 3 Technologies, Inc.Near touch interaction
US8970589B2 (en)2011-02-102015-03-03Edge 3 Technologies, Inc.Near-touch interaction with a stereo camera grid structured tessellations
US10599269B2 (en)2011-02-102020-03-24Edge 3 Technologies, Inc.Near touch interaction
US8582866B2 (en)2011-02-102013-11-12Edge 3 Technologies, Inc.Method and apparatus for disparity computation in stereo images
US9323395B2 (en)2011-02-102016-04-26Edge 3 TechnologiesNear touch interaction with structured light
US8705877B1 (en)2011-11-112014-04-22Edge 3 Technologies, Inc.Method and apparatus for fast computational stereo
US8718387B1 (en)2011-11-112014-05-06Edge 3 Technologies, Inc.Method and apparatus for enhanced stereo vision
US9324154B2 (en)2011-11-112016-04-26Edge 3 TechnologiesMethod and apparatus for enhancing stereo vision through image segmentation
US10037602B2 (en)2011-11-112018-07-31Edge 3 Technologies, Inc.Method and apparatus for enhancing stereo vision
US11455712B2 (en)2011-11-112022-09-27Edge 3 TechnologiesMethod and apparatus for enhancing stereo vision
US10825159B2 (en)2011-11-112020-11-03Edge 3 Technologies, Inc.Method and apparatus for enhancing stereo vision
US12131452B1 (en)2011-11-112024-10-29Golden Edge Holding CorporationMethod and apparatus for enhancing stereo vision
US8761509B1 (en)2011-11-112014-06-24Edge 3 Technologies, Inc.Method and apparatus for fast computational stereo
US9672609B1 (en)2011-11-112017-06-06Edge 3 Technologies, Inc.Method and apparatus for improved depth-map estimation
US9336456B2 (en)2012-01-252016-05-10Bruno DeleanSystems, methods and computer program products for identifying objects in video data
US8643628B1 (en)2012-10-142014-02-04Neonode Inc.Light-based proximity detection system and user interface
US10949027B2 (en)2012-10-142021-03-16Neonode Inc.Interactive virtual display
US8917239B2 (en)2012-10-142014-12-23Neonode Inc.Removable protective cover with embedded proximity sensors
US10928957B2 (en)2012-10-142021-02-23Neonode Inc.Optical proximity sensor
US9001087B2 (en)2012-10-142015-04-07Neonode Inc.Light-based proximity detection system and user interface
US11733808B2 (en)2012-10-142023-08-22Neonode, Inc.Object detector based on reflected light
US10802601B2 (en)2012-10-142020-10-13Neonode Inc.Optical proximity sensor and associated user interface
US10140791B2 (en)2012-10-142018-11-27Neonode Inc.Door lock user interface
US10496180B2 (en)2012-10-142019-12-03Neonode, Inc.Optical proximity sensor and associated user interface
US10534479B2 (en)2012-10-142020-01-14Neonode Inc.Optical proximity sensors
US10282034B2 (en)2012-10-142019-05-07Neonode Inc.Touch sensitive curved and flexible displays
US10004985B2 (en)2012-10-142018-06-26Neonode Inc.Handheld electronic device and associated distributed multi-display system
US11073948B2 (en)2012-10-142021-07-27Neonode Inc.Optical proximity sensors
US9164625B2 (en)2012-10-142015-10-20Neonode Inc.Proximity sensor for determining two-dimensional coordinates of a proximal object
US11379048B2 (en)2012-10-142022-07-05Neonode Inc.Contactless control panel
US9921661B2 (en)2012-10-142018-03-20Neonode Inc.Optical proximity sensor and associated user interface
US11714509B2 (en)2012-10-142023-08-01Neonode Inc.Multi-plane reflective sensor
US9569095B2 (en)2012-10-142017-02-14Neonode Inc.Removable protective cover with embedded proximity sensors
US9741184B2 (en)2012-10-142017-08-22Neonode Inc.Door handle with optical proximity sensors
US10254943B2 (en)2012-11-272019-04-09Neonode Inc.Autonomous drive user interface
US11650727B2 (en)2012-11-272023-05-16Neonode Inc.Vehicle user interface
US9092093B2 (en)2012-11-272015-07-28Neonode Inc.Steering wheel user interface
US10719218B2 (en)2012-11-272020-07-21Neonode Inc.Vehicle user interface
US12032817B2 (en)2012-11-272024-07-09Neonode Inc.Vehicle user interface
US9710144B2 (en)2012-11-272017-07-18Neonode Inc.User interface for curved input device
US10721448B2 (en)2013-03-152020-07-21Edge 3 Technologies, Inc.Method and apparatus for adaptive exposure bracketing, segmentation and scene organization
US10324565B2 (en)2013-05-302019-06-18Neonode Inc.Optical proximity sensor
US10585530B2 (en)2014-09-232020-03-10Neonode Inc.Optical proximity sensor
US11429230B2 (en)2018-11-282022-08-30Neonode IncMotorist user interface sensor
US11842014B2 (en)2019-12-312023-12-12Neonode Inc.Contactless touch input system
US12299238B2 (en)2019-12-312025-05-13Neonode Inc.Contactless touch input system

Also Published As

Publication numberPublication date
EP1148411A3 (en)2005-09-14
CN100487633C (en)2009-05-13
KR100843811B1 (en)2008-07-04
US20020006222A1 (en)2002-01-17
EP1148411A2 (en)2001-10-24
CN1320854A (en)2001-11-07
KR20010098781A (en)2001-11-08

Similar Documents

PublicationPublication DateTitle
US7046232B2 (en)Information processing apparatus, method of displaying movement recognizable standby state, method of showing recognizable movement, method of displaying movement recognizing process, and program storage medium
CN113132618B (en) Auxiliary photographing method, device, terminal device and storage medium
CN110572716B (en)Multimedia data playing method, device and storage medium
CN111556278A (en)Video processing method, video display device and storage medium
CN112954046B (en)Information transmission method, information transmission device and electronic equipment
US7064742B2 (en)Input devices using infrared trackers
WO2021093583A1 (en)Video stream processing method and apparatus, terminal device, and computer readable storage medium
CN110559645B (en) An application operation method and electronic device
CN108108214A (en)A kind of guiding method of operating, device and mobile terminal
CN110601959B (en)Session message display method, device, terminal and storage medium
CN110795007A (en)Method and device for acquiring screenshot information
CN111400552B (en)Note creation method and electronic equipment
CN107835366A (en) Multimedia playing method, device, storage medium and electronic equipment
CN109947988A (en)A kind of information processing method, device, terminal device and server
CN114489422A (en)Display method of sidebar and electronic equipment
JP2002083302A (en)Information processing device, action recognition processing method, and program storage medium
KR20230061519A (en) Screen capture methods, devices and electronics
CN109618218A (en) A video processing method and mobile terminal
WO2020045909A1 (en)Apparatus and method for user interface framework for multi-selection and operation of non-consecutive segmented information
CN107396178A (en)A kind of method and apparatus for editing video
CN113095163B (en)Video processing method, device, electronic equipment and storage medium
KR100466855B1 (en)Method and System for Providing User Interface by Using Image Signal for Mobile Communication Terminal Equipped with Camera Function
JP4415227B2 (en) Information processing apparatus, information processing method, and recording medium
US20130076622A1 (en)Method and apparatus for determining input
CN108984677B (en) An image stitching method and terminal

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:SONY CORPORATION, JAPAN

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:INAGAKI, TAKEO;SAITO, JUNKO;IHARA, KEIGO;AND OTHERS;REEL/FRAME:012063/0891;SIGNING DATES FROM 20010725 TO 20010731

STCFInformation on status: patent grant

Free format text:PATENTED CASE

FEPPFee payment procedure

Free format text:PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text:PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAYFee payment

Year of fee payment:4

FPAYFee payment

Year of fee payment:8

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553)

Year of fee payment:12


[8]ページ先頭

©2009-2025 Movatter.jp