Movatterモバイル変換


[0]ホーム

URL:


US8427506B2 - Image processing system, storage medium storing image processing program, image processing apparatus and image processing method - Google Patents

Image processing system, storage medium storing image processing program, image processing apparatus and image processing method
Download PDF

Info

Publication number
US8427506B2
US8427506B2US12/870,158US87015810AUS8427506B2US 8427506 B2US8427506 B2US 8427506B2US 87015810 AUS87015810 AUS 87015810AUS 8427506 B2US8427506 B2US 8427506B2
Authority
US
United States
Prior art keywords
image
displayer
image processing
marker
imaged
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US12/870,158
Other versions
US20110304646A1 (en
Inventor
Shunsaku Kato
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nintendo Co Ltd
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nintendo Co LtdfiledCriticalNintendo Co Ltd
Assigned to NINTENDO CO., LTD.reassignmentNINTENDO CO., LTD.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: KATO, SHUNSAKU
Publication of US20110304646A1publicationCriticalpatent/US20110304646A1/en
Priority to US13/859,769priorityCriticalpatent/US9058790B2/en
Application grantedgrantedCritical
Publication of US8427506B2publicationCriticalpatent/US8427506B2/en
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

A first image processing apparatus displays markers on a monitor to thereby make a second image processing apparatus perform a display control of a second object on an imaged image of an LCD while the second image processing apparatus transmits a marker recognizing signal when the display control is performed based on the markers to thereby make the first image processing apparatus perform a display control of a first object on the monitor.

Description

CROSS REFERENCE OF RELATED APPLICATION
The disclosure of Japanese Patent Application No. 2010-134062 is incorporated herein by reference.
BACKGROUND AND SUMMARY
1. Technical Field
The technology presented herein relates to an image processing system, a storage medium storing an image processing program, an image processing apparatus and an image processing method. More specifically, the present technology relates to an image processing system, a storage medium storing an image processing program, an image processing apparatus and an image processing method which display a composite image obtained by combining a photographed image and a CG image.
2. Description of the Related Art
As an example of this kind of a conventional image processing apparatus, one disclosed in a Japanese Patent Application Laid-Open No. 2000-322602 is known. In the related art, the image processing apparatus images a card attached with a two-dimensional bar code with a CCD camera, searches the two-dimensional bar code from this imaged image, detects a position of the two-dimensional bar code in the imaged image, identifies a pattern of the two-dimensional bar code, and then displays a three-dimensional image according to this pattern so as to be superposed at the position of the two-dimensional bar code within the imaged image.
In the aforementioned related art, the two-dimensional bar code is printed on a card, but it is conceivable that this is displayed on a monitor of another image processing apparatus, such as a monitor of a PC, for example. In this case, by changing the two-dimensional bar code on the monitor via the PC, it becomes possible to make the image processing apparatus combine various three-dimensional images. However, this is merely a one-way display control from the PC to the image processing apparatus, and has a limit to enhancement of savor of the game.
Therefore, it is a primary feature of the present technology to provide a novel image processing system, a novel storage medium storing an image processing program, a novel image processing apparatus and a novel image processing method.
Another feature of the present technology is to provide a image processing system, a storage medium storing an image processing program, an image processing method, a first image processing apparatus, and a second image processing apparatus which can bring the first image processing apparatus and the second image processing apparatus into association with each other to thereby operatively connect a first object and a second object between a first displayer and an imaged image of a second displayer.
The present technology employs following features in order to solve the above-described problems.
The present technology adopts the following construction in order to solve the above-mentioned problems.
A first aspect is an image processing system including a first image processing apparatus utilizing a first displayer and a second image processing apparatus utilizing an imager and a second displayer capable of viewing a real space on a screen, wherein the first image processing apparatus comprising: a first display processor which displays a predetermined marker image on the first displayer; and a first object display controller which performs on the first displayer a display control of at least a part of first object image being a predetermined CG object; and the second image processing apparatus comprising: an imaging processor which performs imaging by the imager; and a second object display controller which performs a composition control of at least a part of second object image being a predetermined CG object on a real space capable of being viewed on the second displayer at a position with reference to the marker image within the imaged image by recognizing the marker image within the imaged image.
In the first aspect, a first image processing apparatus displays a predetermined marker image on the first displayer to thereby perform a display control of at least a part of first object image being a predetermined CG object, and a second image processing apparatus performs imaging by the imager, and recognizes a marker image within the imaged image to thereby perform a display control of at least a part of second object image being a predetermined CG object at a position with reference to the marker image within the imaged image on the real space capable of being viewed on the second displayer.
According to the first aspect, the first image processing apparatus and the second image processing apparatus are brought into association with each other via the marker image to thereby operatively connect the first object and the second object between the first displayer and the imaged image of the second displayer.
Here, the first object and the second object may be a common object (that is, the common object becomes the first object on the first displayer, and becomes the second object on the second displayer), and may be a part of the common object (head and arms, for example) and other parts (body, for example). Thus, the first object and the second object may preferably be objects that are brought into association with each other, but may be objects independent of each other.
For example, in a case that the first object and the second object are the common object (a part thereof and another part thereof), the object can look as if it moves between the first displayer and the second displayer (perform a display control). In one embodiment, the object pops out of the first displayer to the imaged image of the second displayer, or returns to the first displayer therefrom. In another modified example, the objects appear on the first displayer and the imaged image of the second displayer at the same time and disappear therefrom at the same time.
Additionally, it is preferable that the marker image is always displayed, but displayed as required. The marker image is always displayed in one embodiment, but it is displayed as required in a modified example.
Furthermore, the composition is a display in a superimposed manner, but the imaged image itself may be changed.
A second aspect is according to the first aspect, wherein the first image processor and the second image processor are able to communicate with each other, the first object display controller performs a display control of the first object image by being operatively connected through the communication, and the second object display controller performs a composition control of the second object image by being operatively connected through the communication.
According to the second aspect, the first image processing apparatus and the second image processing apparatus are brought into association with each other via the marker image and the communications, capable of enhancing the continuity between the first object and the second object.
A third aspect is according to the second aspect, wherein the second object display controller combines at least a part of the second object image with the imaged image when the marker image is recognized within the imaged image, and the first object display controller performs a control on the first displayer such that the first object image disappears when the marker image is recognized within the imaged image in the second object display controller.
According to the third aspect, in accordance with the first object image disappearing from the first displayer, the second object image can be made to appear within the imaged image of the second displayer.
A fourth aspect is according to the second aspect, wherein the second object display controller combines at least a part of the second object image with the imaged image when the marker image is recognized within the imaged image, and the first object display controller performs a control such that the first object image is displayed on the first displayer when the marker image is recognized within the imaged image in the second object display controller.
According to the fourth aspect, in accordance with the second object image appearing within the imaged image on the second displayer, the first object image can also be made to appear in the first displayer.
A fifth aspect is according to the first aspect, wherein the marker image includes identification information; and the first object image and the second object image are images corresponding to the identification information included in the marker image.
According to the fifth aspect, through the identification information included in the marker image, various first object images and second object images can be displayed.
A sixth aspect is according to the first aspect, wherein the first display processor displays the plurality of marker images on the first displayer.
According to the sixth aspect, by displaying the plurality of marker images, it is possible expand a recognizable range.
A seventh aspect is according to the sixth aspect, wherein the first display processor displays four marker images at four corners of the first displayer.
According to the seventh aspect, it is possible expand the recognizable range with the visibility of the first displayer maintained as high as possible.
An eighth aspect is according to the sixth aspect, wherein the first object display controller performs a control such that the first object image is displayed at a predetermined position surrounded by the plurality of marker images, and by recognizing at least one of the plurality of marker images within the imaged image, the second object display controller performs a control of a composition of the second object image on a position surrounded by the plurality of marker images recognized within the imaged image.
According to the eighth aspect, it is possible to display the object image at a predetermined position surrounded by the markers on each of the first displayer and the second displayer.
A ninth aspect is according to the first aspect, wherein the second object display controller which performs a composition control of the second object at a position and an orientation corresponding to a position and an orientation of the marker image within the imaged image by performing an AR recognition on the marker image within the imaged image.
According to the ninth aspect, it is possible to accurately perform a composition control of the second object through the AR recognition.
A tenth aspect is according to the ninth aspect, wherein the second object display controller includes: a position and orientation calculator which calculates a correlative relation of a position and an orientation between the marker images on the first displayer and the imager by recognizing the marker image within the imaged image; a virtual camera setter which arranges the second object in the virtual space and decides a position and an orientation of the virtual camera such that a correlative relation of a position and an orientation between the second object and the virtual camera match the position and the orientation that are calculated by the position and orientation calculator; and a virtual space imager which images the virtual space including the second object by the virtual camera, wherein a composition control is performed between the imaged image and the virtual space imaged by the virtual space imager.
According to the tenth aspect, a correlative relation of a position and a orientation between the marker image on the first displayer and the imager is calculated to thereby arrange the second object in the virtual space, and the position and orientation of the virtual camera are decided a correlative relation of the position and the orientation between the second object and the virtual camera match the position and orientation calculated by the position and attitude calculator, capable of combining the second object with the imaged image with high accuracy.
An eleventh aspect is according to the first aspect, wherein the second image processing apparatus further comprises a second display processor which displays the imaged image imaged by the imaging processor on the second displayer, and the second object display controller performs a composition control of the second object on the imaged image displayed by the second displayer.
According to the eleventh aspect, it is possible to combine the second object with the imaged image.
A twelfth aspect is according to the first aspect, wherein the second image processing apparatus further comprises: a first signal transmitter which transmits a first signal to the first image processing apparatus on the basis of a recognition result of the marker image, and the first image processing apparatus further comprising: a first signal receiver which receives the first signal transmitted by the first signal transmitter, wherein the first object display controller controls a display of the first object on the basis of the first signal received by the first signal receiver.
According to the twelfth aspect, the first image processing apparatus displays a predetermined marker image on a first displayer to thereby make the second image processing apparatus to perform a display control of the second object on the imaged image of the second displayer while the second image processing apparatus transmits a first signal when the display control is performed on the basis of the markers to thereby make the first image processing apparatus perform a display control of the first object on the first displayer. Thus, the first image processing apparatus and the second image processing apparatus are associated with each other through the marker image and the first signal to thereby operatively connect the first object and the second object between the first displayer and the imaged image of the second displayer.
Here, the first signal is preferably a determination result signal indicating a determination result (YES) that the marker image is included, and it is repeatedly transmitted, but it may be a timing signal indicating that the determination result changes from “NO” to “YES”, and it may be transmitted by one.
A thirteenth aspect is according to the twelfth aspect, wherein the first image processing apparatus further comprises: a second signal transmitter which transmits a second signal to the second image processing apparatus in a case that the first signal is received by the first signal receiver, and the second image processing apparatus further comprises: a second signal receiver which receives the second signal, wherein the second object display controller performs a display control on the basis of the second signal received by the second signal receiver.
In the thirteenth aspect, the second image processing apparatus transmits the second signal to the first image processing apparatus when the first signal is received, and the first image processing apparatus performs a display control on the basis of the second signal.
According to the thirteenth aspect, the first image processing apparatus and the second image processing apparatus are associated with each other through the second signal in addition to the marker image and the first signal, capable of enhancing the continuity between the first object and the second objet.
Here, the second signal is preferably a control signal indicating the content and a timing of the display control, but it may merely be a timing signal indicating a timing of the display control, and may be an acknowledgment signal (ACK, for example) confirming reception of the first signal. Furthermore, the second signal may preferably be transmitted repeatedly, but may be transmitted by one.
A fourteenth aspect is according to the thirteenth aspect, wherein the second signal transmitter transmits the second signal to the second image processing apparatus after a lapse of a first predetermined time since the first signal is received by the first signal receiver.
According to the fourteenth aspect, the first controlling device transmits the second signal on the basis of the elapsed time from the reception of the first signal, and therefore, it is possible to perform an active control in relation to the composition of the second object.
Alternatively, the transmission of the second signal by the first image processing apparatus may passively be performed on the basis of a command from the first controlling device.
A fifteenth aspect is according to the thirteenth aspect, wherein the first object display controller performs a display control after the second signal is transmitted by the second signal transmitter.
According to the fifteenth aspect, the first image processing apparatus performs the display control of the first object after transmitting the second signal. In response to the second signal, the second image processing apparatus performs the display control of the second object. Thus, it is possible to synchronize the display control by the first image processing apparatus itself and the display control by the second image processing apparatus. For example, it becomes easy to match a starting timing of the display control between the first image processing apparatus and the second image processing apparatus.
A sixteenth aspect is according to the fourteenth aspect, wherein the first image processing apparatus further comprises a third signal transmitter which transmits a third signal to the second image processing apparatus after a lapse of a second predetermined time since the first object display controller performs a display control, the second image processing apparatus further comprises a third signal receiver which receives the third signal, and the second object display controller erases the second object from the imaged image after the third signal is received by the third signal receiver.
According to the sixteenth aspect, the first controlling device transmits the third signal on the basis of the elapsed time since the display control of the first object is performed, and therefore, it is possible to actively perform a control on the composition and moreover the erasure of the second object.
Alternatively, the transmission of the third signal by the first image processing apparatus may passively be performed on the basis of a command from the first controlling device.
Here, the third signal is preferably a control signal indicating the content and a timing of the display control, but it may merely be a timing signal indicating a timing of the display control. Furthermore, the third signal may preferably be transmitted repeatedly, but may be transmitted by one.
A seventeenth aspect is according to the sixteenth aspect, wherein the first object display controller returns to a state before the display control is performed after the third signal is transmitted by the third signal transmitter.
According to the seventeenth aspect, when the second object is erased on the second displayer, the imaged image on the first displayer can be returned to the original state, that is, the state before the first object is combined.
An eighteenth aspect is according to the twelfth aspect, wherein the first display processor displays at least a part of the first object together with the predetermined identification information, and the first object display controller erases at least a part of the first object on the basis of the first signal.
According to the eighteenth aspect, it becomes possible to make an expression as if the object moves between the first displayer and the second displayer.
A nineteenth aspect is according to the twelfth aspect, wherein the first object display controller displays at least a part of the first object on the basis of the first signal.
According to the nineteenth aspect, it becomes possible to make an expression as if the objects appear and/or disappear between the first displayer and the second displayer at the same time.
A twentieth aspect is according to the first aspect, wherein the first object display controller includes a first object size changer which changes a size of the first object on the basis of a size of the marker image, and the second object display controller includes a second object size changer which changes a size of the second object on the basis of the size of the marker image.
According to the twentieth aspect, it becomes possible to control the size of the object through the size of the marker.
A twenty-first aspect is according to the first aspect, wherein the first object display controller includes a first object direction changer which changes a display direction of the first object on the basis of a shape of the marker image, and the second object display controller includes a second object direction changer which changes a display direction of the second object on the basis of the shape of the marker image.
According to the twenty-first aspect, it becomes possible to control the direction of the object through the shape of the marker.
A twenty-second aspect is according to the twelfth aspect, wherein the first signal includes coordinate information, and the first object display controller performs a display control of at least a part of the first object on the basis of the coordinate information included in the first signal.
According to the twenty-second aspect, the second image processing apparatus transmits the first signal inclusive of the coordinate information, and whereby, the first image processing apparatus can make a display in association with the second image processing apparatus.
A twenty-third aspect is according to the eleventh aspect, wherein the second display processor includes a frame displayer which displays a frame the same in shape as the marker image on the second displayer, and the second object display controller performs recognition in a state that the marker image is displayed along the frame displayed by the frame displayer.
According to the twenty-third aspect, a frame the same in shape as the marker image is displayed on the second displayer, and a recognition is performed with the marker images displayed along the frame, and whereby, it saves the processing of a direction detection based on the position of the marker image and further a coordinate transformation, capable of reducing the processing load.
A twenty-fourth aspect is an image processing program performing image processing between a first image processing apparatus utilizing a first displayer and a second image processing apparatus utilizing an imager and a second displayer capable of viewing a real space on a screen, wherein the image processing program causes a computer to function as: a first display processor which displays a predetermined marker image on the first displayer; and a first object display controller which performs on the first displayer a display control of at least a part of first object image being a predetermined CG object, the image processing program causes the second image processing apparatus to function as: an imaging processor which performs imaging by the imager; and a second object display controller which performs a composition control of at least a part of second object image being a predetermined CG object on a real space capable of being viewed on the second displayer at a position with reference to the marker image within the imaged image by recognizing the marker image within the imaged image.
A twenty-fifth aspect is a first image processing apparatus being brought into association with a second image processing apparatus that utilizes an imager and a second displayer capable of viewing a real space on a screen by utilizing a first displayer, comprising: a first display processor which displays a predetermined marker image on the first displayer; and a first object display controller which performs a display control of at least a part of first object image being a predetermined CG object; wherein the second image processing apparatus comprises an imaging processor which performs imaging by the imager; and a second object display controller which performs a composition control of at least a part of second object image being a predetermined CG object on a real space capable of being viewed on the second displayer at a position with reference to the marker image within the imaged image by recognizing the marker image within the imaged image.
A twenty-sixth aspect is a second image processing apparatus utilizing an imager and a second displayer capable of viewing a real space on a screen in associating with a first image processing apparatus utilizing a first displayer, wherein the first image processing apparatus comprises: a first display processor which displays a predetermined marker image on the first displayer; and a first object display controller which performs on the first displayer a display control of at least a part of first object image being a predetermined CG object, comprising: an imaging processor which performs imaging by the imager; and a second object display controller which performs a composition control of at least a part of second object image being a predetermined CG object on a real space capable of being viewed on the second displayer at a position with reference to the marker image within the imaged image by recognizing the marker image within the imaged image.
A twenty-seventh aspect is an image processing method performed by a first image processing apparatus utilizing a first displayer and a second image processing apparatus utilizing an imager and a second displayer capable of viewing a real space on a screen, including followings steps to be executed by a computer of the first image processing apparatus: a first display processing step for displaying a predetermined marker image on the first displayer; and a first object display controlling step for performing on the first displayer a display control of at least a part of first object image being a predetermined CG object, and including following steps to be executed by a computer of the second image processing apparatus of: an imaging processing step for performing imaging by the imager; and a second object display controlling step for performing a composition control of at least a part of second object image being a predetermined CG object on a real space capable of being viewed on the second displayer at a position with reference to the marker image within the imaged image by recognizing the marker image within the imaged image.
In the twenty-fourth to twenty-seventh aspect as well, similar to the first aspect, the first image processing apparatus and the second image processing apparatus are brought into association via the marker image to thereby operatively connect the first object and the second object between the first displayer and the imaged image of the second displayer.
According to the technology presented herein, it is possible to implement an image processing system, an image processing program and an image processing method, and a first image processing apparatus and a second image processing apparatus therefor capable of operatively connecting the first object and the second object between the first displayer and the imaged image of the second displayer by bringing the first image processing apparatus and the second image processing apparatus into association with each other.
The above described features, aspects and advantages of the present technology will become more apparent from the following detailed description of the present technology when taken in conjunction with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is an illustrative view showing one embodiment of a game system;
FIG. 2 is a block diagram showing an electric configuration of the game system;
FIG. 3 is an illustrative view illustrating an appearance of a first controller;
FIG. 4 is an illustrative view illustrating an appearance of a second controller;
FIG. 5 is a block diagram showing an electric configuration of controllers (the first controller and the second controller are connected with each other);
FIG. 6 is an illustrative view summarizing a situation that a virtual game is played by utilizing the controllers;
FIG. 7 is an illustrative view showing viewing angles of a marker and the controller;
FIG. 8 is an illustrative view showing one example of an imaged image by the controller;
FIG. 9 is an external view of a hand-held-type game apparatus, and shows a top surface in an open state;
FIG. 10 is an external view of the hand-held-type game apparatus, and shows a side surface in the open state;
FIG. 11 is an external view of the hand-held-type game apparatus,FIG. 11(A) shows one side surface thereof in a closed state,FIG. 11(B) shows a top surface thereof in the closed state,FIG. 11(C) shows the other side surface in the close state, andFIG. 11(D) shows a bottom surface thereof in the close state;
FIG. 12 is an illustrative view showing a situation in which the hand-held-type game apparatus is held by the user;
FIG. 13 is a block diagram showing one example of an electric configuration of the hand-held-type game apparatus;
FIG. 14 is an illustrative view explaining a situation when a “pop-up” virtual game is played;
FIG. 15 is an illustrative view showing a part of a memory map,FIG. 15(A) shows a memory map of a console-type game apparatus, andFIG. 15(B) shows a memory map of the hand-held type game apparatus;
FIG. 16 is a flowchart showing a part of an operation by CPUs,FIG. 16(A) shows an operation by the CPU of the console-type game apparatus, andFIG. 16(B) is a flowchart showing an operation by the CPU of the hand-held type game apparatus;
FIG. 17 is a flowchart showing another part of the operation by the CPU of the console-type game apparatus;
FIG. 18 is a flowchart showing another part of the operation by the CPU of the hand-held type game apparatus;
FIG. 19 is a flowchart showing a still another part of the operation by the CPU of the hand-held type game apparatus;
FIG. 20 is an illustrative view illustrating a display control when a pattern is stored,FIG. 20(A) shows a monitor screen, andFIG. 20(B) shows an LCD screen;
FIG. 21 is an illustrative view illustrating a change in size when a pattern is stored;
FIG. 22 is an illustrative view explaining a part of a display control when a game is played,FIG. 22(A) shows the monitor screen, andFIG. 22(B) shows the LCD screen;
FIG. 23 is an illustrative view sequel toFIG. 22,FIG. 23(A) shows the monitor screen, andFIG. 23(B) shows the LCD screen;
FIG. 24 is an illustrative view sequel toFIG. 23,FIG. 24(A) shows the monitor screen, andFIG. 24(B) shows the LCD screen;
FIG. 25 is an illustrative view sequel toFIG. 24,FIG. 25(A) shows the monitor screen, andFIG. 25(B) shows the LCD screen;
FIG. 26 is an illustrative view explaining another part of the display control when a game is played,FIG. 26(A) shows a positional relationship between the monitor screen and the LCD screen, andFIG. 26(B) shows the LCD screen after change;
FIG. 27 is a flowchart showing a part of an operation by the CPU of the console-type game apparatus in a first modified example;
FIG. 28 is a flowchart showing a part of an operation by the CPU of the hand-held type game apparatus in the first modified example;
FIG. 29 is an illustrative view explaining a display control in the first modified example,FIG. 29(A) shows the monitor screen, andFIG. 29(B) shows the LCD screen;
FIG. 30 is an illustrative view sequel toFIG. 29,FIG. 30(A) shows the monitor screen, andFIG. 30(B) shows the LCD screen;
FIG. 31 is a flowchart showing a part of an operation by the CPUs in a second modified example,FIG. 31(A) shows an operation by the CPU of the console-type game apparatus, andFIG. 31(B) shows an operation by the CPU of the hand-held type game apparatus;
FIG. 32 is an illustrative view explaining a display control in the second modified example,FIG. 32(A) shows the monitor screen, andFIG. 32(B) shows the LCD screen; and
FIG. 33 is an illustrative view sequel toFIG. 32,FIG. 33(A) shows the monitor screen, andFIG. 33(B) shows the LCD screen.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring toFIG. 1, agame system10 of one embodiment includes a game apparatus12 (console-type game apparatus) and acontroller14. Although illustration is omitted, thegame apparatus12 of this embodiment is designed such that it can be connected to fourcontrollers14 at the maximum. Furthermore, thegame apparatus12 and each of thecontrollers14 are wirelessly connected. For example, the wireless communication is executed according to an MP (Multilink Protocol) or Bluetooth (registered trademark) standard, but may be executed by other standards such as infrared rays, a wireless LAN, etc. Alternatively, they may be connected by wire.
Thegame apparatus12 includes a roughlyrectangular parallelepiped housing16, and thehousing16 is furnished with adisk slot18 on a front surface. Anoptical disk24 as one example of an information storage medium storing a game program, etc. is inserted from thedisk slot18 to be loaded into a disk drive54 (seeFIG. 2) within thehousing16. Although omitted in the illustration, around thedisk slot18, an LED and a light guide plate are arranged so as to make thedisk slot18 light up and off or flash in response to various processing.
Furthermore, on a front surface of thehousing16 of thegame apparatus12, apower button20aand areset button20bare provided at the upper part thereof, and aneject button20cis provided below them. In addition, a connector cover forexternal memory card22 is provided between thereset button20band theeject button20c, and in the vicinity of thedisk slot18. Inside the connector cover forexternal memory card22, a connector for external memory card62 (seeFIG. 2) is provided, through which an external memory card38 (hereinafter simply referred to as a “memory card38”) not shown is inserted. The memory card is employed for loading the game program, etc. read from theoptical disk24 to temporarily store it, storing (saving) game data (result data, proceeding data of the game, or replay data described later) of the game played by means of thegame system10, and so forth. Here, storing the game data described above may be performed on an internal memory, such as a flash memory44 (seeFIG. 2) provided inside thegame apparatus12 in place of thememory card38. Also, thememory card38 may be utilized as a backup memory of the internal memory. In addition, in thegame apparatus12, an application other than the game can be executed, and in such a case, data of the other application can be saved in thememory card38.
It should be noted that a general-purpose SD card can be employed as amemory card38, but other general-purpose memory cards, such as memory sticks, multimedia cards (registered trademark) can be employed. Thememory card38 can be utilized in another game apparatuses12A having a construction similar to thegame apparatus12, and thus, it is possible to offer the game data to other players via thememory card38.
Although omitted inFIG. 1, thegame apparatus12 has an AV cable connector58 (seeFIG. 2) on the rear surface of thehousing16, and by utilizing theAV cable connector58, amonitor28 and aspeaker30 are connected to thegame apparatus12 through anAV cable26. Themonitor28 and thespeaker30 are typically a color television receiver, and through theAV cable26, a video signal from thegame apparatus12 is input to a video input terminal of the color television, and a sound signal from thegame apparatus12 is input to a sound input terminal thereof. Accordingly, a virtual three-dimensional game image of a three-dimensional (3D) video game, for example, is displayed on the screen of the color television (monitor)28, and stereo game sound, such as a game music, a sound effect, etc. is output from right and leftspeakers30. Around the monitor28 (on the top side of themonitor28, in this embodiment), amarker unit32 including two infrared ray LEDs (markers)32A and32B is provided. Themarker unit32 is connected to thegame apparatus12 through apower source cable32c. Accordingly, themarker unit32 is supplied with power from thegame apparatus12. Thus, themarkers32A and32B emit lights so as to output infrared rays ahead of themonitor28.
Furthermore, the power of thegame apparatus12 is applied by means of a general AC adapter (not illustrated). The AC adapter is inserted into a standard wall socket for home use, and thegame apparatus12 transforms the house current (commercial power supply) to a low DC voltage signal suitable for driving. In another embodiment, a battery may be utilized as a power supply.
Thecontroller14, which is described in detail later, includes afirst controller34 and asecond controller36 each capable of being held with one hand as a first operation unit and a second operation unit, respectively. Acable36ahas one end extending from the rear end of thesecond controller36 and the other end provided with aconnector36b. Theconnector36bis connected to aconnector34a(FIG. 3,FIG. 5) provided on a rear end surface of thefirst controller34. Input data obtained by thesecond controller36 is applied to thefirst controller34 through thecable36a. Thefirst controller34 transmits controller data including the input data of thefirst controller34 itself and the input data of thesecond controller36.
In thegame system10, a user or a player turns the power of thegame apparatus12 on for playing the game (or applications other than the game) by apower switch20a. Then, the user selects an appropriateoptical disk24 recording a program of a video game (or other applications the player wants to play), and loads theoptical disk24 into thedisk drive54 of thegame apparatus12. In response thereto, thegame apparatus12 starts to execute a video game or other applications on the basis of the program recorded in theoptical disk24. The user operates thecontroller14 in order to apply an input to thegame apparatus12. For example, by operating any one of the operating buttons of the operatingportion82, a game or other application is started. Besides the operation performed on operatingportion82, by moving thecontroller14 itself, it is possible to move a moving image object (player object) in different directions or change the perspective of the user (camera position of the virtual game) in a three-dimensional game world.
It should be noted that the video game and other application programs are stored (installed) in an internal memory (flash memory44 (seeFIG. 2)) of thegame apparatus12, and may be executed in the internal memory. In such a case, a program stored in a storage medium like anoptical disk24 may be installed in the internal memory, and the downloaded program may be installed in the internal memory.
FIG. 2 is a block diagram showing an electric configuration of thegame system10 shown inFIG. 1 embodiment. Although illustration is omitted, respective components within thehousing16 are mounted on a printed board. As shown inFIG. 2, thegame apparatus12 has aCPU40. TheCPU40 functions as a game processor. TheCPU40 is connected with asystem LSI42. Thesystem LSI42 is connected with an externalmain memory46, a ROM/RTC48, thedisk drive54, and anAV IC56.
The externalmain memory46 is utilized as a work area and a buffer area of theCPU40 by storing programs like a game program, etc. and various data. The ROM/RTC48, which is a so-called boot ROM, is incorporated with a program for activating thegame apparatus12, and is provided with a time circuit for counting a time. Thedisk drive54 reads program, texture data etc. from theoptical disk24, and writes them in an internalmain memory42edescribed later or the externalmain memory46 under the control of theCPU40.
Thesystem LSI42 is provided with an input-output processor42a, a GPU (Graphics Processor Unit)42b, a DSP (Digital Signal Processor)42c, aVRAM42dand an internalmain memory42e, and these are connected with one another by internal buses although illustration is omitted. The input-output processor (I/O processor)42aexecutes transmission and reception of data and executes download of the data. TheGPU42bis made up of a part of a depicting means, and receives a graphics command (construction command) from theCPU40 to generate game image data according to the command. Additionally, theCPU40 applies an image generating program required for generating game image data to theGPU42bin addition to the graphics command.
Although illustration is omitted, theGPU42bis connected with theVRAM42das described above. TheGPU42baccesses theVRAM42dto acquire data (image data: data such as polygon data, texture data, etc.) required to execute the construction command. Here, theCPU40 writes image data required for depicting to theVRAM42dvia theGPU42b. TheGPU42baccesses theVRAM42dto create game image data for depicting.
In this embodiment, a case that theGPU42bgenerates game image data is explained, but in a case that an arbitrary application except for the game application is executed, theGPU42bgenerates image data as to the arbitrary application.
Furthermore, theDSP42cfunctions as an audio processor, and generates audio data corresponding to a sound, a voice, music, or the like to be output from thespeaker30 by means of the sound data and the sound wave (tone) data stored in the internalmain memory42eand the externalmain memory46.
The game image data and audio data generated as described above are read by theAV IC56, and output to themonitor28 and thespeaker30 via theAV connector58. Accordingly, a game screen is displayed on themonitor28, and a sound (music) necessary for the game is output from thespeaker30.
Furthermore, the input-output processor42ais connected with aflash memory44, awireless communication module50 and awireless controller module52, and is also connected with anexpansion connector60 and a connector forexternal memory card62. Thewireless communication module50 is connected with anantenna50a, and thewireless controller module52 is connected with anantenna52a.
The input-output processor42acan communicate with other game apparatuses and various servers (both of them are not shown) to be connected to a network via awireless communication module50. The input-output processor42aperiodically accesses theflash memory44 to detect the presence or absence of data (referred to as data to be transmitted) being required to be transmitted to a network, and transmits it to the network via thewireless communication module50 and theantenna50ain a case that data to be transmitted is present. Furthermore, the input-output processor42areceives data (referred to as received data) transmitted from another game apparatuses via the network, theantenna50aand thewireless communication module50, and stores the received data in theflash memory44. In a case that the received data does not satisfy a predetermined condition, the reception data is abandoned as it is. In addition, the input-output processor42areceives data (download data) downloaded from the server connected to the network via the network theantenna50aand thewireless communication module50, and stores the download data in theflash memory44.
Furthermore, the input-output processor42areceives input data transmitted from thecontroller14 via theantenna52aand thewireless controller module52, and (temporarily) stores it in the buffer area of the internalmain memory42eor the externalmain memory46. The input data is erased from the buffer area after being utilized in processing (game processing, for example) by theCPU40.
Here, the input-output processor42acan communicate with the same kind of another game apparatus and a hand-held-type game apparatus100 (described later) directly without passing through the network via thewireless communication module50.
In addition, the input-output processor42ais connected with theexpansion connector60 and the connector forexternal memory card62. Theexpansion connector60 is a connector for interfaces, such as USB, SCSI, etc., and can be connected with medium such as an external storage and peripheral devices such as another controller different from thecontroller14. Furthermore, theexpansion connector60 is connected with a cable LAN adaptor, and can utilize the cable LAN in place of thewireless communication module50. The connector forexternal memory card62 can be connected with an external storage like amemory card38. Thus, the input-output processor42a, for example, accesses the external storage via theexpansion connector60 and the connector forexternal memory card62 to store and read the data.
Although a detailed description is omitted, as shown inFIG. 1 as well, the game apparatus12 (housing16) is furnished with thepower button20a, thereset button20b, and theeject button20c. Thepower button20ais connected to thesystem LSI42. When thepower button20ais turned on, thesystem LSI42 is set to a mode of a normal energized state (referred to as “normal mode”) in which the respective components of thegame apparatus12 are supplied with power through an AC adapter not shown. On the other hand, when thepower button20ais turned off, thesystem LSI42 is set to a mode (hereinafter referred to as “standby mode”) in which a part of the components of thegame apparatus12 is supplied with power, and the power consumption is reduced to minimum.
In this embodiment, in a case that the standby mode is set, thesystem LSI42 issues an instruction to stop supplying the power to the components except for the input-output processor42a, theflash memory44, the externalmain memory46, the ROM/RTC48 and thewireless communication module50, and thewireless controller module52. Accordingly, in this embodiment, in the standby mode, theCPU40 never executes an application.
Thereset button20bis also connected to thesystem LSI42. When thereset button20bis pushed, thesystem LSI42 restarts a start-up program of thegame apparatus12. Theeject button20cis connected to thedisk drive54. When theeject button20cis pushed, theoptical disk24 is ejected from thedisk drive54.
FIG. 3 shows one example of an external appearance of thefirst controller34.FIG. 3 (A) is a perspective view of thefirst controller34 as seeing it from above rear, andFIG. 3 (B) is a perspective view of thefirst controller34 as seeing it from below front. Thefirst controller34 has ahousing80 formed by plastic molding, for example. Thehousing80 is formed into an approximately rectangular parallelepiped shape regarding a back and forth direction (Z-axis direction shown inFIG. 3) as a longitudinal direction, and has a size small enough to be held by one hand of a child and an adult. As one example, thehousing80 has a length or a width approximately the same as that of the palm of the person. A player can perform a game operation by means of thefirst controller34, that is, by pushing buttons provided on it and by changing a position and a direction of thefirst controller34 itself.
Thehousing80 is provided with a plurality of operation buttons (operation key). That is, on the top surface of thehousing80, a cross key82a, a 1button82b, a 2button82c, anA button82d, a −button82e, amenu button82f, and a +button82gare provided. Meanwhile, on the bottom surface of thehousing80, a concave portion is formed, and on the reward inclined surface of the concave portion, aB button82his provided. Each of the buttons (switches)82a-82his assigned an appropriate function according to a game program to be executed by thegame apparatus12. Furthermore, thehousing80 has apower switch82ifor turning on/off the power of the main body of thegame apparatus12 from a remote place on a top surface. The respective buttons (switches) provided on thefirst controller34 may inclusively be indicated with the use of thereference numeral82.
At the back surface of thehousing80, the above-describedconnector34ais provided. Theconnector34ais a 32 pin edge connector, for example, and utilized for connecting other devices to thefirst controller34. In this embodiment, theconnector34ais connected with theconnector36bof thesecond controller36. At the back end of the top surface of thehousing80, a plurality ofLEDs84 are provided, and the plurality ofLEDs84 show a controller number (identification number of the controller) of thecontroller14. Thegame apparatus12 can be connected with a maximum fourcontrollers14, for example. If a plurality ofcontrollers14 are connected to thegame apparatus12, a controller number is applied to therespective controllers14 in the connecting order, for example. EachLED84 corresponds to the controller number, and theLED84 corresponding to the controller number lights up.
Furthermore, inside thehousing80 of thefirst controller34, an acceleration sensor86 (FIG. 5) is provided. As anacceleration sensor86, acceleration sensors of an electrostatic capacity type can typically be utilized. Theacceleration sensor86 detects accelerations of a linear component for each sensing axis and gravitational acceleration out of the accelerations applied to a detection portion of the acceleration sensor. More specifically, in this embodiment, a three-axis acceleration sensor is applied to detect the respective accelerations in directions of three axes of a up and down direction (Y-axial direction shown inFIG. 3), a right and left direction (X-axial direction shown inFIG. 3), and a forward and rearward direction (Z-axial direction shown inFIG. 3) of thefirst controller34.
It should be noted that as anacceleration sensor86, two-axis acceleration sensors may be utilized for detecting any two of the directions of the accelerations out of the up and down direction, the right and left direction and the back and forth direction according to the shape of thehousing80, the limitation on how to hold thefirst controller34, or the like. Under certain circumstances, a one-axis acceleration sensor may be used.
In addition, thefirst controller34 has an imaged information arithmetic section88 (seeFIG. 5). As shown inFIG. 3 (B), on the front end surface of thehousing80, a light incident opening90 of the imaged informationarithmetic section88 is provided, and from the light incident opening90, infrared rays emitted by themarkers44mand44nof thesensor bar44 are captured.
FIG. 4 shows one example of an appearance of thesecond controller36.FIG. 4 (A) is a perspective view of thesecond controller36 as seeing it from above rear, andFIG. 4 (B) is a perspective view of thesecond controller36 as seeing it from below front. Additionally, inFIG. 4, thecable36aof thesecond controller36 is omitted.
Thesecond controller36 has ahousing92 formed by plastic molding, for example. Thehousing92 is formed into an approximately thin long elliptical shape in the forward and backward directions (Z-axis direction inFIG. 4) when viewed from plan, and the width of the right and left direction (X-axis direction inFIG. 4) at the back end is narrower than that of the front end. Furthermore, thehousing92 has a curved shape as a whole when viewed from a side, and downwardly curved from a horizontal portion at the front end to the back end. Thehousing92 has a size small enough to be held by one hand of a child and an adult similar to thefirst controller34 as a whole, and has a longitudinal length (in the Z-axis direction) slightly shorter than that of thehousing80 of thefirst controller34. Even with thesecond controller36, the player can perform a game operation by operating buttons and a stick, and by changing a position and a direction of the controller by moving itself.
At the end of the top surface of thehousing92, ananalog joystick94ais provided. At the end of thehousing92, a front edge slightly inclined backward is provided, and on the front edge are provided aC button94band aZ button94cvertically arranged (Y-axis direction inFIG. 4). Theanalog joystick94aand therespective buttons94band94care assigned appropriate functions according to a game program to be executed by thegame apparatus12. Theanalog joystick94aand therespective buttons94band94cprovided to thesecond controller36 may be inclusively denoted by means of thereference numeral94.
Inside thehousing92 of thesecond controller36, an acceleration sensor96 (FIG. 5) is provided. As theacceleration sensor96, an acceleration sensor similar to theacceleration sensor86 in thefirst controller34 is applied. More specifically, the three-axis acceleration sensor is applied in this embodiment, and detects accelerations in the respective three axis directions like an up and down direction (Y-axial direction shown inFIG. 4), a right and left direction (X-axial direction shown inFIG. 4), and a forward and backward direction (Z-axial direction shown inFIG. 4) of thesecond controller36.
Additionally, the shapes of thefirst controller34 shown inFIG. 3 and thesecond controller36 shown inFIG. 4 and the shape, the number and the setting position of the buttons (switches, stick, or the like), etc. are merely one example, and can be changed to other shapes, numbers and setting positions, etc. as needed.
Furthermore, thecontroller14 is powered by a battery (not illustrated) detachably housed in thefirst controller34. Thesecond controller36 is supplied with the power through theconnector34a, theconnector40, and thecable36a.
FIG. 5 shows one example of an electric configuration of thecontroller14 when thefirst controller34 and thesecond controller36 are connected with each other. Thefirst controller34 has acommunication unit98, and thecommunication unit98 is connected with the operatingportion82, theacceleration sensor86, the imaged informationarithmetic section88 and theconnector34a. The operatingportion82 indicates the above-described operation buttons or operatingswitches82a-82i. When the operatingportion82 is operated, an operation signal (key information) is applied to thecommunication unit98. The data indicative of acceleration detected by theacceleration sensor86 is output to thecommunication unit98. Theacceleration sensor86 has in the order of a maximum sampling period of 200 frames per second.
The data taken in by the imaged informationarithmetic section88 is also output to thecommunication unit98. The imaged informationarithmetic section88 is constituted by aninfrared filter100, alens102, animager104 and animage processing circuit106. Theinfrared filter100 passes only infrared rays from the light incident from the light incident opening90 at the front of thefirst controller34. As described above, themarkers44mand44nof thesensor bar44 placed near (around) the display screen of themonitor30 are infrared LEDs for outputting infrared lights forward themonitor30. Accordingly, by providing theinfrared filter100, it is possible to image the image of themarkers44mand44nmore accurately. Thelens102 condenses the infrared rays passing thorough theinfrared filter100 to emit them to theimager104. Theimager104 is a solid imager, such as a CMOS sensor and a CCD, for example, and images the infrared rays condensed by thelens102. Accordingly, theimager104 images only the infrared rays passing through theinfrared filter100 to generate image data. Hereafter, the image imaged by theimager104 is called an “imaged image”. The image data generated by theimager104 is processed by theimage processing circuit106. Theimage processing circuit106 calculates positions of objects to be imaged (markers44mand44n) within the imaged image, and outputs marker coordinates data including each coordinate value indicative of the position to thecommunication unit98 for each predetermined time (one frame, for example). It should be noted that a description of theimage processing circuit106 is made later.
Theconnector34ais connected with theconnector36bof thecable36aextending from thesecond controller36. Theconnector36bis connected with the operatingportion94 and theacceleration sensor96 of thesecond controller36. The operatingportion94 denotes the above-describedanalog joystick94aandoperation buttons94band94c. When the operatingportion94 is operated, an operation signal is applied to thecommunication unit98 via thecable36a, theconnector36b, theconnector34a, etc. Theacceleration sensor96 also has a sampling period similar to that of theacceleration sensor86, and applies the data indicative of the detected acceleration to thecommunication unit98.
Thecommunication unit98 includes a microcomputer (micon)108, amemory110, awireless module78 and anantenna112. Themicon108 transmits the obtained data to thegame apparatus12 and receives data from thegame apparatus12 by controlling thewireless module78 while using thememory110 as a memory area (working area and buffer area) in processing.
The data output from the operatingportion82, theacceleration sensor86 and the imaged informationarithmetic section88 of thefirst controller34, and the operatingportion94 andacceleration sensor96 of thesecond controller36 to themicon108 is temporarily stored in thememory110. The wireless transmission from thecommunication unit98 to theBluetooth communication unit76 of thegame apparatus12 is performed every predetermined cycle. The game processing is generally performed by regarding 1/60 seconds as a unit, and therefore, it is necessary to perform the transmission from thefirst controller34 at a cycle equal to or shorter than it. Themicon108 outputs data including the operation data of the operatingportions82 and94 and the acceleration data of theacceleration sensors86 and96, and marker coordinates data from the imaged informationarithmetic section88 stored in thememory110 to thewireless module78 as controller data when transmission timing to thegame apparatus12 has come. Thewireless module78 modulates a carrier of a predetermined frequency by the controller data, and emits its weak radio wave signal from theantenna112 by using a short-range wireless communication technique, such as Bluetooth. Namely, the controller data is modulated to the weak radio wave signal by thewireless module78 and transmitted from thefirst controller34. The weak radio wave signal is received by theBluetooth communication unit76 of thegame apparatus12. The weak radio wave thus received is subjected to demodulating and decoding processing, thus making it possible for thegame apparatus12 to obtain the controller data. TheCPU46 of thegame apparatus12 performs the game processing on the basis of the controller data obtained from thecontroller14.
It will be appreciated by those skilled in the art from the description of this specification that a computer, such as a processor (CPU46, for example) of thegame apparatus12 or the processor (micon108, for example) of thecontroller14 executes processing on the basis of an acceleration signal output from theacceleration sensors86 and96, and whereby, more information relating to thecontroller14 can be estimated or calculated (determined). In a case that processing is executed on the side of the computer assuming that thefirst controller34 andsecond controller36 respectively incorporated with theacceleration sensors86 and96 are in a static state (that is, processing is executed considering that accelerations detected by theacceleration sensors86 and96 are only gravitational accelerations), if thefirst controller34 and thesecond controller36 are actually in a static state, it is possible to know whether or not the orientations of thefirst controller34 and thesecond controller36 are inclined with respect to the direction of gravity or to what extent they are inclined on the basis of the detected acceleration. More specifically, when a state in which the detection axes of theacceleration sensors86 and96 are directed to a vertically downward direction is taken as a reference, merely whether or not 1G (gravitational acceleration) is imposed on can show whether or not each of thefirst controller34 and thesecond controller36 is inclined, and the size can show to what extent each of them is inclined. Furthermore, if a multi-axes acceleration sensor is applied, by further performing processing on an acceleration signal of each axis, it is possible to more precisely know to what extent thefirst controller34 and thesecond controller36 are inclined with respect to the direction of gravity. In this case, on the basis of outputs from theacceleration sensors86 and96, the computer may perform processing of calculating data of inclined angles of thefirst controller34 andsecond controller36, but perform processing of estimating an approximate inclination on the basis of the outputs from theacceleration sensors86 and96 without performing the processing of calculating the data of the inclined angle. Thus, by using theacceleration sensors86 and96 in conjunction with the computer, it is possible to determine an inclination, an orientation or a position of each of thefirst controller34 andsecond controller36.
On the other hand, assuming that theacceleration sensors86 and96 are in a dynamic state, accelerations according to the movement of theacceleration sensors86 and96 are detected in addition to the gravitational acceleration component, and therefore, if the gravitational acceleration component is removed by predetermined processing, it is possible to know a moving direction, etc. More specifically, in a case that thefirst controller34 and thesecond controller36 respectively being furnished with theacceleration sensors86 and96 are accelerated and moved by the hands of the user, acceleration signals generated by theacceleration sensors86 and96 are processed by the above-described computer, and whereby, it is possible to calculate various movements and/or positions of thefirst controller34 and thesecond controller36. Additionally, even when assuming that theacceleration sensors86 and96 are in a dynamic state, if an acceleration in correspondence with the movement of each of theacceleration sensors86 and96 is removed by the predetermined processing, it is possible to know the inclination with respect to the direction of gravity. In another embodiment, each of theacceleration sensors86 and96 may contain a built-in signal processing apparatus or other kinds of dedicated processing apparatuses for performing desired processing on the acceleration signal output from the incorporated acceleration detecting means before outputting the signal to themicon108. For example, in a case that theacceleration sensors86 and96 are ones for detecting a static acceleration (gravitational acceleration, for example), the built-in or dedicated processing apparatuses may be ones for transforming the detected acceleration signal into the inclined angle (or other preferable parameters) corresponding thereto.
In thisgame system10, a user can make an operation or input to the game by moving thecontroller14. In playing the game, the user holds thefirst controller34 with the right hand and thesecond controller36 with the left hand as shown inFIG. 6. As described above, in this embodiment, thefirst controller34 contains theacceleration sensor86 for detecting accelerations in the three-axis directions, and thesecond controller36 also contains thesame acceleration sensor96. When thefirst controller34 and thesecond controller36 are moved by the user, acceleration values respectively indicating the movements of the controllers are detected by theacceleration sensor86 and theacceleration sensor96. In thegame apparatus12, game processing can be executed according to the detected acceleration values.
Furthermore, thefirst controller34 is provided with the imaged informationarithmetic section88, and this makes it possible for the user to utilize thefirst controller34 as a pointing device. In this case, the user holds thefirst controller34 with the edge surface (light incident opening90) of thefirst controller34 directed to themarkers44mand44n. It should be noted that as understood fromFIG. 1, themarkers44mand44nare placed around a predetermined side (top or bottom) of themonitor30 in parallel with a predetermined side. In this state, the user can perform a game operation by changing a position on the screen designated by thefirst controller34 by moving thefirst controller34 itself, and by changing distances between thefirst controller34 and each of themarkers44mand44n.
FIG. 7 is a view explaining viewing angles between therespective markers44mand44n, and thefirst controller34. As shown inFIG. 7, each of themarkers44mand44nemits infrared ray within a range of a viewing angle α. Also, theimager104 of the imaged informationarithmetic section88 can receive incident light within the range of the viewing angle β taking the line of sight of the first controller34 (Z axis direction inFIG. 3) as a center. For example, the viewing angle α of each of themarkers44mand44nis 34° (half-value angle) while the viewing angle β of theimager104 is 42°. The user holds thefirst controller34 such that theimager104 is directed and positioned so as to receive the infrared rays from themarkers44mand44n. More specifically, the user holds thefirst controller34 such that at least one of themarkers44mand44nexists in the viewing angle β of theimager104, and thefirst controller34 exists in at least one of the viewing angles α of themarker44mor44n. In this state, thefirst controller34 can detect at least one of themarkers44mand44n. The user can perform a game operation by changing the position and the orientation of thefirst controller34 in the range satisfying the state. Also, in a case that any one of themakers44mand44nis only detected, by setting temporary marker coordinates in place of the other marker which is not detected by means of data detecting the previous twomakers44mand44n, a designated position by thefirst controller34 can be calculated.
If the position and the orientation of thefirst controller34 are out of the range, the game operation based on the position and the orientation of thefirst controller34 cannot be performed. Hereafter, the above-described range is called an “operable range.”
If thefirst controller34 is held within the operable range, an image of each of themarkers44mand44nis imaged by the imaged informationarithmetic section88. That is, the imaged image obtained by theimager104 includes an image (object image) of each of themarkers44mand44nas an object to be imaged.FIG. 8 is an illustrative view showing one example of the imaged image including object images. Theimage processing circuit106 calculates coordinates (marker coordinates) indicative of the position of each of themarkers44mand44nin the imaged image by utilizing the image data of the imaged image including theobject images44m′ and44n′.
Since theobject images44m′ and44n′ appear as high-intensity parts in the image data of the imaged image, theimage processing circuit106 first detects the high-intensity parts as a candidate of the object images. Next, theimage processing circuit106 determines whether or not each of the high-intensity parts is an object image on the basis of the size of the detected high-intensity part. The imaged image may include images other than the object image due to sunlight through a window and light of a fluorescent lamp in the room as well as the twoobject images44m′ and44n′ (marker images). The determination processing whether or not the high-intensity part is an object image is executed for discriminating theimages44m′ and44n′ of the twomarkers44mand44nas object images from the images other than them, and accurately detecting the object images. In order to discriminate theobject images44m′ and44n′ in the imaged image from other images, the imaging objects44mand44nare necessary to be known, and in this embodiment, the size is decided in advance, and therefore, it is possible to estimate the size of themarker images44m′ and44n′. Thus, on the basis of the size of the high-intensity part, it is possible to make a determination of themarker images44m′ and44n′. More specifically, in the determination process, it is determined whether or not each of the detected high-intensity part is within the size of the preset predetermined range. Then, if the high-intensity part is within the size of the predetermined range, it is determined that the high-intensity part represents the object image. On the contrary, if the high-intensity part is not within the size of the predetermined range, it is determined that the high-intensity part represents the images other than the object image.
In addition, as to the high-intensity part which is determined to represent the object image as a result of the above-described determination processing, theimage processing circuit106 calculates the position of the high-intensity part. More specifically, the barycenter position of the high-intensity part is calculated. Here, the coordinates of the barycenter position is called a “marker coordinates”. Also, the barycenter position can be calculated with more detailed scale than the resolution of theimager104. Now, the resolution of the imaged image imaged by theimager104 shall be 126×96, and the barycenter position shall be calculated with the scale of 1024×768. That is, the marker coordinates is represented by the integer from (0, 0) to (1024, 768).
Additionally, as shown inFIG. 8, the position in the imaged image is represented in a coordinate system (X-Y coordinate system of the imaged image) by taking the upper left of the imaged image as an original point O, the downward direction as the Y-axis positive direction, and the right direction as the X-axis positive direction.
Furthermore, in a case that theobject images44m′ and44n′ are accurately detected, two high-intensity parts are determined as object images by the determination processing, and therefore, it is possible to calculate two marker coordinates. Theimage processing circuit106 outputs data indicative of the calculated two marker coordinates, that is, imaging object data indicative of positions of the imaging objects to thecommunication unit98. The output imaging object data (marker coordinate data) is included in the controller data by themicon108 as described above, and transmitted to thegame apparatus12.
When taking in the marker coordinate data from the received controller data, the game apparatus12 (CPU46) can calculate a designated position (designated coordinates) of thefirst controller34 on the screen of themonitor30 and the distance from thefirst controller34 to each of themarkers44mand44non the basis of the marker coordinate data. For example, when thefirst controller34 designates the left end of themonitor30, theobject images44m′ and44n′ are detected at the right of the imaged image, and when thefirst controller34 designates the lower end of the screen, theobject images44m′ and44n′ are detected at the upper portion of the imaged image. In other words, the marker coordinates on the imaged image are detected at positions reverse to the designated position of thefirst controller34 on the screen. Accordingly, when the coordinates of the designated position of thefirst controller34 are calculated from the marker coordinates, the coordinate system is appropriately transformed from the coordinate system of the imaged image inFIG. 8 to a coordinate system for representing positions on the screen.
Additionally, in this embodiment, thefirst controller34 performs predetermined arithmetic processing on the imaged data to detect the marker coordinates, and transmit the marker coordinate data to thegame apparatus12. However, in another embodiment, imaged data is transmitted as controller data from thefirst controller34 to thegame apparatus12, and theCPU46 of thegame apparatus12 performs predetermined arithmetic processing on the imaged data to detect the marker coordinates and the coordinates of the designated position.
Furthermore, the distance between the object images in the imaged image is changed depending on the distance between thefirst controller34 and each of themarkers44mand44n. Since the distance between themarkers44mand44n, the width of the imaged image, and the viewing angle β of theimager104 are decided in advance, by calculating the distance between the two marker coordinates, thegame apparatus12 can calculate the current distance between thefirst controller34, and each of themarkers44mand44n.
As described above, in thegame system10, the player generally operates thecontroller14, and thegame apparatus12 executes game processing based on controller data from thecontroller14. Here, thegame apparatus12 can perform wireless communications with thegame apparatus100 as described before, there are some games utilizing thegame apparatus100 as a controller (“pop-up” game, for example, described later). In a case that such kind of game is played, thegame system10 further includes a game apparatus100 (hand-held type game apparatus).
InFIG. 9 toFIG. 11, an external view of agame apparatus100 is shown. Thegame apparatus100 is a foldable game apparatus, and each ofFIG. 9 andFIG. 10 shows thegame apparatus100 in a opened state (open state), andFIG. 11 shows thegame apparatus100 in a closed state (close state). Furthermore,FIG. 9 is a front view of thegame apparatus100 in the open state, andFIG. 10 is a side view of the game apparatus in the open state. Thegame apparatus100 has two displays (LCDs112 and114) and two cameras (cameras116 and118), can image an image with the camera, display the imaged image and store the data of the imaged image.
Thegame apparatus100 is constructed small enough to be held by the user with both of the hands or one hand even in the open state.
Thegame apparatus100 has two housings of alower housing120 and anupper housing122. Thelower housing120 and theupper housing122 are connected with each other so as to be opened or closed (foldable). In this embodiment, therespective housings120 and122 are formed in the form of plate of a horizontally long rectangular, and are rotatably connected with each other at the long sides of both of the housings.
Theupper housing122 is supported pivotally at a part of the upper side of thelower housing120. This makes thegame apparatus100 to take a close state (the angle formed by thelower housing120 and theupper housing122 is about 0° (seeFIG. 11)) and an open state (the angle formed by thelower housing120 and theupper housing122 is about 180° (seeFIG. 10)). The user generally uses thegame apparatus100 in the open state, and keeps thegame apparatus100 in the close state when not using thegame apparatus100. Furthermore, thegame apparatus100 can maintain the angle formed by thelower housing120 and theupper housing122 at an arbitrary angle between the close state and the open state by friction, etc. exerted on the hinge as well as the close state and the open state as described above. That is, theupper housing112 can be fixed with respect to thelower housing114 at an arbitrary angle.
First, the configuration of thelower housing120 is first explained. As shown inFIG. 9, thegame apparatus100 has the lower LCD (liquid crystal display)112. Thelower LCD112 takes a horizontally-long shape, and is arranged such that the direction of the long side is coincident with the long side of thelower housing120. Thelower LCD112 is provided on an inner surface of thelower housing120. Accordingly, if thegame apparatus100 is not to be used, thegame apparatus100 is in the close state to thereby prevent the screen of thelower LCD112 from being soiled, damaged, and so forth. Additionally, in this embodiment, an LCD is used as a display, but other arbitrary displays, such as a display utilizing EL (Electro Luminescence), for example, may be used. Furthermore, thegame apparatus100 can employ a display of an arbitrary resolution. Additionally, in a case that thegame apparatus100 is used as an imaging device, thelower LCD112 is used for displaying, in real time, images (through image) imaged by thecamera116 or118.
The inner surface of thelower housing120 is formed to be approximately planar. At the center of the inner surface, anopening120bfor exposing thelower LCD112 is formed. At the left of theopening120b(in the negative direction of the y axis in the drawing), anopening120cis formed, and at the right of theopening120b, anopening120dis formed. Theopenings120band120care for exposing the respective keytops (the top surfaces of the respective buttons124ato124e). Then, the screen of thelower LCD112 provided inside thelower housing120 is exposed from theopening120b, and the respective keytops are exposed from theopenings120eand120d. Thus, on the inner surface of thelower housing120, on both sides of theopening120bfor thelower LCD112 set at the center, non-screen areas (dotted line areas A1 and A2 shown inFIG. 9. More specifically, areas for arranging the respective buttons124ato124e; button arranging area) are provided.
On thelower housing120, the respective buttons124ato124iand atouch panel128 are provided as input devices. As shown inFIG. 9, the direction input button124a, thebutton124b, thebutton124c, thebutton124d, the button124e, and thepower button124fout of the respective buttons124ato124iare provided on the inner surface of thelower housing120. The direction input button124ais utilized for a selecting operation, for example, and therespective buttons124bto124eare utilized for a decision operation and a cancel operation, for example. Thepower button124fis utilized for turning on/off the power of thegame apparatus100. Here, the direction input button124aand thepower button124fare provided on one side (left side inFIG. 9) of thelower LCD112 provided at substantially the center of thelower housing120, and thebuttons124bto124eare provided at the other side (right side inFIG. 9) of thelower LCD112. The direction input button124aand thebuttons124bto124eare utilized for performing various operations to thegame apparatus100.
FIG. 11(A) is a left side view of thegame apparatus100 in the close state,FIG. 11(B) is a front view of thegame apparatus100,FIG. 11(C) is a right side view of thegame apparatus100, andFIG. 11(D) is a rear view of thegame apparatus100. As shown inFIG. 11(A), thevolume button124iis provided on the left side surface of thelower housing120. Thevolume button124iis utilized for adjusting a volume of aspeaker134 furnished in thegame apparatus100. Furthermore, as shown in FIG.11(B), thebutton124his provided at the right corner of the upper side surface of thelower housing120. Thebutton124gis provided at the left corner of the upper side surface of thelower housing120. The both of thebuttons124gand124hare utilized for performing an imaging instructing operation (shutter operation) on thegame apparatus100, for example. Alternatively, both of thebuttons124gand124hmay be made to work as shutter buttons. In this case, a right-handed user can use thebutton124h, and a left-handed user can use thebutton124g, capable of improving usability for both of the users. Additionally, thegame apparatus100 can constantly make both of thebuttons124gand124hvalid as shutter buttons, or thegame apparatus100 is set to be a right-handed use or a left-handed use (the setting is input by the user according to a menu program, etc. and the set data is stored), and when the right-handed use is set, only thebutton124his made valid, and when the left-handed use is set, only thebutton124gmay be made valid.
As shown inFIG. 9, thegame apparatus100 is further provided with thetouch panel128 as an input device other than the respective operation buttons124ato124i. Thetouch panel128 is set on the screen of thelower LCD112. In this embodiment, thetouch panel128 is a touch panel of a resistance film system. Here, the touch panel can employ arbitrary push type touch panels over the resistance film system. In this embodiment, as thetouch panel128, a touch panel having the same resolution (detection accuracy) as that of thelower LCD112 is utilized. The resolution of thetouch panel128 and the resolution of thelower LCD112 are not necessarily coincident with each other. Furthermore, at the right side surface of thelower housing120, an inserting portion130 (shown by a dotted line inFIG. 9 andFIG. 11(D)) is provided. The insertingportion130 can accommodate atouch pen136 utilized for performing an operation on thetouch panel128. It should be noted that an input to thetouch panel128 is generally performed by means of thetouch pen136, but can be performed on thetouch panel128 with fingers of the user besides thetouch pen136.
As shown inFIG. 10 andFIG. 11 (C), on the right side surface of thelower housing120, an openable and closeable cover portion11bis provided. Inside the cover portion11b, a loading slot (dashed line) for loading amemory card138aand a connector (not illustrated) for electrically connecting thegame apparatus100 and thememory card138aare provided. Thememory card138ais detachably attached to a connector. Thememory card138ais used for storing (saving) image data imaged by thegame apparatus100, for example.
Furthermore, on the top surface of thelower housing120, a loading slot (chain double-dashed line)138bfor loading a memory card and a connector (not shown) for electrically connecting thegame apparatus100 and thememory card138bare provided. Thememory card138bis utilized for storing a program operated in thegame apparatus100, for example.
As shown inFIG. 9, at the left of theshaft portion120aof thelower housing120, threeLEDs126ato126care attached. Here, thegame apparatus100 can perform a wireless communication with the same kind of another game apparatus and theaforementioned game apparatus12, and thefirst LED126alights up when a wireless communication is established. Thesecond LED126blights up while thegame apparatus100 is recharged. Thethird LED126clights up when the main power supply of thegame apparatus100 is turned on. Accordingly, by the threeLEDs126ato126c, it is possible to inform the user of a communication-established state, a charge state, and a main power supply on/off state of thegame apparatus100.
As described above, thelower housing120 is provided with the input device (touch panel128 and respective buttons124ato124i) for performing an operation input to thegame apparatus100. Accordingly, when utilizing thegame apparatus100, the user can perform an operation on thegame apparatus100 while holding thelower housing120.FIG. 12 shows a situation in which the user holds thegame apparatus100 with both hands. As shown inFIG. 12, the user holds the side surface and the outer surface (surface opposite to the inner surface) of thelower housing120 with the palms, the middle fingers, the ring fingers and the little fingers of both of the hands in a state that therespective LCDs112 and114 are directed to the user. By holding thegame apparatus100 in such a manner, the user can perform operations as to the respective buttons124ato124ewith the thumbs, and perform operations as to thebuttons124gand124hwith the index fingers while holding thelower housing120.
On the other hand, theupper housing122 has a configuration for imaging an image (camera), and a configuration for displaying the imaged image (display). The configuration of theupper housing122 is explained below.
As shown inFIG. 9, thegame apparatus100 has theupper LCD114. Theupper LCD114 is set to theupper housing122. Theupper LCD114 takes a horizontally-long shape, and is arranged such that the direction of the long side is coincident with the long side of theupper housing122. Theupper LCD114 is provided on the inner surface of the upper housing122 (the inner surface when thegame apparatus100 is in the close state). Accordingly, if thegame apparatus100 is not to be used, thegame apparatus100 is set to the close state to thereby prevent the screen of theupper LCD114 from being soiled, damaged, and so forth. Here, similar to thelower LCD112, in place of theupper LCD114, a display with an arbitrary form and an arbitrary resolution may be utilized. It should be noted that in another embodiment, a touch panel may be provided on theupper LCD114 as well.
Furthermore, thegame apparatus100 has the twocameras116 and118. Therespective cameras116 and118 are housed in theupper housing122. As shown inFIG. 9, theinward camera116 is attached to the inner surface of theupper housing122. On the other hand, as shown inFIG. 11(B), theoutward camera118 is attached to the surface being opposed to the surface to which theinward camera116 is provided, that is, the outer surface of the upper housing122 (outer surface when thegame apparatus100 is in the close state). Thus, theinward camera116 can image a direction to which the inner surface of theupper housing122 is turned, and theoutward camera118 can image a direction opposite to the imaging direction of theinward camera116, that is, a direction to which the outer surface of theupper housing122 is turned. As described above, in this embodiment, the twocameras116 and118 are provided so as to make the imaging directions opposite to each other. Accordingly, the user can image the two different directions without shifting thegame apparatus100 inside out. For example, the user can image a landscape as the user is seen from thegame apparatus100 with theinward camera116, and can image a landscape as the direction opposite to the user is seen from thegame apparatus100 with theoutward camera118.
Furthermore, theinward camera116 is attached to the center of theshaft portion122aformed at the center of the bottom of theupper housing122. That is, theinward camera116 is attached at the center of the part where the twohousings120 and122 are connected. Accordingly, in a case that thegame apparatus100 is in the open state, theinward camera116 is arranged between the twoLCDs112 and114 (seeFIG. 9). In other words, theinward camera116 is positioned in the vicinity of the center of thegame apparatus100. Here, “the center of thegame apparatus100” means the center of the operation surface of the game apparatus100 (surface being made up of the inner surfaces of therespective housings120 and122 in the open state). Here, it may be the that theinward camera116 is arranged in the vicinity of the center in the horizontal direction of theLCDs112 and114. In this embodiment, when thegame apparatus100 is set to the open state, theinward camera116 is arranged in the vicinity of the center of thegame apparatus100, and therefore, in a case that the user images the user himself or herself by theinward camera116, the user may hold thegame apparatus100 at a position directly opposite to thegame apparatus100. That is, if the user holds the game apparatus at a normal holding position, the user is positioned at approximately the center of an imaging range, and the user himself or herself can easily be within the imaging range.
Furthermore, as shown inFIG. 11(B), theoutward camera118 is arranged at the upper end of the upper housing122 (portion far away from the lower housing120) in a case that thegame apparatus100 is set to the open state. Here, since theoutward camera118 is not for imaging the user holding thegame apparatus100, there is less need for being provided at the center of thegame apparatus100.
Furthermore, as shown inFIG. 9 orFIG. 11(B), amicrophone132 is housed in theupper housing122. More specifically, themicrophone132 is attached to theshaft portion122aof theupper housing122. In this embodiment, themicrophone132 is attached around the inward camera116 (next to theinward camera116 along the y axis), and specifically attached next to theinward camera116 in the positive direction of the y axis. Furthermore, a through hole formicrophone122cis mounted to theshaft portion122aat a position corresponding to the microphone132 (next to the inward camera116) such that themicrophone132 can detect a sound outside thegame apparatus100. Alternatively, themicrophone132 may be housed in thelower housing120. For example, the through hole formicrophone122cis provided on the inner surface of thelower housing120, specifically, at the lower left (button arranging area A1) of the inner surface of thelower housing120, and themicrophone132 may be arranged in the vicinity of the through hole formicrophone122cwithin thelower housing120. In addition, themicrophone132 is attached in such a direction that its sound collecting direction (direction in which the sensitivity becomes maximum) is approximately in parallel with the imaging direction (optical axis) of the inward camera116 (in other words, the sound collecting direction and the imaging direction are approximately in parallel with the z axis). Thus, a sound generated within the imaging range of theinward camera116 is suitably acquired by themicrophone132. That is, detection of a sound input through themicrophone132 and detection of the user by the imaged image by theinward camera116 can be simultaneously performed, and accuracy of the detections can be improved, at the same time.
As shown inFIG. 11(B), on the outer surface of theupper housing122, afourth LED126dis attached. Thefourth LED126dis attached around the outward camera118 (at the right side of theoutward camera118 in this embodiment). Thefourth LED126dlights up at a time when an imaging is made with theinward camera116 or the outward camera118 (shutter button is pushed). Furthermore, the fourth LED138 continues to light up while a motion image is imaged by theinward camera116 or theoutward camera118. By making thefourth LED126dlight up, it is possible to inform an object to be imaged that an imaging with thegame apparatus100 is made (is being made).
Furthermore, the inner surface of thelower housing122 is formed to be approximately planar. As shown inFIG. 9, at the center of the inner surface, anopening122bfor exposing theupper LCD114 is formed. The screen of theupper LCD114 housed inside theupper housing122 is exposed from theopening122b. Furthermore, on both side of theaforementioned opening122b, asound release hole122dis formed one by one. Inside thesound release hole122dof theupper housing122, aspeaker134 is hosed. Thesound release hole122dis a through hole for releasing a sound from thespeaker134.
Thus, on the inner surface of theupper housing122, non-display areas (areas B1 and B2 represented by a dotted lines inFIG. 9. More specifically, areas for arranging thespeaker134; speaker arranging areas) are provided on both sides of theopening122bset at the center of theupper LCD114. The two sound release holes122dare arranged at approximately the center of the horizontal direction of each speaker arranging area with respect to the horizontal direction, and at the lower portion of each speaker arranging area with respect to the vertical direction (area close to the lower housing120).
Here, as described above, by providing the non-display areas on thelower housing120 and theupper housing122 at the same positions in the horizontal direction, thegame apparatus100 is configured to help user's holding not only when it is held horizontally as shown inFIG. 12, but also when it is held vertically (a state rotated to left or right by 90° from the state shown inFIG. 12).
As described above, theupper housing122 is provided with thecameras116 and118 which are configured to image an image and theupper LCD114 as a display means for displaying the imaged image. On the other hand, thelower housing120 is provided with the input device (touch panel128 and respective buttons124ato124i) for performing an operation input to thegame apparatus100. Accordingly, when utilizing thegame apparatus100 as an imaging device, the user can perform an input to the input device with thelower housing120 holding while viewing the imaged image (image imaged by the camera) displayed on theupper LCD114.
Furthermore, in the vicinity of thecamera116 of theupper housing122, themicrophone132 configured to input a sound is provided, and thegame apparatus100 can also be used as a recording device. In addition, the user performs a sound input over themicrophone132, and thegame apparatus100 can execute the game processing and application processing other than the game on the basis of the microphone input information as well.
FIG. 13 is a block diagram showing an internal configuration (electronic configuration) of thegame apparatus100. As shown inFIG. 13, thegame apparatus100 includes electronic components, such as aCPU142, amain memory148, amemory controlling circuit150, a memory for saveddata152, a memory forpreset data154, a memory card interface (memory card I/F)144, awireless communication module156, alocal communication module158, a real-time clock (RTC)160, apower supply circuit146, and an interface circuit (I/F circuit)140, etc. Theses electronic components are mounted on an electronic circuit board, and housed in the lower housing120 (or theupper housing122 may also be appropriate).
TheCPU142 is an information processor to execute various programs. In a case that thegame apparatus100 is utilized as an imaging device, the program for it is stored in the memory (memory for saveddata152, for example) within thegame apparatus100. TheCPU142 executes the program to allow thegame apparatus100 to function as an imaging device. Here, the programs to be executed by theCPU142 may previously be stored in the memory within thegame apparatus100, may be acquired from thememory card138b, and may be acquired from another appliance, for example, thegame apparatus12 through communications.
TheCPU142 is connected with themain memory148, thememory controlling circuit150, and the memory forpreset data154. Furthermore, thememory controlling circuit150 is connected with the memory for saveddata152. Themain memory148 is a memory means to be utilized as a work area and a buffer area of theCPU142. That is, themain memory148 stores various data to be utilized in the game processing and the application processing, and stores a program obtained from the outside (memory cards138b, thegame apparatus12, etc.) In this embodiment, a PSRAM (Pseudo-SRAM) is used, for example, as amain memory148. The memory for saveddata152 is a memory means for storing (saving) a program to be executed by theCPU142, data of an image imaged by therespective cameras116 and118, etc. The memory for saveddata152 is configured by a NAND type flash memory, for example. Thememory controlling circuit150 is a circuit for controlling reading and writing from and to the memory for saveddata152 according to an instruction from theCPU142. The memory forpreset data154 is a memory means for storing data (preset data), such as various parameters, etc. which are previously set in thegame apparatus100. As a memory forpreset data154, a flash memory to be connected to theCPU142 through an SPI (Serial Peripheral Interface) bus can be used.
The memory card I/F144 is connected to theCPU142. The memory card I/F144 performs reading and writing data from and to thememory cards138aand138battached to the connector according to an instruction from theCPU142. In this embodiment, the image data imaged by therespective cameras116 and118 is written to thememory card138a, and the image data stored in thememory card138ais read from thememory card138aand stored in the memory for saveddata152. Furthermore, the program and the data stored in the memory card138 are read to and transferred to themain memory148.
Thewireless communication module156 has a function of connecting to a wireless LAN compliant with IEEE802.11b/g standards, for example. Furthermore, thelocal communication module158 has a function of performing a wireless communication with the same types of the game apparatuses by a predetermined communication system, such as an infrared communication system. Thewireless communication module156 andlocal communication module158 are connected to theCPU142. TheCPU142 can send and receive data by means of thewireless communication module156, over the Internet or directly without passing through the Internet with the same kind of other game apparatuses and thegame apparatus12, and can send and receive data with the same kind of other game apparatuses by means of thelocal communication module158.
It should be noted that thelocal communication module158 is contained in thegame apparatus100 in this embodiment, but may be provided to thememory card138bfor example. In this case, theCPU142 performs a control of communications via the memory card I/F144.
Additionally, theCPU142 is connected with theRTC160 and thepower supply circuit146. TheRTC160 counts a time to output the same to theCPU142. TheCPU142 can calculate a current time (date) on the basis of the time counted by theRTC160, and detects an operation timing as to when an image is to be acquired, etc. Thepower supply circuit146 controls power supplied from the power supply (a battery accommodated in the lower housing) included in thegame apparatus100, and supplies the power to the respective circuit components within thegame apparatus100.
Moreover, thegame apparatus100 is provided with themicrophone132 and thespeaker134. Themicrophone132 and thespeaker134 are connected to the I/F circuit140. Themicrophone132 detects a sound of the user to output a sound signal to the I/F circuit140. Thespeaker134 outputs a sound corresponding to the sound signal from the I/F circuit140. The I/F circuit140 is connected to theCPU142. Furthermore, thetouch panel128 is connected to the I/F circuit140. The I/F circuit140 includes a sound controlling circuit for controlling themicrophone132 and thespeaker134, and a touch panel controlling circuit for controlling thetouch panel128. The sound controlling circuit performs an A/D conversion and a D/A conversion on a sound signal, or converts a sound signal into sound data in a predetermined format. The converted audio data is written to a sound area (not shown) of themain memory148. If thegame apparatus100 is utilized as a recording device, the audio data stored in the sound area is written to the memory for saveddata152 via thememory controlling circuit150 thereafter (recorded in thememory card138avia the memory card I/F144 as required). Furthermore, the sound data (microphone input information) stored in the sound area is also utilized for various game processing. The touch panel controlling circuit generates touch position data in a predetermined format on the basis of the signal from thetouch panel128 and outputs the same to theCPU142. The touch position data indicates coordinates of a position where an input is performed on an input surface of thetouch panel128. Also, the touch panel controlling circuit performs reading of a signal from thetouch panel128 and generation of the touch position data per each predetermined time. TheCPU142 acquires the touch position data to thereby know the position where the input is made on thetouch panel128.
The operatingportion124 is made up of the aforementioned respective buttons124ato124i, and connected to theCPU142. The operation data indicating an input state (whether or not to be pushed) with respect to each of the operation buttons124ato124kis output from theoperation button124 to theCPU142. TheCPU142 executes processing according to an input to the operatingportion124 by acquiring the operation data from the operatingportion124.
Therespective cameras116 and118 are connected to theCPU142. Therespective cameras116 and118 image images according to an instruction from theCPU142, and output imaged image data to theCPU142. TheCPU142 writes the image data from each of thecameras116 and118 to an image area (not shown) of themain memory148. In a case that thegame apparatus100 is utilized as an imaging device, the image data stored in the image area is written to the memory for saveddata152 via the memory controlling circuit150 (and moreover recorded in thememory card138avia the memory card I/F144 as required). Furthermore, the image data sorted in the image area can also be utilized for various game processing.
In addition, each of theLCDs112 and114 is connected to theCPU142. Each of theLCDs112 and114 displays an image according to an instruction from theCPU142. In a case that thegame apparatus100 is utilized as an imaging device, theCPU142 displays an image acquired from any one of thecameras116 and118 on theupper LCD114, and displays an operation screen generated according to predetermined processing on thelower LCD112. If a game is played with thegame apparatus100, a game image is displayed on one or both of theLCDs112 and114.
When the “pop-up” game is played in thegame system10 including thegame apparatus100 configured as described above, the player images themonitor28 with the outward camera (hereinafter referred to as camera)118 of thegame apparatus100 while standing in front of themonitor28 connected to thegame apparatus12 as shown inFIG. 14. On the lower LCD (hereinafter, simply referred to as LCD)112 of thegame apparatus100, an imaged image including themonitor screen28ais displayed.
In themain memory42eand/or46 of thegame apparatus12, as shown inFIG. 15(A), aprogram memory area70 and adata memory area74 are formed, and in theprogram memory area70, agame program72, etc. is stored. Thegame program72 is software for implementing the “pop-up” game by controlling the entire hardware (seeFIG. 2) of thegame apparatus12 via theCPU40, and includes adisplay controlling program72acorresponding to flowcharts inFIG. 16(A) andFIG. 17. It should be noted that thedisplay controlling program72ain the first and second modified examples described later respectively correspond to flowcharts inFIG. 27 andFIG. 31(A).
Although illustration is omitted, in theprogram memory area70, various programs necessary for the “pop-up” game, such as, an output and communication controlling program (not illustrated) are stored other than thegame program72. The output and communication controlling program mainly controls an output to themonitor28 via the input-output processor42a, and mainly controls wireless communications with thegame apparatus100 via thewireless communication module50.
In thedata memory area74,time data76,image data78, etc. are stored. Thetime data76 is data indicating a time when an object (see PO, EO:FIG. 22(A), etc.) is made to go out of themonitor28 and return within the virtual space, and theimage data78 is data for displaying the object and a marker (M: seeFIG. 20(A),FIG. 22(A), etc.) on themonitor28.
On the other hand, in themain memory148 of thegame apparatus100, aprogram memory area170 and adata memory area174 are formed, and in theprogram memory area170, agame program172, etc. are stored as shown inFIG. 15(B). Thegame program172 is software for implementing the “pop-up” game by controlling the entire hardware (seeFIG. 13) of thegame apparatus100 via theCPU142, and includes adisplay controlling program172acorresponding to flowcharts inFIG. 16(B),FIG. 18, andFIG. 19, and arecognition program172bfor recognizing (pattern verification) markers (Ma: seeFIG. 20(B), and XFIG. 22(B), etc.) included in the imaged image. It should be noted that thedisplay controlling program172ain the first and second modified examples described later respectively correspond to flowcharts inFIG. 28 andFIG. 31(B).
Although illustration is omitted, in theprogram memory area170, various programs necessary for the “pop-up” game, such as, input-output and communication controlling program (not illustrated) are stored other than thegame program172a. The input-output and communication controlling program mainly controls an input from thecamera118 and an output to theLCD112, and controls a wireless communication with thegame apparatus12 mainly via thewireless communication module156.
Thedata memory area174 includes an imagedimage area176, a patterntemporary storing area178, apattern storing area180, etc. In the imagedimage area176, an imaged image input from thecamera118 at a predetermined frame rate (for example 60 fps) is written through theCPU142. The imaged image thus stored in the imagedimage area176 is read at a predetermined frame rate (rate the same as that in writing, for example) by theLCD112 under the control of theCPU142. Due to this, the imaged image from thecamera118 is displayed on theLCD112.
On the imaged image stored in the imagedimage area176, recognition processing (seeFIG. 19) of recognizing a predetermined marker is performed. The patterntemporary storing area178 is an area for temporarily saving a pattern of a marker detected in real time from the imaged image. In thepattern storing area180, through the previous pattern storing processing (seeFIG. 16), a pattern of a predetermined marker (seeFIG. 20) is stored. The pattern temporarily stored in the patterntemporary storing area178 is successively compared with the pattern stored in thepattern storing area180, and if the temporary stored pattern matches the stored pattern, a marker recognizing signal indicating recognition of the predetermined marker is transmitted to thegame apparatus12.
Here, the marker to be recognized is one kind in this embodiment, but a plurality of kinds may be appropriate. In this case, a plurality of patterns are stored in thepattern storing area180, and the temporary stored pattern is compared with each of the stored patterns. In the marker recognizing signal, information indicating which marker is recognized is added.
Furthermore, in thedata memory area174,image data182, etc. are stored. Theimage data182 is data for displaying the object (POb, EOb: seeFIG. 24(B),FIG. 25(B), etc.) going out of themonitor28 within the virtual space such that it is superimposed on the imaged image of theLCD112.
When the pattern is stored in advance, theCPU40 of the game apparatus12 (console-type game apparatus side) and theCPU142 of the game apparatus100 (hand-held-type game apparatus side) respectively execute pattern storing processing shown inFIG. 16(A) andFIG. 16(B). First, in relation to the console-type game apparatus side, with first reference toFIG. 16(A), theCPU40 instructs theGPU42bto display a marker on themonitor28 in a step S1. In response thereto, theGPU42bdisplays a rectangular marker M on themonitor28 as shown inFIG. 20(A), for example. It should be noted that on the side of the hand-held-type game apparatus, at this time, an imaged image as shown inFIG. 20(B) is displayed on theLCD112.
The marker M has a rectangular outline having each face depicted with a predetermined pattern (rectangular parallelepiped having each face different in colors, here) on a white. The background of the marker M is painted with black in order to extremely heighten a luminance difference with the white marker (makes it easy to detect the outline). Here, the shape of the marker M may be a quadrangle other than a rectangle, such as a square, a rhombus, a trapezoid, for example. Alternatively, a polygon, such as a triangle, a hexagon, etc., a circle, or a complex shape combined them may be appropriate. Furthermore, the white of the marker M may be colored, and the background may be painted with an arbitrary color if they are easy to detect the outline. The marker (pattern) may be displayed in color or in monochrome.
Then, theCPU40 determines whether or not a completion notification (described later) is received by thewireless communication module50 in a step S3. If “NO” here, the process returns to the step S1 to repeat the similar processing for each frame ( 1/60 seconds cycle, for example). If “YES” in the step S3, the process is ended.
Next, in relation to the hand-held-type game apparatus side, with reference toFIG. 16(B), theCPU142 executes imaging processing with thecamera118 in a step S11 and repetitively writes the acquired imaged image in the imagedimage area176. Then, theCPU142 instructs theLCD112 to display the imaged image on theLCD112 in a step S13, and in response thereto, theLCD112 reads the imaged image stored in the imagedimage area176 in order and displays the same. Accordingly, on theLCD112, an imaged image as shown inFIG. 20(B) is displayed. This imaged image includes themonitor image28a. Themonitor image28ais an imaged image of themonitor28 shown inFIG. 20(A), and includes a marker image Ma in which a pattern Pa is depicted in each face of a white rectangle and a black background.
Next, theCPU142 performs outline detection based on the luminance difference on the imaged image stored in the imagedimage area176 in a step S15. From the imaged image ofFIG. 20(B), based on the luminance difference between the white of the marker Ma and the background, the outline of the white is detected. Then, in a step S17, it is determined whether or not the detected outline has a predetermined shape (quadrangle, here), and if “NO” here, the process returns to the step S11 to repeat the similar processing for each frame.
Then, theCPU142 changes the quadrangle to a rectangle in predetermined size in a step S19. That is, depending on the positional relationship between themonitor28 and thegame apparatus100, the marker Ma becomes a quadrangle different from the rectangle, such as trapezoid, for example, and varies in size, and thus, such a quadrangle is changed to a rectangle of A×B as shown inFIG. 21.
Here, if the marker Ma is a rectangle in predetermined size from the first, such a deformation is not required. For example, theCPU142 displays a frame image (not illustrated) corresponding to the rectangle in predetermined size such that it is superimposed on the imaged image on theLCD112, and the player may make a position adjustment such that the marker Ma is fit into the frame image.
Next, theCPU142 stores a color distribution within the rectangle as a pattern of the marker in thepattern storing area180 in a step S21. The color distribution within the rectangle is represented by RGB (x, y) as a RGB value at an arbitrary point (x, y) within the rectangle as shown inFIG. 21, for example. In this case, as to each of all the pixels within the rectangle or a part of pixels appropriately sampled from them, the RGB (x, y) is stored.
It should be noted that the pattern to be stored may be a binarized one (gray scale, for example). At a time of recognition, the pattern extracted from the imaged marker is compared with the binarized pattern.
Furthermore, in a case of specifications of less precise recognition, the pattern may be reduced to be stored. This makes it possible to save the capacity and reduce the possibility of making an erroneous recognition.
After completion of storing the pattern, theCPU142 transmits a completion notification to thegame apparatus12 via thewireless module156 in a step S23. Then, the processing is ended.
When the “pop-up” game is played thereafter, theCPU40 of the game apparatus12 (console-type game apparatus side) and theCPU142 of the game apparatus100 (hand-held-type game apparatus side) respectively execute display control processing shown inFIG. 17 andFIG. 18. First, in relation to the console-type game apparatus side, with reference toFIG. 17, theCPU40 instructs theGPU42bto display all or a part of the object on themonitor28 in a step S31. In response thereto, theGPU42bdisplays a player object PO and an enemy object EO on themonitor28 as shown inFIG. 22(A), for example. The background is colored by gray, for example, other than white (the colored illustration is omitted here).
Then, theCPU40 instructs theGPU42bto display markers on themonitor28 in a step S33. In response thereto, theGPU42bdisplays markers M at the four corners of themonitor28 as shown inFIG. 22(A), for example. Each marker M is equal to but only different in display size from the marker M shown inFIG. 20(A). Here, at this time, on the side of the hand-held type game apparatus, an imaged image as shown inFIG. 22(B) is displayed on theLCD112.
Next, theCPU40 determines whether or not a marker recognizing signal (described later) is received by thewireless communication module50 in a step S35. If “NO” here, the process returns to the step S31 to repeat the similar processing for each frame. Through the loop processing of the steps S31 to S35, the display of themonitor28 change fromFIG. 22(A) toFIG. 23(A). Here, during this time, on the side of hand-held-type game apparatus as well, the display of theLCD112 changes fromFIG. 22(B) toFIG. 23(B).
It should be noted that in the flowchart, only the screen display (display controlling program172a) is described, but the position and the movement of object (PO, EO) within the screen are controlled by thegame program72, and by the control, the display of themonitor28 first changes fromFIG. 22(A) toFIG. 23(A), and the display of theLCD112 acquired by imaging themonitor28 also change fromFIG. 22(B) toFIG. 23(B).
If “YES” in the step S35, that is, a marker recognizing signal is received, the process proceeds to a step S37 to determine whether or not the object (PO, EO) is made to be out of themonitor28 on the basis of the time data76 (seeFIG. 15(A)).
Thetime data76 describes a timing (first predetermined time) when the object (PO, EO) is made to be out of themonitor28 within the virtual space and a timing (second predetermined time) when the object (PO, EO) is returned to themonitor28 within the virtual space. For example, when a time elapsed from the appearance of the object (PO, EO) on themonitor28 reaches a first predetermined time, “YES” is determined in the step S37, and the process proceeds to a step S39. Here, while the first predetermined time elapses from the appearance of the object, loop processing of the steps S31 to S37 and S43 is executed.
In the step S39, theCPU40 transmits a first control signal toward thegame apparatus100 via thewireless module50. The first control signal includes position information indicating a position of the object (PO, EO) within themonitor28 which tries to go out of the monitor28 (in other words, the position of the object with respect to the marker M) in a next step S41.
In the successive step S41, theCPU40 instructs theGPU42bto erase all or a part of the object from themonitor28. In response thereto, theGPU42berases the player object PO from themonitor28 as shown inFIG. 24(A), for example. Here, in accordance therewith, on the side of the hand-held type game apparatus, the player object POa within themonitor28adisappears as shown inFIG. 24(B) while the player object POb equivalent thereto is displayed so as to be superimposed on the imaged image. The position where the player object POb is superimposed is decided at the outside of themonitor28aon the basis of the position information included in the first control signal. This make is possible to look as if the object PO is popped out of themonitor28 within the virtual space.
Then, the process returns to the step S33 to repeat the similar processing for each frame. Through the loop processing of the steps S33 to S41, the display of themonitor28 gradually changes fromFIG. 23(A) thenFIG. 24(A) toFIG. 25(A). Here, during this time, on the side of the hand-held type game apparatus, the display of theLCD112 changes fromFIG. 23(B) thenFIG. 24(B) toFIG. 25(B).
When the object (PO, EO) thus goes away from themonitor28, the determination in the step S37 changes from “YES” to “NO”, and the process shifts to a step S43. In the step S43, theCPU40 determines whether or not the object (PO, EO) is to be returned to themonitor28 on the basis of thetime data76. When the time elapsed from when the object (PO, EO) goes away from themonitor28 reaches a second predetermined time, “YES” is determined in the step S43, and the process proceeds to a step S45. Here, while the second predetermined time elapses from the disappearance of the objects, the loop processing from steps S31 to S37 and S43 is executed.
In the step S45, theCPU40 transmits a second control signal toward thegame apparatus100 via thewireless module50. Thereafter, the process returns to the step S31 to repeat the similar processing for each frame. Through the loop processing from the steps S31 to S37, S43 and S45, the display of themonitor28 gradually changes fromFIG. 25(A), thenFIG. 24(A) toFIG. 23(A). Here, during this time, on the side of the hand-held type game apparatus, the display of theLCD112 gradually changes fromFIG. 25(B), thenFIG. 24(B) toFIG. 23(B). This makes it possible to look as if the player object PO pops out of themonitor28 within the virtual space returns to themonitor28.
When the object (PO, EO) thus appears again within themonitor28, the determination in the step S43 changes from “YES” to “NO”, and the processing returns to the loop of the steps S31 to S37 and S43.
Here, the determination in the steps S37 and S43 may be performed based on a command from thegame apparatus100 in place of thetime data76. For example, when a predetermined operation is performed via the operatingportion124 in thegame apparatus100, a command corresponding to the operation is transmitted from thewireless module156. On the other hand, when the command is received by thewireless module50 in thegame apparatus12, the determinations in the steps S37 and S43 are performed on the basis of the command. Thus, the player can arbitrarily decide the timing when the object is made to go out and returned.
Next, in relation to the hand-held-type game apparatus side, with reference toFIG. 18, theCPU142 executes imaging processing via thecamera118 in a step S51, and repetitively writes the acquired imaged image to the imagedimage area176. Next, theCPU142 instructs theLCD112 to display the imaged image in a step S53, and in response thereto, theLCD112 reads the imaged image stored in the imagedimage area176 in order and displays the same. Accordingly, on theLCD112, the imaged image as shown inFIG. 22(B) is displayed. The imaged image includes themonitor image28a. Themonitor image28ais the imaged image of themonitor28 as shown inFIG. 22(A), and specifically includes the marker image Ma in which a white rectangle depicted with the pattern Pa and the gray background (the colored illustration is omitted here as) are included as shown inFIG. 20(B).
Next, theCPU142 performs marker recognizing processing on the imaged image stored in the imagedimage area176 in a step S55. Here, at least one marker out of the four markers M displayed at the four corners of themonitor28 may be recognized. That is, by spacedly displaying the plurality of markers M, the recognizable range is more expanded than when a single marker is provided.
The marker recognizing processing in the step S55 includes processing similar to the marker detecting processing (step S15-S21) at a time of storing the pattern, and is executed according to a subroutine inFIG. 19. TheCPU142 first performs outline detection based on the luminance difference in a step S71, and then determines whether or not the detected outline is a quadrangle in a step S73. If “NO” in the step S73, the process is restored to the main routine (FIG. 18). On the other hand, if “YES” in the step S73, the process proceeds to a step S75 to change the quadrangle to the rectangle of A×B in a manner shown inFIG. 21. In a step S77, a color distribution within the rectangle (RGB (x, y) of each pixel, for example: seeFIG. 21) is temporarily stored in the patterntemporary storing area178 as a pattern of the marker. Then, theCPU142 compares the pattern temporarily stored in the patterntemporary storing area178 with the pattern stored inpattern storing area180 in a step S79, and then, the process is restored to the main routine (FIG. 18).
Next, theCPU142 determines whether or not a marker is recognized on the basis of the comparison result in the step S79 in a step S57. If the temporarily-stored pattern does not match the stored pattern, “NO” is determined in the step S57, and the process returns to the step S51 to repeat the similar processing for each frame. If the temporarily-stored pattern matches the stored pattern, “YES” is determined in the step S57, and the process proceeds to a step S59. Here the determination in the step S57 is not necessarily repeated, and one determination may be appropriate. That is, successive recognitions and one recognition may be appropriate.
In the step S59, theCPU142 transmits a marker recognizing signal to thegame apparatus12 via thewireless communication module50. Next, theCPU142 determines whether or not a first control signal is received by thewireless communication module156 in a step S61, and if “NO” here, the process shifts to a step S67 to further determine whether or not a second control signal is received by thewireless communication module156. If “NO” in the step S67 as well, the process returns to the step S51 to repeat the similar processing for each frame. Accordingly, while no control signal is transmitted from thegame apparatus12, through the loop processing of the steps S51 to S61 and S67, the display of theLCD112 changes fromFIG. 22(B) toFIG. 23(B) as the display of themonitor28 changes fromFIG. 22(A) toFIG. 23 (A).
If “YES” in the step S61, theCPU142 proceeds to a step S63 to correct the size and/or angle of the object to be displayed in a superimposed manner in a following step S65 in correspondence with the size and/or shape of the marker recognized in the step S55. The correction processing is processing of changing the size and/or shape of the object popping out of themonitor28 in correspondence with the positional relationship between themonitor28 and the game apparatus100 (seeFIG. 26: described later).
Next, theCPU142 instructs theLCD112 to display all or a part of the object such that they are superimposed on the imaged image in a step S65. In response thereto, theLCD112 displays the player object POb such that it is superimposed on the imaged image as shown inFIG. 24(B), for example. The position where the player object POb is superimposed is decided outside themonitor28aon the basis of the position information included in the first control signal.
When the player object POb is thus displayed to be superimposed, on the side of the console-type game apparatus, the aforementioned step S41 is executed to thereby make the player object PO within themonitor28 disappear as shown inFIG. 24(A). In accordance therewith, on theLCD112 as well, the player object POa within themonitor28adisappears, so that it is possible to look as if the player object PO pops out of themonitor28 within the virtual space.
Then, the process returns to the step S51 to repeat the similar processing for each frame. Accordingly, while the first control signal is transmitted from thegame apparatus12 through the loop processing of the steps S51 to S65, the display of theLCD112 changes fromFIG. 23(B),FIG. 24(B) toFIG. 25(B) as the display of themonitor28 changes fromFIG. 23(A),FIG. 24(A) toFIG. 25(A).
Here, inFIG. 25(A), a part of the enemy object EO, that is, the head and the arms, here, is erased from themonitor28, and inFIG. 25(B), the erased part, that is, the head and the arms of the enemy object EOb are displayed so as to be superimposed on the imaged image.
Furthermore,FIG. 25(B) is the display of theLCD112 in a case that thegame apparatus100 is at the left front of themonitor28, and if thegame apparatus100 is at the right front of themonitor28 as shown inFIG. 26(A), the display of theLCD112 is as shown inFIG. 26(B). InFIG. 25(B), the heart which is invisible under the enemy object EOb appears inFIG. 26(B), and therefore, a play such as shooting an arrow to the heart of the enemy object EOb by the player object POb is made possible.
If “YES” in the step S67, theCPU142 advances the process to a step S69 to instruct theLCD112 to erase all or a part of the object. In response thereto, theLCD112 erases all or a part of the object displayed in a superimposed manner on the imaged image. Thereafter, the process returns to the step S51 to repeat the similar processing for each frame. Accordingly, while the second control signal is transmitted from thegame apparatus12, through the loop processing of the steps S51-S61, S67 and S69, the display of theLCD112 changes fromFIG. 25(B),FIG. 24(B) toFIG. 23(B) as the display of themonitor28 changes fromFIG. 25(A),FIG. 24(A) toFIG. 23(A). This makes it possible to show as if the player object PO popping out of themonitor28 returns to themonitor28 within the virtual space.
In addition, if “YES” is determined in the step S57, the pattern stored in thepattern storing area180 may be updated to the latest pattern acquired from the recognition processing in the step S55 as required. For example, between the steps S59 and the step S61, a step S60 of performing a pattern update is inserted. Thus, even if the brightness of the surrounding environment and the image quality of themonitor28 change, accuracy of the marker recognition can be maintained at a high level.
Although the detailed explanation is omitted in the above description, for marker recognition and a control of the display position in relation toFIG. 21 andFIG. 26, AR (Augmented Reality) processing described in “augmented reality system based on a trace of a marker and its calibration” (Journal of the Virtual Reality Society of Japan Vol. 4, No. 4, 1999) can be used. The basic flow of the AR processing is as follows. First, marker recognition is performed on the imaged image. If there is a marker, a position and an orientation of a virtual camera in a marker coordinate system are calculated on the basis of the position and the orientation of the marker on the imaged image. Next, a transformation matrix from the marker coordinate system to a camera coordinate system is calculated. More specifically, as to the marker recognition, coordinate values of four vertexes of the marker (four vertexes of the marker image Ma shown inFIG. 21 in this embodiment) are evaluated, and on the basis of the coordinate values and information in relation to the size of the marker, the transformation matrix (transformation matrix from the marker image Ma to the marker image Mb shown inFIG. 21 in this embodiment) from the marker coordinate system to the camera coordinate system is calculated. Then, the transformation matrix is set to a view matrix of the virtual camera. By the virtual camera, a virtual space is imaged to generate a CG image with which the imaged image is combined.
In this embodiment, an image (composite image) obtained by superimposing the virtual image on the imaged image can be generated according to following processing, for example. That is, theCPU142 of thegame apparatus100 executes (image) recognition processing of a marker included in the imaged image in the step S55. Here, as a processing result of the recognition processing, information indicating the position and the orientation of the marker is calculated. If a marker is recognized (S57:YES), a positional relationship between thegame apparatus100 itself (camera) and the marker is calculated from the shape and the orientation of the recognized marker. Here, if the recognition processing fails, theCPU142 uses the position and the orientation of the virtual camera calculated when the recognition processing succeeds last without calculating the position and the orientation of the virtual camera. In addition, theCPU142 calculates the position and the orientation of the virtual camera in the virtual space on the basis of the positional relationship. The position and orientation of the virtual camera are calculated such that the positional relationship between the virtual camera and the virtual object in the virtual space matches the positional relationship between thegame apparatus100 itself and the marker in the real space. When the position of the virtual camera is decided, theCPU142 generates a virtual image when the virtual object is viewed from the position of the virtual camera, and superimposes the virtual image on the imaged image. By the processing described above, thegame apparatus100 can generate and display a composite image. Here, processing of calculating the position of the virtual camera from the aforementioned positional relationship may be similar to that in the conventional AR processing.
As understood from the above description, thegame system10 of this embodiment includes thegame apparatus12 connected with themonitor28 and thegame apparatus100 including thecamera118 and theLCD112. Here, themonitor28 may be integrally provided with thegame apparatus12, and thecamera118 and/or theLCD112 may be provided separately from thegame apparatus100.
TheCPU40 of thegame apparatus12 displays the markers M on the monitor28 (S33), and theCPU142 of thegame apparatus100 images an image with the camera118 (S51), displays the imaged image on the LCD112 (S53), determines whether or not any marker M is included in the imaged image (S55, S57), performs a display control such that at least a part of a second object (POb, EOb) corresponding to the marker M is superimposed on the imaged image displayed on theLCD112 on the basis of the determination result (S65), and transmits a marker recognizing signal to thegame apparatus12 on the basis of the determination result (S59).
Thegame apparatus12 further receives the marker recognizing signal transmitted as described above (S35), and performs a display control of at least a first object image (PO, EO) corresponding to the markers M on themonitor28 on the basis of the received marker recognizing signal (S41).
That is, thegame apparatus12 displays the markers M on themonitor28, and themonitor28 displayed with the markers M is imaged by thecamera118. Thegame apparatus100 displays the imaged image by thecamera118 on theLCD112, and determines whether or not the markers M are included in the imaged image. If the markers M are included in the imaged image on the basis of the determination result, thegame apparatus100 performs a display control of superimposing (display in a superimposed manner in this embodiment) at least a part of the second object on the imaged image displayed on theLCD112, and transmits a marker recognizing signal to thegame apparatus12. Thegame apparatus12 receives the marker recognizing signal to perform a display control of at least a part of the first object.
Thus, thegame apparatus12 displays the markers M on themonitor28 to thereby make thegame apparatus100 perform a display control on the second object of the imaged image on theLCD112 while thegame apparatus100 transmits a marker recognizing signal to thegame apparatus12 when the display control is performed based on the markers M to make thegame apparatus12 perform a display control on the first object of themonitor28, and therefore, thegame apparatus12 and thegame apparatus100 are associated with each other through the markers M and the marker recognizing signal to thereby operatively connect the first object and the second object on themonitor28 and the imaged image of theLCD112.
Specifically, in this embodiment, the first object and the second object are a common object (that is, the common object is the first object on themonitor28 and is the second object on the LCD112), and therefore, it is possible to look as if the common object moves between themonitor28 and the LCD112 (display control).
More specifically, thegame apparatus12 displays at least a part of the first object together with the marker M (S31), and erases at least the part of the first object on the basis of the marker recognizing signal (S41). Thus, the player object PO and the enemy object EO pops out of themonitor28 to the imaged image on theLCD112, or is returned to themonitor28 therefrom.
Here, the first object and the second object may be a part of the common object (head and arms, for example) and other parts (body, for example) or may be objects independent of each other.
It should be noted that in the above-described embodiment, the object is in and out between themonitor28 and the imaged image of theLCD112 in a state that the marker is always displayed, but the marker is displayed as required to make the object appear in themonitor28 and theLCD112 at the same time, and disappear from themonitor28 and theLCD112 at the same time. The display control in this case is explained as a first modified example below.
In the first modified example, theCPU40 of the game apparatus12 (console-type game apparatus side) and theCPU142 of the game apparatus100 (hand-held-type side) respectively execute display control processing as shown inFIG. 27 andFIG. 28. First, in relation to the console-type game apparatus side, with reference toFIG. 27, theCPU40 instructs theGPU42bto display a shadow of the object on themonitor28 in a step S81. In response thereto, theGPU42bdisplays a shadow SH of the object O on themonitor28 as shown inFIG. 29(A). Here, at this time, on the side of the hand-held type game apparatus, an imaged image as shown inFIG. 29(B) is displayed on the LCD112 (described later).
Next, theCPU40 determines whether or not the current time is within a period for making the object appear on the basis of thetime data76 in a step S83, and if “YES” here, the process proceeds to a step S85 to instruct theGPU42bto display the marker on themonitor28. In response thereto, theGPU42bdisplays the markers M at the four corners of themonitor28.
Then, theCPU40 determines whether or not a marker recognizing signal is received by thewireless communication module50 in a step S87. If “NO” here, the process returns to the step S81 to repeat the similar processing for each frame. If “YES” in the step S87, theCPU40 instructs theGPU42bto display the body of the object O on themonitor28, and in response thereto, theGPU42bdisplays the body on themonitor28 as shown inFIG. 30(A). Here, at this time, on the side of the hand-held type game apparatus, an imaged image as shown inFIG. 30(B) is displayed on the LCD112 (described later). Thereafter, the process returns to the step S81 to repeat the similar processing for each frame.
If “NO” in the step S83, theCPU40 shifts to a step S91 to instruct theGPU42bto erase the marker, and further instructs theGPU42bto erase the body in a following step S93. In response thereto, theGPU42berases the marker M and further the body from themonitor28. Thereafter, the process returns to the step S81 to repeat the similar processing for each frame.
Next, in relation to the hand-held-type game apparatus side, with reference toFIG. 28, theCPU142 executes imaging processing via thecamera118 in a step S101, and repetitively writes the acquired imaged image in the imagedimage area176. Next, theCPU142 instructs theLCD112 to display the imaged image in a step S103, and in response thereto, theLCD112 reads the imaged image stored in the imagedimage area176 in order, and displays the same. Accordingly, on theLCD112, an imaged image as shown inFIG. 29(B) is displayed. The imaged image includes themonitor image28a. Themonitor image28ais an imaged image of themonitor28 shown inFIG. 32(A), and includes a shadow image SHa.
Next, theCPU142 performs marker recognizing processing on the imaged image stored in the imagedimage area176 in a step S105. The marker recognizing processing is also executed according to the subroutine inFIG. 19. In a next step S107, it is determined whether or not a marker is recognized on the basis of the comparison result in the step S79 (seeFIG. 19). If “YES” here, the process shifts to a step S109 to transmit a marker recognizing signal to thegame apparatus12 via thewireless communication module156. Then, in a step S111, theLCD112 is instructed to display the head and arms of the object Ob such that they are superimposed on the imaged image. In response thereto, theLCD112 displays the head and arms of the object Ob such that they are superimposed on the imaged image as shown inFIG. 30(B). Thereafter, the process returns to the step S101 to repeat the similar processing for each frame.
If “NO” in the step S107, theCPU142 shifts to a step S113 to instruct theLCD112 to erase the head and arms of the object Ob, and in response thereto, theLCD112 erases the head and arms that are displayed in a superimposed manner of the object Ob. Thereafter, the process returns to the step S101 to repeat the similar processing for each frame. Thus, the display of theLCD112 changes fromFIG. 30(B) toFIG. 29(B).
As described above, in the first modified example, theCPU40 of thegame apparatus12 displays the marker M on the monitor28 (S85), and theCPU142 of thegame apparatus100 performs imaging with the camera118 (S101), displays the imaged imaged image on the LCD112 (S103), determines whether or not the marker M is included in the imaged image (S105, S107), performs a display control of at least the second object Ob corresponding to the marker M displayed on the imaged image of theLCD112 on the basis of the determination result (S111), and transmits a marker recognizing signal to thegame apparatus12 on the basis of the determination result (S109).
Thegame apparatus12 receives the marker recognizing signal transmitted as described above (S87), and displays at least a part of the first object image O corresponding to the marker M on themonitor28 on the basis of the received marker recognizing signal (S89). This makes it possible to make an expression as if the objects appear/disappear at the same time between themonitor28 and the imaged image of theLCD112.
It should be noted that in the aforementioned embodiment and first modified example, various signals are transmitted and received via the wireless communication module when a display control is performed, but it may be possible to perform a display control without using any signal. A display control in this case is explained as a second modified example below.
In the second modified example, theCPU40 of the game apparatus12 (console-type game apparatus side) and theCPU142 of the game apparatus100 (hand-held-type game apparatus side) respectively execute display control processing as shown inFIG. 31(A) andFIG. 31(B). First, in relation to the console-type game apparatus side, with reference toFIG. 31(A), theCPU40 instructs theGPU42bto display an object O on themonitor28 in a step S121. In response thereto, theGPU42bdisplays the object O on themonitor28 as shown inFIG. 32(A). Here, at this time, on the side of the hand-held type game apparatus, an imaged image as shown inFIG. 32(B) is displayed on the LCD112 (described later).
Next, theCPU40 determines whether or not the current time is within the period for making the object O go out of themonitor28 on the basis of thetime data76 in a step S123. If “YES” here, the process shifts to a step S125 to instruct theGPU42bto erase the object O from themonitor28, and instructs theGPU42bto display the marker M on themonitor28 in a step S127. In response thereto, theGPU42berases the object O from themonitor28, and displays the marker M at a predetermined position (erasing position of the object O, here) of themonitor28. Thereafter, the process returns to the step S123 to repeat the similar processing for each frame.
If “NO” in the step S123, theCPU40 instructs theGPU42bto erase the marker M from themonitor28 in a step S129. In response thereto, theGPU42berases the marker M from themonitor28. Then, the process returns to the step S121 to repeat the similar processing for each frame.
Accordingly, if the determination result in the step S123 changes from “NO” to “YES”, the display of themonitor28 changes fromFIG. 32(A) toFIG. 33(A). Here, at this time, the display of theLCD112 changes fromFIG. 32(B) toFIG. 33(B) (described later). On the other hand, if the determination result in the step S123 changes from “YES” to “NO”, the display of themonitor28 changes fromFIG. 33(A) toFIG. 32(A). Here, at this time, the display of theLCD112 changes fromFIG. 33(B) toFIG. 32(B) (described later).
Next, in relation to the hand-held-type game apparatus side, with reference toFIG. 31(B), theCPU142 performs imaging processing via thecamera118 in a step S131, and repetitively writes to acquired imaged image to the imagedimage area176. Then, theCPU142 instructs theLCD112 to display the imaged imaged image in a step S133, and in response thereto, theLCD112 reads and displays the imaged image stored in the imagedimage area176 in order. Accordingly, on theLCD112, the imaged image as shown inFIG. 32(B) is displayed. The imaged image includes themonitor image28a. Themonitor image28ais the imaged image of themonitor28 shown inFIG. 29(A), and includes an object image Oa.
Next, theCPU142 performs marker recognizing processing on the imaged image stored in the imagedimage area176 in a step S135. The marker recognizing processing is also executed according to the subroutine inFIG. 19. In a next step S137, it is determined whether or not a marker is recognized on the basis of the comparison result in the step S79 (seeFIG. 19). If “YES” here, theLCD112 is instructed to display the object image Ob such that it is displayed on the imaged image in a step S139. In response thereto, theLCD112 displays the object image Ob such that it is superimposed on the imaged image as shown inFIG. 33(B). Thereafter, the process returns to the step S131 to repeat the similar processing for each frame.
If “NO” in the step S137, theCPU142 shifts to step S141 to instruct theLCD112 to erase the object image Ob, and in response thereto, theLCD112 erases the object image Ob which is displayed in a superimposed manner. Then, the process returns to the step S131 to repeat the similar processing for each frame.
Accordingly, when the determination result in the step S137 changes from “NO” to “YES”, the display of theLCD112 changes fromFIG. 32(B) toFIG. 33(B). Here, at this time, the display of themonitor28 changes fromFIG. 32(A) toFIG. 33(A). On the other hand, when the determination result in the step S137 changes from “YES” to “NO”, the display of theLCD112 changes fromFIG. 33(B) toFIG. 32(B). Here, at this time, the display of themonitor28 changes fromFIG. 33(A) toFIG. 32(A).
It should be noted that in the aforementioned embodiment and the modified example, the markers are displayed on themonitor28 before recognition, the markers displayed on themonitor28 are imaged and stored, and at a time of recognition, the markers thus imaged and stored are compared with a marker newly imaged. However, by storing the markers in advance, such saving processing is omitted, and when recognition is performed, the markers stored in advance and the markers newly imaged may be compare with each other. Here, if saving processing is performed like this embodiment, it is possible to respond to the property of themonitor28 and the change of the environment, and therefore, it is possible to expect improvement in recognition accuracy.
Furthermore, in the aforementioned embodiment and modified example, the marker is static but may be changed dynamically. For example, depending on the size of the marker, the size of the appearing object may be changed. Alternatively, depending on the shape of the marker, the display direction of the appearing object may be changed.
In addition, in the embodiments ofFIG. 17 toFIG. 19, the hand-held type game apparatus transmits a signal, and the hand-held-type game apparatus may execute a display control at this time, and the console-type game apparatus may perform a display control at a time when receiving the signal.
Additionally, in this embodiment, the connection between the hand-held-type game apparatus and the console-type game apparatus is made in a wireless manner but may be connected by a wire.
Furthermore, in this embodiment, the direction of the hand-held-type game apparatus with respect to the display of the console-type game apparatus is calculated on the basis of the positions of the markers included in the imaged image of the hand-held-type game apparatus. However, a sensor (acceleration sensor, gyro sensor, etc.) for detecting a motion is attached to the hand-held-type game apparatus, and on the basis of the sensor value, the position is calculated and on the basis of the calculated position, and the direction of the hand-held-type game apparatus with respect to the display of the console-type game apparatus may be calculated.
In this embodiment and the modified example, it is difficult to view the display of the console-type game apparatus and the screen of the hand-held-type game apparatus at the same time (without changing the gazing). However, if a head mount display allowing for optical see-through (translucent display) is used as the screen of the hand-held-type game apparatus, viewing without changing the gazing is made possible. More specifically, a camera is attached to the head mount display (here, the camera may be separately provided). With the camera, the display of the console-type game apparatus is imaged, and when a marker is detected from the imaged image, an object is displayed on the head mount display. On the head mount display, the imaged image by the camera is not displayed. Thus, behind the translucent object displayed on the head mount display, an image (object) displayed on the display of the console-type game apparatus can be viewed, capable of enhancing visual continuity between the console-type game apparatus and the hand-held-type game apparatus.
In the above description, thegame system10 is explained, but the present invention can be applied to an image processing system including a first image processing apparatus utilizing a first displayer (console-type game apparatus connected to a monitor, PC, etc.: the monitor may be provided externally or internally), an imager and a second image processing apparatus (hand-held type game apparatus containing an LCD and a camera, a portable communication terminal, a PDA, etc.: the LCD and the camera may be provided externally) utilizing an imager and a second displayer.
It should be noted that if the LCD and the cameras are a 3D compatible type, popping out of the object, etc. can be represented more real as shown inFIG. 25.
Although the technology presented herein has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the scope of the present technology being limited only by the terms of the appended claims.
Furthermore, it should be understood that overall the embodiment of the present technology, a representation of a singular form also includes a concept of the plural form unless otherwise stated. Accordingly, an article or an adjective in the singular (for example, “a”, “an”, “the”, etc. for English) also includes a concept of the plural form unless otherwise stated.

Claims (29)

What is claimed is:
1. An image processing system including a first image processing apparatus utilizing a first displayer and a second image processing apparatus utilizing an imager and a second displayer capable of viewing a real space on a screen, wherein
said first image processing apparatus comprises:
a first display processor which displays a predetermined marker image on said first displayer; and
a first object display controller which performs on said first displayer a display control of at least a part of a first object image being a predetermined CG object; and
said second image processing apparatus comprises:
an imaging processor which performs imaging by said imager; and
a second object display controller which performs a composition control of at least a part of a second object image being a predetermined CG object on a real space capable of being viewed on said second displayer at a position with reference to said marker image within said imaged image by recognizing said marker image within said imaged image, wherein
said at least part of a second image is controlled within the imaged image in accordance with the state of said at least a part of a first object on the first displayer, and wherein
said first object display controller, if the first object is made to be out of said first displayer, it erases the first object from said first displayer, and
said second object display controller, correspondingly to the erase of the first object from said first displayer, it displays the second object in said second displayer so that it is superimposed on the imaged image, and the position where the second object is superimposed is decided at the outside of the imaged image of said first displayer.
2. The image processing system according toclaim 1, wherein
said first image processor and said second image processor are able to communicate with each other,
said first object display controller performs a display control of said first object image by being operatively connected through the communication, and
said second object display controller performs a composition control of said second object image by being operatively connected through the communication.
3. The image processing system according toclaim 2, wherein
said second object display controller combines at least a part of said second object image with said imaged image when said marker image is recognized within said imaged image, and
said first object display controller performs a control on said first displayer such that said first object image disappears when said marker image is recognized within said imaged image in said second object display controller.
4. The image processing system according toclaim 2, wherein
said second object display controller combines at least a part of said second object image with said imaged image when said marker image is recognized within said imaged image, and
said first object display controller performs a control such that said first object image is displayed on said first displayer when said marker image is recognized within said imaged image in said second object display controller.
5. The image processing system according toclaim 1, wherein
said marker image includes identification information; and
said first object image and said second object image are images corresponding to identification information included in said marker image.
6. The image processing system according toclaim 1, wherein
said first display processor displays a plurality of marker images on said first displayer.
7. The image processing system according toclaim 6, wherein
said first display processor displays four marker images at four corners of said first displayer.
8. The image processing system according toclaim 6, wherein
said first object display controller performs a control such that said first object image is displayed at a predetermined position surrounded by said plurality of marker images, and by recognizing at least one of said plurality of marker images within said imaged image, said second object display controller performs a control of a composition of said second object image on a position surrounded by said plurality of marker images recognized within said imaged image.
9. The image processing system according toclaim 1, wherein
said second object display controller which performs a composition control of said second object at a position and an orientation corresponding to a position and an orientation of said marker image within said imaged image by performing an AR recognition on said marker image within said imaged image.
10. The image processing system according toclaim 9, wherein
said second object display controller includes:
a position and orientation calculator which calculates a correlative relation of a position and an orientation between said marker image on said first displayer and said imager by recognizing said marker image within said imaged image;
a virtual camera setter which arranges said second object in said virtual space and decides a position and an orientation of said virtual camera such that a correlative relation of a position and an orientation between said second object and said virtual camera match the position and the orientation that are calculated by said position and orientation calculator; and
a virtual space imager which images said virtual space including said second object by said virtual camera, wherein
a composition control is performed between said imaged image and the virtual space imaged by said virtual space imager.
11. The image processing system according toclaim 1, wherein
said second image processing apparatus further comprises a second display processor which displays the imaged image imaged by said imaging processor on said second displayer, and
said second object display controller performs a composition control of said second object on said imaged image displayed by said second displayer.
12. The image processing system according toclaim 1, wherein
said second image processing apparatus further comprises:
a first signal transmitter which transmits a first signal to said first image processing apparatus on the basis of a recognition result of said marker image, and
said first image processing apparatus further comprises:
a first signal receiver which receives the first signal transmitted by said first signal transmitter, wherein
said first object display controller controls a display of said first object on the basis of the first signal received by said first signal receiver.
13. The image processing system according toclaim 12, wherein
said first image processing apparatus further comprises:
a second signal transmitter which transmits a second signal to said second image processing apparatus in a case that said first signal is received by said first signal receiver, and
said second image processing apparatus further comprises:
a second signal receiver which receives said second signal, wherein
said second object display controller performs a display control on the basis of the second signal received by said second signal receiver.
14. The image processing system according toclaim 13, wherein
said second signal transmitter transmits the second signal to said second image processing apparatus after a lapse of a first predetermined time since the first signal is received by said first signal receiver.
15. The image processing system according toclaim 13, wherein
said first object display controller performs a display control after the second signal is transmitted by said second signal transmitter.
16. The image processing system according toclaim 14, wherein
said first image processing apparatus further comprises a third signal transmitter which transmits a third signal to said second image processing apparatus after a lapse of a second predetermined time since said first object display controller performs a display control,
said second image processing apparatus further comprises a third signal receiver which receives the third signal, and
said second object display controller erases said second object from said imaged image after said third signal is received by said third signal receiver.
17. The image processing system according toclaim 16, wherein
said first object display controller returns to a state before said display control is performed after said third signal transmitter transmits the third signal.
18. The image processing system according toclaim 12, wherein
said first display processor displays at least a part of said first object together with predetermined identification information, and
said first object display controller erases at least a part of said first object on the basis of said first signal.
19. The image processing system according toclaim 12, wherein
said first object display controller displays at least a part of said first object on the basis of said first signal.
20. The image processing system according toclaim 1, wherein
said first object display controller includes a first object size changer which changes a size of said first object on the basis of a size of said marker image, and
said second object display controller includes a second object size changer which changes a size of said second object on the basis of the size of said marker image.
21. The image processing system according toclaim 1, wherein
said first object display controller includes a first object direction changer which changes a display direction of said first object on the basis of a shape of said marker image, and
said second object display controller includes a second object direction changer which changes a display direction of said second object on the basis of the shape of said marker image.
22. The image processing system according toclaim 12, wherein
said first signal includes coordinate information, and
said first object display controller performs a display control of at least a part of said first object on the basis of the coordinate information included in said first signal.
23. The image processing system according toclaim 11, wherein
said second display processor includes a frame displayer which displays a frame the same in shape as said marker image on said second displayer, and
said second object display controller performs recognition in a state that said marker image is displayed along the frame displayed by said frame displayer.
24. The image processing system according toclaim 1, wherein
said at least part of a second image moves within the imaged image in accordance with the movement of said at least a part of a first object on the first displayer.
25. The image processing system according toclaim 1, wherein
said at least part of a second image is positioned within the imaged image in accordance with the position of said at least a part of a first object on the first displayer.
26. A non-transitory storage medium storing an image processing program performing image processing between a first image processing apparatus utilizing a first displayer and a second image processing apparatus utilizing an imager and a second displayer capable of viewing a real space on a screen, wherein
said image processing program causes a computer to function as:
a first display processor which displays a predetermined marker image on said first displayer; and
a first object display controller which performs on said first displayer a display control of at least a part of a first object image being a predetermined CG object,
said image processing program causes said second image processing apparatus to function as:
an imaging processor which performs imaging by said imager; and
a second object display controller which performs a composition control of at least a part of a second object image being a predetermined CG object on a real space capable of being viewed on said second displayer at a position with reference to said marker image within said imaged image by recognizing said marker image within said imaged image, wherein
said at least part of a second image is controlled within the imaged image in accordance with the state of said at least a part of a first object on the first displayer, and wherein
said first object display controller, if the first object is made to be out of said first displayer, it erases the first object from said first displayer, and
said second object display controller, correspondingly to the erase of the first object from said first displayer, it displays the second object in said second displayer so that it is superimposed on the imaged image, and the position where the second object is superimposed is decided at the outside of the imaged image of said first displayer.
27. A first image processing apparatus being brought into association with a second image processing apparatus that utilizes an imager and a second displayer capable of viewing a real space on a screen by using a first displayer, comprising:
a first display processor which displays a predetermined marker image on said first displayer; and
a first object display controller which performs on said first displayer a display control of at least a part of a first object image being a predetermined CG object, wherein
said second image processing apparatus comprises:
an imaging processor which performs imaging by said imager; and
a second object display controller which performs a composition control of at least a part of a second object image being a predetermined CG object on a real space capable of being viewed on said second displayer at a position with reference to said marker image within said imaged image by recognizing said marker image within said imaged image, wherein
said at least part of a second image is controlled within the imaged image in accordance with the state of said at least a part of a first object on the first displayer, and wherein
said first object display controller, if the first object is made to be out of said first displayer, it erases the first object from said first displayer, and
said second object display controller, correspondingly to the erase of the first object from said first displayer, it displays the second object in said second displayer so that it is superimposed on the imaged image, and the position where the second object is superimposed is decided at the outside of the imaged image of said first displayer.
28. A second image processing apparatus utilizing an imager and a second displayer capable of viewing a real space on a screen in associating with a first image processing apparatus utilizing a first displayer, wherein
said first image processing apparatus comprises:
a first display processor which displays a predetermined marker image on said first displayer; and
a first object display controller which performs on said first displayer a display control of at least a part of a first object image being a predetermined CG object,
comprising:
an imaging processor which performs imaging by said imager; and
a second object display controller which performs a composition control of at least a part of a second object image being a predetermined CG object on a real space capable of being viewed on said second displayer at a position with reference to said marker image within said imaged image by recognizing said marker image within said imaged image, wherein
said at least part of a second image is controlled within the imaged image in accordance with the state of said at least a part of a first object on the first displayer, and wherein
said first object display controller, if the first object is made to be out of said first displayer, it erases the first object from said first displayer, and
said second object display controller, correspondingly to the erase of the first object from said first displayer, it displays the second object in said second displayer so that it is superimposed on the imaged image, and the position where the second object is superimposed is decided at the outside of the imaged image of said first displayer.
29. An image processing method performed by a first image processing apparatus utilizing a first displayer and a second image processing apparatus utilizing an imager and a second displayer capable of viewing a real space on a screen, including the following steps to be executed by a computer of said first image processing apparatus of:
a first display processing step for displaying a predetermined marker image on said first displayer; and
a first object display controlling step for performing on said first displayer a display control of at least a part of a first object image being a predetermined CG object, and
including the following steps to be executed by a computer of said second image processing apparatus of:
an imaging processing step for performing imaging by said imager; and
a second object display controlling step for performing a composition control of at least a part of a second object image being a predetermined CG object on a real space capable of being viewed on said second displayer at a position with reference to said marker image within said imaged image by recognizing said marker image within said imaged image, wherein
said at least part of a second image is controlled within the imaged image in accordance with the state of said at least a part of a first object on the first displayer, and wherein
said first object display controlling step, if the first object is made to be out of said first displayer, it erases the first object from said first displayer, and
said second object display controlling step, correspondingly to the erase of the first object from said first displayer, it displays the second object in said second displayer so that it is superimposed on the imaged image, and the position where the second object is superimposed is decided at the outside of the imaged image of said first displayer.
US12/870,1582010-06-112010-08-27Image processing system, storage medium storing image processing program, image processing apparatus and image processing methodActive2031-08-16US8427506B2 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US13/859,769US9058790B2 (en)2010-06-112013-04-10Image processing system, storage medium storing image processing program, image processing apparatus and image processing method

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
JP2010-1340622010-06-11
JP2010134062AJP5643549B2 (en)2010-06-112010-06-11 Image processing system, image processing program, image processing apparatus, and image processing method

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US13/859,769ContinuationUS9058790B2 (en)2010-06-112013-04-10Image processing system, storage medium storing image processing program, image processing apparatus and image processing method

Publications (2)

Publication NumberPublication Date
US20110304646A1 US20110304646A1 (en)2011-12-15
US8427506B2true US8427506B2 (en)2013-04-23

Family

ID=44650486

Family Applications (2)

Application NumberTitlePriority DateFiling Date
US12/870,158Active2031-08-16US8427506B2 (en)2010-06-112010-08-27Image processing system, storage medium storing image processing program, image processing apparatus and image processing method
US13/859,769ActiveUS9058790B2 (en)2010-06-112013-04-10Image processing system, storage medium storing image processing program, image processing apparatus and image processing method

Family Applications After (1)

Application NumberTitlePriority DateFiling Date
US13/859,769ActiveUS9058790B2 (en)2010-06-112013-04-10Image processing system, storage medium storing image processing program, image processing apparatus and image processing method

Country Status (3)

CountryLink
US (2)US8427506B2 (en)
EP (1)EP2394713B1 (en)
JP (1)JP5643549B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20120308078A1 (en)*2011-06-022012-12-06Nintendo Co., Ltd.Storage medium storing image processing program, image processing apparatus, image processing method and image processing system
US20130249944A1 (en)*2012-03-212013-09-26Sony Computer Entertainment Europe LimitedApparatus and method of augmented reality interaction
US8708818B2 (en)*2012-04-042014-04-29Nintendo Co., Ltd.Display control system, display control method, computer-readable storage medium having stored thereon display control program, and display control apparatus
US9086724B2 (en)2012-04-042015-07-21Nintendo Co., Ltd.Display control system, display control method, computer-readable storage medium having stored thereon display control program, and display control apparatus
US9324298B2 (en)2013-06-132016-04-26Nintendo Co., Ltd.Image processing system, image processing apparatus, storage medium having stored therein image processing program, and image processing method

Families Citing this family (44)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8312033B1 (en)2008-06-262012-11-13Experian Marketing Solutions, Inc.Systems and methods for providing an integrated identifier
US8803803B2 (en)2011-01-252014-08-12Sony CorporationOperation member provided in electronic device, and electronic device
JP5704962B2 (en)*2011-02-252015-04-22任天堂株式会社 Information processing system, information processing method, information processing apparatus, and information processing program
KR101853660B1 (en)*2011-06-102018-05-02엘지전자 주식회사3d graphic contents reproducing method and device
JP5849490B2 (en)*2011-07-212016-01-27ブラザー工業株式会社 Data input device, control method and program for data input device
US8738516B1 (en)2011-10-132014-05-27Consumerinfo.Com, Inc.Debt services candidate locator
US9916621B1 (en)2012-11-302018-03-13Consumerinfo.Com, Inc.Presentation of credit score factors
US9323057B2 (en)*2012-12-072016-04-26Blackberry LimitedMobile device, system and method for controlling a heads-up display
JP5991423B2 (en)2013-02-212016-09-14富士通株式会社 Display device, display method, display program, and position setting system
US10102570B1 (en)2013-03-142018-10-16Consumerinfo.Com, Inc.Account vulnerability alerts
JP6138566B2 (en)*2013-04-242017-05-31川崎重工業株式会社 Component mounting work support system and component mounting method
JP6192454B2 (en)*2013-09-172017-09-06ウエストユニティス株式会社 Display system
JP6143958B2 (en)*2014-06-132017-06-07三菱電機株式会社 Information processing apparatus, information superimposed image display apparatus, marker display program, information superimposed image display program, marker display method, and information superimposed image display method
US20170061700A1 (en)*2015-02-132017-03-02Julian Michael UrbachIntercommunication between a head mounted display and a real world object
EP3340188A4 (en)*2015-08-202019-05-22Sony CorporationInformation processing device, information processing method, and program
US10685490B2 (en)*2016-03-102020-06-16Canon Kabushiki KaishaInformation processing apparatus, information processing method, and storage medium
JP6735147B2 (en)*2016-05-112020-08-05日本放送協会 Display device and program
EP4201340A1 (en)2016-06-202023-06-28BFLY Operations, Inc.Automated image acquisition for assisting a user to operate an ultrasound device
US10506221B2 (en)2016-08-032019-12-10Adobe Inc.Field of view rendering control of digital content
US11461820B2 (en)2016-08-162022-10-04Adobe Inc.Navigation and rewards involving physical goods and services
US20180061128A1 (en)*2016-08-232018-03-01Adobe Systems IncorporatedDigital Content Rendering Coordination in Augmented Reality
US10068378B2 (en)2016-09-122018-09-04Adobe Systems IncorporatedDigital content interaction and navigation in virtual and augmented reality
US10430559B2 (en)2016-10-182019-10-01Adobe Inc.Digital rights management in virtual and augmented reality
US10878591B2 (en)*2016-11-072020-12-29Lincoln Global, Inc.Welding trainer utilizing a head up display to display simulated and real-world objects
KR102798196B1 (en)*2017-01-122025-04-22삼성전자주식회사Method for detecting marker and an electronic device thereof
JP6885179B2 (en)*2017-04-202021-06-09富士通株式会社 Shooting device, shooting control program, and shooting management system
CN108090561B (en)*2017-11-092021-12-07腾讯科技(成都)有限公司Storage medium, electronic device, and method and device for executing game operation
US10916220B2 (en)2018-08-072021-02-09Apple Inc.Detection and display of mixed 2D/3D content
US11265324B2 (en)2018-09-052022-03-01Consumerinfo.Com, Inc.User permissions for access to secure data at third-party
US10818089B2 (en)*2018-09-252020-10-27Disney Enterprises, Inc.Systems and methods to provide a shared interactive experience across multiple presentation devices
US10776954B2 (en)2018-10-082020-09-15Microsoft Technology Licensing, LlcReal-world anchor in a virtual-reality environment
US11238656B1 (en)*2019-02-222022-02-01Consumerinfo.Com, Inc.System and method for an augmented reality experience via an artificial intelligence bot
US11055918B2 (en)*2019-03-152021-07-06Sony Interactive Entertainment Inc.Virtual character inter-reality crossover
US10918949B2 (en)2019-07-012021-02-16Disney Enterprises, Inc.Systems and methods to provide a sports-based interactive experience
US11941065B1 (en)2019-09-132024-03-26Experian Information Solutions, Inc.Single identifier platform for storing entity data
JP7619771B2 (en)*2020-06-262025-01-22株式会社バンダイナムコエンターテインメント Entertainment Systems and Programs
US11277658B1 (en)2020-08-212022-03-15Beam, Inc.Integrating overlaid digital content into displayed data via graphics processing circuitry
US11601276B2 (en)2021-04-302023-03-07Mobeus Industries, Inc.Integrating and detecting visual data security token in displayed data via graphics processing circuitry using a frame buffer
US11586835B2 (en)2021-04-302023-02-21Mobeus Industries, Inc.Integrating overlaid textual digital content into displayed data via graphics processing circuitry using a frame buffer
US11475610B1 (en)2021-04-302022-10-18Mobeus Industries, Inc.Controlling interactivity of digital content overlaid onto displayed data via graphics processing circuitry using a frame buffer
US11477020B1 (en)2021-04-302022-10-18Mobeus Industries, Inc.Generating a secure random number by determining a change in parameters of digital content in subsequent frames via graphics processing circuitry
US11682101B2 (en)*2021-04-302023-06-20Mobeus Industries, Inc.Overlaying displayed digital content transmitted over a communication network via graphics processing circuitry using a frame buffer
EP4361770A4 (en)*2021-06-222025-04-30Maxell, Ltd. INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING DEVICE AND IMAGE DISPLAY DEVICE
US11562153B1 (en)2021-07-162023-01-24Mobeus Industries, Inc.Systems and methods for recognizability of objects in a multi-layer display

Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2000322602A (en)1999-05-122000-11-24Sony CorpDevice and method for processing image and medium
US20020103824A1 (en)*1996-12-062002-08-01Microsoft CorporationObject-oriented framework for hyperlink navigation
US6611242B1 (en)*1999-02-122003-08-26Sanyo Electric Co., Ltd.Information transmission system to transmit work instruction information
US20050018066A1 (en)*2003-07-252005-01-27Hofer Gregory V.Method and apparatus for setting a marker on an object and tracking the position of the object
US20050231529A1 (en)*2001-12-202005-10-20Volker SkwarekMethod and system for displaying information and vehicle infotainment system
US20050289590A1 (en)*2004-05-282005-12-29Cheok Adrian DMarketing platform
US7263207B2 (en)*2002-03-072007-08-28Samsung Electronics Co., Ltd.Method and apparatus for video object tracking
US20080134013A1 (en)*2001-10-152008-06-05Mathieu AudetMultimedia interface
US20100185529A1 (en)*2009-01-212010-07-22Casey ChesnutAugmented reality method and system for designing environments and buying/selling goods
US7793219B1 (en)*2006-12-072010-09-07Adobe Systems Inc.Construction of multimedia compositions
US8261209B2 (en)*2007-08-062012-09-04Apple Inc.Updating content display based on cursor position

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2000222116A (en)*1999-01-292000-08-11Sony CorpPosition recognition method for display image, position recognition device therefor and virtual image stereoscopic synthesis device
US20030062675A1 (en)*2001-09-282003-04-03Canon Kabushiki KaishaImage experiencing system and information processing method
JP3734815B2 (en)*2003-12-102006-01-11任天堂株式会社 Portable game device and game program
JP2005250950A (en)*2004-03-052005-09-15Nippon Telegr & Teleph Corp <Ntt> Marker-presenting portable terminal, augmented reality system, and operation method thereof
US8547401B2 (en)2004-08-192013-10-01Sony Computer Entertainment Inc.Portable augmented reality device and method
JP4774346B2 (en)*2006-08-142011-09-14日本電信電話株式会社 Image processing method, image processing apparatus, and program
TWI354220B (en)*2007-12-172011-12-11Pixart Imaging IncPositioning apparatus and related method of orient
JP5428436B2 (en)*2009-03-252014-02-26ソニー株式会社 Electronic device, display control method and program

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20020103824A1 (en)*1996-12-062002-08-01Microsoft CorporationObject-oriented framework for hyperlink navigation
US6611242B1 (en)*1999-02-122003-08-26Sanyo Electric Co., Ltd.Information transmission system to transmit work instruction information
JP2000322602A (en)1999-05-122000-11-24Sony CorpDevice and method for processing image and medium
US20080134013A1 (en)*2001-10-152008-06-05Mathieu AudetMultimedia interface
US20050231529A1 (en)*2001-12-202005-10-20Volker SkwarekMethod and system for displaying information and vehicle infotainment system
US7263207B2 (en)*2002-03-072007-08-28Samsung Electronics Co., Ltd.Method and apparatus for video object tracking
US20050018066A1 (en)*2003-07-252005-01-27Hofer Gregory V.Method and apparatus for setting a marker on an object and tracking the position of the object
US20050289590A1 (en)*2004-05-282005-12-29Cheok Adrian DMarketing platform
US7793219B1 (en)*2006-12-072010-09-07Adobe Systems Inc.Construction of multimedia compositions
US8261209B2 (en)*2007-08-062012-09-04Apple Inc.Updating content display based on cursor position
US20100185529A1 (en)*2009-01-212010-07-22Casey ChesnutAugmented reality method and system for designing environments and buying/selling goods

Cited By (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20120308078A1 (en)*2011-06-022012-12-06Nintendo Co., Ltd.Storage medium storing image processing program, image processing apparatus, image processing method and image processing system
US9417761B2 (en)*2011-06-022016-08-16Nintendo Co., Ltd.Storage medium storing image processing program, image processing apparatus, image processing method and image processing system for displaying a virtual space in which objects are arranged with a virtual camera
US20130249944A1 (en)*2012-03-212013-09-26Sony Computer Entertainment Europe LimitedApparatus and method of augmented reality interaction
US9135753B2 (en)*2012-03-212015-09-15Sony Computer Entertainment Europe LimitedApparatus and method of augmented reality interaction
US8708818B2 (en)*2012-04-042014-04-29Nintendo Co., Ltd.Display control system, display control method, computer-readable storage medium having stored thereon display control program, and display control apparatus
US9086724B2 (en)2012-04-042015-07-21Nintendo Co., Ltd.Display control system, display control method, computer-readable storage medium having stored thereon display control program, and display control apparatus
US9324298B2 (en)2013-06-132016-04-26Nintendo Co., Ltd.Image processing system, image processing apparatus, storage medium having stored therein image processing program, and image processing method

Also Published As

Publication numberPublication date
EP2394713A3 (en)2014-05-14
JP5643549B2 (en)2014-12-17
US9058790B2 (en)2015-06-16
EP2394713B1 (en)2018-05-23
EP2394713A2 (en)2011-12-14
US20110304646A1 (en)2011-12-15
JP2011258120A (en)2011-12-22
US20130222425A1 (en)2013-08-29

Similar Documents

PublicationPublication DateTitle
US8427506B2 (en)Image processing system, storage medium storing image processing program, image processing apparatus and image processing method
US8403761B2 (en)Game program, game apparatus and game system configured to use replay data to create a highlight scene
US8690675B2 (en)Game system, game device, storage medium storing game program, and game process method
US8493382B2 (en)Storage medium storing three-dimensional image processing program, three-dimensional image processing apparatus and three-dimensional image processing method
US8535132B2 (en)Game apparatus for setting a moving direction of an object in a game space according to an attitude of an input device and game program
TWI440496B (en)Controller device and controller system
JP5692904B2 (en) Input system, information processing apparatus, information processing program, and pointing position calculation method
US8090887B2 (en)Input system enabling connection of even expansion equipment for expanding function, that transmits relatively large amount of data, to peripheral equipment and information processing system
JP5869236B2 (en) Information processing program, information processing apparatus, information processing system, and information processing method
EP2446946A1 (en)Device support system and support device
US20100060575A1 (en)Computer readable recording medium recording image processing program and image processing apparatus
US9005035B2 (en)Storage medium storing game program, game apparatus, and game controlling method for changing game parameters based on game progress
US8870650B2 (en)Game system, game apparatus, storage medium having game program stored therein, and game process method
US9149715B2 (en)Game system, game apparatus, storage medium having game program stored therein, and image generation method
US8784202B2 (en)Apparatus and method for repositioning a virtual camera based on a changed game state
US8350830B2 (en)Input device and information processing system
US8292738B2 (en)Information processing system and attachment device
US9417761B2 (en)Storage medium storing image processing program, image processing apparatus, image processing method and image processing system for displaying a virtual space in which objects are arranged with a virtual camera
JP5184036B2 (en) GAME PROGRAM AND GAME DEVICE
JP2012249868A (en)Game program, game apparatus, game system, and game processing method
KR20130020715A (en)Operating apparatus and operating system

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:NINTENDO CO., LTD., JAPAN

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KATO, SHUNSAKU;REEL/FRAME:024899/0402

Effective date:20100818

STCFInformation on status: patent grant

Free format text:PATENTED CASE

FEPPFee payment procedure

Free format text:PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAYFee payment

Year of fee payment:4

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:8

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:12


[8]ページ先頭

©2009-2025 Movatter.jp