TECHNICAL FIELDThe present invention relates to a boxing game processing method making use of glove type input articles to be imaged by a stroboscope and the related arts.
BACKGROUND ARTThe boxing game system disclosed by Japanese Patent Published Application No. Hei 2.004-49436 filed by the present applicant comprises a game unit and glove type input devices, such that a game player grips the main bodies of the input devices and swings them as if he is actually boxing. A piezoelectric buzzer is provided in the main body in order to detect the acceleration of the input device when it is swung. The acceleration information is output as an infrared light signal, which is received and decoded by an infrared light signal receiving decoding unit provided of the game unit. The acceleration information as decoded is received by a game processor of the game unit, which then calculates the force of the punch corresponding to the swinging of the input device. A damage value is determined by the force of the punch and used to recalculate the physical energy value of an opposing boxer displayed on a monitor. When the physical energy value is exhausted, the opposing boxer gets knocked down.
In this game system, since the positions of right and left punches are fixed on the monitor, the player can throw a punch at the opposing boxer, who is moving and having a guard, by swinging the input devices at an appropriate timing. Namely, what the player can control is only the timing of throwing a punch.
In the competition game system disclosed in Japanese Patent Published Application No. Hei 2003-79945, not only the acceleration of a glove type input device but also the relative position to a monitor are calculated. Accordingly, a player can control his position from which a punch is thrown in the monitor.
However, the input device of the boxing game system or competition game system as described above incorporate a sensor, a microcomputer, and other electronic circuits, and thereby it increases the production cost and can be the cause of trouble. In addition to this, since the weight tends to increase, the operability is not necessarily so good. This is particularly important because the player has to violently move the input devices in the air.
Also, in the case of the above boxing game system and so forth, only one type of punches is available which can be thrown (for example, only a straight punch).
Furthermore, in the case of the above boxing game system, the distinction between right and left is made on the basis of values set in particular ports provided in the right and left input devices. However, if MCUs are not provided in the input devices, the distinction between right and left cannot be made. In the case of the competition game system, when the glove type input device gripped by the left hand and the glove type input device gripped by the right hand are crossed to switch the relative positions thereof between left and right, the game unit recognizes the glove type input device gripped by the right hand as the glove type input device gripped by the left hand, and the glove type input device gripped by the left hand as the glove type input device gripped by the right hand. Namely, in such a case, it is impossible to distinguish between the right hand and the left hand.
SUMMARY OF THE INVENTIONThus, it is an object of the present invention to provide a boxing game processing method and the related arts for making it possible to distinguish between left and right, and increase the types of punches which can be thrown while improving the reliability and manipulability, such that it is possible to enjoy a boxing game by swinging glove type input articles of simple design in the air.
In accordance with a first aspect of the present invention, boxing game processing method comprises: an illumination step of emitting infrared light in a predetermined cycle to illuminate a left-handed glove type input article and a right-handed glove type input article which are provided respectively with retroreflective surfaces; an image generation step of imaging the left-handed glove type input article and the right-handed glove type input article both when the infrared light is emitted and when the infrared light is not emitted, and generating image data obtained with illumination and image data obtained without illumination; a differential data generation step of generating differential data between the image data obtained with illumination and the image data obtained without illumination; a position calculation step of calculating positional information of the left-handed glove type input article and the right-handed glove type input article on the basis of the differential data; an area determination step of determining in which area the position of the left-handed glove type input article is located in a first virtual screen which is divided into a straight area, a cross area and an immovable area, wherein said position of the left-handed glove type input article is a relative position which is indicated by current positional information of the left-handed glove type input article and converted into a coordinate system, the origin of which is located in the position indicated by past positional information obtained by tracing back for a predetermined number of times; an area determination step of determining in which area the position of the right-handed glove type input article is located in a second virtual screen which is divided into a straight area, a cross area and an immovable area, wherein said position of the right-handed glove type input article is a relative position which is indicated by current positional information of the right-handed glove type input article and converted into a coordinate system, the origin of which is located in the position indicated by past positional information obtained by tracing back for a predetermined number of times; a display step of displaying a left glove image which represents the left-handed glove type input article and a right glove image which represents the right-handed glove type input article in accordance with the result of determination in said area determination steps for the left-handed glove type input article and the right-handed glove type input article, wherein the first virtual screen and the second virtual screen are provided as mirror images each other in the right and left direction, and wherein in said area determination step for the left-handed glove type input article, an image showing that a left straight punch is thrown is displayed as the left glove image when the relative position indicated by the current positional information of the left-handed glove type input article is located in the straight area which does not include the origin, an image showing that a left cross punch is thrown is displayed as the left glove image when the relative position indicated by the current positional information of the left-handed glove type input article is located in the cross area which does not include the origin, an image showing that no left punch is thrown is displayed as the left glove image when the relative position indicated by the current positional information of the left-handed glove type input article is located in the immovable area which includes the origin, in said area determination step for the right-handed glove type input article, an image showing that a right straight punch is thrown is displayed as the right glove image when the relative position indicated by the current positional information of the right-handed glove type input article is located in the straight area which does not include the origin, an image showing that a right cross punch is thrown is displayed as the right glove image when the relative position indicated by the current positional information of the right-handed glove type input article is located in the cross area which does not include the origin, an image showing that no right punch is thrown is displayed as the right glove image when the relative position indicated by the current positional information of the right-handed glove type input article is located in the immovable area which includes the origin.
In accordance with this configuration, the area determination of the current position of the glove type input article is performed in the coordinate system, the origin of which is located in the position indicated by past positional information obtained by tracing back for the predetermined number of times. In other words, the origin is always located at the position determined by tracing back for the predetermined number of times from the position to be currently determined, and thereby the motion determination is based on the relative position of the glove type input article. Because of this, even if there are disparities in the body height of the player and in the distance between theplayer11 and an imaging device for performing the image generation step, it is possible to display a constant glove image.
In order to facilitate understanding of this feature, an area determination process which is performed on the basis of the absolute position of the glove type input article in the differential image will be considered. In this case, the differential image corresponds to the virtual screen. For example, when comparing a short player and a tall player playing with the glove type input article in the same posture, needless to say, there is a difference between the positions of the glove type input article gripped by the short and tall players in the differential image.
Accordingly, even if the short and tall players perform the similar action, the area where the glovetype input article7L of one is located may be different from the area where the glovetype input article7L of the other is located.
For example, while the glove type input article is located in the “straight area” of the virtual screen when a tall player such as an adult throws a straight punch, the glove type input article may be located in the “immovable area” of the virtual screen when a short player such as a child throws a straight punch. In such a case, although the similar action is taken, the glove image as displayed is different between a tall player and a short player. This shortcoming results also from the disparity in the distance between the imaging unit and the player. It is not desirable that, in spite of the similar action, a different globe image is displayed depending upon the body height of the player or the distance between the imaging device and the player. In accordance with the present invention, this shortcoming can be avoided.
Incidentally, it is not a realistic idea to provide different virtual screens, each of which is divided into a plurality of areas, respectively for different heights of players. Also, in the case of the present invention, there are the two virtual screens respectively for the two glove type input articles, while the straight area, the cross area and the immovable area are defined for each of the glove type input articles. Accordingly, a variety of glove images can be displayed respectively for the glove type input articles in response to motions.
In order to facilitate understanding of this feature, it is assumed that only one virtual screen is provided for the two glove type input articles. In such a case, a punch thrown with the left-handed glove type input article is either a straight punch or a left cross punch (a punch toward the right), and a punch thrown with the right-handed glove type input article is either a straight punch or a right cross punch (a punch toward the left).
Accordingly, the left-handed glove type input article when throwing a straight punch and the right-handed glove type input article when throwing a right cross punch can be located in the same area. Needless to say, the opposite is true. In such a case, in spite of the different types of motions for left and right, the glove image corresponding to the left-handed glove type input article and the glove image corresponding to the right-handed glove type input article become similar, so that the glove image as displayed may not correspond to the actual motion by the player. For example, in the case where the left-handed glove type input article when throwing a straight punch and the right-handed glove type input article when throwing a right cross punch are located in the same “straight area” of the virtual screen, the same image of a straight punch is displayed and therefore it is not appropriate as the glove image corresponding to the right-handed glove type input article.
In this situation, eventually, glove images must be provided with no distinction between the types of punches with the glove type input articles. Accordingly, it means nothing if the “straight area” and the “cross area” are distinctively defined. In other words, the respective motions of the left-handed glove type input article and the right-handed glove type input article cannot be reflected in the glove images. In this regard, in accordance with the present invention, it is possible to display a variety of glove images (the animations of a straight punch and a cross punch) reflecting the motions of the glove type input articles respectively.
Furthermore, in accordance with the present invention, it is possible to display glove images reflecting the intention of the player. This point will be explained in detail. In accordance with the present invention, the glove image is displayed depending upon the area in which the current position is located in the coordinate system, the origin of which is located in the position obtained by tracing back for a predetermined number of times. In this case, if the current position is located in the “immovable area” including the origin, the image as displayed is indicative of the figure in which no punch is thrown. Accordingly, when the motion of the glove type input article is small, the current position is often located in the “immovable area”, and thereby it is avoided as much as possible to determine, as a punch, a small motion of the player which is not intended as a punch.
The boxing game processing method as recited in the above further comprises: a step of obtaining a first extraction point indicative of the position of the left-handed glove type input article or the left-handed glove type input article on the basis of the differential data; a step of predicting the current position of the left-handed glove type input article on the basis of past positional information of the left-handed glove type input article; a step of predicting the current position of the right-handed glove type input article on the basis of past positional information of the right-handed glove type input article; a step of calculating a first distance which is a distance between the first extraction point and the current position as predicted of the left-handed glove type input article; a step of calculating a second distance which is a distance between the first extraction point and the current position as predicted of the right-handed glove type input article; a step of setting the current position of the right-handed glove type input article to the first extraction point if the first distance is larger than the second distance, and setting the current position of the left-handed glove type input article to the first extraction point if the second distance is larger than the first distance; a step of calculating a third distance which is a distance between the second extraction point and the current position as predicted of the left-handed glove type input article; a step of calculating a fourth distance which is a distance between the second extraction point and the current position as predicted of the right-handed glove type input article; a step of setting the current position of the right-handed glove type input article to the second extraction point if the third distance is larger than the fourth distance, and setting the current position of the left-handed glove type input article to the second extraction point if the fourth distance is larger than the third distance.
Furthermore, since the current positions of the left-handed glove type input article and the right-handed glove type input article are determined on the basis of the currently predicted positions of the left-handed glove type input article and the right-handed glove type input article, even when the player moves such that the left-handed glove type input article and the right-handed glove type input article are crossed to switch the relative positions thereof between left and right, the positions thereof can be determined correctly as much as possible (that is, left and right can be distinguished from each other).
The boxing game processing method as recited in the above further comprises: a step of obtaining a maximum horizontal coordinate of pixels that have luminance values larger than a predetermined threshold value in an image on the basis of the differential data; a step of obtaining a minimum horizontal coordinate of pixels that have luminance values larger than the predetermined threshold value in the image on the basis of the differential data; a step of obtaining a maximum vertical coordinate of pixels that have luminance values larger than the predetermined threshold value in the image on the basis of the differential data; a step of obtaining a minimum vertical coordinate of pixels that have luminance values larger than the predetermined threshold value in the image on the basis of the differential data, wherein said step of obtaining the first extraction point comprising: a step of obtaining a first horizontal distance which is a horizontal distance from a starting position of the minimum horizontal coordinate and the minimum vertical coordinate to the position of the pixel having the luminance value of which first exceeds the predetermined threshold value in the image on the basis of the differential data; a step of obtaining a second horizontal distance which is a horizontal distance from a starting position of the maximum horizontal coordinate and the minimum vertical coordinate to the position of the pixel having the luminance value of which first exceeds the predetermined threshold value in the image on the basis of the differential data; a step of setting the maximum horizontal coordinate to the horizontal coordinate of the first extraction point and the minimum vertical coordinate to the vertical coordinate of the first extraction point if the first horizontal distance is larger than the second horizontal distance, and setting the minimum horizontal coordinate to the horizontal coordinate of the first extraction point and the minimum vertical coordinate to the vertical coordinate of the first extraction point if the second horizontal distance is larger than the first horizontal distance; said step of obtaining the second extraction point comprising: a step of obtaining a third horizontal distance which is a horizontal distance from a starting position of the minimum horizontal coordinate and the maximum vertical coordinate to the position of the pixel having the luminance value of which first exceeds the predetermined threshold value in the image on the basis of the differential data; a step of obtaining a fourth horizontal distance which is a horizontal distance from a starting position of the maximum horizontal coordinate and the maximum vertical coordinate to the position of the pixel having the luminance value of which first exceeds the predetermined threshold value in the image on the basis of the differential data; a step of setting the maximum horizontal coordinate to the horizontal coordinate of the second extraction point and the maximum vertical coordinate to the vertical coordinate of the second extraction point if the third horizontal distance is larger than the fourth horizontal distance, and setting the minimum horizontal coordinate to the horizontal coordinate of the second extraction point and the maximum vertical coordinate to the vertical coordinate of the second extraction point if the fourth horizontal distance is larger than the third horizontal distance.
In accordance with this configuration, since two points are extracted (i.e., the coordinates of the first and second extraction points are determined) on the assumption that both the left-handed glove type input article and the right-handed glove type input article are imaged, it is possible to simplify the calculation for extracting the two points. This point will be explained in detail. If it is not assumed that both the two glove type input articles are imaged, one shape or two shapes must be detected in the differential image. This is because it is possible both that both the two glove type input articles are imaged and that only one article is imaged. Furthermore, it is required to calculate the center coordinates of one shape or two shapes as detected. Particularly, in the case where two shapes are located close to each other, it is difficult to determine which one or two glove type input article is imaged, and thereby the calculation of the center coordinates becomes quite difficult. In accordance with the present invention, since it is not necessary to perform the detection of the respective shapes and the calculation of the center coordinates, the above difficulties shall not rise and the calculation amount is small.
The boxing game processing method as recited in the above further comprises: a step of moving a cursor on a screen to follow the variation of the position of the left-handed glove type input article and/or the right-handed glove type input article; a step of displaying an input area on the screen for receiving an input from an operator; a step of moving the position of the cursor to a predetermined position in the input area if the cursor is located in a predetermined area including the input area irrespective of the positions of the left-handed glove type input article and the right-handed glove type input article; a step of displaying an image indicative of the elapsed time after the cursor is located in the predetermined position and/or the remaining time until a predetermined time elapses on the screen; and a step of performing a predetermined process when the cursor is located in the predetermined area at least in the predetermined time.
In accordance with this configuration, when the cursor is located in the predetermined area including the input area, the cursor is moved to the predetermined position in the input area irrespective of the positions of the glove type input articles, so that the player can easily move the cursor to the input area only by bring the cursor close to the input area. In other words, when the cursor is moved close to the input area, it is predicted that the player intends to move the cursor to the input area, and thereby the cursor is automatically moved to the input area for the purpose of lessening the operational burden of the player. In addition to this, since the elapsed time after the cursor reaches the input area and/or the remaining time until a predetermined time elapses are displayed, the player can easily know the remaining time until the predetermined time at which the predetermined process is performed, and thereby the user-friendliness for the player can be improved.
In accordance with a second aspect of the present invention, a display control method comprises: an illumination step of emitting infrared light in a predetermined cycle to illuminate a plurality of input articles which are provided respectively with retroreflective portions; an image generation step of imaging the plurality of input articles both when the infrared light is emitted and when the infrared light is not emitted, and generating image data obtained with illumination and image data obtained without illumination; a differential data generation step of generating differential data between the image data obtained with illumination and the image data obtained without illumination; and a position calculation step of calculating positional information of the plurality of input articles respectively in the image on the basis of the differential data, wherein a plurality of virtual screens are provided respectively corresponding to the plurality of input articles, said display control method further comprises: an area determination step of determining in which area the position of the input article is located in the corresponding virtual screen which is divided into a plurality of areas, wherein said position of the input article is a relative position which is indicated by current positional information of the input article and converted into a corresponding coordinate system, the origin of which is located, in the position indicated by past positional information of the input article obtained by tracing back for a predetermined number of times, wherein an image corresponding to each of the plurality of input articles is displayed in accordance with the result of area determination of the each of the plurality of input articles by said area determination step.
In accordance with this configuration, even if there are disparities in the body height of the player and in the distance between the player and an imaging device for performing the image generation step, it is possible to display a constant image corresponding to the input article.
This is the same as in the boxing game processing method in accordance with the first aspect. Also, in the case of the present invention, since the plurality of virtual screens are provided respectively corresponding to the plurality of input articles, the plurality of areas can be defined in the virtual screen separately for each of the input articles. Accordingly, a variety of glove images can be displayed respectively for the input articles in response to the motions.
This is also the same as in the boxing game processing method in accordance with the first aspect. In the display control method as described above, the predetermined number of times is a plural number.
In accordance with this configuration, in comparison with the case where the predetermined number of times is one, the displacement of the input article for a longer period can be used for determining the area, and thereby when the input article is continuously moved, appropriate area determination is possible along the motion thereof. Also, it is possible to enhance the difference between a small motion and a large motion of the input article.
In the display control method as recited in the above, the virtual screen is divided into at least two areas including a first area and a second area, and wherein in said area determination step, an image is displayed as an image corresponding to the input article for showing that an input is made when the relative position indicated by the current positional information of the input article is located in the first area which does not include the origin, and an image is displayed as an image corresponding to the input article for showing that no input is made when the relative position indicated by the current positional information of the input article is located in the second area which includes the origin.
In accordance with this configuration, it is possible to display images of the input articles reflecting the intention of the player. In other words, when the motion of the input article is small, the current position is often located in the second area, and thereby it is avoided as much as possible to determine, as an input, a small motion of the player which is not intended as an input.
This is also the same as in the boxing game processing method in accordance with the first aspect. In this description, “the image showing that no input is made” is an image showing a basic figure, and “the image showing that an input is made” is an image changing from the basic figure.
In the display control method as recited in the above, the virtual screen is divided into at least three areas including a first area, a second area and a third area, and wherein in said area determination step, an image is displayed as an image corresponding to the input article for showing that a first input is made when the relative position indicated by the current positional information of the input article is located in the first area which does not include the origin, an image which is different from the image showing that the first input is made is displayed as an image corresponding to the input article for showing that a second input is made when the relative position indicated by the current positional information of the input article is located in the second area which does not include the origin, and an image is displayed as an image corresponding to the input article for showing that no input is made when the relative position indicated by the current positional information of the input article is located in the third area which includes the origin.
In accordance with this configuration, since the virtual screen is divided into at least three areas for the area determination process, a variety of images can be displayed for the input article in accordance with the current position of the input article. Also, as has been discussed above, it is avoided as much as possible to determine, as an input, a small motion of the player which is not intended as an input. The meanings of “the image showing that no input is made” and “the image showing that an input is made” are the same as explained above.
In accordance with a third aspect of the present invention, position detection method comprises: a step of emitting infrared light in a predetermined cycle to illuminate a first input article and a second input article which are provided respectively with retroreflective portions a step of imaging the first input article and the second input article both when the infrared light is emitted and when the infrared light is not emitted, and generating image data obtained with illumination and image data obtained without illumination; a step of generating differential data between the image data obtained with illumination and the image data obtained without illumination; a step of obtaining a first extraction point indicative of the position of the first input article or the second input article on the basis of the differential data; a step of obtaining a second extraction point indicative of the position of the first input article or the second input article on the basis of the differential data; a step of predicting the current position of the first input article on the basis of past positional information of the first input article; a step of predicting the current position of the second input article on the basis of past positional information of the second input article; a step of calculating a first distance which is a distance between the first extraction point and the current position as predicted of the first input article; a step of calculating a second distance which is a distance between the first extraction point and the current position as predicted of the second input article; a step of calculating a third distance which is a distance between the second extraction point and the current position as predicted of the first input article; a step of calculating a fourth distance which is a distance between the second extraction point and the current position as predicted of the second input article; a step of setting the current position of the second input article to the first extraction point if the first distance is larger than the second distance, and setting the current position of the first input article to the first extraction point if the second distance is larger than the first distance; a step of setting the current position of the second input article to the second extraction point if the third distance is larger than the fourth distance, and setting the current position of the first input article to the second extraction point if the fourth distance is larger than the third distance.
Furthermore, since the current positions of the first and second input articles are determined on the basis of the currently predicted positions of the first and second input articles, even when the player moves such that the first input article and the second input article are crossed to switch the relative positions thereof between left and right, the positions thereof can be determined correctly as much as possible.
This is also the same as in the boxing game processing method in accordance with the first aspect. The position detection method as recited in the above further comprises: a step of obtaining a maximum horizontal coordinate of pixels that have luminance values larger than a predetermined threshold value in an image on the basis of the differential data; a step of obtaining a minimum horizontal coordinate of pixels that have luminance values larger than the predetermined threshold value in the image on the basis of the differential data; a step of obtaining a maximum vertical coordinate of pixels that have luminance values larger than the predetermined threshold value in the image on the basis of the differential data; a step of obtaining a minimum vertical coordinate of pixels that have luminance values larger than the predetermined threshold value in the image on the basis of the differential data, wherein said step of obtaining the first extraction point comprises: a step of obtaining a first horizontal distance which is a horizontal distance from a starting position of the minimum horizontal coordinate and the minimum vertical coordinate to the position of the pixel having the luminance value of which first exceeds the predetermined threshold value in the image on the basis of the differential data; a step of obtaining a second horizontal distance which is a horizontal distance from a starting position of the maximum horizontal coordinate and the minimum vertical coordinate to the position of the pixel having the luminance value of which first exceeds the predetermined threshold value in the image on the basis of the differential data; a step of setting the maximum horizontal coordinate to the horizontal coordinate of the first extraction point and the minimum vertical coordinate to the vertical coordinate of the first extraction point if the first horizontal distance is larger than the second horizontal distance, and setting the minimum horizontal coordinate to the horizontal coordinate of the first extraction point and the minimum vertical coordinate to the vertical coordinate of the first extraction point if the second horizontal distance is larger than the first horizontal distance; said step of obtaining the second extraction point comprises: a step of obtaining a third horizontal distance which is a horizontal distance from a starting position of the minimum horizontal coordinate and the maximum vertical coordinate to the position of the pixel having the luminance value of which first exceeds the predetermined threshold value in the image on the basis of the differential data; a step of obtaining a fourth horizontal distance which is a horizontal distance from a starting position of the maximum horizontal coordinate and the maximum vertical coordinate to the position of the pixel having the luminance value of which first exceeds the predetermined threshold value in the image on the basis of the differential data; a step of setting the maximum horizontal coordinate to the horizontal coordinate of the second extraction point and the maximum vertical coordinate to the vertical coordinate of the second extraction point if the third horizontal distance is larger than the fourth horizontal distance, and setting the minimum horizontal coordinate to the horizontal coordinate of the second extraction point and the maximum vertical coordinate to the vertical coordinate of the second extraction point if the fourth horizontal distance is larger than the third horizontal distance.
In accordance with this configuration, since two points are determined (i.e., the coordinates of the first and second extraction points are determined) on the assumption that both two the input articles are imaged, it is possible to simplify the calculation for extracting the two points. This is also the same as in the boxing game processing method in accordance with the first aspect. In accordance with a fourth aspect of the present invention, a cursor control method comprises: a step of emitting infrared light in a predetermined cycle to illuminate an input article which is provided with a retroreflective portion; a step of imaging the input article both when the infrared light is emitted and when the infrared light is not emitted, and generating image data obtained with illumination and image data obtained without illumination; a step of generating differential data between the image data obtained with illumination and the image data obtained without illumination; a step of calculating the position of the input article on the basis of the differential data; a step of moving a cursor on a screen to follow the variation of the position of the input article; a step of displaying an input area on the screen for receiving an input from an operator; a step of moving the position of the cursor to a predetermined position in the input area if the cursor is located in a predetermined area including the input area irrespective of the position of the input article; a step of displaying an image indicative of the elapsed time after the cursor is located in the predetermined position and/or the remaining time until a predetermined time elapses on the screen; and a step of performing a predetermined process when the cursor is located in the input area at least for the predetermined time.
In accordance with this configuration, when the cursor is located in the predetermined area including the input area, the cursor is moved to the predetermined position in the input area irrespective of the positions of the input articles, so that the player can easily move the cursor to the input area only by bring the cursor close to the input area. In addition to this, since the elapsed time after the cursor reaches the input area and/or the remaining time until a predetermined time elapses are displayed, the player can easily know the remaining time until the predetermined time at which the predetermined process is performed, and thereby the user-friendliness for the player can be improved.
This is also the same as in the boxing game processing method in accordance with the first aspect. In accordance with a fifth aspect of the present invention, an energy consumption calculating method comprises: a step of emitting infrared light in a predetermined cycle to illuminate an operation article operated by a user; a step of imaging the operation article both when the infrared light is emitted and when the infrared light is not emitted, and generating image data obtained with illumination and image data obtained without illumination; a step of generating differential data between the image data obtained with illumination and the image data obtained without illumination; a step of calculating state information of the operation article the basis of the differential data; and a step of calculating energy consumption when the user operates the operation article on the basis of the state information.
As described above, the energy consumption of the player can be easily calculated by the use of the result of stroboscopic imaging.
In the energy consumption calculating method as recited in the above, the state information is one of or any combination of two or more of speed information, moving direction information, moving distance information, velocity vector information, acceleration information, movement locus information, area information, and positional information.
In accordance with a sixth aspect of the present invention, an exercise system comprises: an infrared light emission unit operable to periodically emit infrared light to a retroreflective portion which an exerciser puts on; an infrared light image sensor operable to detect the infrared light as reflected by the retroreflective portion to obtain a series of image data a signal processing unit connected to said infrared light image sensor, and operable to generate a first image indicative of an exercise that the exerciser to do, receive the series of image data of the retroreflective portion from said infrared light image sensor while the exerciser does the exercise, calculate calorie consumption estimated of the exerciser, and generate a second image indicative of the calorie consumption in numbers, wherein the calorie consumption is calculated on the basis of the motion of the retroreflective portion corresponding to the exercise that the exerciser has done with reference to the series of image data obtained by said infrared light image sensor.
By this configuration, it is possible to effectively do an exercise while enjoying.
Also, the player is informed of the amount of exercise he actually do in terms of calorie consumption to maintain the health of body.
BRIEF DESCRIPTION OF DRAWINGSThe aforementioned and other features and objects of the present invention and the manner of attaining them will become more apparent and the invention itself will be best understood by reference to the following description of a preferred embodiment taken in conjunction with the accompanying drawings, wherein.
FIG. 1 is a block diagram showing the entire configuration of a boxing game system in accordance with theembodiment 1 of the present invention.
FIG. 2A is a perspective view showing a glovetype input article7L as seen from the front right direction in accordance with theembodiment 1 of the present invention.
FIG. 2B is a perspective view showing a glovetype input article7L as seen from the front left direction in accordance with theembodiment 1 of the present invention.
FIG. 2C is a perspective view showing a glovetype input article7L as seen from the bottom left direction in accordance with theembodiment 1 of the present invention.
FIG. 3 is a perspective view showing anadapter1 and acartridge3 ofFIG. 1.
FIG. 4 is a perspective view showing theadapter1 ofFIG. 1 as seen from the back side.
FIG. 5 is a view showing the electric configuration of theadapter1 ofFIG. 1.
FIG. 6 is a schematic diagram showing the electric configuration of thecartridge3 ofFIG. 1.
FIG. 7 is a cross sectional view showing thecartridge3 ofFIG. 1.
FIG. 8A is a view showing an example of a game mode selection screen displayed on atelevision monitor5 ofFIG. 1.
FIG. 8B is a view for explaining the selection operation in the game mode selection screen.
FIG. 9 is an explanatory view for showing the determination operation in the game mode selection screen displayed on thetelevision monitor5 ofFIG. 1.
FIG. 10 is a view showing an example of a game screen as displayed on thetelevision monitor5.
FIG. 11 is an explanatory view for showing the globe detection process by the use of thehigh speed processor91 incorporated in thecartridge3 ofFIG. 1.
FIG. 12 is an explanatory view for showing the right/left determination process by thehigh speed processor91 incorporated in thecartridge3 ofFIG. 1.
FIG. 13A is a view for explaining the process of calculating velocity vectors by thehigh speed processor91 incorporated in thecartridge3 ofFIG. 1.
FIG. 13B is a view for explaining the process of determining the globe motion by thehigh speed processor91 incorporated in thecartridge3 ofFIG. 1.
FIG. 13C is a view for explaining the process of calculating velocity vectors by thehigh speed processor91 incorporated in thecartridge3 ofFIG. 1.
FIG. 13D is a view for explaining the process of determining the globe motion by thehigh speed processor91 incorporated in thecartridge3 ofFIG. 1.
FIG. 13E is a view for explaining the process of calculating velocity vectors by thehigh speed processor91 incorporated in thecartridge3 ofFIG. 1.
FIG. 13F is a view for explaining the process of determining the globe motion by thehigh speed processor91 incorporated in thecartridge3 ofFIG. 1.
FIG. 14 is a flowchart showing an example of the overall process flow by thehigh speed processor91 incorporated in thecartridge3 ofFIG. 1.
FIG. 15 is a flowchart showing an example of the imaging process of step S2 ofFIG. 14.
FIG. 16 is a flowchart showing an example of the globe detection process of step S3 ofFIG. 14.
FIG. 17 is a flowchart showing an example of the process of detecting the left, right, upper and lower ends in step S32 ofFIG. 16.
FIG. 18 is a flowchart showing an example of the process of determining two points in step S33 ofFIG. 16.
FIG. 19 is a flowchart showing an example of the selection process in step S5 ofFIG. 14.
FIG. 20 is a flowchart showing an example of the process flow during fighting in step S6 ofFIG. 14.
FIG. 21 is a flowchart showing an example of the right/left determination process in step S120 ofFIG. 20.
FIG. 22 is a flowchart showing an example of the process of determining the globe motion in step S121 ofFIG. 20.
FIG. 23 is a flowchart showing an example of the process of updating the positions of the gloves of the player's boxer in step S122 ofFIG. 20.
FIG. 24 is a flowchart showing an example of the calorie consumption calculation process in step S8 ofFIG. 14.
FIG. 25 is a view showing an exemplary screen in which intermediate results is displayed on the basis of the processing result in step S9 ofFIG. 14.
FIG. 26 is a view showing an exemplary screen in which the outcome of fight is displayed on the basis of the processing result in step S11 ofFIG. 14.
FIG. 27 is a view showing an exemplary screen in which the total outcome is displayed after the outcome of fight is displayed inFIG. 26.
FIG. 28 is a view showing an exemplary screen in which comments are displayed after the total outcome is displayed inFIG. 27.
FIG. 29 is a view showing an example of an exercise screen displayed on the basis of the exercise process “A” performed by the boxing game system in accordance with anembodiment 2 of the present invention.
FIG. 30 is a view showing an example of an exercise screen displayed on the basis of the exercise process “B” performed by the boxing game system in accordance with theembodiment 2 of the present invention.
FIG. 31 is a view showing an example of an exercise screen displayed on the basis of the exercise process “C” performed by the boxing game system in accordance with theembodiment 2 of the present invention.
FIG. 32 is a view showing another example of the exercise screen ofFIG. 31.
FIG. 33 is a view showing an example of an exercise screen displayed on the basis of the exercise process “D” performed by the boxing game system in accordance with theembodiment 2 of the present invention.
FIG. 34 is a schematic diagram showing the process transition among the routines performed by the boxing game system in accordance with theembodiment 2 of the present invention.
FIG. 35 is a view showing an example of the save slot selection screen displayed in step S501 ofFIG. 34.
FIG. 36 is a view showing an example of the exercise selection screen displayed in step S513 ofFIG. 34.
FIG. 37 is a view showing an example of the level selection screen displayed in step S514 ofFIG. 34.
FIG. 38 is a view showing an example of the contents of saved data displayed in step S518 ofFIG. 34.
FIG. 39 is a view showing another example of the contents of saved data displayed in step S518 ofFIG. 34.
FIG. 40 is a view showing a further example of the contents of saved data displayed in step S518 ofFIG. 34.
BEST MODE FOR CARRYING OUT THE INVENTIONIn what follows, several embodiments of the present invention will be explained in conjunction with the accompanying drawings. Meanwhile, like references indicate the same or functionally similar elements throughout the respective drawings, and therefore redundant explanation is not repeated.
Embodiment 1FIG. 1 is a block diagram showing the entire configuration of a boxing game system in accordance with theembodiment 1 of the present invention. As shown inFIG. 1, this boxing game system comprises anadapter1, acartridge3, a glovetype input article7L (not shown in the figure), a glovetype input article7R and atelevision monitor5. Theadapter1, thecartridge3 and the glovetype input articles7L and7R make up a boxing game system.
Thecartridge3 is inserted into theadapter1. On the other hand, theadapter1 is connected to thetelevision monitor5 through anAV cable9. The glovetype input article7L and the glovetype input article7R are gripped by the left hand and the right hand of aplayer11 respectively.
FIG. 2A is a perspective view showing the glovetype input article7L ofFIG. 1 as seen from the front right direction;FIG. 2B is a perspective view showing the glovetype input article7L as seen from the front left direction;FIG. 2C is a perspective view showing the glovetype input article7R as seen from the bottom left direction. In this case, “front”, “left” and “right” indicate the directions as viewed from theplayer11.
As shown inFIGS. 2B and 2C,retroreflective sheets21aand21bare attached respectively to the front bottom portion and the left side portion of the glovetype input article7L. Also, as shown inFIG. 2C, agrip23L is fixed between the right side inner surface and the left side inner surface of the glovetype input article7L. Theplayer11 grips thisgrip23L with the left hand. The glovetype input article7R to be gripped by theplayer11 with the right hand is designed in the form of a mirror image of the glovetype input article7L.
In this case, theretroreflective sheets21aand21bare functionally equivalent to each other, and since they cannot be distinguished with the resolution of animage sensor161 to be described below and used in the present embodiment, theretroreflective sheets21aand21bcan be recognized as one retroreflective sheet. In other words, while these sheets are provided as separate parts, this configuration is not requisite but they can be integrated. Incidentally, the term “retroreflective sheet21” is used to generally represent theretroreflective sheets21aand21b.
FIG. 3 is a perspective view showing theadapter1 and thecartridge3 ofFIG. 1.FIG. 4 is a perspective view showing theadapter1 as seen from the back side.
As shown inFIG. 3, theadapter1 has a flat rectangular parallelepiped shape with an upper face, a lower face, a right and a left side face, a front and a back face. Theadapter1 is provided with a power supply,switch45, areset switch43 and apower lamp41 in the left hand side of the front face, and aninfrared filter33 in the right hand side of the front face. Thisinfrared filter33 is a filter capable of removing incident light except infrared light in order to only pass infrared light, and provided with an infrared sensor (not shown in the figure) located behind thisinfrared filter33. In addition,arrow keys37ato37dare provided on the upper face of theadapter1 in the vicinity of the front edge thereof. Furthermore, there are provided a cancel key39 in the left hand side of thearrow key37aand anenter key35 in the right hand side of thearrow key37d.
As shown inFIG. 4, anAV jack83, apower jack85, avideo jack81 V, an L channelaudio jack81L and an Rchannel audio jack81R are provided in the back face of theadapter1. Incidentally, the term “AV jack81” is used to generally represent thevideo jack81 V, the L channelaudio jack81L and the Rchannel audio jack81R. TheAV jack83 is an external output terminal, and connected to an external input terminal of thetelevision monitor5. On the other hand, the AV jack81 is an input terminal which can be connected to the output terminal of a variety of external equipments (for example, DVD (digital versatile disc) player).
An opening is formed in the middle position of the upper surface of theadapter1 while atop plate31 is disposed therein so that its upper face is approximately flush with the upper face of theadapter1. Inside theadapter1, there is an elevator mechanism which urges upward and supports thetop plate31 so that the upper face of thetop plate31 is located at the height as described above. Thetop plate31 is supported to move up and down in the opening by this elevator mechanism. Thecartridge3 can be connected to aconnector32 by placing and pushing down thecartridge3 on thistop plate31, and sliding thecartridge3 toward the front face (refer toFIG. 1). Thiscartridge3 contains ahigh speed processor91, amemory93 and the like to be described below. Also, needless to say, when thecartridge3 is pushed down on thetop plate31, the downward movement distance of thetop plate31 is restricted by the elevator mechanism so that thecartridge3 stops at a predetermined height.
Returning toFIG. 3, thecartridge3 comprises a flat rectangular parallelepiped main body and animaging unit51. The front face of the main body of thecartridge3 is provided with aconnector section57 having terminals t1 to t24 to be described below with which it is connected to theconnector32 of theadapter1. Theimaging unit51 is mounted on the upper face of the main body of thecartridge3. In this case, theimaging unit51 is fitted in order that the surface thereof is inclined at a predetermined angle (for example, 40 degrees) relative to the surface of thecartridge3. Theimaging unit51 is provided with a circularinfrared filter55 in the center portion of the surface thereof around which infraredlight emitting diodes53ato53dare arranged. Meanwhile, the term “infraredlight emitting diode53” is used to generally represent each of the infraredlight emitting diodes53ato53d.
FIG. 5 is a view showing the electric configuration of theadapter1. As shown inFIG. 5, thisadapter1 includes theconnector32, anextension connector63, an extension connectorperipheral circuit65, thereset switch43, acrystal oscillator circuit67, akey block69, an infrared signal receiver circuit (IR receiver circuit)71, anaudio amplifier73, an internal power supplyvoltage generation circuit75, apower supply circuit79 comprising an AC/DC converter and the like, thepower supply switch45, a switching regulator77, thepower jack85, theAV jack83, thevideo jack81 V, the L channelaudio jack81L, and the Rchannel audio jack81R. Theconnector32 has 24 terminals T1 to T24 and is covered by ashield member61 which is grounded. The terminals T1, T2, T22 and T24 of theconnector32 are grounded.
The AC voltage as supplied from a power cable (not shown in the figure) is supplied to thepower supply circuit79 through thepower jack85. Thepower supply circuit79 converts the AC voltage as given to a DC voltage, which is then output to a line w20 as a power supply voltage Vcc0. When turned on, thepower supply switch45 connects the line w20 and a line w54 to supply the switching regulator77 with the power supply voltage Vcc0, and gives the AV jack83 a video signal “VD” from a line w9 and audio signals “AL2” and “AR2” from the lines w12 and w13 respectively through lines w14, w15 and w16. Accordingly, the video signal “VD” and the audio signals “AL2” and “AR2” are given to thetelevision monitor5 through theAV cable9, while thetelevision monitor5 displays an image of the video signal “VD” with sounds of the audio signals “AL2” and “AR2” output from speakers (not shown in the figure).
On the other hand, when turned off, thepower switch45 connects lines w17, w18 and w19 to lines w14, w15 and w16. By this configuration, a video signal as input from thevideo jack81 V, an L channel audio signal as input from the L channelaudio jack81L and an R channel audio signal as input from the Rchannel audio jack81R are given to theAV jack83. Accordingly, the video signal and the audio signals as input from thejacks81 V,81L and81R are transferred to the television monitor5 from theAV jack83 through theAV cable9. As thus described, when thepower supply switch45 is turned off, it is possible to output the video signal and the audio signals as input from an external device through thejacks81 V,81L and81R to thetelevision monitor5.
The switching regulator77 receives the power supply voltage Vcc0 from thepower supply circuit79 through the line w54 when thepower supply switch45 is turned on, and generates a ground potential GND and the power supply voltage Vcc1 on the lines w50 and w22 respectively. On the other hand, when thepower supply switch45 is turned off, the switching regulator77 does not receive the power supply voltage Vcc0, and thereby it does not generate the power supply voltage Vcc1.
The internal power supplyvoltage generation circuit75 generates power supply voltages Vcc2, Vcc3 and Vcc4 respectively on the lines w23, w24 and w25 from the ground potential GND and the power supply voltage Vcc1 as supplied from the switching regulator77. The line w22 is connected to the terminals T7 and T8 of theconnector32; the line w23 is connected to the terminals T11 and T12 of theconnector32; the line w24 is connected to the terminals T15 and T16 of theconnector32; and the line w25 is connected to the terminals T18 and T19 of theconnector32. It is assumed that Vcc0>Vcc1>Vcc2>Vcc3>Vcc4. Incidentally, when thepower supply switch45 is turned off, the power supply voltage Vcc1 is not generated, and thereby the power supply voltages Vcc1, Vcc2, Vcc3 and Vcc4 are not supplied to thecartridge3 through theconnector32.
Theaudio amplifier73 amplifies the R channel audio signal “AR1” as input through the line w11 which is connected to the terminal T21 and the L channel audio signal “AL1” as input through the line w10 which is connected to the terminal T20, and outputs the R channel audio signal “AR2” and L channel audio signal “AL2” as amplified to the lines w13 and w12. The line w9 for inputting the video signal “VD” to thepower supply switch45 is connected to the terminal T23 of theconnector32.
The lines w9, w12 and w13 are covered by acylindrical ferrite87 in order not to radiate electromagnetic waves therefrom.
TheIR receiver circuit71 digital demodulates the digital modulated infrared signal as received, and outputs digital demodulated signal to the line w8. The line w8 is connected to the terminal T17 of theconnector32.
Thekey block69 includes the cancel key39, thearrow keys37ato37dand theenter key35 and is provided with a shift register (not shown in the figure). This shift register serves to convert parallel signals which are input from therespective keys39,37ato37dand35 and a terminal TE7 described below, into serial signals, and output the serial signals to the line w3. This line w3 is connected to the terminal T6 of theconnector32. In addition, thekey block69 is given a clock signal through the line w5 which is connected to the terminal T10 and a control signal through the line w4 which is connected to the terminal T9.
Thecrystal oscillator circuit67 oscillates a clock signal at a predetermined frequency (for example, 3.579545 MHz), and supplies the clock signal to the line w2. The line w2 is connected to the terminal T3 of theconnector32.
Thereset switch43 outputs a reset signal, which is used for resetting the system, to the line w1. The line w1 is connected to the terminal T4 of theconnector32.
Theextension connector63 is provided with first to ninth terminals (referred to as terminals TE1 to TE9 in the following description). The terminals TE2, TE4 and TE6 are connected to the terminals T13, T14 and T5 of theconnector32 respectively through the extension connectorperipheral circuit65. Accordingly, signals can be input from and output to the external device connected to theextension connector63 through the terminals TE2, TE4 and TE6. The lines w4 and w5 are connected to theterminal TE9 andTE8 respectively. Accordingly, the external device connected to theextension connector63 can receive the same clock signal through the terminal TE8 as thekey block69, and receive the same control signal as thekey block69 through the terminal TE9.
The terminals TE3 and TE5 are supplied respectively with the power supply voltages Vcc1 and Vcc2 through the extension connectorperipheral circuit65. Accordingly, the power supply voltages Vcc1 and Vcc2 can be supplied to the external device connected to theextension connector63 through the terminals TE3 and TE5. Theterminal TE1 is grounded. The terminal TE7 is connected to a predetermined input terminal of the above shift register included in thekey block69 through the extension connectorperipheral circuit65.
FIG. 6 is a schematic diagram showing the electric configuration of thecartridge3. As shown inFIG. 6, thecartridge3 includes ahigh speed processor91, amemory93, theimaging unit51, terminals t1 to t24, anaddress bus95, adata bus97, and anamplitude setting circuit99. Theamplitude setting circuit99 includes theresistors101 and103.
Thehigh speed processor91 includes a reset input port /RESET for inputting a reset signal, a clock input port XT for inputting a clock signal “SCLK2”, an input/output ports (I/O ports) IO0 to IOn (“n” is a natural number, for example, n=24) for inputting/outputting data, analog input ports AIN0 to AINk (“k” is a natural number, for example, k=6) for inputting analog signals, audio output ports AL and AR for outputting audio signals “AL1” and “AR1”, a video output port VO for outputting a video signal “VD”, control signal output ports for outputting control signals (for example, a chip enable signal, an output enable signal, a write enable signal and so on), a data bus, and an address bus. Thememory93 includes an address bus, a data bus, and control signal input ports for inputting control signals (for example, a chip enable signal, an output enable signal, a write enable signal and so forth). Thememory93 may be, for example, a ROM (read only memory), a flash memory, or any appropriate memory.
The control signal output ports of thehigh speed processor91 are connected to the control signal input ports of thememory93. The address bus of thehigh speed processor91 and the address bus of thememory93 are connected to theaddress bus95. The data bus of thehigh speed processor91 and the data bus of thememory93 are connected to thedata bus97. In this case, the control signal output ports of thehigh speed processor91 include an OE output port for outputting an output enable signal, a CE output port for outputting a chip enable signal, a WE output port for outputting a write enable signal, and so forth. Also, the control signal input ports of thememory93 include an OE input port connected to the OE output port of thehigh speed processor91, a CE input port connected to the CE output port of thehigh speed processor91, a WE input port connected to the WE output port of thehigh speed processor91, and so forth.
When receiving the chip enable signal, thememory93 responds to the chip enable signal as the destination thereof to output a data signal in accordance with an address signal and the output enable signal which are given substantially at the same time as the chip enable signal. The address signal is input to thememory93 through theaddress bus95 while the data signal is input to thehigh speed processor91 through thedata bus97. Also, when receiving the chip enable signal, thememory93 responds to the chip enable signal as the destination thereof to receive and write a data signal in accordance with an address signal and the write enable signal which are given substantially at the same time as the chip enable signal. The address signal is input to thememory93 through theaddress bus95 while the data signal is input to thememory93 from thehigh speed processor91 through thedata bus97.
When thecartridge3 is installed into theadapter1, the terminals t1 to t24 are connected to the terminals T1 to T24 of theconnector32 of theadapter1 in a one-to-one correspondence. The terminals t1, t2, t22 and t24 are grounded. The terminal t3 is connected to theamplitude setting circuit99. Namely, theresistor101 of theamplitude setting circuit99 is connected to the terminal t3 at one terminal thereof, and connected to the clock input port XT of thehigh speed processor91 and one terminal of theresistor103 at the other terminal thereof. The other terminal of theresistor103 is grounded. Namely, theamplitude setting circuit99 is a resistive potential divider.
The clock signal “SCLK1” generated by oscillation of thecrystal oscillator circuit67 of theadapter1 is input through the terminal t3 to theamplitude setting circuit99 which then generates a clock signal “SCLK2” having an amplitude smaller than the clock signal “SCLK1” and outputs the clock signal “SCLK2” to the clock input port XT. In other words, the amplitude of the clock signal “SCLK2” is set to a value which is determined by the ratio between theresistor101 and theresistor103.
The terminal t4 is connected to the reset input port /RESET of thehigh speed processor91. Also, one terminal of theresistor105 and one terminal of thecapacitor107 are connected to the line through which the reset input port /RESET is connected to the terminal t4. The other terminal of theresistor105 is supplied with the power supply voltage Vcc3, and the other terminal of thecapacitor107 is grounded.
The terminals t5, t13 and t14 are connected respectively to the I/O ports1012,1013 and1014 of thehigh speed processor91. Accordingly, thehigh speed processor91 can input signals to and output signals from an external device connected to theextension connector63 ofFIG. 5 through the terminals t5, t13 and t14.
The power supply voltage Vcc1 is supplied from the terminals t7 and t8. The power supply voltage Vcc2 is supplied from the terminals t11 and t12. The power supply voltage Vcc3 is supplied from the terminals t15 and t16. The power supply voltage Vcc4 is supplied from the terminals t18 and t19. The power supply voltage Vcc2 is supplied to the analog circuitry of thehigh speed processor91 while the power supply voltage Vcc3 is supplied to the digital circuitry of thehigh speed processor91.
The terminals t6, t9, t10 and t17 are connected respectively to the I/O ports IO15, IO16, IO17 and IO18 of thehigh speed processor91. Accordingly, thehigh speed processor91 can receive a signal output from thekey block69 through the terminal t6. Also, thehigh speed processor91 can output a control signal to an external device connected to theextension connector63 and thekey block69 through the terminal t9. Furthermore, thehigh speed processor91 can supply a clock signal to an external device connected to theextension connector63 and thekey block69 through the terminal t10. Still further, thehigh speed processor91 can receive the output signal of theIR receiver circuit71 through the terminal t17.
The terminals t20 and t21 are connected to the audio output ports AL and AR of thehigh speed processor91. The terminal t23 is connected to the video output port VO of thehigh speed processor91. Accordingly, thehigh speed processor91 can output the audio signals “AL1” and “AR1” to theaudio amplifier73 of theadapter1 through the terminals t20 and t21, and output the video signal “VD” to thepower supply switch45 of theadapter1 through the terminal t23.
Incidentally, thecartridge3 is provided with ashield member113. By virtue of theshield member113, electromagnetic waves can be prevented, as much as possible, from leaking from thehigh speed processor91 and the like as external radiation.
Theimaging unit51 includes the infrared emittingdiode53, animage sensor161, anLED drive circuit92 and theinfrared filter55. The output terminal of theimage sensor161 is connected to the analog input AIN0 ofhigh speed processor91.
Theimage sensor161 operates in response to the clock signal “SCLK1” from the terminal t3. The frame output flag signal “FS” and the image data output trigger signal “STR” as output from theimage sensor161 are given respectively to the I/O ports109 and1010 of thehigh speed processor91. The signal “FS” takes a high level during an exposure period and a low level during the period for transferring pixel data. Thehigh speed processor91 receives the pixel data from theimage sensor161 in response to the rising edges of the signal “STR”.
The I/O ports IO0 to IO6 of thehigh speed processor91 are connected respectively to the control terminals IP0 to IP6 of theimage sensor161. Thehigh speed processor91 gives theimage sensor161 commands through the I/O ports IO0 to IO6, and supplies theimage sensor161 with data to be set in control registers thereof through the I/O ports IO0 to IO6.
Thehigh speed processor91 outputs a clock signal “RCLK” to theimage sensor161 through the I/O port107 for storing data in the control registers. Also, thehigh speed processor91 outputs a reset signal to theimage sensor161 through the I/O port108.
Furthermore, thehigh speed processor91 outputs an LED control signal to theLED drive circuit92 through the I/O port IO11. TheLED drive circuit92 drives the infraredlight emitting diode53 in accordance with the LED control signal and the signal “FS”. By this process, the infraredlight emitting diode53 repeats turning on and off and serves as a stroboscope.
Next, the internal configuration of thehigh speed processor91 will be briefly explained. Although not shown in the figure, thehigh speed processor91 includes a CPU (central processing unit), a graphics processor, a sound processor and a DMA controller and so forth, and in addition to this, includes an A/D converter for receiving analog signals, and an input/output control circuit for receiving input signals such as key manipulation signals and infrared signals and giving the output signals to external devices.
The CPU takes control of the entire system and performs various types of arithmetic operations in accordance with a program stored in thememory93.
The graphics processor constructs graphics data on the basis of data stored in thememory93, and outputs a video signal “VD” which is generated on the basis of the graphics data for displaying it on thetelevision monitor5.
The graphics processor constructs graphics data by the use of a background screen, sprites and a bitmap screen. The background screen which covers entirety of the screen of thetelevision monitor5 comprises a two-dimensional block array. And each block comprises of a rectangular set of pixels. There are a first background screen and a second background screen respectively prepared as the background screen for showing depths in the background. The sprite consists of a rectangular set of pixels which can be relocated in any position of the screen of thetelevision monitor5. The bitmap screen consists of a two-dimensional pixel array, the size and position of which as displayed can be freely designated.
In addition to this, thehigh speed processor91 includes a pixel plotter which is not shown in the figure but can perform drawing operations with individual pixels. The sound processor converts data stored in thememory93 into sound data, and generates and outputs the audio signals “AL1” and “AR1” on the basis of the sound data. The sound data is synthesized by pitch conversion and amplitude modulation of PCM (pulse code modulation) data serving as the base data of tone quality. For the amplitude modulation, an envelope control function for reproducing waveforms of a music instrument is provided in addition to a volume control function performed in response to an instruction of the CPU.
In addition to this, thehigh speed processor91 is provided with an internal memory (not shown in the figure) which is used as a working area, a counter area, a register area, a temporary data area, a flag area and/or the like area.
FIG. 7 is a cross sectional view showing thecartridge3 ofFIG. 1. As shown inFIG. 7, alens unit164 is located in the rear side of theinfrared filter55 and fitted on thesubstrate167. Thelens unit164 includes aunit base159, alens holder151, aconcave lens153, and aconvex lens157. Theconcave lens153 is attached to thelens holder151 fixed to theunit base159 in the side facing theinfrared filter55 in parallel with theimage sensor161 mounted on theboard167. Also, theconvex lens157 is attached to thelens holder151 in the side facing theimage sensor161 in parallel with theimage sensor161. Furthermore, there is a cavity (optical path)155 between theconcave lens153 and theconvex lens157. The infrared light as transmitted through theinfrared filter55 is detected by theimage sensor161 after passing through theconcave lens153, thecavity155 and theconvex lens157.
Not shown in the figure, the infraredlight emitting diodes53aand53dare fixed to theLED holding member165 and inserted into the holes of thecylindrical sections163aand163drespectively. The holes of thesecylindrical sections163aand163dare formed completely through the surface so that the emitting portions of the infraredlight emitting diodes53aand53dare exposed to the surface of theimaging unit51. This is true also for the infraredlight emitting diodes53band53c.
As shown inFIG. 7, asubstrate169 is fitted inside of the main body of thecartridge3 for mounting thehigh speed processor91, thememory93 and so forth. Thesubstrate169 is rectangular shaped in the plane view and has the terminals t1 to t24 along the front edge thereof which is a part of theconnector section57. Thesubstrate169 is covered by theshield member171. Theshield member113 ofFIG. 6 is made up of theshield member171 and a shield member which is additionally provided on the bottom surface inside of the main body of thecartridge3.
Next, the general outline of the process by the boxing game system will be explained. With reference toFIG. 6, the infraredlight emitting diode53 is driven by theLED drive circuit92 in order to intermittently emit infrared light. The infrared light as emitted then intermittently illuminates the retroreflective sheets21 of the glovetype input articles7L and7R which are gripped by theplayer11. Theimage sensor161 then images the retroreflective sheets21 which are intermittently illuminated with infrared light. Accordingly, theimage sensor161 alternately outputs the image data of the retroreflective sheets21 which are illuminated with infrared light and the image data of the retroreflective sheets21 which are not illuminated with infrared light to thehigh speed processor91. In the case of the present embodiment, an image sensor of 32 pixels×32 pixels is used as theimage sensor161. Accordingly, 32pixels 32 pixels of pixel data (luminance data for each pixel) is output as image data from theimage sensor161. Thehigh speed processor91 calculates differential image data between the image data obtained when infrared light is emitted and the image data obtained when infrared light is not emitted. Then, thehigh speed processor91 calculates the positional information (determines the positions) of the respective glovetype input articles7L and7R on the basis of this differential image data. Thehigh speed processor91 displays a game mode selection screen and a game screen on thetelevision monitor5 by performing various processes to be described below on the basis of the positional information of the respective glovetype input articles7L and7R as calculated.
FIG. 8A is a view showing an example of the game mode selection screen displayed on thetelevision monitor5 ofFIG. 1.FIG. 8B is a view for explaining the selection operation in the game mode selection screen. As illustrated inFIG. 8A, when thecartridge3 is inserted into theadapter1 and then thepower supply switch45 of theadapter1 is turned on, the game mode selection screen is displayed on thetelevision monitor5 by thehigh speed processor91.
The game mode selection screen includes a gamemode display section200,selection buttons203U and203D, anOK button207 and acursor201. Theselection buttons203U and203D includeindicators202U and202D which are arrow shaped respectively. TheOK button207 includes ancircular indicator209.
Thehigh speed processor91 synchronizes the motion of thecursor201 with the motion of the glovetype input articles7L and/or7R as imaged by theimage sensor161. Accordingly, theplayer11 can manipulate thecursor201 by moving the glovetype input articles7L and/or7R. In this case, when only one of the glovetype input articles7L and7R is imaged, the motion of thecursor201 is synchronized with the motion of the glove type input article as imaged, but when both the glovetype input articles7L and7R are imaged, the motion of thecursor201 is synchronized with the motion of the center position of these glove type input articles.
As illustrated inFIG. 8B, when thecursor201 enters a selectionacceptable area211 including theselection button203D, thehigh speed processor91 moves thecursor201 to the center position of theselection button203D irrespective of the motion of the glovetype input articles7L and7R. For the sake of clarity in explanation, the selectionacceptable area211 is illustrated in the figure while it is not displayed on thetelevision monitor5 in practice.
After moving thecursor201 to the center position of theselection button203D, thehigh speed processor91 gradually fills theindicator202D with a predetermined color as time passes in order to indicate the elapsed time. Theindicator202D is completely filled with the predetermined color after a predetermined time elapses. Thehigh speed processor91 fixes the selection operation after the predetermined time elapses, and displays the next game mode selection screen (the game mode selection screen ofFIG. 9) on thetelevision monitor5 in accordance with the direction of the arrow of theindicator202D.
However, if theplayer11 greatly moves the glovetype input articles7L and/or7R to locate thecursor201 out of the selectionacceptable area211 before the predetermined time elapses, the selection operation is not fixed such that the color of theindicator202D is returned to the initial color.
The selection operation by the use of theselection button203U is same as the selection operation by the use of theselection button203D, and therefore no redundant description is repeated.
FIG. 9 is an explanatory view for showing the determination operation in the game mode selection screen. As illustrated inFIG. 9, when the selection operation is fixed in the condition shown inFIG. 8B, thehigh speed processor91 displays the game mode selection screen on thetelevision monitor5 in which the selectable game mode is “arbitrary match”. The “arbitrary match” mode is a game mode in which theplayer11 can arbitrarily select a boxer as his opponent.
When thecursor201 enters a determinationacceptable area213 including theOK button207, thehigh speed processor91 moves thecursor201 to the center position of theOK button207 irrespective of the motion of the glovetype input articles7L and7R. For the sake of clarity in explanation, the determinationacceptable area213 is illustrated in the figure while it is not actually displayed on thetelevision monitor5.
After moving thecursor201 to the center position of theOK button207, thehigh speed processor91 gradually fills theindicator209 with a predetermined color in the clockwise direction as time passes in order to indicate the elapsed time. Theindicator209 is completely filled with the predetermined color after a predetermined time elapses. Thehigh speed processor91 fixes the determination operation after the predetermined time elapses, and enters the game mode displayed in the game mode display section200 (“arbitrary match” in the case ofFIG. 9).
However, if theplayer11 greatly moves the glovetype input articles7L and/or7R to locate thecursor201 out of the determinationacceptable area213 before the predetermined time elapses, the determination operation is not fixed such that the color of theindicator209 is returned to the initial color.
FIG. 10 is a view showing an example of a game screen (tournament or arbitrary match) as displayed on thetelevision monitor5. As shown inFIG. 10, the game screen includes a CPU boxer215 andglobes217L and217R of the boxer who is controlled by the player11 (referred to herein as “player's boxer”).
Thehigh speed processor91 controls the motion (including punches) of the CPU boxer215 in accordance with the program stored in thememory93. Also, thehigh speed processor91 controls the motion of theglove217L in accordance with the motion of the glovetype input article7L imaged by theimage sensor161, and controls the motion of theglove217R in accordance with the motion of the glovetype input article7R imaged by theimage sensor161. Accordingly, theplayer11 can avoid and defend himself against a punch of the CPU boxer215 by moving the glovetype input articles7L and7R.
The game screen further includes aphysical indicator221aand amental indicator221bof the CPU boxer215 and aphysical indicator223aand amental indicator223bof the player's boxer.
Thephysical indicators221aand223aindicate the physical energy of the CPU boxer215 and the physical energy of the player's boxer respectively, and each time either boxer takes a punch, his physical energy as indicated is decreased. In this case, the decreased amount of physical energy is determined in accordance with the force of the punch as hit. Themental indicators221band223bindicate the mental energy of the CPU boxer215 and the mental energy of the player's boxer respectively, and each time either boxer takes a punch, his mental energy as indicated is more greatly decreased than his physical energy. However, his mental energy is recovered at a predetermined speed up to the residual physical energy at a maximum. When the mental energy is exhausted to zero, thehigh speed processor91 judges that the boxer falls down on the ground. Then, if the boxer stays on the ground for a predetermined period, thehigh speed processor91 judges that the boxer gets knocked out.
The game screen further includes around indication section219 in which the remaining time of the current round is displayed.
Next, a globe detection process, a right/left determination process and a globe motion determination process will be explained with reference to drawings.
FIG. 11 is an explanatory view for showing the globe detection process by the use of thehigh speed processor91. An image of 32×32 pixels is illustrated inFIG. 11 on the basis of the differential image data which is generated from the image data obtained when infrared light is emitted and the image data obtained when infrared light is not emitted. In the figure, each of the small unit squares represents one pixel. Also, the origin of the XY coordinates is located at the upper left vertex.
This image includes twoareas251 and253 having large luminance values. Theareas251 and253 represent the retroreflective sheet21 of the glovetype input article7L and the retroreflective sheet21 of the glovetype input article7R. However, at this time, it cannot be determined which area corresponds to which glove type input article.
Thehigh speed processor91 first scans the differential image data from X=0 to X=31 with Y=0 as a start point, then Y is incremented followed by scanning the differential image data from X=0 to X=31 again. This process is repeated until Y=31 in order to completely scan the differential image data of 32×32 pixels and determine the upper end position minY, the lower end position maxY, the left end position minX and the right end position maxX of the pixel data greater than a threshold value “ThL”.
Next, thehigh speed processor91 scans the differential image data in the positive x-axis direction from the coordinates (minX, minY) as a start point, in order to calculate the distance “LT” between the start point and the pixel at which the luminance value first exceeds the threshold value “ThL”. Also, thehigh speed processor91 scans the differential image data in the negative x-axis direction from the coordinates (maxX, minY) as a start point, in order to calculate the distance “RT” between the start point and the pixel at which the luminance value first exceeds the threshold value “ThL”. Furthermore, thehigh speed processor91 scans the differential image data in the positive x-axis direction from the coordinates (minX, maxY) as a start point, in order to calculate the distance “LB” between the start point and the pixel at which the luminance value first exceeds the threshold value “ThL”. Still further, thehigh speed processor91 scans the differential image data in the negative x-axis direction from the coordinates (maxX, maxY) as a start point, in order to calculate the distance “RB” between the start point and the pixel at which the luminance value first exceeds the threshold value “ThL”.
If the distances satisfy LT>RT, thehigh speed processor91 sets a first extraction point to the coordinates (maxX, minY), and if the distances satisfy LT≦RT, thehigh speed processor91 sets the first extraction point to the coordinates (minx, minY). Also, if the distances satisfy LB>RB, the high,speed processor91 sets a second extraction point to the coordinates (maxX, maxY), and if the distances satisfy LB≦RB, thehigh speed processor91 sets the second extraction point to the coordinates (minX, maxY).
FIG. 12 is an explanatory view for showing the right/left determination process by thehigh speed processor91.FIG. 12 shows the position TPL2 of the glovetype input article7L as determined just before (one video frame before) and the position TPL1 of the glovetype input article7L as determined twice before (two video frames before), and the position TPR2 of the glovetype input article7R as determined just before (one video frame before) and the position TPR1 of the glovetype input article7R as determined twice before (two video frames before). The positions TPL1, TPL2, TPR1 and TPR2 are positions in the image on the basis of the differential image data.
Thehigh speed processor91 calculates a velocity vector VL which has its start point at the position TPL1 and its end point at the position TPL2. Then, the predicted position TPLp of the glovetype input article7L is obtained as the end point of the velocity vector VL having the position TPL2 as its start point. On the other hand, thehigh speed processor91 calculates a velocity vector VR which has its start point at the position TPR1 and its end point at the position TPR2. Then, the predicted position TPRp of the glovetype input article7R is obtained as the end point of the velocity vector VR having the position TPR2 as its start point.
Thehigh speed processor91 obtains the distance LD1 between the first extraction point TPN1 and the predicted position TPLp, the distance RD1 between the first extraction point TPN1 and the predicted position TPRp, the distance LD2 between the second extraction point TPN2 and the predicted position TPLp, and the distance RD2 between the second extraction point TPN2 and the predicted position TPRp.
If the distances satisfy LD1>RD1, thehigh speed processor91 sets the current position of the glovetype input article7R to the first extraction point TPN1, and if the distances satisfy LD1≦RD1, thehigh speed processor91 sets the current position of the glovetype input article7L to the first extraction point TPN1. Also, if the distances satisfy LD2>RD2, thehigh speed processor91 sets the current position of the glovetype input article7R to the second extraction point TPN2, and if the distances satisfy LD2≦RD2, thehigh speed processor91 sets the current position of the glovetype input article7L to the second extraction point TPN2. Incidentally, in the case where the left and right predicted positions TPLp and TPRp cannot be calculated, for example, just after the game starts, the coordinates of the glovetype input article7L are set to the coordinates of one of the first extraction point TPN1 and the second extraction point TPN2 the X-coordinate of which is “minX”, and the coordinates of the glovetype input article7R are set to the coordinates of the other glove type input article the X-coordinate of which is “maxX”.
As has been discussed above, since the first extraction point TPN1 and the second extraction point TPN2 are associated respectively with left and right or right and left on the basis of the left and right predicted positions TPLp and TPRp, thehigh speed processor91 can properly recognize the glovetype input articles7L and7R in the image on the basis of the differential image data even if the relative positions of the glovetype input articles7L and7R are switched (crossed) between left and right.
FIGS. 13A,13C and13E are explanatory views for showing the process of calculating velocity vectors by thehigh speed processor91, andFIGS. 13B,13D and13F are explanatory views for showing the globe motion determination process by thehigh speed processor91. These drawings are explanatory views for showing the motion determination process of the glovetype input article7L.
Positions TPL1 to TPL3 of the glovetype input article7L are illustrated inFIGS. 13A,13C and13E. The positions TPL1 to TPL3 are positions in the image on the basis of the differential image data. As shown in these drawings, thehigh speed processor91 calculates a velocity vector “V” by the use of the position of the glovetype input article7L two video frames before (for example, the position TPL1) as a start point, and the current position (for example, the position TPL3) as an end point.
Then, thehigh speed processor91 places the start point TPL1 of the velocity vector “V” at the origin in the virtual screen (32×32 pixels) as illustrated inFIGS. 13B,13D and13F, and determines in which area the end point TPL3 is located. If the end point TPL3 of the vector “V” is located in an area “immovable”, thehigh speed processor91 determines that theplayer11 does not throw a left punch (refer toFIG. 13B). In other words, although the glovetype input article7L is actually moved, thehigh speed processor91 does not recognize it as a punch. If the end point TPL3 of the vector “V” is located in an area “straight”, thehigh speed processor91 determines that theplayer11 does throw a left straight punch (refer toFIG. 13D). In other words, thehigh speed processor91 determines that the glovetype input article7L is moved straightly. If the end point TPL3 of the vector “V” is located in an area “cross”, thehigh speed processor91 determines that theplayer11 does throw a left cross punch (refer toFIG. 13F). In other words, thehigh speed processor91 determines that the glovetype input article7L is moved in the form of a cross.
As understood fromFIGS. 13B,13D and13F, in such a motion determination process, theprocessor91 places the start point TPL1 of the velocity vector “V” at the origin of the virtual screen. The origin of the virtual screen is located at the center position of the lower side thereof.
For the glovetype input article7R, a virtual screen is provided by laterally inverting the virtual screen as illustrated inFIGS. 13B,13D and13F, and the motion (immovable, straight, cross) determination process is performed in the same manner as that for the glovetype input article7L.
Next, with reference to a flowchart, the process flow of the boxing game system will be explained.
FIG. 14 is a flowchart showing an example of the overall process flow by thehigh speed processor91. As shown inFIG. 14, thehigh speed processor91 performs an initialization process in step S1. More specifically speaking, the system hardware and the respective variables are initialized.
Thehigh speed processor91 performs the imaging process of the glovetype input articles7L and7R in step S2. Also, in step S3, thehigh speed processor9 performs the process of detecting the glovetype input articles7L and7R on the basis of the result of the imaging process of in step S2. Thehigh speed processor91 proceeds to step S5 if the game state is “game mode selection”, proceeds to step S6 if the game state is “fighting”, proceeds to step S7 if the game state is “falling down”, proceeds to step S8 if the game state is “end of round”, and proceeds to step S10 if the game state is “end of bout”. However, the game mode is initialized to “game mode selection” at power up.
In step S5, thehigh speed processor91 performs the game mode selection process in response to the motion of the glovetype input articles7L and7R. In step S6, thehigh speed processor91 controls the motion of the CPU boxer215, and the motion of thegloves217L and217R in response to the motion of the glovetype input articles7L and7R. Thehigh speed processor91 performs the fighting process between the CPU boxer215 and the player's boxer in this way. In this case, thehigh speed processor91 decreases the mental energy indicated by themental indicator221bor223bof the boxer who takes a punch, judges falling down when the mental energy is exhausted to zero, and sets the game mode to “falling down”. Also, when a predetermined time elapses and one round ends, thehigh speed processor91 sets the game mode to “end of round”. Furthermore, when the final round ends, thehigh speed processor91 sets the game mode to “end of bout” after performing the process of steps S8 and S9.
In this example, if the end point of the velocity vector “V” of the glovetype input article7L is located in the straight area or the cross area shown inFIG. 13, thehigh speed processor91 determines that a punch is thrown with the glovetype input article7L. This is true also for the glovetype input article7R.
In step S7, thehigh speed processor91 performs the process for the falling down. In the case where the player's boxer falls down, if the glovetype input articles7L and7R are swung for a predetermined number of times which is determined in accordance with the remaining physical energy before the count of 10, the game state is set to “fighting” by this process. If the glovetype input articles7L and7R are not swung for the predetermined number of times before the count of 10, thehigh speed processor91 judges a knockout and set the game state to “end of bout”.
On the other hand, when the CPU boxer falls down, the game state is set to “fighting” or “end of bout” in accordance with the behavioral algorithm of the CPU boxer.
In step S8, thehigh speed processor91 calculates the calorie consumption of the exercisingplayer11 in the current round. In step S9, thehigh speed processor91 performs the settings of animation and the display positions in order to display the calorie consumption of theplayer11 and the points of the respective boxers in the round.
In step S10, thehigh speed processor91 sums up the energy consumptions of theplayer11 in the respective rounds to calculate the total calorie consumptions of theplayer11 through the bout. In step S11, thehigh speed processor91 performs the settings of animation and the display positions in order to display the total calorie consumption of theplayer11 and the outcome of the bout. On the other hand, when the time runs out, the outcome of the bout is determined.
If there is an interrupt by a video system synchronous signal in step S12, the process proceeds to step S13, otherwise the process repeats the same step S12. The interrupt by a video system synchronous signal is issued at 1/60 second intervals.
In step S13, thehigh speed processor91 updates the display image (video frame) of thetelevision monitor5 on the basis of the animation and display positions as set in steps S5 to S11.
The sound process in step S14 is performed when an audio interrupt is issued for outputting game music sounds, and other sound effects.
FIG. 15 is a flowchart showing an example of the imaging process of step S2 ofFIG. 14. As shown inFIG. 15, thehigh speed processor91 turns on the infraredlight emitting diode53 in step S20. In step S21, thehigh speed processor91 acquires, from theimage sensor161, image data obtained when infrared light is emitted, and stores the image data in the internal memory.
As has been discussed above, the present embodiment makes use of theimage sensor161 of 32 pixels×32 pixels. Accordingly, 32pixels 32 pixels of pixel data (luminance data for each pixel) is output as image data from theimage sensor161. This pixel data is converted into digital data by the A/D converter and stored in the internal memory as the array elements “P1 [X] [Y]”.
In step S22, thehigh speed processor91 turns off the infraredlight emitting diode53. In step S23, thehigh speed processor91 acquires, from theimage sensor161, image data (32pixels 32 pixels of pixel data (luminance data for each pixel)) obtained when infrared light is not emitted, and stores the image data in the internal memory. In this case, the image data obtained when infrared light is not emitted is stored in the array elements “P2[X][Y]” of the internal memory.
The stroboscope imaging is performed in this way. Since the image sensor19 of 32 pixels×32 pixels is used in the case of the present embodiment, X=0 to 31 and Y=0 to 31.
FIG. 16 is a flowchart showing an example of the globe detection process of step S3 ofFIG. 14. As shown inFIG. 16, in step S30, thehigh speed processor91 calculates the differential data between the pixel data “P1 [X] [Y]” acquired when infrared light is emitted and the pixel data “P2[X] [Y]” acquired when infrared light is not emitted, and the differential data is assigned to the respective array elements “Dif [X] [Y]”. In step S31, when the process of calculating the 32×32 pixels of the differential image is finished, thehigh speed processor91 proceeds to step S32, but if the process is not finished yet, thehigh speed processor91 proceeds to step S30. In this way, thehigh speed processor91 repeats the process of step S30 to generate differential image data between the image data obtained with infrared light illumination and the image data obtained without infrared light illumination. The noise due to lights other than reflected light from the retroreflective sheets21 of the glovetype input articles7L and7R can be eliminated, as much as possible, by obtaining differential image data (differential image), and thereby the glovetype input articles7L and7R can be detected with a high degree of accuracy.
In step S32, thehigh speed processor91 performs the process of detecting the left, right, upper and lower ends (minX, maxX, minY, maxY) as explained with reference toFIG. 11. In step S33, thehigh speed processor91 performs the process of determining the positions of two points (the first extraction point (Xtp[0], Ytp[0]) and the second extraction point (Xtp[1], Ytp[1])) as explained with reference toFIG. 11. In step S34, thehigh speed processor91 calculates the center coordinates between the first extraction point (Xtp[0], Ytp[0]) and the second extraction point (Xtp[1], Ytp[1]). Then, the center coordinates are converted into the corresponding screen coordinates.
FIG. 17 is a flowchart showing an example of the process of detecting the left, right, upper and lower ends in step S32 ofFIG. 16. This flowchart is an example of the process of detecting the left, right, upper and lower ends as explained with reference toFIG. 11.
As shown inFIG. 17, thehigh speed processor91 assigns “0” to “X”, “Y”, “maxX”, “maxY” and “k” in step S40. Also, thehigh speed processor91 assigns “31” to “minX” and “minY”.
In step S41, thehigh speed processor91 compares the array element “Dif [X] [Y]” with a predetermined threshold value “ThL”. If the array element “Dif [X] [Y]” is larger than the predetermined threshold value “ThL” in step S42, thehigh speed processor91 proceeds to step S43, and conversely if the array element “Dif [X] [Y]” is not larger than the predetermined threshold value “ThL”, the high speed processor23 proceeds to step S55.
The process in steps S41 and S42 is the process for detecting whether or not the glovetype input articles7L or7R is imaged. Since the glovetype input articles7L and7R are provided with the retroreflective sheets21, when the glovetype input article7L or7R is imaged, the luminance values of the pixels on the differential image corresponding to the retroreflective sheet21 become large. Because of this, the pixels having luminance values larger than the threshold value “ThL” are recognized as part of the retroreflective sheet21 as imaged by evaluating small and large t of the luminance values on the basis of threshold value “ThL”.
In step S43, thehigh speed processor91 increments the counter value “k” by one. In step S44, thehigh speed processor91 determines whether or not the counter value “k” is “1”, and if k=1 the process proceeds to step S45, otherwise the process proceeds to step S46.
In step S45, thehigh speed processor91 assigns the current Y-coordinate to the minimum Y-coordinate “minY”. In other words, after scanning starts from (X, Y)=(0, 0), “X” is incremented from “0” to “31” with “Y” which is fixed until X=31 but incremented each time “X” is returned to “0” and “X” is incremented again from “0” to “31” (refer to steps S55 to S59 to be described below), and thereby the value “Y” of the first array element “Dif [X] [Y]” (i.e., pixel) exceeding the threshold value “ThL” is necessarily the minimum Y-coordinate “minY”.
In step S46, thehigh speed processor91 compares the current Y-coordinate with the current maximum Y-coordinate “maxY”. If the current Y-coordinate is larger than the current maximum Y-coordinate “maxY” in step S47, thehigh speed processor91 proceeds to step S48, otherwise proceeds to step S49. In step S48, thehigh speed processor91 assigns the current Y-coordinate to the maximum Y-coordinate “maxY”.
In step S49, thehigh speed processor91 compares the current minimum X-coordinate minX and the current X-coordinate. If the current X-coordinate is smaller than the current minimum X-coordinate “minX” in step S50, thehigh speed processor91 proceeds to step S51, otherwise proceeds to step S52. In step S51, thehigh speed processor91 assigns the current X-coordinate to the minimum X-coordinate “minX”.
In step S52, thehigh speed processor91 compares the current X-coordinate with the current maximum X-coordinate “maxX”. If the current X-coordinate is larger than the current maximum X-coordinate “maxX” in step S53, thehigh speed processor91 proceeds to step S54, otherwise proceeds to step S55. In step S54, thehigh speed processor91 assigns the current X-coordinate to the maximum X-coordinate “maxX”.
In step S55, thehigh speed processor91 increments “X” by one. If X=32 in step S56 (i.e., when the process of one line of the differential image is finished), thehigh speed processor91 proceeds to step S57, otherwise thehigh speed processor91 proceeds to step S41.
In step S57, thehigh speed processor91 assigns “0” to “X”. In step S58, thehigh speed processor91 increments “Y” by one. Since one line of the differential image is completely processed, the steps S57 and S58 are taken to repeat the process for the next line.
If Y=32 in step S59 (i.e., when the process of the 32×32 pixels of the differential image is finished), thehigh speed processor91 returns to the routine ofFIG. 16, otherwise thehigh speed processor91 proceeds to step S41.
The minimum X-coordinate “minX”, maximum X-coordinate “maxX”, minimum Y-coordinate “minY” and maximum Y-coordinate “maxY” are finally determined when Y=32 after repeating the above steps S41 to S59.
FIG. 18 is a flowchart showing an example of the process of determining two points in step S33 ofFIG. 16. This flowchart is an example of the process of determining two points as explained with reference toFIG. 11.
As shown inFIG. 18, thehigh speed processor91 assigns “0” to “M” in step S70, and repeatedly performs the process from step S71 to step S87. In this case, as shown in step S71, Ytb=minY in the first loop and Ytb=maxY in the second loop. In step S72, thehigh speed processor91 starts scanning from the coordinates (minX, Ytb) as a starting point.
In step S73, thehigh speed processor91 assigns “0” to the counter value “C1”. In step S74, thehigh speed processor91 compares the differential data Dif [X] [Y] with the threshold value “ThL”, and proceeds to step S77 if the differential data is larger than the threshold value, otherwise proceeds to step S75. In step S75, thehigh speed processor91 increments the count value “C1” by one. In step S76, thehigh speed processor91 increments the coordinate “X” by one and proceeds to step S74.
In step S74, when it is determined that Dif [X] [Y]>ThL, the current counter value “C1” is equal to the distance “LT” or “LB” shown inFIG. 11 In step S72, C1=LT if Ytb=minY, and C1=LB if Ytb=maxY.
In step S77, thehigh speed processor91 starts scanning from the coordinates (maxX, Ytb) as a starting point. In step S78, thehigh speed processor91 assigns “0” to the counter value “Cr”. In step S79, thehigh speed processor91 compares the differential data Dif [X] [Y] with the threshold value “ThL”, and proceeds to step S82 if the differential data is larger than the threshold value, otherwise proceeds to step S80. In step S80, thehigh speed processor91 increments the count value “Cr” by one. In step S81, thehigh speed processor91 decrements the coordinate “X” by one, and proceeds to step S79.
In step S79, when it is determined that Dif [X] [Y]>ThL, the counter value “Cr” is equal to the distance “RT” or “RB” shown inFIG. 11. In step S72, Cr=RT if Ytb=minY, and Cr=RB if Ytb=maxY.
In step82, thehigh speed processor91 compares the distance “Cr” with the distance “C1”. If the distance “C1” is larger than the distance “Cr” in step S83, the process proceeds to step S85, otherwise proceeds to step S84.
In step S84, “minX” is assigned to “Xtp[M]”, and “Ytb” is assigned to “Ytp[M]”. On the other hand, in step S85, “maxX” is assigned to “Xtp[M]”, and “Ytb” is assigned to “Ytp[M]”.
In this case, the coordinates (Xtp[0], Ytp[0]) are the coordinates of the first extraction point as explained with reference toFIG. 11, and the coordinates (Xtp[1], Ytp[1]) are the coordinates of the second extraction point as explained with reference toFIG. 11.
In step S86, thehigh speed processor91 increments “M” by one, and proceeds to step S87. When the loop from step S71 to step S87 is finished, the process returns to the routine ofFIG. 16.
FIG. 19 is a flowchart showing an example of the selection process in step S5 ofFIG. 14FIG. 19 is a flow chart showing an example of the process flow as explained with reference toFIGS. 8A and 8B andFIG. 9. As shown inFIG. 19, in step S101, thehigh speed processor91 calculates the center coordinates between the center coordinates currently obtained and the center coordinates previously obtained, as calculated in step S34 ofFIG. 16, and sets the current coordinates as calculated here to the current coordinates of the cursor201 (called as the cursor coordinates). Incidentally, the center coordinates as used are screen coordinates after conversion. In step S102, if the current cursor coordinates are located in the selectionacceptable area211 in which theselection button203U or203D is included, thehigh speed processor91 proceeds to step S103, otherwise proceeds to step S110.
In step S103, thehigh speed processor91 sets an area flag to a value corresponding to theselection button203U or203D in the selectionacceptable area211 in which the current cursor coordinates are located. In step S104, thehigh speed processor91 resets the current position of thecursor201 to the center of theselection button203U or203D in the selectionacceptable area211 in which the current cursor coordinates are located. In step S105, thehigh speed processor91 performs the settings of the animation of theindicator202U or202D which indicates the passage of time and is provided in the selectionacceptable area211 in which the current cursor coordinates are located. By this process, thecorresponding indicator202U or202D is gradually filled with the predetermined color as time passes.
In step S106, thehigh speed processor91 checks the area flag, determines whether or not its value is the same as the previous value, and if it is the same the process proceeds to step S108, otherwise proceeds to step S107. In step S107, thehigh speed processor91 resets the elapsed time (returns it to 0), and proceeds to step S108. In step S108, thehigh speed processor91 determines whether or not a predetermined time elapses, and if it elapses, the process proceeds to step S109, otherwise returns to the main routine ofFIG. 14. In step S109, thehigh speed processor91 performs the settings of animation and the display positions in order to display the next game mode selection screen in accordance with the direction of the arrow of thecorresponding indicator202U or202D.
On the other hand, if the current cursor coordinates are located in the determinationacceptable area213 in step S110, the process proceeds to step S111, otherwise proceeds to step S118. In step S118, thehigh speed processor91 sets the area flag to a value indicating that the current cursor coordinates are not located in either theselection button203U or203D and also not located in theOK button207.
By the way, supplementary explanation will be given in regard to the process ofFIG. 11,FIG. 17 andFIG. 18.FIG. 11 illustrates an exemplary case where both the glovetype input articles7L and7R are imaged. However, even when only one of the glovetype input articles7L and7R is imaged, it is apparent that, by the process ofFIG. 17 andFIG. 18, the maximum X-coordinate “maxX”, minimum X-coordinate “minX”, maximum Y-coordinate “maxY” and minimum Y-coordinate “minY” can be obtained, and also the first extraction point and the second extraction point can be obtained.
On the other hand, in step S111, thehigh speed processor91 sets the area flag to a value corresponding to theOK button207 in the determinationacceptable area213 in which the current cursor coordinates are located. In step S112, thehigh speed processor91 resets the current position of thecursor201 to the center position of theOK button207. In step S113, thehigh speed processor91 sets the animation of theindicator209 indicative of the passage of time. By this process, as time passes, theindicator209 is gradually filled with a predetermined color in the clockwise direction.
In step S114, thehigh speed processor91 checks the area flag, determines whether or not its value is the same as the previous value, and if it is the same the process proceeds to step S116, otherwise proceeds to step S115. In step S115, thehigh speed processor91 resets the elapsed time (returns it to 0), and proceeds to step S116. In step S116, thehigh speed processor91 determines whether or not a predetermined time elapses, and if it elapses, the process proceeds to step S117, otherwise returns to the main routine ofFIG. 14. In step S117, thehigh speed processor91 starts the process which is to be performed in the selected game mode. In this case, the game state is set to “fighting”.
FIG. 20 is a flowchart showing an example of the process flow during fighting in step S6 ofFIG. 14. As shown inFIG. 20, in step S120, thehigh speed processor91 determines which of the first extraction point and the second extraction point as obtained in the process ofFIG. 18 is corresponding to the right or left hand.
In step S121, thehigh speed processor91 evaluates the motion of thegloves217L and217R of the player's boxer, and determines a straight punch, a cross punch or no punch. In step S122, thehigh speed processor91 updates the display position of thegloves217L and217R of the player's boxer. In step S123, thehigh speed processor91 calculates the difference between the horizontal component (x component) of the center coordinates as currently obtained in step S34 ofFIG. 16 and the horizontal component (x component) of the center coordinates as previously obtained, i.e., the moving distance of the center point in the horizontal direction, and adds it to an accumulated value “Dm” as previously obtained. As thus described, in step S123, the accumulated value “Dm” (i.e., sum of displacement “Dm”) is acquired by successively accumulating the moving distance of the center point in the horizontal direction, i.e., the moving distance of the center point between theglove217L and theglove217R in the horizontal direction. Incidentally, the center coordinates are screen coordinates after conversion.
In step S124, thehigh speed processor91 controls the motion of the opposing boxer (i.e., the CPU boxer215). In other words, thehigh speed processor91 performs the settings of animation and position of the opposing boxer in accordance with the behavioral algorithm of the opposing boxer. In step S125, thehigh speed processor91 controls the display of the background image in response to the position of the opposing boxer and the positions of thegloves217L and217R of the player's boxer.
More specific description is as follows. In the case of the present embodiment, horizontal 256 pixels (width)×vertical 224 pixels (height) are displayed in the screen of thetelevision monitor5, and three areas are defined by dividing the displayed area by three in the horizontal (width) direction. The background image is scrolled to the right direction when the opposing boxer moves from the center area to the left area, while the background image is scrolled to the left direction when the opposing boxer moves from the center area to the right area, in order to control the background image such that the opposing boxer is located in the center area. Such background control is equivalent to changing the angle of a camera when taking the image of the ring.
Also, the positions of the opposing boxer and the background image are controlled in accordance with the motions of the center point between the first extraction point and the second extraction point which are obtained by the process ofFIG. 16. In other words, the positions of the opposing boxer and the background image are controlled in accordance with the moving distance of the center point between the first extraction point and the second extraction point in the horizontal direction. The opposing boxer and the background image are scrolled to the right when the center point moves to the left in relation to the center of the screen, and the opposing boxer and the background image are scrolled to the left when the center point moves to the right in relation to the center of the screen. Such control of the opposing boxer and the background image is performed for taking the motion parallax of the player's boxer into consideration.
In step S126, thehigh speed processor91 determines whether or not a punch of the player's boxer hits the opposing boxer. In step S127, thehigh speed processor91 determines whether or not a punch of the opposing boxer hits the player's boxer. More specifically speaking, if either the right or left punch of the opposing boxer is located between theglove217L and theglove217R of the player's boxer, it is determined that the punch hits the player's boxer. On the other hand, if either the right or left punch of the opposing boxer is located outside of theglove217L or theglove217R of the player's boxer, it is determined that the punch is defended.
In step S128, thehigh speed processor91 determines whether or not the round ends, and if the round ends the process proceeds to step S129 to set the game state to “end of round”, and returns to the main routine, otherwise proceeds to step S130. In step S130, thehigh speed processor91 whether or not the bout is over, i.e., all the rounds end, and if the bout is over, the process proceeds to step S131 to set the game state to “end of bout”, and returns to the main routine. On the other hand, if the bout is not over, the process returns to the main routine.
FIG. 21 is a flowchart showing an example of the right/left determination process in step S120 ofFIG. 20. This flowchart is also an example of the process of determining right/left as explained with reference toFIG. 12. Incidentally, the position of the glovetype input article7L is called the left extraction point, and the position of the glovetype input article7R is called the right extraction point.
As shown inFIG. 21, in step S140, thehigh speed processor91 predicts the current position (Xnl, Ynl) of the left extraction point from the previous position (XL[0], YL[0]) of the left extraction point. In step S141, thehigh speed processor91 predicts the current position (Xnr, Ynr) of the right extraction point from the previous position (XR[0], YR[0]) of the right extraction point. In this case, the position (Xnl, Ynl) of the left extraction point corresponds to the predicted position TPLp ofFIG. 12, and the position (Xnr, Ynr) of the right extraction point corresponds to the predicted position TPRp ofFIG. 12.
In step S142, thehigh speed processor91 assigns “0” to “M”. In step S143, thehigh speed processor91 calculates the distance Dl between the predicted position (Xnl, Ynl) and the extraction point (Xtp[M], Ytp[M]). In step S144, thehigh speed processor91 calculates the distance Dr between the predicted position (Xnr, Ynr) and the extraction point (Xtp [M], Ytp [M]).
In this case, the extraction point (Xtp[0], Ytp[0]) is the first extraction point as obtained by the routine ofFIG. 18, and the extraction point (Xtp[1], Ytp[1]) is the second extraction point as obtained by the routine ofFIG. 18. When M=0, the distance Dl corresponds to the distance LD1 ofFIG. 12, and the distance Dr corresponds to the distance RD1 ofFIG. 12. Also, when M=1, the distance D1 corresponds to the distance LD2 ofFIG. 12, and the distance Dr corresponds to the distance RD2 ofFIG. 12.
In step S145, thehigh speed processor91 compares the distance Dr and the distance Dl. If Dl>Dr in step S146, the high speed processor proceeds to step S148 otherwise proceeds to step S147.
In step S147, thehigh speed processor91 sets the position (XL[2], YL[2]) to the position (XL[1], YL[1]) of the left extraction point as determined twice before, and sets the position (XL[1], YL[1]) to the position (XL[0], YL[0]) of the left extraction point as determined just before. Then, thehigh speed processor91 sets the coordinates (Xtp[M], Ytp[M]) to the current position (XL[0], YL[0]) of the left extraction point.
On the other hand, in step S148, thehigh speed processor91 sets the position (XR[2], YR[2]) to the position (XR[1], YR[1]) of the right extraction point as determined twice before, and sets the position (XR[1], YR[1]) to the position (XR[0], YR[0]) of the right extraction point as determined just before. Then, thehigh speed processor91 sets the coordinates (Xtp[M], Ytp[M]) to the current position (XR[0], YR[0]) of the right extraction point.
In step S149, thehigh speed processor91 increments the variable “M” by one. In step S150, thehigh speed processor91 determines whether or not M=2, and if M=2 the process returns to the routine ofFIG. 20 otherwise proceeds to step S143.
FIG. 22 is a flowchart showing an example of the globe motion determination process in step S121 ofFIG. 20. This flowchart is also an example of the process of determining the globe motion as explained with reference toFIG. 13.
As shown inFIG. 22, thehigh speed processor91 repeats the process from step S160 to step S169. When i=0, the motion of the glovetype input article7L is determined, and when i=1, the motion of the glovetype input article7R is determined.
In step S161, thehigh speed processor91 calculates the velocity vector Vi by the following equation.
Vi=(Xi[0]−Xi[2],Yi[0]−Yi[2])
In this case, the coordinates (X0[0], Y0[0]) are the current left extraction point (XL[0], YL[0]) corresponding to the left position TPL3 ofFIG. 13, and the coordinates (X0[2], Y0[2]) are the left extraction point (XL[2], YL[2]) as determined twice before corresponding to the left position TPL1 ofFIG. 13. Accordingly, the velocity vector VO corresponds to the velocity vector V ofFIG. 13. On the other hand, the coordinates (X1[0], Y1[0]) are the current right extraction point (XR[0], YR[0]), and the coordinates (X1[2], Y1[2]) are the right extraction point (XR[2], YR[2]) as determined twice before.
In step S162, thehigh speed processor91 places the start point of the velocity vector Vi at the origin in the virtual screen, and determines in which area the end point of the velocity vector Vi is located (refer toFIGS. 13B,13D and13F). If the end point of the velocity vector Vi is located in the “immovable area” in step S163, thehigh speed processor91 proceeds to step S164 in which a immovable flag IFi is turned on, otherwise proceeds to step S165.
If the end point of the velocity vector Vi is located in the “straight area” in step S165, thehigh speed processor91 proceeds to step S167 in which a straight flag SFi is turned on, otherwise the end point is located in the “cross area” and thereby the process proceeds to step S166 in which a cross flag CFi is turned on. In step S168, thehigh speed processor91 increments the count value “Np” of a punch counter which counts the number of punches and the process proceeds to step S169. In this manner, the number of punches is counted with no distinction between straight and cross and between right and left.
After performing the process from step S160 to S169 twice, i.e., the determination process is completed for the left and right glovetype input articles7L and7R, the process returns to the routine ofFIG. 20.
FIG. 23 is a flowchart showing an example of the process of updating the positions of the gloves of the player's boxer in step S122 ofFIG. 20. As shown inFIG. 23, thehigh speed processor91 repeats the process from step S180 to S190. When i=0, the process is performed for theglove217L of the player's boxer, and when i=1, the process is performed for theglove217R of the player's boxer.
Thehigh speed processor91 determines in step S181 whether or not the immovable flag IFi is turned on, and if turned on, the process proceeds to step S182 in which the position of thecorresponding glove217L or217R is updated, otherwise proceeds to step S184. In this case, the positions of thegloves217L and217R are set to the current position of the left extraction point and the current position of the right extraction point as converted in the screen coordinates respectively. However, thegloves217L and217R can be moved freely in the horizontal direction, but can be moved limitedly in the vertical direction (for example, the center of thegloves217L and217R can be moved only within the lower third of the screen). Theglobe217L and217R show an exemplary image (indicative of a basic figure) when there is no input from theplayer11. In step S183, thehigh speed processor91 turns off the immovable flag IFi.
Thehigh speed processor91 determines whether or not the straight flag SFi is turned on in step S184, and if turned on the process proceeds to step S185 in which settings are made to animate a straight punch with thecorresponding glove217L or217R (an image indicative of an input by the player11 (i.e., an image changing from basic figure), otherwise proceeds to step S187. In step S186, thehigh speed processor91 turns off the straight flag SFi.
In step S187, thehigh speed processor91 determines whether or not the cross flag CFi is turned on, and if turned on the process proceeds to step S188 in which settings are made to animate a cross punch with thecorresponding glove217L or217R (an image indicative of another input by the player11 (i.e., an image changing from the basic figure). In step S189, thehigh speed processor91 turns off the cross flag CFi.
After performing the process from step S180 to S190 twice, i.e., after the updating process is completed for the left and right glovetype input articles7L and7R, the process proceeds to step S191. In step S191, thehigh speed processor91 determines whether or not theleft globe217L and theright globe217R of the player's boxer are crossed, i.e., whether or not the relative positions thereof are switched between left and right, and if crossed the process proceeds to step S192. In step S192, thehigh speed processor91 determines whether or not the left and right hands are continuous crossed for a predetermined time, and if this time elapses the process proceeds to step S193 in which settings are made to animate theglobe217L displayed at the right side and theglobe217R displayed at the left side which are switched each other left to right.
FIG. 24 is a flowchart showing an example of the calorie consumption calculation process in step S8 ofFIG. 14. As shown inFIG. 24, in step S200, thehigh speed processor91 divides the accumulated value “Dm” (i.e., sum of displacement “Dm”) of the moving distances of the center point between theglove217L and theglove217R in the horizontal direction, which is obtained in step S123 ofFIG. 20, by “256” to acquire the quotient “Um”. In this case, the fractional residue is discarded. Here, while horizontal 256 pixels×vertical 224 pixels are displayed in the screen of thetelevision monitor5, the divisor “256” corresponds to the number of pixels in the horizontal direction. The 256 pixels are treated as one unit of displacement.
In step S201, thehigh speed processor91 multiplies the quotient “Um” and a unit motion calorie consumption “Cm” (for example, 157 calories) to acquire the product “Ef”. In this case, the one unit, i.e., the unit motion calorie consumption “Cm” is the calorie consumption which is actually measured by having the player move the center point between theglove217L and theglove217R by 256 pixels in horizontal direction. Accordingly, the product “Ef” is the calorie consumption on the basis of the motion of the glovetype input articles7L and7R in the horizontal direction.
In step S202, thehigh speed processor91 multiplies a unit punch calorie consumption “Cp” (for example, 120 calories) by the count value “Np” as obtained in step S168 ofFIG. 22, i.e., the number of punches “Np” to acquire the product “Es”. In this case, the unit punch calorie consumption “Cp” is the calorie consumption which is actually measured by having the player throw a punch. Thus, the product “Es” is the calorie consumption corresponding to the punches having been thrown. Incidentally, as understood fromFIG. 22, the number of punches “Np” is counted with no distinction between straight and cross and between right and left.
In step S203, thehigh speed processor91 acquires a calorie consumption “E (R)” of the current round (R+1) by adding the calorie consumption “Ef” of the horizontal motions of the glovetype input articles7L and7R and the calorie consumption “Es” of the punches as thrown. The index R=0, 1, . . . , and the number (R+1) indicates the round number. In step S204, thehigh speed processor91 clears the sum of displacement “Dm” and the number of punches “Np”.
Incidentally, the above actual measurements have been performed with Japanese women aged20 in advance, and the unit motion calorie consumption “Cm” and the unit punch calorie consumption “Cp” calculated from the actual measurements are implemented as parameters of the calculation. In accordance with one preferred example of implementation, this unit motion calorie consumption “Cm” and this unit punch calorie consumption “Cp” are corrected by taking account of the age, gender and weight of the player which are entered by the player in order to obtain a value closer to the actual calorie consumption. In any cases, even if the calorie consumption as calculated includes some error, the amount of exercise by the player can be roughly recognized. In addition to this, since daily relative increase and decrease in the calorie consumption is substantially accurate, it is effective to display the calorie consumption for enabling the player's adherence to constant exercise and health maintenance. Returning toFIG. 14, in step S10, thehigh speed processor91 sums up the calorie consumption “E (R)” as obtained in step S8 to calculate the total calorie consumption through the bout.
FIG. 25 is a view showing an exemplary screen in which the intermediate result is displayed on the basis of the processing result in step S9 ofFIG. 14. As illustrated inFIG. 25, this screen contains a judgmentresult display area500, a calorieconsumption display area502, anOK button504 and thecursor201. The judgments of the respective judges A to C of the current round (round1 in the illustrated case) are displayed in the judgmentresult display area500. In the figure, “Raz” is the name of the player's boxer.
Also, the calorie consumption (calculated in step S8) of the current round is displayed in the calorieconsumption display area502. Then, if thecursor201 is located in theOK button504 for a predetermined time, the process proceeds to the next round.
FIG. 26 is a view showing an exemplary screen in which the outcome of the fight is displayed on the basis of the processing result in step S11 ofFIG. 14. As illustrated inFIG. 26, this screen contains a calorieconsumption display area506, a cancelbutton510, anOK button504, and thecursor201. The total calorie consumption through the bout (as calculated in step S10) is displayed in the calorieconsumption display area506. In addition, the name of the winner is displayed (“Raz” in the figure).
Then, if thecursor201 is located in theOK button504 for a predetermined time, the calorie consumption, which is being displayed, is added to the accumulated value of the past calorie consumption. On the other hand, if thecursor201 is located in the cancelbutton510 for a predetermined time, the calorie consumption, which is being displayed, is not added to the accumulated value of the past calorie consumption.
FIG. 27 is a view showing an exemplary screen in which the total outcome is displayed after the outcome of the current fight is displayed inFIG. 26. As illustrated inFIG. 27, this screen contains a calorieconsumption display area506 and a total calorieconsumption display area508. The calorie consumption through the current bout (as calculated in step S10) is displayed in the calorieconsumption display area506. The accumulated value of the past calorie consumption is displayed in the total calorieconsumption display area508.
FIG. 28 is a view showing an exemplary screen in which comments are displayed after the total outcome is displayed inFIG. 27. As illustrated inFIG. 28, this screen contains acomment display area514, acharacter512, anOK button504, and thecursor201. Comments are displayed in thecomment display area514 in accordance with the outcome of the fight. Then, if thecursor201 is located in theOK button504 for a predetermined time, the process proceeds to step S5 ofFIG. 14.
Incidentally, in accordance with the present invention as has been discussed above, the globe motion determination is performed on the basis of the position TPL3 as currently determined of the glovetype input article7L in the coordinates in which the past position TPL1 as determined twice before is located in the origin, (FIGS. 13A to 13F).
In other words, the origin is always located at the position determined by tracing back twice from the position to be currently determined, and thereby the motion determination is based on the relative position of the glovetype input article7L. Because of this, even if there are disparities in the body height of theplayer11 and in the distance between theimaging unit51 and theplayer11, it is possible to display a constant glove image. This is true also for the glovetype input article7R.
In order to facilitate understanding of this feature, a motion determination process which is performed on the basis of the absolute position of the glovetype input article7L in the differential image will be considered. In this case, the differential image corresponds to the virtual screen. For example, when comparing a short player and a tall player playing with the glovetype input article7L in the same posture, needless to say, there is a difference between the positions of the glovetype input article7L gripped by the short and tall players in the differential image.
Accordingly, even if the short and tall players perform the similar action, the area where the glovetype input article7L of one is located may be different from the area where the glovetype input article7L of the other is located.
For example, while the glovetype input article7L is located in the straight area of the virtual screen when a tall player such as an adult throws a straight punch, the glovetype input article7L may be located in the immovable area of the virtual screen when a short player such as a child throws a straight punch. In such a case, although the similar action is taken, the glove image as displayed is different between a tall player and a short player. This shortcoming results also from the disparity in the distance between theimaging unit51 and the player. It is not desirable that, in spite of the similar action, a different globe image is displayed depending upon the body height of the player or the distance between theimaging unit51 and the player. This is true also for the glovetype input article7R. In accordance with the present embodiment, this shortcoming can be avoided.
Also, in the case of the present embodiment, there are the two virtual screens respectively for the two glovetype input articles7L and7R, while the “straight area”, the “cross area” and the “immovable area” are defined for each of the glovetype input articles7L and7R. Accordingly, a variety of glove images can be displayed respectively for the glovetype input articles7L and7R in response to motions.
In order to facilitate understanding of this feature, it is assumed that only one virtual screen is provided for the two glovetype input articles7L and7R. In such a case, a punch thrown with the glovetype input article7L is either a straight punch or a left cross punch (a punch toward the right), and a punch thrown with the glovetype input article7R is, either a straight punch or a right cross punch (a punch toward the left).
Accordingly, the glovetype input article7L when throwing a straight punch and the glovetype input article7R when throwing a right cross punch can be located in the same area. Needless to say, the opposite is true. In such a case, in spite of the different types of motions for left and right, the glove image corresponding to the glovetype input article7L and the glove image corresponding to the glovetype input article7R become similar, so that the glove image as displayed may not correspond to the actual motion by theplayer11. For example, in the case where the glovetype input article7L when throwing a straight punch and the glovetype input article7R when throwing a right cross punch are located in the same “straight area” of the virtual screen, the same animation of a straight punch is displayed and therefore it is not appropriate as the glove image corresponding to the glovetype input article7R.
In this situation, eventually, glove images must be provided with no distinction between the types of punches with the glovetype input articles7L and7R. Accordingly, it means nothing if the “straight area” and the “cross area” are distinctively defined. In other words, the respective motions of the glovetype input articles7L and7R cannot be reflected in the glove images. In this regard, in accordance with the present embodiment, it is possible to display a variety of glove images (the animations of a straight punch and a cross punch) reflecting the motions of the glovetype input articles7L and7R respectively.
Furthermore, in accordance with the present embodiment, when the current position TPL3 of the glovetype input article7L is located in the “immovable area” (refer toFIGS. 13A to 13F), theglove217L is moved in the screen in synchronization with the glovetype input article7L (refer toFIG. 10). This is true also for theglove217R. Accordingly, theplayer11 can avoid and defend himself against a punch of the CPU boxer215 by moving the glovetype input articles7L and7R.
Furthermore, in accordance with the present embodiment, it is possible to display glove images reflecting the intention of theplayer11. This point will be explained in detail. In accordance with the present embodiment, the glove image is displayed depending upon the area in which the current position TPL3 is located in the coordinates in which the position TPL1 of the glovetype input article7L as determined twice before is located in the origin. In this case, if the current position TPL3 is located in the “immovable area” including the origin, the image as displayed is indicative of the posture in which no punch is thrown (refer to theglove217L ofFIG. 10). Accordingly, when the motion of the glovetype input article7L is small, the current position TPL3 is often located in the “immovable area”, and thereby it is avoided as much as possible to determine, as a punch, a small motion of theplayer11 which is not intended as a punch. This is true also for the glovetype input article7R.
Furthermore, the position TPL1 is used as the origin of the coordinates in which the globe motion determination is performed. Particularly, in this case, the position of the glovetype input article7L traced back twice from the current position TPL3 is used as the position TPL1. Because of this, in comparison with the case where the past position TPL2 which is determined once before is located at the origin, the displacement of the glovetype input article7L for a longer period can be used for determining the motion, and thereby when the glovetype input article7L is continuously moved, appropriate motion determination is possible along the motion thereof. Also, it is possible to enhance the difference between a small motion and a large motion of the glovetype input article7L. This is true also for the glovetype input article7R.
Furthermore, since the current positions of the glovetype input articles7L and7R are determined on the basis of the currently predicted positions TPLp and TPRp of the glovetype input articles7L and7R (refer toFIG. 12), even when theplayer11 moves such that the glovetype input article7L and the glovetype input article7R are crossed to switch the relative positions thereof between left and right, the positions thereof can be determined correctly as much as possible (that is, left and right can be distinguished from each other).
Furthermore, in accordance with the present embodiment, since two points are extracted (i.e., the coordinates of the first and second extraction points are determined) on the assumption that both the glovetype input articles7L and7R are imaged, it is possible to simplify the calculation for extracting the two points (refer toFIG. 11). This point will be explained in detail. If it is not assumed that both the two glovetype input articles7L and7R are imaged, one shape or two shapes must be detected in the differential image. This is because it is possible both that both the two glovetype input articles7L and7R are imaged and that only one article is imaged. Furthermore, it is required to calculate the center coordinates of one shape or two shapes as detected. Particularly, in the case where two shapes are located close to each other, it is difficult to determine which one or two glove type input article is imaged, and thereby the calculation of the center coordinates becomes quite difficult. In accordance with the present embodiment, since it is not necessary to perform the detection of the respective shapes and the calculation of the center coordinates, the above difficulties shall not rise and the calculation amount is small.
Furthermore, in accordance with the present embodiment, when the cursor coordinates are located in thearea211 or213 including thebutton203U,203D or207, thecursor201 is moved to the center position of thebutton203U,203D or207 irrespective of the positions of the glovetype input articles7L and7R, so that theplayer11 can easily move thecursor201 to thebutton203U,203D or207 only by bring thecursor201 close to thebutton203U,203D or207. In other words, when thecursor201 is located close to thebutton203U,203D or207, it is predicted that theplayer11 intends to move thecursor201 to thebutton203U,203D or207, and thereby thecursor201 is automatically moved to thebutton203U,203D or207 for the purpose of lessening the operational burden of theplayer11. In addition to this, since the elapsed time after thecursor201 reaches thebuttons203U,203D and207 and the remaining time until a predetermined time elapses are displayed in theindicator202U,202D or209, theplayer11 can easily know the remaining time until the predetermined time at which the selection or determination is fixed, and thereby the user-friendliness for theplayer11 can be improved (refer toFIGS. 8A and 8B, andFIG. 9).
Furthermore, in accordance with the present embodiment, since theadapter1 into which thecartridge3 is inserted is placed on the floor face for playing the boxing game, the displacements of the glovetype input articles7L and7R in the differential image tend to be enlarged to reflect the motion of theplayer11 appropriately. Meanwhile, even if theplayer11 performs the same motion, when theadapter1 with thecartridge3 inserted thereinto is placed on the top surface of thetelevision monitor5, the glovetype input articles7L and7R are moved in the differential image by smaller amounts than those when theadapter1 is placed on the floor face.
Furthermore, in accordance with the preferred embodiment, the energy consumption of theplayer11 can be easily calculated by the use of the result of stroboscopic imaging. In this case, it is possible to improve the accuracy of calculating energy consumption because the number of punches “Np” and the sum of displacement “Dm” are taken into consideration.
Embodiment 2The hardware of the boxing game system of theembodiment 1 is used also as the hardware of the boxing game system of theembodiment 2. This boxing game system can perform an exercise process (mode) A, an exercise process (mode) B, an exercise process (mode) C and an exercise process (mode) D in addition to the boxing game process as has been discussed above in the description of theembodiment 1 with reference toFIG. 14. These processes will be explained in turn.
FIG. 29 is a view showing an example of an exercise screen displayed on the basis of the exercise process A performed by the boxing game system in accordance with theembodiment 2 of the present invention. As illustrated inFIG. 29, thehigh speed processor91 displays ball objects521A and521B to appear one after another on thetelevision monitor5 in order that each object flies from the back side toward the front side. Also, thehigh speed processor91 displays theglove217L which moves in response to the motion of the glovetype input article7L and theglove217R which moves in response to the motion of the glovetype input article7R on thetelevision monitor5.
Thehigh speed processor91 determines whether or not the ball object521A is located in a predetermined range from the center position of theglove217L or a predetermined range from the center position of theglove217R, and if it is located in the predetermined range, thehigh speed processor91 judges that the glove hits the ball object521A. Thehigh speed processor91 counts the number of hits and moves (hits back) theball object521A that is hit in the backward direction. Incidentally, the motion control process of thegloves217L and217R and the process of calculating calorie consumption are performed in the same manner as those of theembodiment 1.
The player makes efforts to hit back the ball object521A as much times as possible with theglove217L or217R by swinging the glovetype input articles7L and7R. Meanwhile, if theball object521B is located in a predetermined range from the center position of theglove217L or a predetermined range from the center position of theglove217R, theball object521B is hit back, however, it is not judged as a hit so that the number of hits is not increased. In addition, the number of hits is displayed in the top left corner of the screen on a real time base, and the number of times that the ball object521A appears is displayed in the top right corner of the screen on a real time base.
FIG. 30 is a view showing an example of an exercise screen displayed on the basis of the exercise process B performed by the boxing game system in accordance with theembodiment 2 of the present invention. As illustrated inFIG. 30, thehigh speed processor91 displays ansandbag object520 on thetelevision monitor5. Also, thehigh speed processor91 displays theglove217L which moves in response to the motion of the glovetype input article7L and theglove217R which moves in response to the motion of the glovetype input article7R on thetelevision monitor5. Incidentally, the motion control process of thegloves217L and217R and the process of calculating calorie consumption are performed in the same manner as those of theembodiment 1.
Thehigh speed processor91 counts the number of punches which are thrown in a predetermined time (refer to step S168 ofFIG. 22). The player makes efforts to throw punches as much times as possible with theglove217L or217R by swinging the glovetype input articles7L and7R. Also, the number of punches is displayed in the top left corner of the screen on a real time base, and the elapsed time is displayed in the top right corner of the screen on a real time base.
FIG. 31 is a view showing an example of an exercise screen displayed on the basis of the exercise process C performed by the boxing game system in accordance with theembodiment 2 of the present invention.FIG. 32 is a view showing another example of the exercise screen ofFIG. 31. As illustrated inFIG. 31, thehigh speed processor91 displays panel objects P11, P12, P13, P21, P23, P31, P32 and P33, an opaque globe set522, aninstruction object524, aguide526 and aguide528 on thetelevision monitor5.
Theguide526 contains eight rectangular figures corresponding respectively to the eight panel objects. The display position of theinstruction object524 is suggested by coloring (hatching in the exemplary illustration) some of the rectangular figures with a predetermined color. The arrow-shapedguide528 shows the order of displaying theinstruction object524. The player can get the next position of theinstruction object524 by referring to theguides526 and528.
If the twoareas251 and253 having higher luminance values as illustrated inFIG. 11 cannot be separately detected, but only one area having a higher luminance value is detected, then thehigh speed processor91 displays the opaque globe set522. On the other hand, if the twoareas251 and253 having higher luminance values can be separately detected as illustrated inFIG. 11, then thehigh speed processor91 displays asemi-transparent globe530 as shown inFIG. 32. Accordingly, the player controls the location of the opaque globe set522 by moving the glovetype input articles7L and7R which are kept to be in contact with each other.
Thehigh speed processor91 displays theinstruction object524 overlapping the panel object in accordance with a program. If the glove set522 is moved to the position of theinstruction object524 in response to the motion of the player, thehigh speed processor91 increments a counter by one and display theinstruction object524 to overlap the panel object located in the position as suggested by theguide526. Thehigh speed processor91 repeats such a process for a predetermined time.
The player moves the glovetype input articles7L and7R in order to successively place the opaque globe set522 over theinstruction object524 which is successively moving from one position to another. In addition, the number of times that the glove set522 overlaps theinstruction object524 is displayed in the top left corner of the screen on a real time base, and the elapsed time is displayed in the top right corner of the screen on a real time base.
In this exercise process C, the glove set522 is displayed in the position corresponding to the center point between the first extraction point and the second extraction point. Also, the calorie consumption is calculated on the basis of the accumulated amount of displacement of the glove set522 in the horizontal direction and in the vertical direction. In this case, for each of the horizontal direction and the vertical direction, the process in step S123 ofFIG. 20 and the process shown inFIG. 24 are performed, and the calorie consumption corresponding to the accumulated amount of displacement in the horizontal direction and the calorie consumption corresponding to the accumulated amount of displacement in the vertical direction are added.
FIG. 33 is a view showing an example of an exercise screen displayed on the basis of the exercise process D performed by the boxing game system in accordance with theembodiment 2 of the present invention. As illustrated inFIG. 33, thehigh speed processor91 displays guides534,536,538,540 and542, atarget object532 and theglobes217L and217R on thetelevision monitor5. Incidentally, the motion control process of thegloves217L and217R and the process of calculating calorie consumption are performed in the same manner as those of theembodiment 1.
These guides instruct the player to throw a punch. In the example as illustrated, theguides534,536 and540 instruct the player to throw a left straight punch, theguide538 instructs the player to throw a right straight punch, and theguide542 instructs the player to throw a right cross punch. Also, there is a guide prepared to instruct the player to throw a left cross punch.
Atiming object544 is displayed in order to surround the guide. Furthermore, anindicator546 which grows along the edge of thetiming object544 as time passes is displayed. The player has to perform the action as suggested by the guide surrounded with thetiming object544 within the time indicated by the indicator546 (before theindicator546 has grown all around the guide).
Thehigh speed processor91 counts the number of times that the player performs the action as suggested by the guide surrounded with thetiming object544 within the time indicated by theindicator546. Also, every time the leading end of theindicator546 goes around the last guide in the screen (at the rightmost position) when thetiming object544 moves to this last guide, thehigh speed processor91 switches the display of the guides and then thetiming object536 moves again from the first guide in the screen (at the leftmost position). In addition, the above number of times is displayed in the top left corner of the screen on a real time base, and the number of guides which have been displayed after starting this play in the top right corner of the screen on a real time base.
FIG. 34 is a schematic diagram showing the process transition among the routines performed by the boxing game system in accordance with theembodiment 2 of the present invention. As illustrated inFIG. 34, thehigh speed processor91 displays a title (for example, “power boxing”) on thetelevision monitor5 in step S500. In step S501, thehigh speed processor91 displays a save slot selection screen on thetelevision monitor5, and performs the process of selecting a save slot.
FIG. 35 is a view showing an example of the save slot selection screen displayed in step S501 ofFIG. 34. As illustrated inFIG. 35, thehigh speed processor91 displays thesave slot552, a selectedslot indication area554,screen changing objects550L and550R, a cancelbutton510, anOK button504 and thecursor201 on thetelevision monitor5.
Thesave slot552 of this example includes an upper section in which is displayed the current class and the current stage of the user in a championship mode, and an intermediate section in which is displayed the levels that are passed in the respective exercise modes A through D with star marks. Each of the exercise modes A through D is provided with 10 levels, and the player can start exercise from any level. Also, thesave slot552 includes a lower section in which is displayed the total calorie consumption. This total calorie consumption indicates the sum of all the calories consumed in the championship mode and the exercise mode.
There are four instances of the save slot552 (i.e., for four users) having different colors from each other. For example, the colors may be red, blue, yellow and green. The user can easily confirm hisown save slot552 by the color.
The selectedslot indication area554 shows which instance of thesave slot552 is currently selected (displayed). Accordingly, when the player moves thecursor201 to thescreen changing object550L or550R by moving the glovetype input articles7L and7R, another instance of thesave slot552 is displayed. The four instances of thesave slot552 are cyclically displayed by this operation. If thecursor201 is located in the cancelbutton510 for a predetermined time, the process proceeds to step S500, and if thecursor201 is located in theOK button504 for a predetermined time, the process proceeds to step S502.
Returning toFIG. 34, in step S502, thehigh speed processor91 displays a play mode selection screen on thetelevision monitor5, and performs the process of selecting a play mode. This process is provided for selecting one of the championship mode, the exercise mode and a data view mode. When the player selects the championship mode, thehigh speed processor91 proceeds to step S503; when the player selects the exercise mode, thehigh speed processor91 proceeds to step S513; and when the player selects the data view mode, thehigh speed processor91 proceeds to step S518. Incidentally, the player can select a mode by manipulating thecursor201.
In step S503, thehigh speed processor91 displays the mode selection screen and starts the process of selecting a mode. This process is provided for selecting one of a fighting mode and a training mode in the championship mode. When the player selects the fighting mode, thehigh speed processor91 proceeds to step S504; and when the player selects the training mode, thehigh speed processor91 proceeds to step S509. Incidentally, the player can select a mode by manipulating thecursor201. The process ofFIG. 14 is performed also in the fighting mode.
In step S504, thehigh speed processor91 displays an opponent selection screen and starts the process of selecting an opponent. Incidentally, the player can select an opponent by manipulating thecursor201. In step S505, thehigh speed processor91 displays a class/stage selection screen, and starts the process of selecting a class and a stage. Incidentally, the player can select a class and a stage by manipulating thecursor201.
In step S506, thehigh speed processor91 performs the fighting process between the CPU boxer and the player's boxer. In step S507, thehigh speed processor91 displays the outcome display screen on the television monitor5 (refer toFIG. 26 andFIG. 27). In step S508, thehigh speed processor91 displays a comment screen in accordance with the outcome of fighting (refer toFIG. 28).
On the other hand, in step S509, thehigh speed processor91 displays the training selection screen, and starts the process of selecting a training mode. There are four training modes A to D from which the player can select one training mode by manipulating thecursor201. The training modes A to D correspond respectively to the exercise modes A to D, and the processes thereof also correspond respectively to the exercise modes A to D as explained above (refer toFIG. 29 toFIG. 33 respectively). However, the player cannot select an arbitrary level in the training mode, but has to pass the respective levels in order.
In step S510, thehigh speed processor91 performs the training mode as selected. In step S511, thehigh speed processor91 displays the outcome display screen on the television monitor5 (in the same manner as illustrated inFIG. 26 andFIG. 27). In step S512, thehigh speed processor91 displays a comment screen (the screen as shown inFIG. 28) in accordance with the outcome in the training mode.
On the other hand, thehigh speed processor91 displays an exercise selection screen in step S513, and starts the selection process in the exercise mode. There are the four exercise modes A to D (refer toFIG. 29 toFIG. 33) from which the player can select one exercise mode by manipulating thecursor201.
FIG. 36 is a view showing an example of the exercise selection screen displayed in step S513 ofFIG. 34. As illustrated inFIG. 36, thehigh speed processor91 displays a selectedexercise indication area555, a passedlevel indication area560,screen changing objects550L and550R, a cancelbutton510, anOK button504 and thecursor201 on thetelevision monitor5.
The selectedexercise indication area555 shows which of the exercise mode is currently selected (displayed). More specific description is as follows. The selectedexercise indication area555 comprises four rectangular objects which are arranged in the horizontal direction. Each rectangular object is provided with an exercise name corresponding thereto. The rectangular object corresponding to the exercise mode currently selected is indicated by coloring (hatching in the exemplary illustration) it with a predetermined color. In addition, the name of the exercise mode currently selected (“Punch the red balls” in the illustrated example) is displayed in the center of the screen.
In the passedlevel indication area560, the levels passed in the exercise mode currently selected are indicated by star marks. Each of the exercise modes A through D is provided with 10 levels, and the player can start exercise from any level.
When the player moves thecursor201 to thescreen changing object550L or550R by moving the glovetype input articles7L and7R, the name of another exercise mode and the passedlevel indication area560 are displayed, and the rectangle object of the selectedexercise indication area555 corresponding to the exercise mode as selected is colored with the predetermined color. If thecursor201 is located in the cancelbutton510 for a predetermined time, the process proceeds to step S502, and if thecursor201 is located in theOK button504 for a predetermined time, the process proceeds to step S514.
Returning toFIG. 34, in step S514, thehigh speed processor91 displays a level selection screen, and performs the process of selecting a level. The player can select a level by manipulating thecursor201.
FIG. 37 is a view showing an example of the level selection screen displayed in step S514 ofFIG. 34. As illustrated inFIG. 37, thehigh speed processor91 displays alevel display area561,screen changing objects550L and550R, a cancelbutton510, anOK button504 and thecursor201 on thetelevision monitor5.
Thelevel display area561 is used to display the level as currently selected together with the requirements for passing the level and the predicted calorie consumption. When the player moves thecursor201 to thescreen changing object550L or550R by moving the glovetype input articles7L and7R, thelevel display area561 of another level is displayed. If thecursor201 is located in the cancelbutton510 for a predetermined time, the process proceeds to step S513, and if thecursor201 is located in theOK button504 for a predetermined time, the process proceeds to step S515.
Returning toFIG. 34, in step S515, thehigh speed processor91 performs the process in the exercise mode as selected. In step S516, thehigh speed processor91 displays the outcome display screens on the television monitor (the screen as shown inFIG. 26 andFIG. 27). In step S517, thehigh speed processor91 displays the comment screen (the screen as shown inFIG. 28) in accordance with the outcome in the exercise mode.
On the other hand, thehigh speed processor91 displays the contents of saved data on thetelevision monitor5 in step S518.
FIG. 38 is a view showing an example of the contents of saved data displayed in step S518 ofFIG. 34. As illustrated inFIG. 38, thehigh speed processor91 displays afirst display area572, asecond display area574, athird display area576,screen changing objects550L and550R, anexit button562, a dataclear button564 and thecursor201 on thetelevision monitor5.
This screen is a screen displaying the contents of saved data in the championship mode. Thefirst display area572 is used to display a class, a stage and wins and losses. Thesecond display area574 is used to display the skills of the player's boxer by star marks. The skills includes four properties of power, speed, stamina and guard. The skill of speed is enhanced by passing the training mode corresponding to the exercise mode A; the skill of power is enhanced by passing the training mode corresponding to the exercise mode B; the skill of guard is enhanced by passing the training mode corresponding to the exercise mode C; and the skill of stamina is enhanced by passing the training mode corresponding to the exercise mode D. Thethird display area576 is used to display the number of levels which are passed in the training mode and the number of levels which are failed in the training mode.
If thecursor201 is located in theexit button562 for a predetermined time, the process proceeds to step S502, and if thecursor201 is located in the dataclear button564 for a predetermined time, the data displayed in thefirst display area572, thesecond display area574 and thethird display area576 is erased from an EEPROM. This EEPROM is not shown in the figures, but incorporated within thecartridge3. If thecursor201 is located in thescreen changing objects550L and550R for a predetermined time, the screen is switched to the view shown inFIG. 39 orFIG. 40.
FIG. 39 is a view showing another example of the contents of saved data displayed in step S518 ofFIG. 34. As illustrated inFIG. 39, thehigh speed processor91 displays a passedlevel display area570,screen changing objects550L and550R, anexit button562, a dataclear button564 and thecursor201 on thetelevision monitor5.
The passedlevel display area570 is used to display the levels passed in the respective exercise modes A to D by star marks. If thecursor201 is located in theexit button562 for a predetermined time, the process proceeds to step S502, and if thecursor201 is located in the dataclear button564 for a predetermined time, the data displayed in the passedlevel display area570 is erased from the above EEPROM. If thecursor201 is located in thescreen changing objects550L and550R for a predetermined time, the screen is switched to the view shown inFIG. 38 orFIG. 40.
FIG. 40 is a view showing a further example of the contents of saved data displayed in step S518 ofFIG. 34. As illustrated inFIG. 40, thehigh speed processor91 displays a calorieconsumption display area566, a punchnumber display area568,screen changing objects550L and550R, anexit button562, a dataclear button564 and thecursor201 on thetelevision monitor5.
The calorieconsumption display area566 is used to display the total calorie consumption as accumulated in the championship mode, the total calorie consumption as accumulated in the exercise mode, and the sum thereof. The punchnumber display area568 is used to display the total number of punches as accumulated in the championship mode and the exercise mode.
If thecursor201 is located in theexit button562 for a predetermined time, the process proceeds to step S502, and if thecursor201 is located in the dataclear button564 for a predetermined time, the data displayed in the calorieconsumption display area566 and the punchnumber display area568 is erased from an EEPROM. If thecursor201 is located in thescreen changing objects550L and550R for a predetermined time, the screen is switched to the view shown inFIG. 38 orFIG. 39.
As has been discussed above, the present embodiment provides not only boxing bouts, but also a variety of exercise modes. Accordingly, the player can enjoy not only a game but also exercise. In addition to this, since the calorie consumption is displayed in either the championship mode or the exercise mode, the player can quantitatively know how much calories are consumed.
Also, the above EEPROM stores the data as shown in the screens ofFIG. 38 toFIG. 40. This data can be cleared if necessary, and therefore the user can save the data and know the course of the outcome each time the data is cleared.
Meanwhile, the present invention is not limited to the above embodiments, and a variety of variations and modifications may be effected without departing from the spirit and scope thereof, as described in the following exemplary modifications.
(1) While a cartridge type is employed in the above description, it is possible to implement the respective functions of thecartridge3 within theadapter1 without the use of a cartridge.
(2) In the above description, the globe motion determination process is performed by locating the past position which is determined twice before (FIGS. 13A to 13F) at the origin. However, the number of times that the past position is traced back is not limited to this, but can be set to three or more times if appropriate through a trial and error process. In addition, as shown inFIG. 14, one cycle of the process is completed before an interrupt is issued by the next video system synchronous signal. In other words, one cycle of the process is completed within one video frame. However, it is possible to complete one cycle of the process in N video frames (N is two or a larger integer) such as two video frames. For example, if one cycle of the process is completed in two video frames, the positions of the glovetype input articles7L and7R are calculated once per two video frames.
(3) In the above right/left determination process, as shown inFIG. 12, the predicted position TPLp and TPRp of the glovetype input articles7L and7R are calculated only on the basis of the velocity vectors VL and VR which are calculated from the positions TPL1 and TPR1 determined twice before and the previous positions TPL2 and TPR2. However, it is possible to calculate the predicted positions TPLp and TPRp by using also the positions TPL0 and TPR0 which are determined before the positions TPL1 and TPR1 determined twice before. The positions TPL0, TPL1 and TPL2 are considered (left prediction). A velocity vector VL0 is calculated such that the position TPL0 is the start point thereof and the position TPL1 is the end point thereof, and a velocity vector VL1 is calculated such that the position TPL1 is the start point thereof and the position TPL2 is the end point thereof. A predicted vector VLp is determined in order that the angle between the velocity vector VL1 and the predicted vector VLp is equal to the angle between the velocity vectors VL0 and VL1. Furthermore, the magnitude of the velocity vector VL1 is multiplied by the ratio “r”, which is calculated as r=(magnitude of the velocity vector VL1)/(magnitude of the velocity vector VL0), and the magnitude of the predicted vector VLp is set to the result of multiplication. Then, the starting point of the predicted vector VLp is set to the end point of the velocity vector VL1, and the predicted position TPLp is set to the end point of the predicted vector VLp. The right prediction is performed in the same manner. By this process, the predicted position can be calculated with a high degree of accuracy.
(4) In addition to the configuration of the above embodiment, it is possible to implement, within the respective glovetype input articles7L and7R, an acceleration sensor circuit, an infrared light emitting diode, a microcomputer and so forth as described in Japanese Patent Published Application No. Hei 2004-49436. The microcomputer controls the acceleration sensor circuit, and receives acceleration information therefrom. Then, the microcomputer drives the infrared light emitting diode in order to transmit acceleration information of the glovetype input articles7L and7R to theadapter1 by infrared communication. Accordingly, thehigh speed processor91 makes use of the acceleration information to determine whether or not a punch is thrown by moving the glovetype input articles7L and7R, and makes use of the result of imaging by theimaging unit51 to determine motions for avoiding or protecting against the punch of the opposing boxer. By this configuration, theadapter1 with thecartridge3 inserted thereinto can be placed on thetelevision monitor5 without causing any problem to play game.
(5) In the above description, the elapsed time and the remaining time are not represented by numerals but represented by variation in color as illustrated inFIG. 8 andFIG. 9. However, the representation method is not limited to this, but they can be represented by numerals, change in shape, or any other arbitrary method.
(6) In the above description, the virtual screen is divided into the “immovable area”, the “straight area” and the “cross area”. In addition, the gloves in the basic posture, a straight punch, or a cross punch are displayed in accordance with the area in which theinput article7L or7R is located. However, the configuration of the virtual screen is not limited thereto, but it is possible to increase or decrease the number of areas, and change the actions assigned to the respective areas (what image is to be displayed or what process is to be performed when the input article is located in the area).
(7) In the above description, there are two virtual screens which are mirror images each other in the right and left direction. This is because the function (for punches) of the left-handed input article7L is the same as the function (for punches) of the right-handed input article7R. However, the virtual screens are not necessarily mirror images each other in the right and left direction, but left and right virtual screens which are totally different from each other can be used in accordance with the type of the game. For example, in the case where the function (for example, for moving a shield) of the left-handed input article is different from the function (for example, for swinging a sword) of the right-handed input article, left and right virtual screens which are different from each other may be used. That virtual screens are different means that they are different in the number, dimensions and/or functions of areas which are defined in the virtual screen.
(8) While the boxing game is explained in the above description, the application program is not limited to this and not limited to games. Also, depending upon the context of the application, it is possible to arbitrarily select the shapes of input articles and the locations of the retroreflective sheets attached to the input articles.
(9) In the above description, the amount of exercise by the player is represented by energy consumption in units of calories. However, the unit is not limited to calories, but any other unit of energy can be used. Also, while energy consumption is used to directly represent the amount of exercise by the player, any appropriate representation can be used to indirectly represent the amount of exercise by the player. For example, it may be suggested what number of apples are equivalent to the exercise, what number of steps are equivalent to the exercise and so forth. As has been discussed above, in this description, “the amount of exercise” is meant a value which quantitatively represents how much the player exercises.
(10) In the above description, the motion control of thegloves217L and217R and the calculation of calorie consumption are performed on the basis of the positional information of the glovetype input articles7L and7R as the state information thereof. However, speed information, moving direction information, moving distance information, velocity vector information, acceleration information, movement locus information, area information (perspective information), and/or positional information can be calculated as state information of glovetype input articles7L and7R in order to calculate energy consumption on basis of the state information.
(11) It is possible to calculate the calorie consumption by detecting arbitrary motions of theplayer11 during playing game as shown inFIG. 10 and so on, and also by displaying images on thetelevision monitor5 through which thehigh speed processor91 instructs theplayer11 what motion to do as shown inFIG. 31 toFIG. 33 and detecting the motion that theplayer11 actually performs. (12) By inserting a binarization step between step S31 and step S32 ofFIG. 16, the differential image is converted into binary image data by comparing the threshold value ThL and the array element “Dif [X] [Y]”, and the processes of steps S32 and S33 are performed on the basis of the binary image data. In this case, if the array elements “Dif [X] [Y]” larger than the threshold value ThL are set to “1” and the array elements “Dif [X] [Y]” smaller than the threshold value ThL are set to “0”, then the threshold value ThL for use in the processes ofFIG. 17 andFIG. 18 is replaced, for example, by a value of “0”.
(13) In accordance with the present invention, the player is informed of the amount of exercise he actually do in terms of calorie consumption to maintain the health of body. In view of this point, besides boxing, there are a variety of exercises to which the present invention can be applied. In any way, the player puts on some retroreflective portion before doing an exercise.
The foregoing description of the embodiments has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form described, and obviously many modifications and variations are possible in light of the above teaching. The embodiment was chosen in order to explain most clearly the principles of the invention and its practical application thereby to enable others in the art to utilize most effectively the invention in various embodiments and with various modifications as are suited to the particular use contemplated.