Movatterモバイル変換


[0]ホーム

URL:


HK1171403A1 - Controller device and information processing device - Google Patents

Controller device and information processing device
Download PDF

Info

Publication number
HK1171403A1
HK1171403A1HK12112245.4AHK12112245AHK1171403A1HK 1171403 A1HK1171403 A1HK 1171403A1HK 12112245 AHK12112245 AHK 12112245AHK 1171403 A1HK1171403 A1HK 1171403A1
Authority
HK
Hong Kong
Prior art keywords
game
terminal device
data
housing
controller
Prior art date
Application number
HK12112245.4A
Other languages
Chinese (zh)
Other versions
HK1171403B (en
Inventor
蘆田健郎
芦田健一郎
后藤義智
后藤义智
岡村考師
冈村考师
高本純治
高本纯治
伊吹真人
山本伸樹
山本伸树
土屋人詩
土屋人诗
末武史佳
須賀明子
须贺明子
山本直彌
山本直弥
熊崎大助
Original Assignee
任天堂株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2010245299Aexternal-prioritypatent/JP4798809B1/en
Priority claimed from JP2011092612Aexternal-prioritypatent/JP6103677B2/en
Priority claimed from JP2011102834Aexternal-prioritypatent/JP5837325B2/en
Priority claimed from JP2011118488Aexternal-prioritypatent/JP5936315B2/en
Application filed by 任天堂株式会社filedCritical任天堂株式会社
Publication of HK1171403A1publicationCriticalpatent/HK1171403A1/en
Publication of HK1171403BpublicationCriticalpatent/HK1171403B/en

Links

Classifications

Landscapes

Abstract

PURPOSE: An operation device and an operation system are provided to enable a user to easily grip a manipulating device. CONSTITUTION: A game system comprises a television, a game device, an optical disc, a controller, a marker device, and a terminal device. The optical disc is detachable from the game device and stores an information processing program which is performed in the game device. An inlet hole of the optical disc is formed in the front of the game device. The game device carries game processing.

Description

Operation device and information processing device
Technical Field
The present invention relates to an operation device that can be held and operated by a player.
Background
Conventionally, there is an operation device that is used by a player holding the device (see, for example, japanese patent No. 3703473). For example, a portable game device described in japanese patent No. 3703473 is of a foldable type, and an operation button is provided on a lower case. According to this game device, the user can perform a game operation using the operation buttons provided on both sides of the screen while viewing the screen, and can easily perform a game operation while holding the game device.
In recent years, as for portable terminal devices (operation devices), devices in which screens and the like become larger and the devices themselves also become larger have been increasing. Here, if the device itself to be used by the user holding the device in hand becomes large, the device may not be easily held.
Disclosure of Invention
Therefore, an object of the present invention is to provide an operation device that can be easily held by a user.
In order to solve the above problems, the present invention employs the following configurations (1) to (18).
(1) An example of the present invention is an operation device for operation by a user. The operation device includes a substantially plate-shaped housing, a display portion, and a protrusion portion. The display unit is disposed on the front side of the housing. The protruding portions are provided at least on the left and right sides of the rear surface side of the housing above the center of the housing.
The "operation unit" may be any type of operation unit as long as it is an operation device that can be operated by a user, for example, a joystick (analog joystick), a key (button), a touch panel, a touch pad, and the like in the embodiments described later.
The "left and right positions" means that the protrusions are provided on the left and right sides of the center of the housing in the left-right direction, the protrusions may be provided on both left and right ends, or the protrusions may be provided on the center of the ends.
According to the configuration of the above (1), since the projecting portion is provided on the rear surface side of the housing, the user can easily grip the operation device by hooking the projecting portion to the finger when gripping the left and right housings of the display portion. Further, since the projection is provided on the upper side of the case, when the user grips the case by bringing the index finger, the middle finger, or the ring finger into contact with the lower surface of the projection, the user can support the case with the palm (see fig. 10 and 11), and can grip the operation device reliably. Therefore, according to the configuration of the above (1), it is possible to provide an operation device that can be easily held by a user.
(2) The operation device may further include a first operation unit and a second operation unit provided on the left and right sides of the display unit and above the center of the housing.
According to the configuration of the above (2), since the operation portions are provided on both the left and right sides of the display portion, the user can easily operate the operation portions with, for example, the thumb when gripping the left and right housings of the display portion. Therefore, according to the configuration of the above (2), it is possible to provide an operation device which can be easily held by a user and can be easily operated.
(3) Another example of the present invention is an operation device including a substantially plate-shaped housing, a display portion, a first operation portion, a second operation portion, and a protrusion portion. The display unit is disposed on the front side of the housing. The first operation portion and the second operation portion are respectively arranged on the left side and the right side of the display portion. The projection is provided at the following positions on the back side of the housing: when the user grips the housing so that the first operation portion and the second operation portion can be operated by the thumbs of both hands, the protruding portion can be placed at a position on any finger other than the thumb.
According to the configuration of the above (3), since the projecting portion is provided on the rear surface side of the housing, when the user grips the left and right housings of the display portion, the user can easily grip the operation device by putting the projecting portion on the fingers other than the thumb (see fig. 10 and 11). Further, since the operation portions are provided on both the left and right sides of the display portion, the user can easily operate the operation portions with the thumb while holding the left and right housings of the display portion. Therefore, according to the configuration of the above (3), it is possible to provide an operation device which can be easily held by a user and can be easily operated.
(4) The projection portion may be provided in a region including a position opposing the first operation portion and the second operation portion on the back surface side of the housing.
The above-mentioned "relative position" is not strictly limited to a state in which the operation portion is aligned with the position of the protrusion portion, and includes a state in which, when a region in which the operation portion is provided on the front surface side of the housing is projected on the rear surface side, the region in which the protrusion portion is provided on the rear surface side of the housing partially overlaps with the projected region.
According to the configuration of the above (4), when operating each operation section, the user can hold the terminal device 7 by supporting the brim 59 with the index finger, the middle finger, or the ring finger (see fig. 10 and 11). This makes it easy to hold the terminal device 7 and to handle the operation units.
(5) The operating device may further include a third operating portion and a fourth operating portion provided at left and right sides of the housing on the upper surface of the protruding portion.
According to the configuration of the above (5), the user can operate the third operation unit and the fourth operation unit with, for example, the index finger or the middle finger while holding the left and right housings of the display unit. That is, more operations can be performed in the above state, and an operation device with better operability can be provided. In addition, the user can hold the operation device by sandwiching the protrusion portion from above and below, and therefore, the user can hold the operation device more easily.
(6) The protrusion may have a brim shape extending in the left-right direction.
According to the configuration of the above (6), the user can hold the operation device along the lower surface of the projection portion with the fingers supporting the projection portion, and therefore the user can hold the operation device more easily. Further, since the projecting portion is formed to extend in the left-right direction, when the user grips the operation device with the projecting portion oriented in the vertical direction, the fingers other than the thumb can be brought into contact with the projecting portion regardless of the position on one side of the operation device. Thus, even in the case where the operation device is held with the protrusion portion in the longitudinal direction, the user can reliably hold the operation device.
(7) A first locking hole may be provided in a lower surface of the protrusion, and the first locking hole may lock an attachment device independent of the operation device.
According to the configuration of the above (7), the operation device and the attachment can be firmly connected by the first locking hole. In addition, when the configuration of (6) and the configuration of (7) are combined, the first locking hole can be provided near the center of the operation device in the left-right direction, and therefore, the attachment device can be stably connected while maintaining the left-right balance.
(8) A second locking hole may be provided in a lower surface of the housing, and the second locking hole may lock the attachment.
According to the configuration of the above (8), since the operation device and the attachment device are connected by the first locking hole and the second locking hole provided at different positions, the connection can be made more firm.
(9) The operating device may further include convex portions having a convex cross section on both left and right sides of the rear surface of the housing below the protruding portion.
According to the configuration of (9) above, the user can hold the housing by hooking a finger (e.g., ring finger or little finger) on the projection, and thus can hold the operation device more reliably.
(10) The protrusion and the convex portion may be provided with a space.
According to the configuration of the above (10), the user can hold the operation device by hooking other fingers on the convex portion without the convex portion interfering with the fingers by supporting the convex portion with the middle finger, the ring finger, or the like. This makes it easier to hold the operating device.
(11) The operation device may further include grip portions provided on both left and right sides of the back surface of the housing.
According to the configuration of (11) above, the user can grip the housing by hooking a finger (e.g., ring finger or little finger) on the grip portion, and therefore can grip the operation device more reliably.
(12) The operation device may further include a fifth operation unit and a sixth operation unit. The fifth operation unit is disposed below the first operation unit on the front surface side of the housing. The sixth operation portion is disposed below the second operation portion on the front surface side of the housing.
According to the configuration of the above (12), a wider variety of operations can be performed by the operation device. In addition, even when the fifth operation unit and the sixth operation unit are operated, the user can reliably hold the operation device, and thus an operation device with good operability can be provided.
(13) Another example of the present invention is an operation device including a substantially plate-shaped housing, a display portion, a protrusion portion, and an operation portion. The display unit is disposed on the front side of the housing. The protruding portions are provided at least at left and right sides of the back surface side of the housing in a protruding manner. The operation portion is provided on an upper surface of the protrusion portion.
According to the configuration of the above (13), since the projection portion is provided on the rear surface side of the housing, the user can easily hold the operation device by hooking the projection portion to the finger when holding the left and right housings of the display portion (see fig. 10 and 11). Further, since the operating portion is provided on the upper surface of the projection portion, the operating portion can be easily operated when the operating device is gripped by hooking the projection portion on a finger. In this case, the user can hold the operation device by sandwiching the protrusion portion from the upper and lower sides, and therefore, the user can hold the operation device more easily. As described above, according to the configuration of the above (13), it is possible to provide an operation device which can be easily held by a user and can be easily operated.
(14) Another example of the present invention is an operation device for operation by a user. The operation device includes a substantially plate-shaped housing, a display unit, and a grip unit. The display unit is disposed on the front side of the housing. The handle portion is provided to extend in the vertical direction on both the left and right sides of the housing on the back surface side of the housing, and has a convex cross section.
According to the configuration of (14), the user can hold the housing by hooking a finger (for example, a ring finger or a little finger) on the grip portion, and thus can hold the operation device reliably. Therefore, according to the configuration of the above (14), it is possible to provide an operation device which can be easily held by a user.
(15) The operating device may further include a protruding portion provided to protrude from the rear surface side of the housing at least on the left and right sides of the upper side of the grip portion.
According to the configuration of the above (15), since the projecting portion is provided on the rear surface side of the housing, the user can easily grip the operation device by hooking the projecting portion to the finger when gripping the left and right housings of the display portion, and can more reliably grip the operation device.
(16) The operation device may further include a seventh operation unit and an eighth operation unit provided on both left and right sides of the upper surface of the housing.
According to the configuration of the above (16), a wider variety of operations can be performed by the operation device. Further, since the operation unit is disposed on the upper surface of the housing, the user can reliably hold the operation device by surrounding the housing from the front surface side, the upper side, and the back surface side of the housing.
(17) The operation device may further include a touch panel provided on the screen of the display unit.
According to the configuration of (17) above, the user can intuitively and easily operate the image displayed on the display unit using the touch panel. When the operating device is placed with the display portion facing upward, the operating device is placed with the projection portion slightly inclined. Therefore, the touch panel can be easily operated in a state where the operation device is mounted.
(18) The operation device may further include an inertial sensor inside the housing.
According to the configuration of the above (18), the operation of swinging or moving the operation device itself can be performed, and the user can perform an intuitive and easy operation using the operation device. In addition, since the operating device is moved to be used, it is important to securely connect the operating device and the attachment device when the attachment device is connected to the operating device. Therefore, in the configuration of the above (18), it is particularly effective to securely connect the operation device and the attachment by adopting the configuration of the above (7) or (8).
(19) The operation device may further include a communication unit and a display control unit. The communication unit wirelessly transmits operation data indicating an operation performed on an operation device to a game device and receives image data transmitted from the game device. The display control unit causes the display unit to display the received image data.
According to the configuration of (19), the user can perform a game operation using the operation device which can be easily held and has excellent operability. Further, since the image transmitted from the game device is displayed on the display unit, the user can perform the game operation while viewing the image displayed on the display unit of the operation device.
(20) The operation device may further include a game processing unit and a display control unit. The game processing unit executes game processing in accordance with an operation performed on the operation device. The display control unit generates a game image based on the result of the game processing and causes the display unit to display the game image.
According to the configuration of the above (20), the portable game device can be easily held and can be easily handled.
(21) The display unit may have a screen of 5 inches or more.
With the configuration of (21), an image that is easy to view and has a large power can be displayed on a large screen. Further, in the case where a large-screen display unit is used as in the configuration of (21), the size of the operation device itself is inevitably large, and therefore, the configurations of (1) to (20) described above, which enable the user to easily grip, are particularly effective.
Another example of the present invention may be a tablet-type information processing apparatus including the respective portions (the housing, the display portion, the protrusion, and the like) of (1) to (21) described above. For example, another example of the present invention may be a tablet-type information processing apparatus including: a substantially plate-shaped housing; a display unit provided on the front surface side of the housing; and a protrusion portion provided at least on the left and right sides of the rear surface side of the housing and above the center of the housing.
According to the present invention, the display portion is provided on the front side of the housing, and the protruding portions are provided on the rear side of the housing at least on the left and right sides above the center of the housing, whereby the user can easily grip the operation device.
The foregoing and other objects, features, aspects and effects of the present invention will become further apparent from the following detailed description with reference to the accompanying drawings.
Drawings
Fig. 1 is an external view of the game system 1.
Fig. 2 is a block diagram showing an internal configuration of game device 3.
Fig. 3 is a perspective view showing an external configuration of the controller 5.
Fig. 4 is a perspective view showing an external configuration of the controller 5.
Fig. 5 is a diagram showing an internal configuration of the controller 5.
Fig. 6 is a diagram showing an internal configuration of the controller 5.
Fig. 7 is a block diagram showing the configuration of the controller 5.
Fig. 8 is a diagram showing an external configuration of the terminal device 7.
Fig. 9 is a diagram showing an external configuration of the terminal device 7.
Fig. 10 is a diagram showing a state where the user holds the terminal device 7 in the lateral direction.
Fig. 11 is a diagram showing a state in which the user holds the terminal device 7 laterally.
Fig. 12 is a diagram showing a state in which the user holds the terminal device 7 in the portrait orientation.
Fig. 13 is a diagram showing a state in which the user holds the terminal device 7 in the portrait orientation.
Fig. 14 is a block diagram showing an internal configuration of the terminal device 7.
Fig. 15 is a diagram showing an example in which an additional device (input device 200) is attached to the terminal device 7.
Fig. 16 is a diagram showing an example in which an additional device (input device 200) is attached to the terminal device 7.
Fig. 17 is a diagram showing another example of the input device.
Fig. 18 is a diagram showing a state in which the input device 220 shown in fig. 17 is mounted on the terminal device 7.
Fig. 19 is a diagram showing a state in which the input device 220 shown in fig. 17 is mounted on the terminal device 7.
Fig. 20 is a diagram showing another example of connecting an attachment (cradle 210) to the terminal device 7.
Fig. 21 is a diagram showing various data used in the game process.
Fig. 22 is a main flowchart showing a flow of game processing executed by the game device 3.
Fig. 23 is a flowchart showing a detailed flow of the game control process.
Fig. 24 is a diagram showing the screen of the television 2 and the terminal device 7 in the first game example.
Fig. 25 is a diagram showing the screen of the television 2 and the terminal device 7 in the second game example.
Fig. 26 is a diagram showing an example of a television game image displayed on the television 2 in the third game example.
Fig. 27 is a diagram showing an example of a terminal game image displayed on the terminal device 7 in the third game example.
Fig. 28 is a diagram showing an example of a television game image displayed on the television 2 in the fourth game example.
Fig. 29 is a diagram showing an example of a terminal game image displayed on the terminal device 7 in the fourth game example.
Fig. 30 is a diagram showing a use situation of the game system 1 in the fifth game example.
Fig. 31 is a diagram showing a connection relationship of each device included in the game system 1 when connected to an external device via a network.
Fig. 32 is a diagram showing an external configuration of a terminal device according to a modification of the present embodiment.
Fig. 33 is a diagram showing a state in which the user holds the terminal device shown in fig. 32.
Fig. 34 is a diagram showing an external configuration of a terminal device according to another modification of the present embodiment.
Fig. 35 is a diagram showing an external configuration of a terminal device according to another modification of the present embodiment.
Detailed Description
[1. overall Structure of Game System ]
A game system 1 according to an embodiment of the present invention will be described below with reference to the drawings. Fig. 1 is an external view of the game system 1. In fig. 1, a game system 1 includes a stationary display device (hereinafter referred to as a "television") 2 typified by a television receiver or the like, a stationary game device 3, an optical disc 4, a controller 5, a marker device (markedvice) 6, and a terminal device 7. The game system 1 executes game processing in the game device 3 in accordance with game operations performed by the controller 5, and displays a game image obtained by the game processing on the television 2 and/or the terminal device 7.
An optical disk 4 is detachably inserted into the game device 3, and the optical disk 4 is an example of an information storage medium that is replaceably used for the game device 3. The optical disk 4 stores an information processing program (typically, a game program) to be executed by the game device 3. An insertion port for the optical disk 4 is provided on the front surface of the game device 3. The game device 3 reads and executes the information processing program stored in the optical disk 4 inserted into the insertion port to execute game processing.
The game device 3 is connected to the television 2 via a connection cable (cord). The television 2 displays a game image obtained by a game process executed by the game device 3. The television 2 has a speaker 2a (fig. 2), and the speaker 2a outputs game sound obtained as a result of the game processing. In other embodiments, the game device 3 may be integrated with a stationary display device. The communication between the game device 3 and the television 2 may be wireless communication.
A marker 6 is provided around the screen of the television set 2 (on the upper side of the screen in fig. 1). The user (player) can perform a game operation of the motion controller 5, and the marker device 6 is used for calculating the movement, position, posture, and the like of the controller 5 by the game device 3, and details thereof will be described later. The marking device 6 is provided with two markers (markers) 6R and 6L at both ends thereof. The marker 6R (the same applies to the marker 6L) is specifically one or more infrared LEDs (Light Emitting diodes), and outputs infrared Light toward the front of the television 2. The marker device 6 is connected to the game device 3, and the game device 3 can control the lighting of each infrared LED provided in the marker device 6. Further, the marker 6 is a portable device, and the user can set the marker 6 at an arbitrary position. Fig. 1 shows a state in which the marker 6 is disposed above the television set 2, but the position and orientation in which the marker 6 is disposed are arbitrary.
The controller 5 is configured to provide the game device 3 with operation data indicating the content of an operation performed on the controller 5. The controller 5 and the game device 3 can communicate by wireless communication. In the present embodiment, for example, Bluetooth (registered trademark) is used for wireless communication between the controller 5 and the game device 3. In another embodiment, the controller 5 and the game device 3 may be connected by a wire. In the present embodiment, the game device 3 is capable of communicating with a plurality of controllers, and a game can be played by a plurality of players by using a predetermined number of controllers at the same time, although one controller 5 is included in the game system 1. The detailed configuration of the controller 5 will be described later.
The terminal device 7 has a size that can be held by a user, and the user can use the terminal device 7 by holding the terminal device 7 with his hand and moving it or by placing the terminal device 7 at an arbitrary position. The terminal device 7 includes an LCD (liquid crystal Display) 51 as a Display unit and an input unit (a touch panel 52, a gyro sensor 74, and the like described later), and the detailed configuration thereof will be described later. The terminal device 7 and the game device 3 can communicate by a wireless method (or a wired method). Terminal device 7 receives data of an image (for example, a game image) generated in game device 3 from game device 3, and displays the image on LCD 51. In the present embodiment, an LCD is used as the display device, but the terminal device 7 may have any other display device such as a display device using EL (Electro Luminescence), for example. Further, the terminal device 7 transmits operation data indicating the contents of the operation performed on the terminal device 7 to the game device 3.
[2. internal Structure of Game device 3 ]
Next, the internal configuration of the game device 3 will be described with reference to fig. 2. Fig. 2 is a block diagram showing an internal configuration of game device 3. The game device 3 includes a CPU (Central processing Unit) 10, a system L SI 11, an external main memory 12, a ROM/RTC 13, a disk drive 14, an AV-IC 15, and the like.
The CPU 10 executes a game process by executing a game program stored in the optical disk 4, and the CPU 10 functions as a game processor. The CPU 10 is connected to the system LSI 11. The system LSI 11 is connected to an external main memory 12, a ROM/RTC 13, a disk drive 14, and an AV-IC 15, in addition to the CPU 10. The system LSI 11 performs the following processing and the like: controlling data transmission between the structural elements connected with the data transmission device; generating an image to be displayed; data is acquired from an external device. The internal configuration of the system LSI 11 will be described later. The volatile external main memory 12 is used for storing programs such as a game program read from the optical disk 4 and a game program read from a Flash memory (Flash memory)17, and various data, and the external main memory 12 is used as a work area and a buffer area of the CPU 10. The ROM/RTC 13 includes a ROM (so-called boot ROM) on which a start program of the game device 3 is installed and a clock circuit (RTC (real Time clock) for performing timing. The disk drive 14 reads program data, texture data, and the like from the optical disk 4, and writes the read data in the internal main memory 11e or the external main memory 12, which will be described later.
The system LSI 11 is provided with an input/output Processor (I/O Processor) 11a, a GPU (Graphic Processor Unit) 11b, a DSP (digital signal Processor) 11c, a VRAM (Video RAM)11d, and an internal main memory 11 e. Although not shown, these components 11a to 11e are connected to each other by an internal bus.
The GPU 11b forms a part of the drawing unit, and generates an image in accordance with a drawing Command (Graphics Command) from the CPU 10. The VRAM 11d stores data (polygon data, texture data, and the like) necessary for the GPU 11b to execute the drawing command. When generating an image, the GPU 11b creates image data using the data stored in the VRAM 11 d. In the present embodiment, the game device 3 generates both the game image displayed on the television set 2 and the game image displayed on the terminal device 7. Hereinafter, the game image displayed on the television 2 may be referred to as a "television game image", and the game image displayed on the terminal device 7 may be referred to as a "terminal game image".
The DSP 11c functions as an audio processor, and generates audio data using audio data (sound data) and audio waveform (tone) data stored in the internal main memory 11e and the external main memory 12. In the present embodiment, as for the game sound, both the game sound output from the speaker of the television set 2 and the game sound output from the speaker of the terminal device 7 are generated similarly to the game image. Hereinafter, the game sound output from the television 2 may be referred to as "television game sound", and the game sound output from the terminal device 7 may be referred to as "terminal game sound".
Of the images and sounds generated in game device 3 as described above, data of the images and sounds to be output by television set 2 is read by AV-IC 15. The AV-IC 15 outputs the read image data to the television set 2 via the AV connector 16, and outputs the read sound data to the speaker 2a built in the television set 2. Thereby, an image is displayed on the television set 2, and sound is output from the speaker 2 a.
In addition, of the images and sounds generated in game device 3, data of the images and sounds to be output in terminal device 7 is transmitted to terminal device 7 through input-output processor 11 a. Data transmission to the terminal device 7 by the input/output processor 11a and the like will be described later.
The input/output processor 11a performs transmission/reception of data between components connected thereto or performs download of data from an external device. The input/output processor 11a is connected to a flash memory 17, a network communication module 18, a controller communication module 19, an expansion connector 20, a memory card connector 21, and a codec (codec) LSI 27. An antenna 22 is connected to the network communication module 18. An antenna 23 is connected to the controller communication module 19. The codec LSI 27 is connected to a terminal communication module 28, and an antenna 29 is connected to the terminal communication module 28.
The game device 3 can be connected to a network such as the internet to communicate with an external information processing device (for example, another game device, various servers, and the like). That is, the input/output processor 11a is connected to a network such as the internet via the network communication module 18 and the antenna 22, and can communicate with an external information processing apparatus connected to the network. The input/output processor 11a periodically accesses the flash memory 17, detects whether or not there is data to be transmitted to the network, and transmits the data to the network through the network communication module 18 and the antenna 22 in the case where the data exists. The input/output processor 11a receives data transmitted from an external information processing apparatus and data downloaded from a download server via a network, the antenna 22, and the network communication module 18, and stores the received data in the flash memory 17. The CPU 10 reads data stored in the flash memory 17 by executing the game program and utilizes the data in the game program. The flash memory 17 may store, in addition to data transmitted and received between the game device 3 and the external information processing device, archived data (game result data or data in the middle) of a game played by the game device 3. In addition, the flash memory 17 may store a game program.
In addition, the game device 3 can receive operation data from the controller 5. That is, the input/output processor 11a receives the operation data transmitted from the controller 5 via the antenna 23 and the controller communication module 19, and stores (temporarily stores) the operation data in the buffer area of the internal main memory 11e or the external main memory 12.
The game device 3 can transmit and receive data such as images and sounds to and from the terminal device 7. When a game image (terminal game image) is to be transmitted to the terminal device 7, the input/output processor 11a outputs the data of the game image generated by the GPU 11b to the codec LSI 27. The codec LSI 27 performs predetermined compression processing on the image data from the input/output processor 11 a. The terminal communication module 28 performs wireless communication with the terminal device 7. Thus, the image data compressed by the codec LSI 27 is transmitted to the terminal device 7 by the terminal communication module 28 through the antenna 29. In the present embodiment, the image data transmitted from the game device 3 to the terminal device 7 is data for a game, and if a delay occurs in the image displayed in the game, the operability of the game is adversely affected. Therefore, it is preferable to avoid the occurrence of delay as much as possible in the transmission of image data from the game device 3 to the terminal device 7. Therefore, in the present embodiment, the codec LSI 27 compresses the image data using, for example, the high-efficiency compression technique of the h.264 standard. In addition, other compression techniques may be used, and when the communication speed is sufficiently high, the image data may be transmitted without being compressed. The terminal communication module 28 is, for example, a communication module subjected to Wi-Fi (wireless fidelity) authentication, and may perform wireless communication with the terminal device 7 at high speed using, for example, the MIMO (Multiple input Multiple Output) technology adopted under the IEEE 802.11n standard, or may use another communication method.
In addition, the game device 3 transmits the image data to the terminal device 7, and also transmits the audio data to the terminal device 7. That is, the input/output processor 11a outputs the sound data generated by the DSP 11c to the terminal communication module 28 through the codec LSI 27. The codec LSI 27 also performs compression processing on the audio data, as with the image data. The compression method for the audio data may be any method, and a method with a high compression rate and little audio degradation is preferable. In other embodiments, the audio data may be transmitted without being compressed. The terminal communication module 28 transmits the compressed image data and sound data to the terminal device 7 through the antenna 29.
In addition to the image data and the sound data, the game device 3 transmits various control data to the terminal device 7 as necessary. The control data is data indicating a control instruction for a component provided in the terminal device 7, and for example, indicates an instruction to control lighting of a marker portion (marker portion 55 shown in fig. 14), an instruction to control imaging by a camera (camera 56 shown in fig. 14), and the like. The input/output processor 11a transmits control data to the terminal device 7 in accordance with an instruction from the CPU 10. In addition, although the codec LSI 27 does not perform data compression processing on the control data in this embodiment, the codec LSI may perform data compression processing on the control data in another embodiment. The data transmitted from the game device 3 to the terminal device 7 may be encrypted or not encrypted as necessary.
The game device 3 can receive various data from the terminal device 7. In the present embodiment, the terminal device 7 transmits operation data, image data, and audio data, which will be described in detail later. Each data transmitted from the terminal device 7 is received by the terminal communication module 28 through the antenna 29. Here, the image data and the sound data from the terminal device 7 are subjected to the same compression processing as the image data and the sound data transmitted from the game device 3 to the terminal device 7. Therefore, the image data and the sound data are transmitted from the terminal communication module 28 to the codec LSI27, subjected to decompression processing by the codec LSI27, and output to the input/output processor 11 a. On the other hand, the operation data from the terminal device 7 is smaller in data amount than the image and the voice, and therefore, the compression processing may not be performed. Further, encryption may be performed or may not be performed as necessary. Thus, the operation data is output to the input-output processor 11a via the codec LSI27 after being received by the terminal communication module 28. The input/output processor 11a stores (temporarily stores) the data received from the terminal device 7 in a buffer area of the internal main memory 11e or the external main memory 12.
The game device 3 can be connected to other devices and external storage media. That is, the expansion connector 20 and the memory card connector 21 are connected to the input/output processor 11 a. The expansion connector 20 is a connector for an Interface such as USB or SCSI (Small computer system Interface). The network communication module 18 can be replaced with a communication connector to which a medium such as an external storage medium is connected, a peripheral device such as another controller is connected, or a wired communication connector is connected, thereby enabling communication with the network. The memory card connector 21 is a connector for connecting an external storage medium such as a memory card. For example, the input/output processor 11a can access the external storage medium through the expansion connector 20 and the memory card connector 21 to store data in the external storage medium or read data from the external storage medium.
The game device 3 is provided with a power button 24, a reset button 25, and an eject button 26. The power button 24 and the reset button 25 are connected to the system LSI 11. When power button 24 is turned on, power is supplied from an external power source to each component of game device 3 via an AC adapter not shown. When the reset button 25 is pressed, the system LSI 11 restarts the startup program of the game device 3. An eject button 26 is connected to the disc drive 14. When the eject button 26 is pressed, the optical disk 4 is ejected from the disk drive 14.
In another embodiment, some of the components included in the game device 3 may be configured as extension devices independent of the game device 3. In this case, the extension device may be connected to the game apparatus 3 via the extension connector 20. Specifically, the expansion device may include the components of the codec LSI27, the terminal communication module 28, and the antenna 29, and may be attachable to and detachable from the expansion connector 20. In this way, by connecting the extension device to a game device that does not include the above-described components, the game device can be configured to be able to communicate with the terminal device 7.
[3. Structure of controller 5 ]
Next, the controller 5 will be described with reference to fig. 3 to 7. Fig. 3 is a perspective view showing an external configuration of the controller 5. Fig. 4 is a perspective view showing an external configuration of the controller 5. Fig. 3 is a perspective view of the controller 5 as viewed from the upper rear side of the controller 5, and fig. 4 is a perspective view of the controller 5 as viewed from the lower front side of the controller 5.
In fig. 3 and 4, the controller 5 has a housing 31 formed, for example, by plastic molding. The housing 31 has a substantially rectangular parallelepiped shape with its longitudinal direction in the front-rear direction (Z-axis direction shown in fig. 3), and is sized to be held by an adult or a child as a whole with one hand. The user can perform a game operation by changing the position and posture (inclination) thereof by pressing a button provided on the controller 5 and moving the controller 5 itself.
The housing 31 is provided with a plurality of operation buttons. As shown in fig. 3, on the upper surface of the housing 31, there are provided a cross button 32a, a button No. 1 32b, a button No. 2 32c, a button a 32d, a minus (-) button 32e, a home button 32f, a plus (+) button 32g, and a power button 32 h. In the present specification, the upper surface of the case 31 on which the buttons 32a to 32h are provided may be referred to as a "button top". On the other hand, as shown in fig. 4, a recess is formed on the lower surface of the housing 31, and a B button 32i is provided on the rear-side inclined surface of the recess. To each of these operation buttons 32a to 32i, a function corresponding to an information processing program executed by the game device 3 is appropriately assigned. In addition, the power button 32h is used to remotely turn on/off the power of the game device 3 main body. The home button 32f and the power button 32h are disposed such that the upper surface thereof is lower than the upper surface of the housing 31. This can prevent the user from erroneously pressing the home button 32f or the power button 32 h.
A connector 33 is provided on the rear surface of the housing 31. The connector 33 is used to connect other devices (e.g., other sensor units, controllers) to the controller 5. Further, locking holes 33a are provided on both sides of the connector 33 on the rear surface of the housing 31 to prevent the other devices from being easily detached.
A plurality of (four in fig. 3) LEDs 34a to 34d are provided on the rear portion of the upper surface of the housing 31. Here, the controller 5 is assigned a controller type (number) for distinguishing from other controllers. The LEDs 34a to 34d are used for the following purposes: the user is notified of the above-described controller category currently set for the controller 5, or the user is notified of the remaining battery level of the controller 5. Specifically, when a game operation is performed using the controller 5, any one of the plurality of LEDs 34a to 34d is turned on according to the controller type.
The controller 5 includes an imaging information calculation unit 35 (fig. 6), and as shown in fig. 4, a light incident surface 35a of the imaging information calculation unit 35 is provided on the front surface of the housing 31. The light incident surface 35a is made of a material that transmits at least infrared light from the markers 6R and 6L.
A sound outlet 31a for emitting sound from a speaker 47 (fig. 5) incorporated in the controller 5 to the outside is formed between the No. 1 button 32b and the home button 32f on the upper surface of the housing 31.
Next, the internal structure of the controller 5 will be described with reference to fig. 5 and 6. Fig. 5 and 6 are diagrams showing an internal structure of the controller 5. Fig. 5 is a perspective view showing a state where the upper housing (a part of the housing 31) of the controller 5 is removed. Fig. 6 is a perspective view showing a state where the lower housing (a part of the housing 31) of the controller 5 is removed. The perspective view shown in fig. 6 is a perspective view of the substrate 30 shown in fig. 5 as viewed from the back side.
In fig. 5, a substrate 30 is fixedly provided inside a housing 31, and operation buttons 32a to 32h, LEDs 34a to 34d, an acceleration sensor 37, an antenna 45, a speaker 47, and the like are provided on an upper main surface of the substrate 30. These are connected to a microcomputer (Micro Computer) 42 (see fig. 6) via a wiring (not shown) formed on the substrate 30 or the like. In the present embodiment, the acceleration sensor 37 is disposed at a position offset from the center of the controller 5 in the X-axis direction. This makes it easy to calculate the movement of the controller 5 when the controller 5 is rotated about the Z axis. The acceleration sensor 37 is disposed forward of the center of the controller 5 in the longitudinal direction (Z-axis direction). The controller 5 functions as a wireless controller using the wireless module 44 (fig. 7) and the antenna 45.
On the other hand, in fig. 6, an imaging information calculation unit 35 is provided at the edge of the front end on the lower main surface of the substrate 30. The imaging information calculation unit 35 includes an infrared filter 38, a lens 39, an imaging device 40, and an image processing circuit 41 in this order from the front of the controller 5. These components 38 to 41 are mounted on the lower main surface of the substrate 30.
The microcomputer 42 and the vibrator (vibrator)46 are provided on the lower main surface of the substrate 30. The vibrator 46 is, for example, a vibration motor or a solenoid (solenoid), and is connected to the microcomputer 42 through a wiring formed on the substrate 30 or the like. The vibrator 46 is operated in accordance with an instruction from the microcomputer 42, thereby generating vibration in the controller 5. This enables a so-called vibration-assisted game in which the vibration is transmitted to the hand of the user holding the controller 5. In the present embodiment, the vibrator 46 is disposed at a position slightly forward of the housing 31. That is, the vibrator 46 is disposed on the end side of the center of the controller 5, and thus large vibration can be generated in the entire controller 5 by the vibration of the vibrator 46. In addition, the connector 33 is mounted at the rear end edge on the lower main surface of the substrate 30. In addition to those shown in fig. 5 and 6, the controller 5 includes a crystal oscillator for generating a basic clock of the microcomputer 42, an amplifier for outputting an audio signal to the speaker 47, and the like.
The shape of the controller 5, the shape of each operation button, the number and the installation positions of the acceleration sensors and the vibrators, and the like shown in fig. 3 to 6 are merely examples, and other shapes, numbers, and installation positions may be used. In the present embodiment, the imaging direction of the imaging means is the positive Z-axis direction, but the imaging direction may be any direction. That is, the position of the imaging information arithmetic unit 35 in the controller 5 (the light incident surface 35a of the imaging information arithmetic unit 35) may not be the front surface of the housing 31, and may be provided on another surface as long as light can be taken in from the outside of the housing 31.
Fig. 7 is a block diagram showing the configuration of the controller 5. The controller 5 includes an operation unit 32 (each of the operation buttons 32a to 32i), an imaging information calculation unit 35, a communication unit 36, an acceleration sensor 37, and a gyro sensor 48. The controller 5 transmits data indicating the content of an operation performed on the controller 5 to the game device 3 as operation data. In addition, hereinafter, the operation data transmitted by the controller 5 is sometimes referred to as "controller operation data", and the operation data transmitted by the terminal device 7 is sometimes referred to as "terminal operation data".
The operation unit 32 includes the operation buttons 32a to 32i, and outputs operation button data indicating the input state to the operation buttons 32a to 32i (whether or not the operation buttons 32a to 32i are pressed) to the microcomputer 42 of the communication unit 36.
The imaging information calculation unit 35 is a system for analyzing image data captured by the imaging unit, identifying a region having high brightness, and calculating the position of the center of gravity, the size, and the like of the region. The imaging information calculation unit 35 has a sampling period of about 200 frames/second at maximum, for example, and therefore can track and analyze the movement of the controller 5 even at a relatively high speed.
The imaging information calculation unit 35 includes an infrared filter 38, a lens 39, an imaging device 40, and an image processing circuit 41. The infrared filter 38 passes only infrared rays of light incident from the front of the controller 5. The lens 39 condenses the infrared rays having passed through the infrared filter 38 and makes them incident on the image pickup device 40. The imaging element 40 is a solid-state imaging element such as a CMOS sensor or a CCD sensor, for example, and receives the infrared rays condensed by the lens 39 and outputs an image signal. Here, the marker unit 55 and the marker device 6 of the terminal device 7 to be imaged are constituted by markers that output infrared light. Therefore, by providing the infrared filter 38, the image pickup device 40 receives only the infrared rays having passed through the infrared filter 38 to generate image data, and thus can more accurately capture an image of the image pickup object (the marker 55 and/or the marker 6). Hereinafter, the image captured by the imaging element 40 is referred to as a captured image. The image processing circuit 41 processes the image data generated by the image pickup device 40. The image processing circuit 41 calculates the position of the imaging object within the captured image. The image processing circuit 41 outputs the coordinates indicating the calculated position to the microcomputer 42 of the communication unit 36. The data of the coordinates is transmitted to the game device 3 as operation data by the microcomputer 42. Hereinafter, the above coordinates are referred to as "marker coordinates". Since the marker coordinates change in accordance with the orientation (inclination angle) and position of the controller 5 itself, the game device 3 can calculate the orientation and position of the controller 5 using the marker coordinates.
In other embodiments, the controller 5 may not include the image processing circuit 41, and the captured image itself may be transmitted from the controller 5 to the game device 3. In this case, the game device 3 may have a circuit or a program having the same function as the image processing circuit 41 to calculate the marker coordinates.
The acceleration sensor 37 detects acceleration (including gravitational acceleration) of the controller 5, that is, detects force (including gravitational force) applied to the controller 5. The acceleration sensor 37 detects a value of acceleration in a linear direction along the sensing axis direction (linear acceleration) among accelerations applied to the detection portion of the acceleration sensor 37. For example, in the case of a biaxial or biaxial multi-axis acceleration sensor, the acceleration of the component along each axis is detected as the acceleration applied to the detection unit of the acceleration sensor. The acceleration sensor 37 is, for example, a capacitive MEMS (Micro Electro mechanical system) type acceleration sensor, but other types of acceleration sensors may be used.
In the present embodiment, the acceleration sensor 37 detects linear accelerations in three axial directions, i.e., the vertical direction (Y-axis direction shown in fig. 3), the horizontal direction (X-axis direction shown in fig. 3), and the front-rear direction (Z-axis direction shown in fig. 3) with reference to the controller 5. Since the acceleration sensor 37 detects acceleration in a linear direction along each axis, the output from the acceleration sensor 37 represents the value of the linear acceleration of each of the three axes. That is, the detected acceleration is expressed as a three-dimensional vector on an XYZ coordinate system (controller coordinate system) set with reference to the controller 5.
Data (acceleration data) indicating the acceleration detected by the acceleration sensor 37 is output to the communication unit 36. Further, since the acceleration detected by the acceleration sensor 37 changes in accordance with the orientation (inclination angle) and movement of the controller 5 itself, the game device 3 can calculate the orientation and movement of the controller 5 using the acquired acceleration data. In the present embodiment, the game device 3 calculates the posture, the inclination angle, and the like of the controller 5 from the acquired acceleration data.
Further, a computer such as a processor (for example, CPU 10) of the game device 3 or a processor (for example, microcomputer 42) of the controller 5 may process the acceleration signal output from the acceleration sensor 37 (the same applies to the acceleration sensor 73 described later), thereby estimating or calculating (determining) more information on the controller 5, and those skilled in the art can easily understand this description. For example, in the case where the process on the computer side is executed on the assumption that the controller 5 on which the acceleration sensor 37 is mounted is in a stationary state (that is, in the case where the process is executed assuming that the acceleration detected by the acceleration sensor is only the acceleration due to gravity), if the controller 5 is actually in a stationary state, it is possible to know whether or not the posture of the controller 5 is inclined or to what degree with respect to the direction of gravity from the detected acceleration. Specifically, when the state in which the detection axis of the acceleration sensor 37 is oriented in the vertical direction is used as a reference, whether or not the controller 5 is tilted with respect to the reference can be known from whether or not 1G (gravitational acceleration) is applied, and the degree of tilt with respect to the reference can be known from the magnitude of the gravitational acceleration. In the case of the multi-axis acceleration sensor 37, the degree of inclination of the controller 5 with respect to the direction of gravity can be known in more detail by further processing the signals of the acceleration of each axis. In this case, the processor may calculate the tilt angle of the controller 5 from the output from the acceleration sensor 37, or may calculate the tilt direction of the controller 5 without calculating the tilt angle. In this way, the inclination angle or the posture of the controller 5 can be determined by using the acceleration sensor 37 in combination with the processor.
On the other hand, assuming that the controller 5 is in a dynamic state (a state in which the controller 5 is moving), the acceleration sensor 37 detects acceleration corresponding to the movement of the controller 5 in addition to the gravitational acceleration, and therefore, the movement direction of the controller 5 can be known by removing a component of the gravitational acceleration from the detected acceleration by predetermined processing. Even when the controller 5 is in a dynamic state, the inclination of the controller 5 with respect to the direction of gravity can be known by removing a component of the acceleration corresponding to the movement of the acceleration sensor from the detected acceleration through a predetermined process. In other embodiments, the acceleration sensor 37 may include an embedded processing device or other type of dedicated processing device for performing predetermined processing on the acceleration signal detected by the built-in acceleration detection unit before the acceleration signal is output to the microcomputer 42. The embedded or dedicated processing means may also convert the acceleration signal into a tilt angle (or other preferred parameter), for example in case the acceleration sensor 37 is used to detect static acceleration (e.g. gravitational acceleration).
The gyro sensor 48 detects angular velocities about three axes (XYZ axes in the present embodiment). In the present specification, with reference to the imaging direction (Z-axis positive direction) of the controller 5, the rotation direction around the X-axis is referred to as a pitch direction (pitch direction), the rotation direction around the Y-axis is referred to as a yaw direction (yaw direction), and the rotation direction around the Z-axis is referred to as a roll direction (roll direction). The gyro sensors 48 may be any sensors as long as they can detect angular velocities about three axes, and the number and combination of the gyro sensors to be used may be arbitrary. For example, the gyro sensor 48 may be a three-axis gyro sensor, or a combination of a two-axis gyro sensor and a single-axis gyro sensor may detect angular velocities about three axes. Data indicating the angular velocity detected by the gyro sensor 48 is output to the communication unit 36. In addition, the gyro sensor 48 may also detect angular velocity about one or two axes.
The communication unit 36 includes a microcomputer 42, a memory 43, a wireless module 44, and an antenna 45. The microcomputer 42 uses the memory 43 as a storage area at the time of processing, and the microcomputer 42 controls the wireless module 44, and the wireless module 44 wirelessly transmits data acquired by the microcomputer 42 to the game device 3.
Data output from the operation unit 32, the imaging information calculation unit 35, the acceleration sensor 37, and the gyro sensor 48 to the microcomputer 42 is temporarily stored in the memory 43. These data are transmitted as operation data (controller operation data) to the game device 3. That is, when the transmission timing to transmit to the controller communication module 19 of the game device 3 comes, the microcomputer 42 outputs the operation data stored in the memory 43 to the wireless module 44. The wireless module 44 modulates a carrier wave of a predetermined frequency with operation data using, for example, the technology of Bluetooth (registered trademark), and transmits the weak radio wave signal from the antenna 45. That is, the operation data is modulated into a weak radio signal by the wireless module 44 and transmitted from the controller 5. The weak radio wave signal is received by the controller communication module 19 on the game device 3 side. The game device 3 can acquire the operation data by demodulating and decoding the received weak radio wave signal. Then, the CPU 10 of the game device 3 performs game processing using the operation data acquired from the controller 5. While the wireless transmission from the communication unit 36 to the controller communication module 19 is performed sequentially at predetermined intervals, the processing of the game is generally performed in units of 1/60 seconds (as one frame time), and therefore, the transmission is preferably performed at a period equal to or shorter than this time. The communication unit 36 of the controller 5 outputs the operation data to the controller communication module 19 of the game device 3 at a rate of, for example, 1/200 seconds once.
As described above, the controller 5 can transmit marker coordinate data, acceleration data, angular velocity data, and operation button data as operation data indicating an operation on the controller 5. Further, the game device 3 executes game processing using the operation data as game input. Therefore, by using the controller 5, the user can perform not only a conventional general game operation of pressing each operation button but also a game operation of moving the controller 5 itself. For example, an operation of tilting the controller 5 in an arbitrary posture, an operation of instructing an arbitrary position on the screen by the controller 5, an operation of moving the controller 5 itself, and the like can be performed.
In the present embodiment, the controller 5 does not have a display unit for displaying a game image, but may have a display unit for displaying an image indicating the remaining battery level, for example.
[4. Structure of terminal device 7 ]
Next, the configuration of the terminal device 7 will be described with reference to fig. 8 to 13. Fig. 8 is a plan view showing an external configuration of the terminal device 7. Fig. 8 (a) is a front view of the terminal device 7, (b) is a top view, (c) is a right side view, and (d) is a bottom view. Fig. 9 is a rear view of the terminal device 7. Fig. 10 and 11 are diagrams showing a state in which the user holds the terminal device 7 in a lateral direction. Fig. 12 and 13 are diagrams showing a state in which the user holds the terminal device 7 in the portrait orientation.
As shown in fig. 8, the terminal device 7 includes a case 50 having a substantially horizontally long rectangular plate-like shape. That is, the terminal device 7 can be said to be a tablet-type information processing device. The housing 50 may have a curved surface or a protrusion in a part thereof, as long as the entire housing has a plate-like shape. The housing 50 is sized to be gripped by a user. Therefore, the user can hold and move the terminal device 7 or change the arrangement position of the terminal device 7. The length of the terminal device 7 in the longitudinal direction (z-axis direction) is preferably 100 to 150[ mm ], and in the present embodiment 133.5[ mm ]. The length of the terminal device 7 in the transverse direction (x-axis direction) is preferably 200 to 250[ mm ], and 228.26[ mm ] in the present embodiment. The thickness (length in the y-axis direction) of the terminal device 7 is preferably about 15 to 30[ mm ] in the plate-like portion, about 30 to 50[ mm ] including the thickest portion, and in the present embodiment, about 23.6 (40.26) [ mm ] in the thickest portion). The weight of the terminal device 7 is about 400 to 600 g, 530 g in the present embodiment. The terminal device 7 is configured to be easily held and operated by a user, although it is a relatively large terminal device (operation device) as described above, and details thereof will be described later.
The terminal device 7 has an LCD 51 on the front surface (front surface side) of the casing 50. Further, the size of the screen of the LCD 51 is preferably 5 inches (inch) or more, and here, 6.2 inches. The operation device 7 of the present embodiment is easy to hold and operate, and is easy to operate even if a large LCD is provided. In other embodiments, the LCD 51 may be smaller and the size of the operation device 7 may be smaller. The LCD 51 is disposed near the center of the front surface of the housing 50. Therefore, the user can hold the terminal device 7 while viewing the screen of the LCD 51 by holding the case 50 on both sides of the LCD 51 as shown in fig. 10 and 11. Although fig. 10 and 11 show an example in which the user holds the terminal device 7 in the lateral direction (in the laterally long direction) while holding the housings 50 on both right and left sides of the LCD 51, the user may hold the terminal device 7 in the vertical direction (in the vertically long direction) as shown in fig. 12 and 13.
As shown in fig. 8 (a), the terminal device 7 has a touch panel 52 as an operation unit on the screen of the LCD 51. In the present embodiment, the touch panel 52 is a resistive film type touch panel. However, the touch panel is not limited to the resistive type, and any type of touch panel such as a capacitive type can be used. The touch panel 52 may be of a single-touch type or a multi-touch type. In the present embodiment, a touch panel having the same resolution (detection accuracy) as that of the LCD 51 is used as the touch panel 52. However, the resolution of the touch panel 52 and the resolution of the LCD 51 do not necessarily have to coincide. Although input is normally made to the touch panel 52 by the stylus pen 60, the input is not limited to the stylus pen 60, and the user can also input to the touch panel 52 with a finger. The housing 50 is provided with a storage hole 60a (see fig. 8 b), and the storage hole 60a stores the stylus pen 60 for operating the touch panel 52. Here, in order to prevent the stylus pen 60 from falling, the housing hole 60a may be provided on the upper surface of the case 50, or may be provided on the side surface or the lower surface. Since the terminal device 7 includes the touch panel 52 in this manner, the user can operate the touch panel 52 while moving the terminal device 7. That is, the user can directly input to the screen of the LCD 51 (through the touch panel 52) while moving the screen.
As shown in fig. 8, the terminal device 7 includes two analog sticks 53A and 53B and a plurality of buttons (keys) 54A to 54M as operation means (operation section). Each analog joystick 53A and 53B is a device capable of indicating direction. Each of the analog sticks 53A and 53B is configured to be able to slide a movable member (stick portion) operated by a finger of a user in an arbitrary direction (an arbitrary angle in the vertical, horizontal, and oblique directions) with respect to the surface of the housing 50. That is, analog joysticks 53A and 53B are directional input devices sometimes referred to as analog sliders (SlidePads). Further, the movable member of each of the pair of rocker levers 53A and 53B may be a movable member inclined in an arbitrary direction with respect to the surface of the housing 50. In the present embodiment, since the analog sticks of the type in which the movable member slides are used, the user can operate the respective analog sticks 53A and 53B without greatly moving the thumb, and can operate the case 50 while holding it more reliably. Further, in the case of using a member of a kind that tilts the movable member as the respective kinds of rocking levers 53A and 53B, it is easy for the user to know the degree of input (degree of tilt), and thus detailed operation can be performed more easily.
In addition, the left analog stick 53A is provided on the left side of the screen of the LCD 51, and the right analog stick 53B is provided on the right side of the screen of the LCD 51. Therefore, the user can input the pointing direction by using the analog stick regardless of the left or right hand. In addition, as shown in fig. 10 and 11, the respective analog sticks 53A and 53B are provided at positions where the user can operate the left and right portions (portions on both left and right sides of the LCD 51) of the terminal device 7 while holding them, so that the user can easily operate the respective analog sticks 53A and 53B even when holding and moving the terminal device 7.
Each of the buttons 54A to 54L is an operation unit (operation unit) for performing a predetermined input, and is a depressible key. As will be described later, the buttons 54A to 54L are provided at positions where the user can operate the left and right portions of the terminal device 7 while holding them (see fig. 10 and 11). Therefore, even when the user holds and moves the terminal device 7, the user can easily operate these operation means.
As shown in fig. 8 (a), a cross button (direction input button) 54A and buttons 54B to 54H and 54M among the respective operation buttons 54A to 54L are provided on the front surface of the housing 50. That is, these buttons 54A to 54H and 54M are arranged at positions that can be operated by the thumb of the user (refer to fig. 10 and 11).
The cross button 54A is provided on the left side of the LCD 51 and on the lower side of the left analog stick 53A. That is, the cross button 54A is arranged at a position where the user can operate with the left hand. The cross button 54A has a cross shape and is a button capable of indicating at least the up, down, left, and right directions.
Buttons 54B to 54D are provided on the lower side of LCD 51. The three buttons 54B to 54D are disposed at positions where both the left and right hands can operate. In addition, the terminal device 7 has a power button 54M for turning on/off the power of the terminal device 7. The power of the game apparatus 3 can also be turned on/off remotely by the operation of the power button 54M. The power button 54M is provided below the LCD 51, similarly to the buttons 54B to 54D. The power button 54M is provided on the right side of the buttons 54B to 54D. Thus, the power button 54M is disposed at a position that can be operated (easily operated) with the right hand. The four buttons 54E to 54H are provided on the right side of the LCD 51 and below the right analog stick 53B. That is, the four buttons 54E to 54H are arranged at positions where the user can operate with the right hand. The four buttons 54E to 54H are disposed in a vertical and horizontal positional relationship (with respect to the center positions of the four buttons 54E to 54H). Therefore, the terminal device 7 can also function the four buttons 54E to 54H as buttons for causing the user to instruct the up, down, left, and right directions.
In the present embodiment, the respective rocker levers 53A and 53B are disposed above the cross button 54A and the buttons 54E to 54H. Here, the respective rocker levers 53A and 53B protrude in the thickness direction (y-axis direction) with respect to the cross button 54A and the buttons 54E to 54H. Therefore, if the positions of the analog stick 53A and the cross button 54A are reversed, the user may touch the analog stick 53A with the thumb when operating the cross button 54A with the thumb, resulting in erroneous operation. The same problem occurs when the positions of the analog stick 53B and the buttons 54E to 54H are reversed. In contrast, in the present embodiment, since the respective types of analog sticks 53A and 53B are disposed above the cross button 54A and the buttons 54E to 54H, the possibility that the user touches the cross button 54A and the buttons 54E to 54H with his fingers when operating the analog sticks 53A and 53B is lower than in the above case. As described above, in the present embodiment, the possibility of erroneous operation can be reduced, and the operability of the terminal device 7 can be improved. However, in other embodiments, the analog stick 53A and the cross button 54A may be disposed upside down, and the analog stick 53B and the buttons 54E to 54H may be disposed upside down, as required.
Here, in the present embodiment, a plurality of operation portions (the respective rocker levers 53A and 53B, the cross button 54A, and the three buttons 54E to 54G) are provided on both left and right sides of the display portion (the LCD51) above the center of the casing 50 in the vertical direction (y-axis direction). When operating these operation units, the user mainly holds the terminal device 7 at a position above the center in the vertical direction. Here, when the user grips the lower side of the housing 50, (particularly, when the terminal device 7 has a relatively large size as in the present embodiment), the terminal device 7 gripped is unstable, and the user cannot easily hold the terminal device 7. In contrast, in the present embodiment, when the user operates the operation unit, the user can hold the terminal device 7 mainly at a position above the center in the vertical direction and can support the housing 50 from the side with the palm. Therefore, the user can hold the housing 50 in a stable state, and easily hold the terminal device 7, and thus easily operate the operation unit. In another embodiment, at least one operation portion may be provided on each of the left and right sides of the display portion at a position above the center of the housing 50. For example, only the respective rocker levers 53A and 53B may be provided at the upper side of the center of the casing 50. For example, in a case where the cross button 54A is provided above the left analog stick 53A and the four buttons 54E to 54H are provided above the right analog stick 53B, the cross button 54A and the four buttons 54E to 54H may be provided above the center of the case 50.
In the present embodiment, a projection (brim 59) is provided on the back side of the case 50 (the side opposite to the front side on which the LCD 51 is provided) (see fig. 8 c and 9). As shown in fig. 8 (c), the brim 59 is a mountain-shaped member provided to protrude from the back surface of the substantially plate-shaped case 50. The protruding portion has a height (thickness) that can be caught on fingers of the back of the user holding the housing 50. The height of the protrusion is preferably 10 to 25[ mm ], and in the present embodiment, 16.66[ mm ]. In addition, it is preferable that the lower surface of the protruding portion has an inclination of 45 ° or more (more preferably 60 ° or more) with respect to the rear surface of the housing 50 so that the protruding portion is easily caught on the user's finger. As shown in fig. 8 (c), the lower surface of the protrusion may be formed at a larger inclination angle than the upper surface. As shown in fig. 10 and 11, the user can hold the terminal device 7 in a stable state without being tired even if the terminal device 7 has a relatively large size by holding the eaves 59 with fingers (by hooking the eaves 59 to the fingers). That is, the brim 59 may be a support member for supporting the case 50 with fingers, and may also be referred to as a finger-engaging portion.
The eaves 59 is provided above the center of the case 50 in the vertical direction. The eaves 59 are provided at positions substantially opposite to the operating portions (the respective kinds of rocker levers 53A and 53B) provided on the front surface of the housing 50. That is, the protrusion is provided in a region including positions on opposite sides of the operation portions provided on the left and right of the display portion, respectively. Therefore, when operating the operation unit, the user can hold the terminal device 7 with the middle finger or ring finger support brim 59 (see fig. 10 and 11). This makes it easier to hold the terminal device 7 and to operate the operation unit. In the present embodiment, since the protruding portion (protruding portion) has an eaves-like shape extending in the left-right direction, the user can grip the terminal device 7 with the middle finger or ring finger along the lower surface of the protruding portion, and can more easily hold the terminal device 7. The eaves 59 need not be formed to extend in the horizontal direction (protruding portion) as shown in fig. 9, and is not limited to a shape extending in the horizontal direction. In other embodiments, the eaves 59 may also extend in a direction that is slightly inclined with respect to the horizontal direction. For example, the eaves 59 may be inclined upward (or downward) as extending from the left and right ends toward the center.
In the present embodiment, since the locking hole described later is provided in the brim 59, the brim 59 formed in a brim shape is used as the protrusion formed on the rear surface of the housing, but the protrusion may have any shape. For example, in another embodiment, two protrusions may be provided on both the left and right sides (the protrusions are not provided at the center in the left-right direction) on the back surface side of the housing 50 (see fig. 32). In other embodiments, the cross-sectional shape (cross-sectional shape perpendicular to the x-axis direction) of the protrusion may be a hook-shaped shape (shape with a concave lower surface) so that the terminal device 7 can be supported by the user's finger more reliably (so that the protrusion is hooked on the finger more reliably).
The width of the protrusion (brim 59) in the vertical direction may be any. For example, the protrusion may be formed to extend to the upper edge of the case 50. That is, the upper surface of the protrusion may be formed at the same position as the upper side surface of the housing 50. At this time, the case 50 is formed in a two-step structure having a thinner lower side and a thicker upper side. In this way, it is preferable that downward surfaces (lower surfaces of the protrusions) be formed on both the left and right sides of the back surface of the housing 50. Thereby, the user can easily hold the operation device by abutting the fingers on the surface. The "downward surface" may be formed at any position on the back surface of the housing 50, but is preferably located above the center of the housing 50.
As shown in fig. 8 (a), (b), and (c), the first L button 54I and the first R button 54J are provided on the left and right sides of the upper surface of the housing 50, respectively. In the present embodiment, the first L button 54I and the first R button 54J are provided at the obliquely upper portion (upper left portion and upper right portion) of the housing 50. Specifically, the first L-shaped button 54I is provided at the left end of the upper side surface of the plate-shaped housing 50 and is exposed from the upper left side surface (in other words, from both the upper side surface and the left side surface). The first R button 54J is provided at the right end of the upper side surface of the housing 50, and is exposed from the upper right side surface (in other words, from both the upper side surface and the right side surface). In this way, the first L button 54I is disposed at a position where the user can operate with the left index finger, and the first R button 54J is disposed at a position where the user can operate with the right index finger (refer to fig. 10). In other embodiments, the left and right operation portions provided on the upper surface of the housing 50 may be provided at positions other than the left and right end portions, instead of the left and right end portions. The operation portions may be provided on the left and right side surfaces of the housing 50.
As shown in fig. 8 c and 9, the second L button 54K and the second R button 54L are disposed on the protrusion (brim 59). The second L-button 54K is provided near the left end of the eaves 59. The second R button 54L is provided near the right end of the eaves 59. That is, the second L button 54K is provided at a position slightly above the left side (left side when viewed from the front side) of the rear surface of the housing 50, and the second R button 54L is provided at a position slightly above the right side (right side when viewed from the front side) of the rear surface of the housing 50. In other words, the second L button 54K is provided at a position (substantially) opposed to the left analog stick 53A provided on the front surface, and the second R button 54L is provided at a position (substantially) opposed to the right analog stick 53B provided on the front surface. In this way, the second L button 54K is disposed at a position where the user can operate with the middle finger or the index finger of the left hand, and the second R button 54L is disposed at a position where the user can operate with the middle finger or the index finger of the right hand (see fig. 10 and 11). As shown in fig. 8 (c), the second L button 54K and the second R button 54L are provided on the upper surface of the brim 59. Therefore, the second L button 54K and the second R button 54L have button faces facing upward (obliquely upward). It is considered that the middle finger or the index finger moves in the up-down direction with the user holding the terminal device 7, and therefore, by making the buttons face upward, the user can easily press the second L button 54K and the second R button 54L.
As described above, in the present embodiment, the operation portions (the analog sticks 53A and 53B) are provided on the left and right sides of the display portion (the LCD 51) of the casing 50 above the center of the casing, and the other operation portions (the second L button 54K and the second R button 54L) are provided on the back surface side of the casing 50 at positions facing the operation portions. Thus, the operation portion and the other operation portion are disposed at positions facing the front surface side and the back surface side of the housing 50, and therefore, the user can grip the operation portions while sandwiching the housing 50 from the front surface side and the back surface side. When operating these operation portions, the user can hold the terminal device 7 on the upper side and can support the terminal device 7 with the palm of the hand because the user holds the housing 50 at a position above the center in the vertical direction (see fig. 10 and 11). As described above, the user can stably hold the housing 50 in a state where the user can operate at least four operation portions, and an operation device (terminal device 7) which can be easily held by the user and has excellent operability can be provided.
As described above, in the present embodiment, the user can easily grip the terminal device 7 by gripping the terminal device 7 with fingers in contact with the lower surface of the protruding portion (eaves portion 59). In addition, since the second L button 54K and the second R button 54L are provided on the upper surface of the protruding portion, the user can easily operate these buttons in the above state. The user can easily hold the terminal device 7 by, for example, a hand-holding method as follows.
That is, the user can grip the terminal device 7 by bringing the ring finger into contact with the lower surface (the single-dot chain line shown in fig. 10) of the brim 59 as shown in fig. 10 (so as to support the brim 59 with the ring finger). At this time, the user can operate the four buttons (the first L button 54I, the first R button 54J, the second L button 54K, and the second R button 54L) with the index finger and the middle finger. For example, when the game operation to be performed is relatively complicated due to a large number of buttons to be used, many buttons can be easily operated by holding the game as shown in fig. 10. In addition, since the analog sticks 53A and 53B are provided on the upper side of the cross button 54A and the buttons 54E to 54H, the user can operate the analog sticks 53A and 53B with the thumbs in a case where a relatively complicated operation is required, which is convenient. In fig. 10, the user holds the terminal device 7 by placing the thumb on the front surface of the case 50, the index finger on the upper surface of the case 50, the middle finger on the upper surface of the brim 59 on the rear surface of the case 50, the ring finger on the lower surface of the brim 59, and the little finger on the rear surface of the case 50. In this way, the user can reliably hold the terminal device 7 by surrounding the housing 50 from all around.
As shown in fig. 11, the user can hold the terminal device 7 by bringing the middle finger into contact with the lower surface of the brim 59 (indicated by the one-dot chain line in fig. 11). At this time, the user can easily operate the two buttons (the second L button 54K and the second R button 54L) with the index finger. For example, if the game operation to be performed is relatively simple with a small number of buttons, the game may be held as shown in fig. 11. In fig. 11, the user can hold the lower side of the housing 50 with two fingers (ring finger and little finger), and thus can hold the terminal device 7 reliably.
In the present embodiment, the lower surface of the brim 59 is provided so as to be positioned between the respective type ratio rockers 53A and 53B and the cross button 54A and the four buttons 54E to 54H (at a position below the respective type ratio rockers 53A and 53B and above the cross button 54A and the four buttons 54E to 54H). Therefore, when the ring finger is brought into contact with the brim 59 to grip the terminal device 7 (fig. 10), the analog sticks 53A and 53B are easily operated by the thumb, and when the middle finger is brought into contact with the brim 59 to grip the terminal device 7 (fig. 11), the cross button 54A and the four buttons 54E to 54H are easily operated by the thumb. That is, in either case, the user can perform the direction input operation while reliably holding the terminal device 7.
In addition, as described above, the user can also hold the terminal device 7 vertically. That is, as shown in fig. 12, the user can grip the terminal device 7 vertically by gripping the upper edge of the terminal device 7 with the left hand. Further, as shown in fig. 13, the user can grip the terminal device 7 vertically by gripping the lower edge of the terminal device 7 with the left hand. Although fig. 12 and 13 show the case where the terminal device 7 is held by the left hand, the terminal device 7 may be held by the right hand. In this way, since the user can hold the terminal device 7 with one hand, for example, the user can perform an operation of holding the terminal device 7 with one hand and inputting an input to the touch panel 52 with the other hand.
In the case of holding the terminal device 7 by the holding method shown in fig. 12, the user can hold the terminal device 7 securely by placing fingers (middle finger, ring finger, and little finger in fig. 12) other than the thumb on the lower surface of the brim 59 (the chain line shown in fig. 12). In particular, in the present embodiment, since the eaves portion 59 is formed to extend in the left-right direction (in the up-down direction in fig. 12), the user can place fingers other than the thumb on the eaves portion 59 regardless of the position of the upper edge of the terminal device 7, and can reliably grip the terminal device 7. That is, in the case where the terminal device 7 is used in the longitudinal direction, the eaves 59 can be used as a handle. On the other hand, when the terminal device 7 is held by the hand-holding method shown in fig. 13, the user can operate the buttons 54B to 54D with the left hand. Therefore, for example, the buttons 54B to 54D can be operated by the hand holding the terminal device 7 while the input to the touch panel 52 is performed by one hand, and more operations can be performed.
In the terminal device 7 of the present embodiment, since the protrusion (eaves 59) is provided on the rear surface, when the terminal device 7 is placed with the screen of the LCD 51 (the front surface of the housing 50) facing upward, the screen is slightly inclined. This makes it easier to view the screen in the state where the terminal device 7 is mounted. In addition, it is easy to perform an input operation on the touch panel 52 in a state where the terminal device 7 is mounted. In another embodiment, an additional protrusion having a height similar to that of the brim 59 may be formed on the rear surface of the housing 50. Accordingly, in a state where the screen of the LCD 51 is directed upward, the terminal device 7 can be placed so that the screen is horizontal by the contact of the respective protrusions with the placement surface. The additional protrusion may be a member that can be attached and detached (or folded). This makes it possible to place the terminal device in two states, i.e., a state in which the screen is slightly inclined and a state in which the screen is horizontal. That is, when the terminal device 7 is placed and used, the eaves 59 can be used as the feet.
The buttons 54A to 54L are assigned functions corresponding to the game program as appropriate. For example, the cross button 54A and the buttons 54E to 54H may be used for a direction instruction operation, a selection operation, and the like, and the buttons 54B to 54E may be used for a confirmation operation, a cancellation operation, and the like. Terminal device 7 may have a button for turning on/off the screen display of LCD 51 and a button for performing connection setting (pairing) with game device 3.
As shown in fig. 8 (a), the terminal device 7 includes a marker 55 including a marker 55A and a marker 55B on the front surface of the housing 50. The flag 55 is provided on the upper side of the LCD 51. Each of the markers 55A and 55B is composed of one or more infrared LEDs, as in the markers 6R and 6L of the marker device 6. The infrared LEDs constituting the markers 55A and 55B are disposed inside the window portion through which infrared light passes. The marker 55 is used when the game device 3 calculates the movement of the controller 5, as in the marker 6 described above. Further, the game device 3 can control the lighting of each infrared LED provided in the marker section 55.
The terminal device 7 includes a camera 56 as an imaging unit. The camera 56 includes an image pickup element (e.g., a CCD image sensor, a CMO S image sensor, etc.) having a predetermined resolution and a lens. As shown in fig. 8, in the present embodiment, the camera 56 is provided on the front surface of the housing 50. Therefore, the camera 56 can capture the face of the user holding the terminal device 7, for example, the user who is playing a game while watching the LCD 51. In the present embodiment, the camera 56 is disposed between the two markers 55A and 55B.
The terminal device 7 further includes a microphone 69 as an audio input means. A microphone hole 50c is provided in the front surface of the housing 50. The microphone 69 is provided inside the casing 50 in the microphone hole 50 c. The microphone 69 detects sounds around the terminal device 7 such as the user's voice.
The terminal device 7 includes a speaker 77 as an audio output means. As shown in fig. 8 (d), a speaker hole 57 is provided on the lower side of the front surface of the housing 50. The output sound of the speaker 77 is output from the speaker hole 57. In the present embodiment, the terminal device 7 includes two speakers, and speaker holes 57 are provided at positions of the left speaker and the right speaker, respectively. The terminal device 7 is provided with a dial knob 64 for adjusting the sound volume of the speaker 77. The terminal device 7 includes an audio output terminal 62 for connecting an audio output unit such as an earphone. Here, the sound output terminal 62 and the toggle button 64 are provided on the upper side of the case 50 in consideration of the attachment to be connected to the lower side of the case, but may be provided on the left and right sides or the lower side.
The housing 50 is provided with a window 63 for emitting an infrared signal from the infrared communication module 82 to the outside of the terminal device 7. Here, in order to emit an infrared signal to the front of the user while holding both sides of the LCD 51, a window 63 is provided on the upper side surface of the housing 50. However, in other embodiments, the window 63 may be provided at any position such as the back surface of the housing 50.
The terminal device 7 includes an extension connector 58 for connecting another device to the terminal device 7. The expansion connector 58 is a communication terminal for transmitting and receiving data (information) to and from another device connected to the terminal device 7. In the present embodiment, as shown in fig. 8 (d), the expansion connector 58 is provided on the lower side surface of the housing 50. The other additional devices connected to the expansion connector 58 may be any devices, and may be input devices such as a controller (gun-type controller, etc.) used in a specific game and a keyboard. The expansion connector 58 may not be provided if no additional devices need to be connected. The expansion connector 58 may also include terminals for supplying power to an additional device, and terminals for charging.
In addition, the terminal device 7 has a charging terminal 66 for taking power from an accessory device, in addition to the extension connector 58. When the charging terminal 66 is connected to a cradle (stand)210 described later, electric power is supplied from the cradle 210 to the terminal device 7. In the present embodiment, the charging terminal 66 is provided on the lower side surface of the housing 50. Therefore, when the terminal device 7 is connected to an additional device (for example, the input device 200 shown in fig. 15 or the input device 220 shown in fig. 17), it is possible to transmit and receive information through the extension connector 58 and supply power from one to the other. By providing the charging terminals 66 around the extension connector 58 (on both the left and right sides) in this way, when the terminal device 7 is connected to an accessory device, it is possible to supply electric power while transmitting and receiving information. In addition, the terminal device 7 has a charging connector, and the housing 50 has a cover portion 61 for protecting the charging connector. The charging connector can be connected to a charger 86 described later, and when the charging connector is connected to the charger, power is supplied from the charger 86 to the terminal device 7. In the present embodiment, the charging connector (cover 61) is provided on the upper side surface of the housing 50 in consideration of the attachment to be connected to the lower side surface of the housing, but may be provided on the left and right side surfaces or the lower side surface.
In addition, the terminal device 7 has a battery cover 67 that is attachable and detachable with respect to the housing 50. A battery (battery 85 shown in fig. 14) is disposed inside the battery cover 67. In the present embodiment, the battery cover 67 is provided on the rear surface side of the case 50 and below the protruding portion (brim portion 59).
In addition, holes 65a and 65b for tying a string are provided in the housing 50 of the terminal device 7. As shown in fig. 8 (d), in the present embodiment, holes 65a and 65b are provided in the lower surface of the housing 50. In addition, in the present embodiment, two holes 65a and 65b are provided one on each of the right and left sides of the housing 50. That is, the hole 65a is provided on the lower surface of the housing 50 on the left side of the center thereof, and the hole 65b is provided on the lower surface of the housing 50 on the right side of the center thereof. The user can tie the lanyard to either of the holes 65a and 65b and hang the lanyard on his or her wrist. Thus, if the user drops the terminal device 7 or the terminal device 7 is taken off the hand, the terminal device 7 can be prevented from dropping or colliding with other things. In addition, in the present embodiment, since the holes are provided on the left and right sides, the user can hang the hanging rope on either hand, which is convenient.
In the terminal device 7 shown in fig. 8 to 13, the shape of each operation button and the housing 50, the number of components, the installation position, and the like are merely examples, and other shapes, numbers, and installation positions may be used.
Next, the internal configuration of the terminal device 7 will be described with reference to fig. 14. Fig. 14 is a block diagram showing an internal configuration of the terminal device 7. As shown in fig. 14, the terminal device 7 includes, in addition to the configuration shown in fig. 8, a touch panel controller 71, a magnetic sensor 72, an acceleration sensor 73, a gyro sensor 74, a user interface controller (UI controller) 75, a codec LSI 76, a speaker 77, a voice IC (Integrated Circuit) 78, a microphone 79, a wireless module 80, an antenna 81, an infrared communication module 82, a flash memory 83, a power IC84, a battery 85, and a vibrator 89. These electronic components are mounted on the circuit board and housed in the case 50.
The UI controller 75 is a circuit for controlling input/output of data to/from various input/output units. The UI controller 75 is connected to the touch panel controller 71, the analog sticks 53 (analog sticks 53A and 53B), the operation buttons 54 (operation buttons 54A to 54L), the marker section 55, the magnetic sensor 72, the acceleration sensor 73, the gyro sensor 74, and the vibrator 89. Further, the UI controller 75 is connected to the codec LSI 76 and the expansion connector 58. The power supply IC84 is connected to the UI controller 75, and supplies power to each unit through the UI controller 75. The built-in battery 85 is connected to the power supply IC84 to supply electric power. Further, a charger 86 or a cable that can receive power from an external power supply can be connected to the power supply IC84 via a charging connector, and the terminal device 7 can be supplied with power from the external power supply and charged by the charger 86 or the cable. The terminal device 7 can be charged by attaching the terminal device 7 to a cradle (cradle) having a charging function (not shown). That is, although not shown, a cradle (cradle 210 shown in fig. 20) capable of receiving power from an external power supply can be connected to the power IC84 via the charging terminal 66, and power supply and charging from the external power supply can be performed to the terminal device 7 by the cradle.
The touch panel controller 71 is a circuit connected to the touch panel 52 and controls the touch panel 52. The touch panel controller 71 generates touch position data in a predetermined format based on a signal from the touch panel 52, and outputs the generated touch position data to the UI controller 75. The touch position data indicates coordinates of a position input on the input surface of the touch panel 52. The touch panel controller 71 reads a signal from the touch panel 52 and generates touch position data at a rate of once every predetermined time. In addition, various control instructions for the touch panel 52 are output from the UI controller 75 to the touch panel controller 71.
The analog stick 53 outputs stick data indicating the direction and amount of the stick section sliding (or tilting) operated by the user's finger to the UI controller 75. The operation buttons 54 output operation button data indicating the input status (whether or not they are pressed) to the operation buttons 54A to 54L to the UI controller 75.
The magnetic sensor 72 detects the orientation by detecting the magnitude and direction of the magnetic field. Orientation data indicating the detected orientation is output to the UI controller 75. Further, a control instruction for the magnetic sensor 72 is output from the UI controller 75 to the magnetic sensor 72. As the magnetic sensor 72, there is a sensor using an MI (magnetic impedance) element, a fluxgate sensor (fluxgate sensor), a hall element, a GMR (giant magnetoresistive) element, a TMR (tunnel magnetoresistive) element, an AMR (anisotropic magnetoresistive) element, or the like, but any sensor may be used as long as it can detect an azimuth. In addition, strictly speaking, the obtained azimuth data does not indicate the azimuth at a place where a magnetic field other than the geomagnetic field is generated, but even in this case, since the azimuth data changes when the terminal device 7 moves, it is possible to calculate the change in the posture of the terminal device 7.
The acceleration sensor 73 is provided inside the housing 50, and detects the magnitude of linear acceleration in three-axis directions (xyz axes shown in fig. 8 a). Specifically, the acceleration sensor 73 detects the magnitude of linear acceleration of each axis with the longitudinal direction of the casing 50 as the x axis, the direction perpendicular to the front surface of the casing 50 as the y axis, and the short side direction of the casing 50 as the z axis. Acceleration data indicating the detected acceleration is output to the UI controller 75. Further, a control instruction for the acceleration sensor 73 is output from the UI controller 75 to the acceleration sensor 73. In the present embodiment, the acceleration sensor 73 is, for example, a capacitive MEMS acceleration sensor, but in other embodiments, other types of acceleration sensors may be used. The acceleration sensor 73 may be an acceleration sensor that detects a uniaxial direction or a biaxial direction.
The gyro sensor 74 is provided inside the housing 50, and detects angular velocities about the three axes of the x-axis, the y-axis, and the z-axis. Angular velocity data indicating the detected angular velocity is output to the UI controller 75. Further, a control instruction for the gyro sensor 74 is output from the UI controller 75 to the gyro sensor 74. The number and combination of gyro sensors for detecting the angular velocities of the three axes may be arbitrary, and the gyro sensor 74 may be configured by a two-axis gyro sensor and a single-axis gyro sensor, as in the case of the gyro sensor 48. The gyro sensor 74 may be a gyro sensor that detects a single axis or a biaxial direction.
The vibrator 89 is, for example, a vibration motor or a solenoid, and is connected to the UI controller 75. The vibrator 89 is operated in accordance with an instruction from the UI controller 75, thereby generating vibration in the terminal device 7. This enables a so-called vibration-assisted game in which the vibration is transmitted to the hand of the user holding the terminal device 7.
The UI controller 75 outputs operation data including the touch position data, the joystick data, the operation button data, the azimuth data, the acceleration data, and the angular velocity data received from the above-described components to the codec LSI 76. In the case where another device is connected to the terminal device 7 via the expansion connector 58, the operation data may include data indicating an operation on the other device.
The codec LSI 76 is a circuit that performs compression processing on data transmitted to the game device 3 and decompression processing on data transmitted from the game device 3. The codec LSI 76 is connected to the LCD 51, the camera 56, the audio IC 78, the wireless module 80, the flash memory 83, and the infrared communication module 82. In addition, the codec LSI 76 includes a CPU 87 and an internal memory 88. Although the terminal device 7 is not configured to perform the game process itself, it is necessary to execute a minimum program for management and communication of the terminal device 7. When the power is turned on, the program stored in the flash memory 83 is read into the internal memory 88, and the CPU 87 executes the program, thereby activating the terminal device 7. In addition, a part of the area of the internal memory 88 is used as the VRAM for the LCD 51.
The camera 56 captures an image in accordance with an instruction from the game device 3, and outputs the captured image data to the codec LSI 76. Further, a control instruction to the camera 56, such as an image capturing instruction of an image, is output from the codec LSI 76 to the camera 56. The camera 56 can also take a moving image. That is, the camera 56 can repeatedly perform image pickup and repeatedly output image data to the codec LSI 76.
The audio IC 78 is a circuit connected to the speaker 77 and the microphone 79 and controls input and output of audio data to and from the speaker 77 and the microphone 79. That is, when the audio data is received from the codec LSI 76, the audio IC 78 outputs an audio signal obtained by D/a conversion of the audio data to the speaker 77, and outputs the audio from the speaker 77. The microphone 79 detects sound (such as a user's voice) transmitted to the terminal device 7, and outputs a voice signal indicating the sound to the voice IC 78. The audio IC 78 performs a/D conversion on the audio signal from the microphone 79 and outputs audio data of a predetermined format to the codec LSI 76.
The codec LSI 76 transmits the image data from the camera 56, the sound data from the microphone 79, and the operation data from the UI controller 75 as terminal operation data to the game apparatus 3 through the wireless module 80. In the present embodiment, the codec LSI 76 performs the same compression processing as the codec LSI 27 on the image data and the sound data. The terminal operation data and the compressed image data and sound data are output to the wireless module 80 as transmission data. An antenna 81 is connected to the wireless module 80, and the wireless module 80 transmits the transmission data to the game device 3 via the antenna 81. The wireless module 80 has the same function as the terminal communication module 28 of the game device 3. That is, the wireless module 80 has a function of connecting to a wireless LAN by a system conforming to the ieee802.11n standard, for example. The transmitted data may or may not be encrypted as desired.
As described above, the transmission data transmitted from the terminal device 7 to the game device 3 includes the operation data (terminal operation data), the image data, and the audio data. In addition, when another device is connected to the terminal device 7 via the extension connector 58, the transmission data may include data received from the other device. In addition, the infrared communication module 82 performs infrared communication with other devices in compliance with, for example, the IRDA (infrared data Association) standard. The codec LSI76 may include data received by infrared communication in the transmission data and transmit the data to the game device 3 as necessary.
As described above, the compressed image data and audio data are transmitted from the game device 3 to the terminal device 7. These data are received by the codec LSI76 via the antenna 81 and the radio module 80. The codec LSI76 decompresses the received image data and sound data. The decompressed image data is output to the LCD51, whereby an image is displayed on the LCD 51. That is, the codec LSI76(CPU 87) causes the display unit to display the received image data. The decompressed sound data is output to the voice IC 78, and the voice IC 78 outputs sound from the speaker 77.
When the data received from the game device 3 includes control data, the codec LSI 76 and the UI controller 75 give control instructions to the respective units in accordance with the control data. As described above, the control data is data indicating a control instruction to each component (in the present embodiment, the camera 56, the touch panel controller 71, the marker section 55, the sensors 72 to 74, and the infrared communication module 82) included in the terminal device 7. In the present embodiment, as the control instruction indicated by the control data, an instruction to operate or stop (stop) the above-described components is considered. That is, in order to suspend components that are not used in the game in order to suppress power consumption, in this case, the transmission data transmitted from the terminal device 7 to the game device 3 may not include data from the suspended components. Further, since the marker portion 55 is an infrared LED, the control only needs to start/stop the supply of electric power.
As described above, the terminal device 7 includes the operation means such as the touch panel 52, the analog stick 53, and the operation buttons 54, but in other embodiments, another operation means may be provided instead of or in addition to these operation means.
The terminal device 7 includes the magnetic sensor 72, the acceleration sensor 73, and the gyro sensor 74 as sensors for calculating the movement (including the position, the orientation, or the change in the position or the orientation) of the terminal device 7, but may be configured to include only one or two of these sensors in another embodiment. In other embodiments, other sensors may be provided instead of or in addition to these sensors.
Although the terminal device 7 includes the camera 56 and the microphone 79, in other embodiments, the camera 56 and the microphone 79 may not be provided, and only one of the camera 56 and the microphone 79 may be provided.
The terminal device 7 is configured to include the marker 55 as a configuration for calculating the positional relationship between the terminal device 7 and the controller 5 (the position, orientation, and the like of the terminal device 7 as viewed from the controller 5), but in another embodiment, the marker 55 may not be included. In other embodiments, the terminal device 7 may include other means for calculating the positional relationship. For example, in another embodiment, the controller 5 may include a marker portion and the terminal device 7 may include an imaging element. In this case, the marker 6 may be configured to include an imaging device instead of the infrared LED.
(Structure of additional device)
Next, an example of an add-on device that can be attached (connected) to the terminal device 7 will be described with reference to fig. 15 to 20. The attachment may have any function, and may be, for example, an additional operation device attached to the terminal device 7 for performing a predetermined operation, a charger for supplying power to the terminal device 7, or a cradle for placing the terminal device 7 in a predetermined posture.
As shown in fig. 8 (d) and 9, locking holes 59a and 59b capable of locking claw portions of the attachment device are provided in the lower surface of the protrusion (brim 59). The locking holes 59a and 59b are used when other additional devices are connected to the terminal device 7. That is, the attachment has claw portions capable of being engaged in the engaging holes 59a and 59b, and when the attachment is connected to the terminal device 7, the terminal device 7 and the attachment are fixed by engaging the claw portions in the engaging holes 59a and 59 b. Further, screw holes may be provided in the locking holes 59a and 59b, and the attachment may be firmly fixed by screws. Here, the protruding portion provided on the rear surface of the terminal device 7 is a brim portion 59 having a brim shape. That is, the eaves 59 are provided to extend in the left-right direction. As shown in fig. 9, the locking holes 59a and 59b are provided near the center (in the left-right direction) of the lower surface of the eaves 59. The number of the locking holes 59a and 59b provided in the lower surface of the brim 59 may be any number, but in the case of one locking hole, it is preferably provided in the center of the brim 59, and in the case of a plurality of locking holes, they are preferably arranged symmetrically. Thus, the attachment can be stably connected while maintaining a uniform left-right balance. In addition, when the locking hole is provided near the center, the attachment can be made smaller in size than when the locking hole is provided at both the left and right ends. That is, the brim 59 can be used as a locking member of the attachment.
In the present embodiment, as shown in fig. 8 (d), locking holes 50a and 50b are provided in the lower surface of the housing 50. Therefore, when the attachment is connected to the terminal device 7, the terminal device 7 and the attachment are fixed by the four claw portions being respectively engaged with the four engagement holes. Thereby, the additional device can be more firmly connected to the terminal device 7. Further, screw holes may be provided in the locking holes 50a and 50b to fix the attachment with screws. In other embodiments, the locking hole provided in the housing may be disposed in any arbitrary arrangement.
Fig. 15 and 16 are diagrams showing an example of mounting an add-on device to the terminal device 7. Fig. 15 is a view of the terminal device 7 and the input device 200 viewed from the front side of the terminal device 7, and fig. 16 is a view of the terminal device 7 and the input device 200 viewed from the back side of the terminal device 7. In fig. 15 and 16, an input device 200 as an example of an additional device is mounted on the terminal device 7.
The input device 200 includes a first grip portion 200a and a second grip portion 200 b. Each of the grip portions 200a and 200b has a rod-like (columnar) shape and can be held by a single hand of a user. The user can use the input device 200 (and the terminal device 7) by holding only one of the grip portions 200a and 200b, or can use the input device 200 by holding both of them. The input device 200 may have only one grip portion. The input device 200 includes a support portion 205. In the present embodiment, the support portion 205 supports the rear surface (back) of the terminal device 7. Specifically, the support portion 205 has four claw portions (convex portions) that can be engaged with the engagement holes 50a, 50b, 59a, and 59b, respectively.
As shown in fig. 15, when the input device 200 is connected to the terminal device 7, the terminal device 7 and the attachment are fixed by latching the four pawl portions in the four latching holes 50a, 50b, 59a, and 59b, respectively. This enables the input device 200 to be firmly fixed to the terminal device 7. In addition to (or instead of) the engagement of the claw portions with the engagement holes, the input device 200 may be further firmly fixed to the terminal device 7 by fixing the input device 200 and the terminal device 7 with screws or the like in another embodiment. The position of the screw fixation may be arbitrary, and for example, the support portion 205 and the brim portion 59 of the input device 200 abutting the rear surface of the housing 50 may be fixed by screws.
In this way, in the present embodiment, the attachment can be reliably fixed to the terminal device 7 through the locking holes 59a and 59 b. The terminal device 7 includes sensors (the magnetic sensor 72, the acceleration sensor 73, and the gyro sensor 74) for detecting movement and inclination of the terminal device 7, and therefore, the terminal device 7 itself can be used while being moved. For example, the following can also be implemented: in the case of connecting the input device 200 shown in fig. 15 and 16 to the terminal device 7, the user holds the grip portion 200a and/or 200b of the input device 200, moves the input device 200 like a gun, and operates it. When the mobile terminal device 7 itself is used as in the present embodiment, it is particularly effective to reliably fix the attachment device by the locking holes 59a and 59 b.
In the present embodiment, the support portion 205 detachably supports the terminal device 7 such that the screen of the LCD 51 is oriented substantially vertically when the first grip portion 200a (or the second grip portion 200b) is oriented vertically. Each of the grip portions 200a and 200b is formed substantially parallel to the display portion (front surface of the housing 50) of the terminal device 7 connected to the input device 200. In other words, the grip portions 200a and 200b are formed to face the up-down direction of the display portion of the terminal device 7 connected to the input device 200. In this way, the input device 200 is connected to the terminal device 7 in a posture in which the display portion of the terminal device 7 faces the user side (when the user holds the input device 200). Since the user can orient the screen of the display unit to himself by holding (at least one of) the grip units 200a and 200b in a substantially vertical direction, the user can operate the input device 200 while viewing the screen of the display unit. In the present embodiment, the second handle portion 200b is oriented in a direction substantially parallel to the first handle portion 200a, but in other embodiments, at least one handle portion may be oriented substantially parallel to the screen of the LCD 51. Thus, the user can easily hold the input device 200 (and the terminal device 7) by holding the grip portion and directing the LCD 51 toward himself.
In the above embodiment, the support portion 205 is provided on the connection member 206 that connects the first handle portion 200a and the second handle portion 200 b. That is, the support portion 205 is provided between the two grip portions 200a and 200b, and thus the terminal device 7 connected to the input device 200 is disposed between the two grip portions 200a and 200 b. At this time, since the center of gravity of the operation device (operation system) including the terminal device 7 and the input device 200 is located between the two grip portions 200a and 200b, the user can easily grip the operation device by gripping the two grip portions 200a and 200 b. Further, in the above-described embodiment, one grip portion (the first grip portion 200a) is provided at a position on the front side of the screen of the terminal device 7 mounted on the input device 200, and the other grip portion (the second grip portion 200b) is provided at a position on the rear side of the screen. Therefore, the user can easily hold the operation device by holding the two grips in a hand-held method like holding a gun by placing one hand in front of the screen and the other hand in back of the screen. Therefore, the operation device is particularly suitable for, for example, a shooting game in which the operation device is regarded as a gun and a game operation is performed.
The input device 200 includes a first button 201, a second button 202, a third button 203, and a rocker 204 as an operation unit. The buttons 201 to 203 are buttons (keys) that can be pressed by the user. The rocker 204 is a device capable of indicating direction. The operation portion is preferably provided at a position where the user can operate the handle portion with fingers of the hand being held. In the present embodiment, the first button 201, the second button 202, and the rocker 204 are provided at positions that can be operated by the thumb or the index finger of the hand that grips the first grip portion 200 a. In addition, the third button 203 is provided at a position operable with the index finger of the hand holding the second grip portion 200 b.
The input device 200 may also include an imaging device (imaging unit). For example, the input device 200 may have the same configuration as the imaging information calculation unit 35 provided in the controller 5. In this case, the image pickup device of the image pickup information calculation unit may be disposed in the direction of the front of the image pickup input device 200 (the rear of the screen of the terminal device 7). For example, instead of the third button 203, an infrared filter may be disposed at the position of the third button 203, and an imaging element may be disposed inside the infrared filter. Thus, the user can use the input device 200 with the front side thereof directed toward the television 2 (marker device 6), and the game device 3 can calculate the direction and position of the input device 200. Therefore, the user can perform an operation of orienting the input device 200 in a desired direction, and can perform an intuitive and easy operation using the input device 200. The input device 200 may be configured to include a camera similar to the camera 56 instead of the imaging information calculation unit. In this case, the camera may be provided in the front direction of the imaging input device 200, similarly to the above-described imaging element. Thus, the user can take an image in the image pickup direction opposite to the camera 56 of the terminal device 7 by using the input device 200 with the front side thereof directed to the television set 2 (the marker device 6).
The input device 200 includes a connector, not shown, which is connected to the extension connector 58 of the terminal device 7 when the terminal device 7 is mounted on the input device 200. This enables data to be transmitted and received between the input device 200 and the terminal device 7. For example, data indicating an operation on the input device 200 and data indicating an imaging result of the imaging device may be transmitted to the terminal device 7. At this time, the terminal device 7 may transmit operation data indicating an operation performed on the terminal device 7 and data transmitted from the input device to the game device 3 by wireless. The input device 200 may include a charging terminal that is connected to the charging terminal 66 of the terminal device 7 when the terminal device 7 is mounted on the input device 200. Thus, when the terminal device 7 is mounted on the input device 200, power can be supplied from one device to the other device. For example, the input device 200 may be connected to a charger, and the terminal device 7 may receive power from the charger via the input device 200 to perform charging.
The input device 200 may have the following configuration, for example. Fig. 17 is a diagram showing another example of the input device. Fig. 18 and 19 are diagrams showing a state in which the input device 220 shown in fig. 17 is mounted on the terminal device 7. Fig. 18 is a view of the terminal device 7 and the input device 220 as viewed from the back side of the terminal device 7, and fig. 19 is a view of the terminal device 7 and the input device 220 as viewed from the front side of the terminal device 7. The terminal device 7 may be provided with an input device 220 shown in fig. 17, for example. Next, the input device 220 will be described. In fig. 17 to 20, the same reference numerals as in fig. 15 and 16 are assigned to components corresponding to those of the input device 200 shown in fig. 15 and 16, and detailed description thereof is omitted.
As shown in fig. 17, the input device 220 includes a first grip portion 200a and a second grip portion 200b similar to the input device 200. Therefore, the user can use the input device 220 (and the terminal device 7) by holding only one of the grip portions 200a and 200b, and can use the input device 220 by holding both of them.
The input device 220 includes a support portion 205 similar to the input device 200. The support portion 205 has four claw portions (only three claw portions 205a to 205c are illustrated in fig. 17) as in the support portion of the input device 200. The upper two claw portions 205a and 205b of the respective claw portions can be respectively engaged with the engaging holes 59a and 59b of the terminal device 7. The remaining lower two claw portions can be locked in the locking holes 50a and 50b of the terminal device 7, respectively. The claw portion (not shown) is provided at a position symmetrical to the claw portion 205c in the left-right direction (the left-right direction of the terminal device 7 attached to the support portion 205).
As shown in fig. 18 and 19, when the input device 220 is connected to the terminal device 7, the terminal device 7 and the input device 220 are fixed by engaging the four claw portions with the four engaging holes 50a, 50b, 59a, and 59b, respectively. This enables the input device 220 to be firmly fixed to the terminal device 7. In another embodiment, in addition to (or instead of) the engagement between the claw portions and the engagement holes, the input device 220 may be more firmly fixed to the terminal device 7 by fixing the input device 220 and the terminal device 7 with screws or the like. For example, screw holes may be provided in the locking holes 50a and 50b, and the lower two claw portions may be fixed to the locking holes 50a and 50b by screws. In addition, the position to be fixed by the screw may be arbitrary.
As described above, the input device 220 can also be reliably fixed to the terminal device 7 in the same manner as the input device 200.
In the input device 220, similarly to the input device 200, the support portion 205 is attached and detached to and from the terminal device 7 so that the screen of the LCD51 is oriented substantially vertically when the first grip portion 200a (or the second grip portion 200b) is oriented vertically. Each of the grip portions 200a and 200b is formed substantially parallel to the display portion (front surface of the housing 50) of the terminal device 7 connected to the input device 220. Therefore, the user can orient the screen of the display unit to himself by holding (at least one of) the grip units 200a and 200b in a substantially vertical direction, and can operate the input device 220 while viewing the screen of the display unit. In the input device 220, as in the input device 200, the support portion 205 supports the terminal device 7 at a position above the grip portion, and thus the user can easily view the screen by holding the grip portion. In other embodiments, at least one of the handle portions may be oriented substantially parallel to the screen of the LCD 51.
In the input device 220, the shape of the connection portion is different from that of the input device 200. The connecting portion 209 shown in fig. 17 is connected to two positions, one position on the upper side and one position on the lower side of the first handle portion 200a, and is connected to the upper side (upper end) of the second handle portion 200 b. In the input device 220, as in the input device 200, the connection portion 209 is formed to protrude forward from the second grip portion 200 b. In the input device 220, the support portion 205 is provided on the connection member 209 connecting the first grip portion 200a and the second grip portion 200b, similarly to the input device 200. Thus, the user can easily hold the operation device by gripping the two grip portions 200a and 200 b.
The connection portion 209 has a member extending downward from a connection portion with the support portion 205. When the screen of the LCD 51 of the terminal device 7 connected to the support portion 205 is oriented substantially vertically, the member is oriented to extend substantially vertically. That is, the above components are oriented substantially parallel to the respective grip portions 200a and 200 b. Therefore, even when the user grips the member as a grip portion, the user can operate the input device 200 while viewing the screen of the LCD 51 by gripping the member substantially in the vertical direction. Further, since the above-described member is disposed below the support portion 205, the screen can be easily viewed by the user by holding the member.
The input device 220 is also the same as the input device 200 in that one grip portion (first grip portion 200a) is provided at a position on the front side of the screen of the terminal device 7 mounted on the input device 220 and the other grip portion (second grip portion 200b) is provided at a position on the rear side of the screen. Therefore, as with the input device 200, the input device 220 can be easily held by a hand-held method such as holding a gun, and is particularly suitable for a shooting game or the like in which a game operation is performed with the operating device regarded as a gun.
The input device 220 includes a fourth button 207 as an operation unit in addition to the second button 202 and the rocker 204 similar to those of the input device 200. The second button 202 and the rocker 204 are provided on the upper side of the first grip portion 200a, similarly to the input device 200. The fourth button 207 is a button (key) that can be pressed by the user. The fourth button 207 is disposed on the upper side of the second handle portion 200 b. That is, the fourth button 207 is provided at a position operable with the index finger or the like of the hand holding the second grip portion 200 b.
The input device 220 includes an imaging element (imaging device). Here, the input device 220 has the same configuration as the imaging information calculation unit 35 provided in the controller 5. The image pickup device of the image pickup information calculation unit is disposed in the direction of the front of the image pickup input device 220 (the rear of the screen of the terminal device 7). Specifically, a window portion (infrared filter) 208 is provided at the distal end portion of the input device 220 (distal end portion of the connection portion 209), and the image pickup device is provided inside the window portion 208 so as to pick up an image of the front direction from the window portion 208. As described above, the user can use the input device 220 with the front side thereof directed to the television 2 (marker device 6), and the game device 3 can calculate the direction and position of the input device 220. Therefore, the user can perform an operation of orienting the input device 220 in a desired direction, and can perform an intuitive and easy operation using the input device 220.
The input device 220 may be configured to include a camera similar to the camera 56 instead of the imaging information calculation unit. Thus, the user can take an image in the image pickup direction opposite to the camera 56 of the terminal device 7 by using the input device 220 with the front side thereof directed to the television set 2 (the marker device 6).
The input device 220 includes a connector, not shown, similar to the input device 200, and the connector is connected to the extension connector 58 of the terminal device 7 when the terminal device 7 is mounted on the input device 220. This enables data to be transmitted and received between the input device 220 and the terminal device 7. Therefore, data indicating the operation of the input device 220 and data indicating the imaging result of the imaging device may be transmitted to the game device 3 through the terminal device 7. In another embodiment, input device 220 may be configured to directly communicate with game device 3. That is, for example, data indicating an operation on the input device 220 may be directly transmitted from the input device 220 to the game device 3 by Bluetooth (registered trademark) technology or the like, similarly to wireless communication between the controller 5 and the game device 3. At this time, operation data indicating an operation performed on the terminal device 7 is transmitted from the terminal device 7 to the game device 3. Similarly to the input device 200, the input device 220 may be provided with a charging terminal connected to the charging terminal 66 of the terminal device 7 when the terminal device 7 is mounted on the input device 220.
In another embodiment, an operation device in which the terminal device 7 and the input device 200 (or the input device 220) are integrally formed may be provided. In this case, a mechanism for detachably connecting the terminal device 7 and the input device 200, such as the locking holes 50a, 50b, 59a, and 59b of the terminal device 7 and the claw portion of the input device 200, is not required.
Fig. 20 is a diagram showing another example of connecting an additional device to the terminal device 7. In fig. 20, a terminal device 7 is connected (mounted) to a cradle 210 as an example of an attachment device. The stand 210 is a support device for placing (supporting) the terminal device 7 in a standing state at a predetermined angle. The holder 210 includes a support member 211, a charging terminal 212, and guide members 213a and 213 b.
In the present embodiment, the cradle 210 also functions as a charger and has a charging terminal 212. The charging terminal 212 is a terminal that can be connected to the charging terminal 66 of the terminal device 7. In the present embodiment, each of the charging terminals 66 and 212 is a metal terminal, but may be a connector having a shape in which one can be connected to the other. When the terminal device 7 is connected to the cradle 210, the charging terminal 212 of the cradle 210 is in contact with the charging terminal 66 of the terminal device 7, and power is supplied from the cradle 210 to the terminal device 7, thereby enabling charging.
The support member 211 supports the rear surface of the terminal device 7 at a predetermined angle. The support member 211 supports a predetermined surface (here, a back surface) of the housing 50 when the terminal (the charging terminal 66) of the terminal device 7 is connected to the terminal (the charging terminal 212) of the cradle 210. As shown in fig. 20, the support member 211 has a wall portion 211a and a groove portion 211 b. The support member 211 supports the housing 50 by the wall portion 211a such that the back surface of the housing 50 is placed along a predetermined support surface (here, the surface formed by the wall portion 211 a). The groove 211b is a portion into which a part (lower portion) of the housing 50 is inserted when the terminal device 7 is connected to the cradle 210. Therefore, the groove 211b is formed to substantially conform to the shape of the above-described part of the housing 50. The groove 211b extends in a direction parallel to the support surface.
In addition, the guide members 213a and 213b can be inserted into the second latching holes 50a and 50b of the terminal device 7 for determining the position where the terminal device 7 is connected to the cradle 210. The guide members 213a and 213b are provided at positions corresponding to the locking holes 50a and 50b of the terminal device 7. That is, the respective guide members 213a and 213b are provided at positions to be inserted into the locking holes 50a and 50b when the terminal device 7 is correctly connected to the cradle 210. The case where the terminal device 7 is correctly connected to the cradle 210 is a case where the charging terminal 212 of the cradle 210 and the charging terminal 66 of the terminal device 7 are connected. In addition, the guide members 213a and 213b are provided such that a portion thereof protrudes from the bottom surface of the groove portion 211 b. That is, the guide members 213a and 213b are provided such that a portion thereof protrudes upward from the front surface of the support member 211. When the terminal device 7 is connected to the cradle 210, the guide members 213a and 213b are partially inserted into the locking holes 50a and 50b, respectively.
In the present embodiment, each of the guide members 213a and 213b is a rotatable wheel member (roller portion). Each of the guide members 213a and 213b can rotate in a prescribed direction. Here, the predetermined direction is a direction (horizontal direction) parallel to the support surface, in other words, a left-right direction of the terminal device 7 when the terminal device 7 is connected to the cradle 210. The guide member may be a rotating member that can rotate at least in a predetermined direction. For example, in another embodiment, the guide member may be a spherical body rotatably supported by a spherical recess. In the present embodiment, the number of the guide members is two, but the number of the guide members may be set to the number of the locking holes provided in the lower surface of the terminal device 7, and the cradle 210 may include only one guide member or three or more guide members.
When the terminal device 7 is connected to the cradle 210, the back surface of the terminal device 7 comes into contact with the support member 211, and the terminal device 7 is placed on the cradle 210 at a predetermined angle. That is, the terminal device 7 is placed on the holder 210 at a predetermined angle by inserting a part of the lower side of the housing 50 into the groove 211b and supporting the back surface of the housing 50 by the wall 211 a. Therefore, in the present embodiment, the position of the terminal device 7 is determined to be a correct position by the support member 211 in the direction perpendicular to the predetermined direction.
Here, when the terminal device 7 is connected to the cradle 210, if the terminal device 7 and the cradle 210 are not in the correct positional relationship, the terminal device 7 is connected after the position thereof is corrected by the respective guide members 213a and 213 b. That is, when the locking holes 50a and 50b are shifted from the guide members 213a and 213b in the predetermined direction, the guide members 213a and 213b contact the housing 50 around the locking holes 50a and 50 b. Accordingly, the terminal device 7 is slid in a predetermined direction by the rotation of the guide members 213a and 213 b. In the present embodiment, since the two guide members 213a and 213b are arranged in parallel in the predetermined direction, the lower surface of the terminal device 7 can be brought into contact with only the guide members 213a and 213b, and the terminal device 7 can be moved more smoothly. Further, by providing the inclined slopes (recessed inclined slopes) around the locking holes 50a and 50b, the terminal device 7 can be moved more smoothly. As a result of the sliding movement of the terminal device 7 as described above, the respective portions of the guide members 213a and 213b are inserted into the locking holes 50a and 50 b. Thereby, the charging terminal 212 of the cradle 210 is brought into contact with the charging terminal 66 of the terminal device 7, and charging can be performed reliably.
As described above, the user can easily attach the terminal device 7 to the cradle 210 even if the user does not place the terminal device 7 at the correct position. According to the present embodiment, since the terminal device 7 can be positioned with respect to the cradle 210 with a simple configuration such as the locking hole of the terminal device 7 and the guide member of the cradle 210, the cradle 210 can be made small and simple in configuration. In the present embodiment, the terminal device 7 is a relatively large portable device, and even if it is a large portable device, the cradle 210 itself can be formed in a small size as shown in fig. 20. Further, since the holder 210 can be connected to terminal devices of various shapes or sizes, a highly versatile support device can be provided.
In addition, in the present embodiment, the locking holes 50a and 50b are used as holes for locking the claw portions of the attachment, and are also used as objects into which the guide member is inserted. Thus, the number of holes provided in the housing 50 of the terminal device 7 can be reduced, and the shape of the housing 50 can be simplified.
In the above embodiment, the holes into which the guide members of the holder 210 are to be inserted are holes (locking holes 50a and 50b) provided in the side surface on the lower side of the housing 50, but the positions of the holes may be arbitrary. For example, the holes may be provided on other side surfaces of the housing 50, and the holes may be provided on the front or rear surface of the housing 50. Further, the guide portion needs to be provided at a position corresponding to the position of the hole, and therefore, in the case where the hole is provided at the front or rear surface of the housing 50, the guide portion of the bracket 210 may also be provided at the position of the wall portion 211a, for example. Further, holes may be provided on a plurality of surfaces of the housing 50, and in this case, the terminal device 7 may be placed on the holder 210 in various orientations.
[5. Game processing ]
Next, the game processing executed in the present game system will be described in detail. First, various data used in the game process will be described. Fig. 21 is a diagram showing various data used in the game process. Fig. 21 is a diagram showing main data stored in the main memory (external main memory 12 or internal main memory 11e) of the game device 3. As shown in fig. 21, the game program 90, the reception data 91, and the processing data 106 are stored in the main memory of the game device 3. In addition to the data shown in fig. 21, the main memory stores data necessary for the game, such as image data of various objects appearing in the game and audio data used in the game.
A part or all of the game program 90 is read from the optical disk 4 and stored in the main memory at an appropriate timing after the game device 3 is powered on. Instead of the optical disk 4, the game program 90 may be acquired from the flash memory 17 or from a device external to the game device 3 (for example, via the internet). In addition, a part of the program included in the game program 90 (for example, a program for calculating the posture of the controller 5 and/or the terminal device 7) may be stored in the game device 3 in advance.
The reception data 91 is various data received from the controller 5 and the terminal device 7. The received data 91 contains controller operation data 92, terminal operation data 97, camera image data 104, and microphone sound data 105. In the case where a plurality of controllers 5 are connected, the controller operation data 92 also becomes a plurality. When a plurality of terminal apparatuses 7 are connected, the terminal operation data 97, the camera image data 104, and the microphone sound data 105 are also a plurality of data.
The controller operation data 92 is data representing an operation of the controller 5 by a user (player). The controller operation data 92 is transmitted from the controller 5, acquired by the game device 3, and stored in the main memory. The controller operation data 92 includes first operation button data 93, first acceleration data 94, first angular velocity data 95, and marker coordinate data 96. Further, a predetermined number of controller operation data may be stored in the main memory in order from the latest (last acquired) data.
The first operation button data 93 is data indicating the input state to each of the operation buttons 32a to 32i provided on the controller 5. Specifically, the first operation button data 93 indicates whether or not each of the operation buttons 32a to 32i is pressed.
The first acceleration data 94 is data indicating the acceleration (acceleration vector) detected by the acceleration sensor 37 of the controller 5. Here, the first acceleration data 94 represents three-dimensional acceleration having acceleration in the XYZ triaxial directions shown in fig. 3 as each component, but in other embodiments, it may represent acceleration in one or more arbitrary directions.
The first angular velocity data 95 is data indicating the angular velocity detected by the gyro sensor 48 in the controller 5. Here, the first angular velocity data 95 represents each angular velocity about the XYZ triaxial directions shown in fig. 3, but in other embodiments, it may represent an angular velocity about one or more arbitrary axes.
The marker coordinate data 96 is data indicating the coordinates calculated by the image processing circuit 41 of the imaging information calculation unit 35, that is, the marker coordinates. The marker coordinates are expressed in a two-dimensional coordinate system for indicating a position on a plane corresponding to the captured image, and the marker coordinate data 96 indicates coordinate values in the two-dimensional coordinate system.
The controller operation data 92 may be data representing an operation by a user operating the controller 5, and may include only a part of the data 93 to 96. In addition, in the case where the controller 5 has another input unit (for example, a touch panel, an analog stick, or the like), the controller operation data 92 may include data indicating an operation on the other input unit. Further, in the case where the movement of the controller 5 itself is used as the game operation as in the present embodiment, the controller operation data 92 is made to include data in which the value of the first acceleration data 94, the first angular velocity data 95, or the marker coordinate data 96 changes in accordance with the movement of the controller 5 itself.
The terminal operation data 97 is data indicating an operation of the terminal device 7 by the user. The terminal operation data 97 is transmitted from the terminal device 7, acquired by the game device 3, and stored in the main memory. The terminal operation data 97 includes second operation button data 98, stick data 99, touch position data 100, second acceleration data 101, second angular velocity data 102, and orientation data. Further, a predetermined number of terminal operation data may be stored in the main memory in order from the latest (last acquired) data.
The second operation button data 98 is data indicating the input state to each of the operation buttons 54A to 54L provided on the terminal device 7. Specifically, the second operation button data 98 indicates whether or not each of the operation buttons 54A to 54L is pressed.
The stick data 99 is data indicating the direction and amount in which the stick part of the analog stick 53 (analog sticks 53A and 53B) slides (or tilts). The directions and amounts may also be expressed as two-dimensional coordinates or two-dimensional vectors, for example.
The touch position data 100 is data indicating a position (touch position) of an input on the input surface of the touch panel 52. In the present embodiment, the touch position data 100 represents coordinate values on a two-dimensional coordinate system showing the position on the input surface. In addition, when the touch panel 52 is of the multi-touch type, the touch position data 100 may indicate a plurality of touch positions.
The second acceleration data 101 is data indicating an acceleration (acceleration vector) detected by the acceleration sensor 73. In the present embodiment, the second acceleration data 101 represents three-dimensional acceleration having acceleration in the xyz three-axis direction shown in fig. 8 as each component, but in other embodiments, acceleration in one or more arbitrary directions may be represented.
The second angular velocity data 102 is data indicating the angular velocity detected by the gyro sensor 74. In the present embodiment, the second angular velocity data 102 represents each angular velocity about the xyz three axes shown in fig. 8, but in other embodiments, it may represent an angular velocity about any axis of one or more axes.
The azimuth data 103 is data indicating the azimuth detected by the magnetic sensor 72. In the present embodiment, the azimuth data 103 represents the direction of a predetermined azimuth (for example, north) with reference to the terminal device 7. However, in a place where a magnetic field other than the geomagnetic field is generated, the azimuth data 103 does not strictly indicate an absolute azimuth (north or the like), but indicates a relative direction of the terminal device 7 with respect to the magnetic field direction of the place, and therefore, in this case as well, the posture change of the terminal device 7 can be calculated.
The terminal operation data 97 may be data indicating an operation by a user operating the terminal device 7, and may include only one of the data 98 to 103. In the case where the terminal device 7 has another input means (for example, a touch panel, an imaging means of the controller 5, or the like), the terminal operation data 97 may include data indicating an operation on the other input means. In the case where the movement of the terminal device 7 itself is used as the game operation as in the present embodiment, the terminal operation data 97 includes data in which the value of the second acceleration data 101, the second angular velocity data 102, or the direction data 103 changes according to the movement of the terminal device 7 itself.
The camera image data 104 is data indicating an image (camera image) captured by the camera 56 of the terminal device 7. The camera image data 104 is image data obtained by decompressing compressed image data from the terminal device 7 by the codec LSI 27, and is stored in the main memory by the input/output processor 11 a. Further, a predetermined number of camera image data may be stored in the main memory in order from the latest (last acquired) data.
The microphone sound data 105 is data indicating a sound (microphone sound) detected by the microphone 79 of the terminal device 7. The microphone sound data 105 is sound data obtained by decompressing the compressed sound data transmitted from the terminal device 7 by the codec LSI 27, and is stored in the main memory by the input/output processor 11 a.
The processing data 106 is data used in a game process (fig. 22) described later. The processing data 106 includes control data 107, controller gesture data 108, terminal gesture data 109, image recognition data 110, and voice recognition data 111. In addition to the data shown in fig. 21, the processing data 106 includes various data used in game processing, such as data indicating various parameters set for various objects appearing in the game.
The control data 107 is data indicating a control instruction for a component provided in the terminal device 7. The control data 107 indicates, for example, an instruction to control the lighting of the marker section 55, an instruction to control the imaging of the camera 56, and the like. The control data 107 is transmitted to the terminal device 7 at an appropriate timing.
The controller posture data 108 is data representing the posture of the controller 5. In the present embodiment, the controller attitude data 108 is calculated from the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96 included in the controller operation data 92. The calculation method of the controller attitude data 108 is described later in step S23.
The terminal posture data 109 is data indicating the posture of the terminal device 7. In the present embodiment, the terminal posture data 109 is calculated from the second acceleration data 101, the second angular velocity data 102, and the orientation data 103 included in the terminal operation data 97. The calculation method of the terminal posture data 109 will be described later in step S24.
The image recognition data 110 is data indicating a result of predetermined image recognition processing performed on the camera image. The image recognition processing may be any processing as long as it detects some feature from the camera image and outputs the result, and may be, for example, processing of extracting a predetermined object (for example, the face of the user, a marker, or the like) from the camera image and calculating information on the extracted object.
The voice recognition data 111 is data indicating the result of predetermined voice recognition processing performed on the microphone voice. The voice recognition processing may be any processing as long as it is processing to detect some feature from the microphone voice and output the result, and may be processing to detect the language of the user or processing to output only the volume, for example.
Next, the game processing performed in the game device 3 will be described in detail with reference to fig. 22. Fig. 22 is a main flowchart showing a flow of game processing executed by the game device 3. When the power of the game device 3 is turned on, the CPU 10 of the game device 3 initializes each unit such as a main memory by executing a boot program stored in a boot ROM, not shown. Then, the game program stored in the optical disk 4 is read into the main memory, and the CPU 10 starts executing the game program. In the game device 3, the game program stored in the optical disk 4 may be executed immediately after the power is turned on, or the game program stored in the optical disk 4 may be executed when the user instructs the start of the game after the built-in program displaying a predetermined menu screen is first executed after the power is turned on. The flowchart shown in fig. 22 is a flowchart showing the processing performed after the above processing is completed.
Note that the processing of each step in the flowchart shown in fig. 22 is merely an example, and the order of the processing of each step may be changed as long as the same result can be obtained. The values of the variables and the threshold used in the determination step are only examples, and other values may be used as necessary. In the present embodiment, the CPU 10 executes the processing of each step in the flowchart, but a processor or a dedicated circuit other than the CPU 10 may execute the processing of a part of the steps.
First, in step S1, the CPU 10 executes initial processing. The initial processing is, for example, the following processing: a virtual game space is constructed and objects appearing in the game space are arranged at initial positions, or initial values of various parameters used in game processing are set.
In the present embodiment, in the initial processing, the CPU 10 controls the marker device 6 and the marker unit 55 to be lit up according to the type of the game program. Here, the game system 1 includes both the marker device 6 and the marker unit 55 of the terminal device 7 as imaging targets of the imaging means (imaging information calculation unit 35) of the controller 5. Either one or both of the marker 6 and the marker 55 are used depending on the game content (the type of game program). The game program 90 also includes data indicating whether or not each marker 6 and the marker unit 55 are lit. The CPU 10 reads out the data and determines whether or not to light up. Then, when the marker 6 and/or the marker 55 is to be lit, the following processing is executed.
That is, when the marker device 6 is to be turned on, the CPU 10 transmits a control signal to the marker device 6, the control signal instructing to turn on each infrared LED provided in the marker device 6. The transmission of the control signal may be only for the purpose of supplying power. In response thereto, the infrared LEDs of the marking device 6 are lit. On the other hand, when the marker section 55 is to be turned on, the CPU 10 generates control data indicating an instruction to turn on the marker section 55 and stores the control data in the main memory. The generated control data is transmitted to the terminal device 7 in step S10 described later. The control data received by the wireless module 80 of the terminal device 7 is transmitted to the UI controller 75 through the codec LSI 76, and the UI controller 75 instructs the marker 55 to light up. Thereby, the infrared LED of the marker portion 55 is turned on. Note that, although the above description has been given of the case where the marker 6 and the marker 55 are turned on, the marker 6 and the marker 55 can be turned off by the same processing as in the case of the lighting.
The process of step S2 is performed after the above step S1. Thereafter, the processing loop formed by the series of processing of steps S2 to S11 is repeatedly executed at a rate of once every predetermined time (one frame time).
In step S2, the CPU 10 acquires the controller operation data transmitted from the controller 5. Since the controller 5 transmits the controller operation data back to the game device 3, the controller communication module 19 in the game device 3 sequentially receives the controller operation data and sequentially stores the received controller operation data in the main memory via the input/output processor 11 a. It is preferable that the interval of transmission and reception is shorter than the processing time of the game, for example, 1/200 second. In step S2, the CPU 10 reads the latest controller operation data 92 from the main memory. The process of step S3 is performed after step S2.
In step S3, the CPU 10 acquires various data transmitted from the terminal device 7. The terminal device 7 repeatedly transmits the terminal operation data, the camera image data, and the microphone sound data to the game device 3, and thus the game device 3 sequentially receives these data. In the game device 3, the terminal communication module 28 sequentially receives these data, and the codec LSI 27 sequentially performs decompression processing on the camera image data and the microphone sound data. Then, the input-output processor 11a stores the terminal operation data, the camera image data, and the microphone sound data in the main memory in order. In step S3, the CPU 10 reads the latest terminal operation data 97 from the main memory. The process of step S4 is performed after step S3.
In step S4, the CPU 10 executes game control processing. The game control process is a process of advancing a game by executing a process of operating an object in a game space in accordance with a game operation performed by a user. In the present embodiment, the user can play various games using the controller 5 and/or the terminal device 7. Next, the game control process will be described with reference to fig. 23.
Fig. 23 is a flowchart showing a detailed flow of the game control process. The series of processes shown in fig. 23 are various processes that can be executed when the controller 5 and the terminal device 7 are used as operation devices, but the entire processes need not be executed, and only a part of the processes may be executed depending on the type and content of the game.
In the game control process, first, in step S21, the CPU 10 determines whether or not to change the marker to be used. As described above, in the present embodiment, when the game process is started (step S1), the process of controlling the lighting of the marker device 6 and the marker unit 55 is executed. Here, it is also considered that the object to be used (to be lit) among the marker device 6 and the marker unit 55 is changed during the game according to the difference in the game. Further, it is also conceivable to use both the marker 6 and the marker unit 55 depending on the game, but if both are lit, there is a possibility that one marker may be erroneously detected as the other marker. Therefore, it is also preferable to switch the lighting so that only one of them is lit during the game. The process of step S21 is a process of determining whether or not the object to be lit is changed during the game in consideration of the above-described situation.
The determination at step S21 described above can be made by the following method, for example. That is, the CPU 10 can make the above determination according to whether or not the game situation (the stage of the game, the operation target, or the like) has changed. This is because it is considered that, when the game situation changes, the operation method is changed between the operation method of operating the controller 5 toward the marker device 6 and the operation method of operating the controller 5 toward the marker portion 55. The CPU 10 can make the above determination according to the posture of the controller 5. That is, the determination can be made according to whether the controller 5 is the direction marking device 6 or the direction marking unit 55. The posture of the controller 5 can be calculated from the detection results of the acceleration sensor 37 and the gyro sensor 48, for example (see step S23 described later). The CPU 10 may also perform the determination based on whether or not there is a change instruction by the user.
In the case where the result of the determination in the above-described step S21 is affirmative, the process in step S22 is executed. On the other hand, if the determination result at step S21 is negative, the process at step S22 is skipped and the process at step S23 is executed.
In step S22, the CPU 10 controls the lighting of the marker device 6 and the marker section 55. That is, the lighting state of the marker device 6 and/or the marker portion 55 is changed. Further, as in the case of step S1 described above, the specific processing of turning on or off the marker device 6 and/or the marker portion 55 can be performed. The process of step S23 is performed after step S22.
As described above, according to the present embodiment, the light emission (lighting) of the marker device 6 and the marker unit 55 can be controlled according to the type of the game program by the processing of step S1, and the light emission (lighting) of the marker device 6 and the marker unit 55 can be controlled according to the game situation by the processing of steps S21 and S22.
In step S23, the CPU 10 calculates the posture of the controller 5. In the present embodiment, the posture of the controller 5 is calculated from the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96. Next, a method of calculating the posture of the controller 5 will be described.
First, the CPU 10 calculates the attitude of the controller 5 from the first angular velocity data 95 stored in the main memory. The method of calculating the attitude of the controller 5 from the angular velocity may be any method, and the attitude is calculated using the previous attitude (the previously calculated attitude) and the current angular velocity (the angular velocity acquired in step S2 in the current processing loop). Specifically, the CPU 10 calculates the attitude by rotating the previous attitude at the current angular velocity for a unit time. The previous posture is represented by the controller posture data 108 stored in the main memory, and the current angular velocity is represented by the first angular velocity data 95 stored in the main memory. Thus, the CPU 10 reads the controller attitude data 108 and the first angular velocity data 95 from the main memory to calculate the attitude of the controller 5. Data indicating the "posture based on the angular velocity" calculated as described above is stored in the main memory.
Further, in the case of calculating the posture from the angular velocity, it is preferable to determine the initial posture in advance. That is, in the case of calculating the posture of the controller 5 from the angular velocity, the CPU 10 initially calculates the initial posture of the controller 5 in advance. The initial posture of the controller 5 may be calculated from the acceleration data, or the specific posture at the time of the specific operation may be used as the initial posture by causing the player to perform the specific operation in a state where the controller 5 is brought into the specific posture. It is preferable to calculate the initial attitude when the attitude of the controller 5 is calculated as an absolute attitude with reference to a predetermined direction in space, but the initial attitude may not be calculated when the attitude of the controller 5 is calculated as a relative attitude with reference to the attitude of the controller 5 at the game start time, for example.
Next, the CPU 10 corrects the attitude of the controller 5 calculated from the angular velocity using the first acceleration data 94. Specifically, the CPU 10 first reads the first acceleration data 94 from the main memory, and calculates the posture of the controller 5 based on the first acceleration data 94. Here, the fact that the controller 5 is almost stationary means that the acceleration applied to the controller 5 is the acceleration due to gravity. Therefore, in this state, the direction of the gravitational acceleration (gravitational direction) can be calculated using the first acceleration data 94 output by the acceleration sensor 37, and therefore the orientation (posture) of the controller 5 with respect to the gravitational direction can be calculated from the first acceleration data 94. Data indicating the "posture based on acceleration" calculated as described above is stored in the main memory.
When the acceleration-based posture is calculated, the CPU 10 then corrects the angular velocity-based posture with the acceleration-based posture. Specifically, the CPU 10 reads out data indicating the posture based on the angular velocity and data indicating the posture based on the acceleration from the main memory, and performs correction to make the posture based on the angular velocity data approach the posture based on the acceleration data at a predetermined ratio. The predetermined ratio may be a predetermined fixed value or may be set based on the acceleration indicated by the first acceleration data 94. In addition, since the attitude based on the acceleration cannot be calculated with respect to the rotation direction about the gravity direction as an axis, the CPU 10 may not correct the rotation direction. In the present embodiment, the data indicating the corrected posture obtained as described above is stored in the main memory.
After correcting the attitude based on the angular velocity as described above, the CPU 10 further corrects the corrected attitude using the marker coordinate data 96. First, the CPU 10 calculates the posture of the controller 5 (posture based on the marker coordinates) from the marker coordinate data 96. The marker coordinate data 96 indicates the positions of the markers 6R and 6L within the captured image, and therefore the attitude of the controller 5 with respect to the roll direction (the rotational direction about the Z axis) can be calculated from these positions. That is, the attitude of the controller 5 with respect to the roll direction can be calculated from the slope of a straight line connecting the position of the marker 6R and the position of the marker 6L within the captured image. In addition, when the position of the controller 5 with respect to the marker device 6 can be specified (for example, when it can be assumed that the controller 5 is positioned on the front side of the marker device 6), the attitude of the controller 5 with respect to the pitch direction and the yaw direction can be calculated from the position of the marker device 6 within the captured image. For example, in the case where the positions of the markers 6R and 6L are moved to the left within the captured image, it can be determined that the orientation (posture) of the controller 5 is changed to the right. In this way, the attitude of the controller 5 with respect to the pitch direction and the yaw direction can be calculated from the positions of the markers 6R and 6L. Through the above processing, the posture of the controller 5 can be calculated from the marker coordinate data 96.
When the posture based on the marker coordinates is calculated, the CPU 10 then corrects the above-described corrected posture (posture corrected using the posture based on the acceleration) using the posture based on the marker coordinates. That is, the CPU 10 performs correction to approximate the corrected posture to the posture based on the marker coordinates at a predetermined ratio. The predetermined ratio may be a predetermined fixed value. In addition, the correction using the posture based on the marker coordinates may be performed only in any one direction or any two directions of the roll direction, the pitch direction, and the yaw direction. For example, in the case of using the marker coordinate data 96, the attitude can be calculated with high accuracy with respect to the roll direction, so the CPU 10 may correct only the roll direction with the attitude based on the marker coordinate data 96. In addition, since the posture based on the marker coordinate data 96 cannot be calculated when the marker device 6 or the marker portion 55 is not imaged by the imaging element 40 of the controller 5, the correction process using the marker coordinate data 96 may not be executed in this case.
From the above, the CPU 10 corrects the first posture of the controller 5 calculated from the first angular velocity data 95 using the first acceleration data 94 and the marker coordinate data 96. Here, with the method of using the angular velocity among the methods for calculating the attitude of the controller 5, the attitude can be calculated regardless of how the controller 5 is moving. On the other hand, in the method using angular velocities, since the attitude is calculated by adding up the sequentially detected angular velocities, there is a possibility that accuracy deteriorates due to accumulation of errors or the like, or accuracy of the gyro sensor deteriorates due to a problem called temperature drift. In addition, the method using acceleration does not accumulate errors, but in a state where the controller 5 is moved vigorously, the posture cannot be calculated with high accuracy (because the direction of gravity cannot be detected accurately). In addition, the method using the marker coordinates can calculate the attitude with high accuracy (particularly with respect to the roll direction), but cannot calculate the attitude in a state where the marker portion 55 is not captured. In contrast, according to the present embodiment, since the three methods having different advantages are used as described above, the posture of the controller 5 can be calculated more accurately. In other embodiments, the gesture may be calculated by any one or two of the above three methods. When the lighting control of the marker is performed in the processing of step S1 or S22, the CPU 10 preferably calculates the posture of the controller 5 using at least the marker coordinates.
The process of step S24 is performed after the above-described step S23. In step S24, the CPU 10 calculates the posture of the terminal device 7. That is, the terminal operation data 97 acquired from the terminal device 7 includes the second acceleration data 101, the second angular velocity data 102, and the orientation data 103, and therefore the CPU 10 calculates the posture of the terminal device 7 from these data. Here, the CPU 10 can know the amount of rotation (the amount of change in the posture) per unit time of the terminal device 7 from the second angular velocity data 102. In addition, since the terminal device 7 is almost stationary, the acceleration applied to the terminal device 7 is the gravitational acceleration, and the direction of the gravitational force applied to the terminal device 7 (that is, the posture of the terminal device 7 with reference to the gravitational direction) can be known from the second acceleration data 101. Further, a predetermined azimuth with respect to the terminal device 7 (that is, the posture of the terminal device 7 with respect to the predetermined azimuth) can be known from the azimuth data 103. In addition, even when a magnetic field other than the geomagnetic field is generated, the rotation amount of the terminal device 7 can be known. Therefore, the CPU 10 can calculate the posture of the terminal device 7 from the second acceleration data 101, the second angular velocity data 102, and the orientation data 103. In the present embodiment, the posture of the terminal device 7 is calculated from the three pieces of data, but in another embodiment, the posture may be calculated from one or two pieces of data among the three pieces of data.
Note that a specific calculation method of the orientation of the terminal device 7 may be any method, and for example, a method of correcting the orientation calculated from the angular velocity indicated by the second angular velocity data 102 using the second acceleration data 101 and the orientation data 103 is considered. Specifically, the CPU 10 first calculates the posture of the terminal device 7 based on the second angular velocity data 102. Further, the method of calculating the posture from the angular velocity may be the same as the method in step S23 described above. Next, the CPU 10 corrects the posture calculated from the angular velocity using the posture calculated from the second acceleration data 101 and/or the posture calculated from the orientation data 103 at an appropriate timing (for example, when the terminal device 7 is in a state close to a stationary state). Further, the method of correcting the angular velocity-based posture with the acceleration-based posture may be the same method as the above-described case of calculating the posture of the controller 5. In addition, when the posture based on the angular velocity is corrected by the posture based on the orientation data, the CPU 10 may make the posture based on the angular velocity approach the posture based on the orientation data at a predetermined rate. From the above, the CPU 10 can accurately calculate the posture of the terminal device 7.
Further, since the controller 5 includes the imaging information calculation unit 35 as the infrared detection means, the game device 3 can acquire the marker coordinate data 96. Therefore, the game device 3 can know the absolute posture of the controller 5 in the real space (which posture the controller 5 is in the coordinate system set in the real space) from the marker coordinate data 96. On the other hand, the terminal device 7 does not include infrared detection means as in the imaging information calculation unit 35. Therefore, the game device 3 cannot know the absolute attitude in the actual space with respect to the rotation direction about the gravity direction as the axis only from the second acceleration data 101 and the second angular velocity data 102. Therefore, in the present embodiment, the terminal device 7 is configured to include the magnetic sensor 72, and the game device 3 acquires the azimuth data 103. As a result, the game device 3 can calculate the absolute orientation in the actual space with respect to the rotation direction about the gravity direction as the axis from the orientation data 103, and can calculate the orientation of the terminal device 7 more accurately.
As a specific process of step S24, the CPU 10 reads the second acceleration data 101, the second angular velocity data 102, and the orientation data 103 from the main memory, and calculates the posture of the terminal device 7 from these data. Then, data indicating the calculated posture of the terminal device 7 is stored in the main memory as terminal posture data 109. The process of step S25 is performed after step S24.
In step S25, the CPU 10 executes recognition processing of the camera image. That is, the CPU 10 performs predetermined recognition processing on the camera image data 104. The recognition processing may be any processing as long as it is processing for detecting some feature from the camera image and outputting the result. For example, when the camera image includes the face of the player, the processing may be face recognition processing. Specifically, the processing may be processing for detecting a part of the face (eyes, nose, mouth, and the like), or processing for detecting an expression of the face. In addition, data indicating the result of the recognition processing is stored in the main memory as the image recognition data 110. The process of step S26 is performed after step S25.
In step S26, the CPU 10 executes the recognition processing of the microphone sound. That is, the CPU 10 performs predetermined recognition processing on the microphone sound data 105. The recognition processing may be any processing as long as it is processing for detecting some feature from the microphone sound and outputting the result. For example, the processing may be processing for detecting an instruction of the player from the microphone sound, or processing for detecting only the volume of the microphone sound. In addition, data indicating the result of the recognition processing is stored in the main memory as voice recognition data 111. The process of step S27 is performed after step S26.
In step S27, the CPU 10 executes game processing corresponding to the game input. Here, the game input may be any data as long as it is data transmitted from the controller 5 or the terminal device 7 or data obtained from the data. Specifically, the game input may be data obtained from the controller operation data 92 and the terminal operation data 97 (controller gesture data 108, terminal gesture data 109, image recognition data 110, and voice recognition data 111) in addition to the respective data. The content of the game processing in step S27 may be any, and may be, for example, processing for moving an object (character) appearing in the game, processing for controlling a virtual camera, or processing for moving a cursor displayed on the screen. Further, the processing may be processing using a camera image (or a part thereof) as a game image, processing using a microphone sound as a game sound, or the like. Note that an example of the game processing will be described later. In step S27, data indicating the result of the game control process, such as data on various parameters set for characters (objects) appearing in the game, data on parameters relating to virtual cameras arranged in the game space, and data on scores, are stored in the main memory. After step S27, the CPU 10 ends the game control process of step S4.
Returning to the explanation of fig. 22, in step S5, a game image for a television set for display on the television set 2 is generated by the CPU 10 and the GPU11 b. That is, the CPU 10 and the GPU11b read data indicating the result of the game control processing in step S4 from the main memory, and read data necessary for generating a game image from the VRAM 11d to generate a game image. The game image may be generated by any method as long as it indicates the result of the game control process of step S4. For example, the game image may be generated by a method of generating a three-dimensional CG image by calculating a game space viewed from a virtual camera by arranging the virtual camera in the virtual game space, or a method of generating a two-dimensional image (without using the virtual camera). The generated television game image is stored in the VRAM 11 d. The process of step S6 is performed after the above-described step S5.
In step S6, the CPU 10 and the GPU11b generate a terminal game image for display on the terminal device 7. The terminal game image may be generated by any method as long as it shows the result of the game control process of step S4, similarly to the television game image. The terminal game image may be generated by the same method as the television game image or by a different method. The generated terminal game image is stored in the VRAM 11 d. Note that, depending on the game content, the television game image and the terminal game image may be the same, and in this case, the game image generation process may not be executed in step S6. The process of step S7 is performed after the above-described step S6.
In step S7, a television game sound to be output to the speaker 2a of the television 2 is generated. That is, the CPU 10 causes the DSP 11c to generate a game sound corresponding to the result of the game control processing at step S4. The generated game sound may be, for example, an effect sound of a game, a sound of a character appearing in the game, a background sound (BGM), or the like. The process of step S8 is performed after the above-described step S7.
In step S8, terminal game sound for output to the speaker 77 of the terminal device 7 is generated. That is, the CPU 10 causes the DSP 11c to generate a game sound corresponding to the result of the game control processing at step S4. The terminal game sound may be the same as or different from the television game sound. For example, only a part of the BGM may be different, as the BGM is the same, but the effect sound is different. In addition, when the television game sound is the same as the terminal game sound, the game sound generation process may not be executed in step S8. The process of step S9 is performed after the above-described step S8.
In step S9, the CPU 10 outputs a game image and game sound to the television set 2. Specifically, the CPU 10 transmits the data of the television game image stored in the VRAM 11d and the data of the television game sound generated by the DSP 11c in step S7 to the AV-IC 15. Accordingly, the AV-IC 15 outputs data of images and sound to the television set 2 through the AV connector 16. Thereby, the television game image is displayed on the television 2, and the television game sound is output from the speaker 2 a. The process of step S10 is performed after step S9.
In step S10, the CPU 10 transmits the game image and the game sound to the terminal device 7. Specifically, the CPU 10 transmits the image data of the terminal game image stored in the VRAM 11d and the audio data generated by the DSP 11c in step S8 to the codec LSI 27, and the codec LSI 27 performs predetermined compression processing. The data of the compressed image and audio is transmitted to the terminal device 7 through the antenna 29 by the terminal communication module 28. The terminal device 7 receives the image and audio data transmitted from the game device 3 by the wireless module 80, and performs predetermined decompression processing by the codec LSI 76. The image data subjected to the decompression processing is output to the LCD 51, and the audio data subjected to the decompression processing is output to the audio IC 78. Thereby, the terminal game image is displayed on the LCD 51, and the terminal game sound is output from the speaker 77. The process of step S11 is performed after step S10.
In step S11, the CPU 10 determines whether to end the game. The determination of step S11 is made, for example, based on whether or not the game is ended, whether or not the user has made an instruction to suspend the game, or the like. In the case where the determination result of step S11 is negative, the process of step S2 is executed again. On the other hand, in the case where the determination result of step S11 is affirmative, the CPU 10 ends the game processing shown in fig. 22. Thereafter, the series of processes of steps S2 to S11 is repeatedly executed until it is determined in step S11 that the game is ended.
As described above, in the present embodiment, the terminal device 7 includes the inertial sensor such as the touch panel 52, the acceleration sensor 73, or the gyro sensor 74, and outputs of the touch panel 52 and the inertial sensor are transmitted to the game device 3 as operation data to be used as an input of the game (steps S3 and S4). The terminal device 7 includes a display device (LCD 51), and displays a game image obtained by the game processing on the LCD 51 (steps S6 and S10). Therefore, the user can perform an operation of directly touching the game image with the touch panel 52 (by detecting the movement of the terminal device 7 with the inertial sensor), and can also perform an operation of moving the LCD 51 displaying the game image itself. With these operations, the user can play a game with an operation feeling as if the user directly operated a game image, and therefore, for example, a game with a new operation feeling as in the first and second game examples described later can be provided.
In the present embodiment, the terminal device 7 includes the analog stick 53 and the operation buttons 54 that can be operated while holding the terminal device 7, and the game device 3 can use the operation of the analog stick 53 and the operation buttons 54 as the input of the game (steps S3 and S4). Therefore, even when the game image is directly operated as described above, the user can perform a more detailed game operation by a button operation or a joystick operation.
In the present embodiment, the terminal device 7 includes the camera 56 and the microphone 79, and transmits data of the camera image captured by the camera 56 and data of the microphone sound detected by the microphone 79 to the game device 3 (step S3). Therefore, the game device 3 can use the camera image and/or the microphone sound as a game input, and therefore the user can also perform a game operation by an operation of capturing an image by the camera 56 and an operation of inputting a sound to the microphone 79. Further, since these operations can be performed in a state where the terminal device 7 is held, by performing these operations while directly operating the game image as described above, the user can perform more various game operations.
In the present embodiment, since the game image is displayed on the LCD 51 of the portable terminal device 7 (steps S6 and S10), the user can freely arrange the terminal device 7. Therefore, when the user operates the controller 5 toward the marker, the user can play the game in any direction by disposing the terminal device 7 at any position, and the degree of freedom in operating the controller 5 can be increased. Further, since the terminal device 7 can be arranged at an arbitrary position, a game having a stronger sense of reality can be provided by arranging the terminal device 7 at a position suitable for the game content, as in a fifth game example described later.
In addition, according to the present embodiment, the game device 3 acquires operation data and the like from the controller 5 and the terminal device 7 (steps S2, S3), and therefore the user can use both the controller 5 and the terminal device 7 as operation units. Therefore, in the game system 1, a plurality of users can play a game using each device, and one user can play a game using two devices.
Further, according to the present embodiment, the game device 3 generates two kinds of game images (steps S5 and S6), and can cause the television set 2 and the terminal device 7 to display the game images (steps S9 and S10). In this way, by displaying two kinds of game images on different devices, it is possible to provide a game image that is easier for the user to view, and it is possible to improve the operability of the game. For example, when two players play a game, as in the third or fourth game example described later, each player can play the game from a viewpoint that is easy to see by displaying a game image from a viewpoint that is easy to see by one of the users on the television 2 and displaying a game image from a viewpoint that is easy to see by the other user on the terminal device 7. Further, even when a game is played by one person, for example, as in the first, second, and fifth game examples described later, by displaying two types of game images at two different viewpoints, the player can more easily grasp the appearance of the game space, and the operability of the game can be improved.
[6. Game example ]
Next, a specific example of a game to be played in the game system 1 will be described. In addition, in the game example to be described below, there is a case where a part of the configuration of each device in the game system 1 is not used, and there is a case where a part of the series of processes shown in fig. 22 and 23 is not executed. That is, the game system 1 may not have all of the above-described configurations, and the game device 3 may not perform a part of the series of processes shown in fig. 22 and 23.
(first Game example)
The first game example is a game in which an object (a sword in hands) is flown in a game space by operating the terminal device 7. The player can instruct the direction in which the sword is emitted by an operation of changing the posture of the terminal device 7 and an operation of drawing a line on the touch panel 52.
Fig. 24 is a diagram showing the screen of the television 2 and the terminal device 7 in the first game example. In fig. 24, a game image showing a game space is displayed on the LCD 51 of the television 2 and the terminal device 7. A sword 121, a control surface 122, and a target 123 are displayed on the television set 2. On the LCD 51, a control surface 122 (and a sword 121) are displayed. In the first game example, the player plays the game by causing the sword 121 to fly out and hit the target 123 by the operation of the terminal device 7.
When the sword 121 flies out, the player first changes the posture of the control surface 122 disposed in the virtual game space to a desired posture by manipulating the posture of the terminal device 7. That is, the CPU 10 calculates the attitude of the terminal device 7 from the outputs of the inertial sensor (the acceleration sensor 73 and the gyro sensor 74) and the magnetic sensor 72 (step S24), and changes the attitude of the control surface 122 according to the calculated attitude (step S27). In the first game example, the posture of the control surface 122 is controlled to a posture corresponding to the posture of the terminal device 7 in the real space. That is, the player can change the posture of the control surface 122 in the game space by changing the posture of the terminal device 7 (the control surface 122 displayed on the terminal device 7). In the first game example, the position of the control surface 122 is fixed to a predetermined position in the game space.
Next, the player performs an operation of drawing a line on touch panel 52 using stylus 124 or the like (see an arrow shown in fig. 24). Here, in the first game example, the control surface 122 is displayed on the LCD 51 of the terminal device 7 so that the input surface of the touch panel 52 corresponds to the control surface 122. Therefore, the direction on the control surface 122 (the direction indicated by the line) can be calculated from the line drawn on the touch panel 52. The hand sword 121 is launched in the direction determined by this. As described above, the CPU 10 calculates the direction on the control surface 122 from the touch position data 100 of the touch panel 52, and performs the process of moving the sword 121 in the calculated direction (step S27). The CPU 10 may control the speed of the sword 121 based on, for example, the length of the line or the speed of the line.
As described above, according to the first game example, the game device 3 can move the control surface 122 in accordance with the movement (posture) of the terminal device 7 by using the output of the inertial sensor as the game input, and can specify the direction on the control surface 122 by using the output of the touch panel 52 as the game input. As a result, the player can move the game image (the image of the control surface 122) displayed on the terminal device 7 or perform a touch operation on the game image, and thus can play a game with a new operation feeling as if the game image were directly operated.
In addition, in the first game example, directions in a three-dimensional space can be easily indicated by using the sensor outputs of the inertial sensor and the touch panel 52 as game inputs. That is, the player can easily instruct the direction by an intuitive operation such as actually inputting the direction in space by actually adjusting the posture of the terminal device 7 with one hand and inputting the direction on the touch panel 52 in a line manner with the other hand. Further, since the player can simultaneously perform the operation of the posture of the terminal device 7 and the input operation to the touch panel 52 in parallel, the player can quickly perform the operation of instructing the direction in the three-dimensional space.
In addition, according to the first game example, in order to facilitate the touch input operation on the control surface 122, the control surface 122 is displayed in full screen on the terminal device 7. On the other hand, the television 2 displays an image of a game space including the entire control surface 122 and the target 123 so that the posture of the control surface 122 can be easily grasped and the target 123 can be easily aimed (see fig. 24). That is, in step S27, the first virtual camera for generating the game image for the television is set so that the entire control surface 122 and the target 123 are included in the visual field range, and the second virtual camera for generating the game image for the terminal is set so that the screen of the LCD 51 (the input surface of the touch panel 52) and the control surface 122 coincide with each other on the screen. Therefore, in the first game example, the game operation is more easily performed by displaying the images of the game space viewed from different viewpoints on the television set 2 and the terminal device 7.
(second Game example)
Further, the game using the sensor outputs of the inertial sensor and the touch panel 52 as game inputs is not limited to the first game example described above, and various game examples can be conceived. The second game example is a game in which the terminal device 7 is operated to fly an object (cannonball) in the game space, as in the first game example. The player can instruct the direction in which the shell is fired by an operation of changing the posture of the terminal device 7 and an operation of specifying the position on the touch panel 52.
Fig. 25 is a diagram showing the screen of the television 2 and the terminal device 7 in the second game example. In fig. 25, a cannon 131, a cannonball 132, and a target 133 are shown on the television set 2. A projectile 132 and a target 133 are shown on the terminal device 7. The terminal game image displayed on the terminal device 7 is an image obtained by observing the game space from the position of the cannon 131.
In the second game example, the player can change the display range displayed on the terminal device 7 as the terminal game image by manipulating the posture of the terminal device 7. That is, the CPU 10 calculates the attitude of the terminal device 7 from the outputs of the inertial sensor (the acceleration sensor 73 and the gyro sensor 74) and the magnetic sensor 72 (step S24), and controls the position and attitude of the second virtual camera for generating the terminal game image based on the calculated attitude (step S27). Specifically, the second virtual camera is provided at the position of the cannon 131, and the orientation (posture) thereof is controlled in accordance with the posture of the terminal device 7. In this way, the player can change the range of the game space displayed on the terminal device 7 by changing the posture of the terminal device 7.
In addition, in the second game example, the player specifies the shooting direction of the cannonball 132 by an operation (touch operation) of inputting a point on the touch panel 52. Specifically, as the processing of step S27 described above, the CPU 10 calculates a position (control position) in the game space corresponding to the touched position, and calculates a direction from a predetermined position (for example, the position of the cannon 131) in the game space toward the control position as the launch direction. Then, a process of moving the cannonball 132 in the shooting direction is performed. As described above, in the first game example, the player performs an operation of drawing a line on the touch panel 52, but in the second game example, an operation of designating a point on the touch panel 52 is performed. The control position can be calculated by setting the same control surface as in the first game example (although the control surface is not shown in the second game example). That is, by arranging the control surface in accordance with the posture of the second virtual camera so as to correspond to the display range of the terminal device 7 (specifically, the control surface rotates around the position of the cannon 131 in accordance with the change in the posture of the terminal device 7), the position on the control surface corresponding to the touched position can be calculated as the control position.
According to the second game example described above, the game device 3 can change the display range of the terminal-use game image in accordance with the movement (posture) of the terminal device 7 by using the output of the inertial sensor as the game input, and can determine the direction (the emission direction of the cannonball 132) within the game space by using the touch input specifying the position within the display range as the game input. Therefore, in the second game example as well, the player can move the game image displayed on the terminal device 7 or perform a touch operation on the game image, as in the first game example, and therefore can play a game with a new operation feeling as if the game image were directly operated.
In the second embodiment, as in the first embodiment, the player can easily indicate the direction by an intuitive operation such as actually inputting the direction in space by actually adjusting the posture of the terminal device 7 with one hand and performing a touch input to the touch panel 52 with the other hand. Further, since the player can simultaneously perform the operation of the posture of the terminal device 7 and the input operation to the touch panel 52 in parallel, the player can quickly perform the operation of instructing the direction in the three-dimensional space.
In the second game example, the image displayed on the television set 2 may be an image viewed from the same viewpoint as the terminal device 7, but in fig. 25, the game device 3 is assumed to display an image viewed from a different viewpoint. That is, the second virtual camera for generating the terminal game image is set at the position of the cannon 131, whereas the first virtual camera for generating the television game image is set at the position behind the cannon 131. Here, for example, by displaying the range that cannot be seen on the screen of the terminal device 7 on the television 2, it is possible to realize a game system in which the player aims at the target 133 that cannot be seen on the screen of the terminal device 7 while watching the screen of the television 2. By differentiating the display ranges of the television 2 and the terminal device 7 in this way, not only the appearance in the game space can be grasped more easily, but also the interest of the game can be further improved.
As described above, according to the present embodiment, since the terminal device 7 including the touch panel 52 and the inertial sensor can be used as the operation device, it is possible to realize a game having an operation feeling of directly operating a game image as in the first and second game examples described above.
(third Game example)
Next, a third game example is explained with reference to fig. 26 and 27. A third game example is a baseball game in the form of a two-player battle. That is, the first player operates the batter with the controller 5, and the second player operates the pitcher with the terminal device 7. Further, a game image that is easy for each player to perform a game operation is displayed on the television 2 and the terminal device 7.
Fig. 26 is a diagram showing an example of a television game image displayed on the television 2 in the third game example. The television game image shown in fig. 26 is an image mainly provided to the first player. That is, the television game image shows a game space obtained by observing a pitcher (pitcher object) 142 as an operation object of the second player from the side of a batter (batter object) 141 as an operation object of the first player. The first virtual camera for generating a television game image is disposed at a position behind the batter 141 so as to face the pitcher 142 from the batter 141.
On the other hand, fig. 27 is a diagram showing an example of a terminal game image displayed on the terminal device 7 in the third game example. The terminal game image shown in fig. 27 is an image mainly provided to the second player. That is, the terminal game image represents a game space obtained by observing the batter 141 as the operation target of the first player from the pitcher 142 side as the operation target of the second player. Specifically, in step S27, the CPU 10 controls the second virtual camera used to generate the terminal game image in accordance with the orientation of the terminal device 7. As in the second game example, the attitude of the second virtual camera is calculated in accordance with the attitude of the terminal device 7. The position of the second virtual camera is fixed at a predetermined position. The terminal game image includes a cursor 143 for indicating the direction in which the pitcher 142 throws the ball.
Further, the method of operating the batter 141 by the first player and the method of operating the pitcher 142 by the second player may be any methods. For example, the CPU 10 may detect a swing operation to the controller 5 from output data of an inertial sensor of the controller 5, and may cause the batter 141 to swing a club in accordance with the swing operation. For example, the CPU 10 may move the cursor 143 in accordance with the operation of the analog stick 53, and may cause the pitcher 142 to perform a pitching operation toward the position indicated by the cursor 143 when a predetermined button of the operation buttons 54 is pressed. Instead of operating the analog stick 53, the cursor 143 may be moved according to the posture of the terminal device 7.
As described above, in the third game example, the game images are generated from different viewpoints on the television 2 and the terminal device 7, thereby providing the game images that are easy to view and easy to operate for each player.
In the third game example, two virtual cameras are set in one game space, and two kinds of game images obtained by observing the game space from the respective virtual cameras are displayed (fig. 26 and 27). Therefore, the two kinds of game images generated in the third game example have an advantage that the game processes (control of objects in the game space, etc.) for the game space are almost the same, and each game image can be generated by performing the drawing process twice in the same game space, and therefore, the processing efficiency is higher than that in the case of performing the game processes separately.
In the third game example, since the cursor 143 indicating the pitching direction is displayed only on the terminal device 7 side, the first player cannot see the position indicated by the cursor 143. Therefore, there is no unreasonable in the game that the first player knows the direction of pitching to be disadvantageous to the second player. As described above, in the present embodiment, if one player sees the game image, if an unreasonable game is caused for the other player, the game image may be displayed on the terminal device 7. This prevents problems such as a decrease in game strategy. In another embodiment, the game device 3 may display the terminal game image on the television 2 together with the television game image, depending on the content of the game (for example, in a case where the first player does not see the terminal game image unreasonably as described above).
(fourth Game example)
Next, a fourth game example is explained with reference to fig. 28 and 29. A fourth game example is a shooting game in a two-player cooperation format. That is, the first player performs an operation of moving the plane by using the controller 5, and the second player performs an operation of controlling the shooting direction of the cannon of the plane by using the terminal device 7. In the fourth game example, as in the third game example, a game image that is easy for each player to perform a game operation is displayed on the television 2 and the terminal device 7.
Fig. 28 is a diagram showing an example of a television game image displayed on the television 2 in the fourth game example. Fig. 29 is a diagram showing an example of a terminal game image displayed on the terminal device 7 in the fourth game example. As shown in fig. 28, in the fourth game example, an airplane (airplane object) 151 and a target (balloon object) 153 appear in a virtual game space. In addition, the airplane 151 has a cannon (cannon object) 152.
As shown in fig. 28, an image of a game space including the airplane 151 is displayed as a television game image. The first virtual camera for generating a television game image is set to generate an image of a game space obtained by viewing the airplane 151 from behind. That is, the first virtual camera is arranged at a position behind the airplane 151 in a posture in which the airplane 151 is included in the shooting range (field range). In addition, the first virtual camera is controlled to move as the airplane 151 moves. That is, the CPU 10 controls the movement of the airplane 151 according to the controller operation data in the process of the above-described step S27, and controls the position and the attitude of the first virtual camera. In this way, the position and posture of the first virtual camera are controlled in accordance with the operation of the first player.
On the other hand, as shown in fig. 29, an image of the game space viewed from the flight 151 (more specifically, the cannon 152) is displayed as the terminal game image. Thus, the second virtual camera for generating the terminal game image is arranged at the position of the airplane 151 (more specifically, the position of the cannon 152). The CPU 10 controls the movement of the airplane 151 according to the controller operation data and controls the position of the second virtual camera in the process of step S27 described above. The second virtual camera may be disposed at a position around the plane 151 or the cannon 152 (for example, a position slightly behind the cannon 152). As described above, the position of the second virtual camera is controlled by the operation of the first player (operating the movement of the airplane 151). Therefore, in the fourth game example, the first virtual camera and the second virtual camera move in conjunction with each other.
Further, as the terminal game image, an image of the game space viewed in the direction of the shooting direction of the cannon 152 is displayed. Here, the firing direction of the cannon 152 is controlled in a manner corresponding to the attitude of the terminal device 7. That is, in the present embodiment, the posture of the second virtual camera is controlled so that the line-of-sight direction of the second virtual camera coincides with the shooting direction of the cannon 152. In the processing of step S27, the CPU 10 controls the orientation of the cannon 152 and the orientation of the second virtual camera in accordance with the orientation of the terminal device 7 calculated in step S24. In this way, the posture of the second virtual camera is controlled in accordance with the operation of the second player. In addition, the second player can change the shooting direction of the cannon 152 by changing the posture of the terminal device 7.
Further, in the case of firing a cannonball from the cannon 152, the second player presses a prescribed button of the terminal device 7. When a given button is pressed, the cannonball 152 is fired toward its orientation. In the terminal game image, an aiming cursor 154 is displayed at the center of the screen of the LCD 51, and the projectile is fired in the direction indicated by the aiming cursor 154.
As described above, in the fourth game example, the first player operates the airplane 151 (e.g., moves it in the direction of the desired target 153) while mainly observing the television game image (fig. 28) showing the game space observed in the traveling direction of the airplane 151. On the other hand, the second player operates the cannon 152 while mainly observing the terminal-use game image (fig. 29) representing the game space observed in the shooting direction of the cannon 152. In this way, in the fourth game example, in a game of a type in which two players cooperate, game images that are easy to observe and easy to operate for the respective players can be displayed on the television 2 and the terminal device 7, respectively.
In the fourth game example, the positions of the first virtual camera and the second virtual camera are controlled in accordance with the operation by the first player, and the posture of the second virtual camera is controlled in accordance with the operation by the second player. That is, in the present embodiment, the position or orientation of the virtual camera changes according to the game operation of each player, and as a result, the display range of the game space displayed on each display device changes. Since the display range of the game space displayed on the display device changes in accordance with the operation of each player, each player can actually feel that his or her game operation is sufficiently reflected in the progress of the game, and can sufficiently enjoy the game.
In the fourth game example, a game image viewed from behind the flight 151 is displayed on the television 2, and a game image viewed from the position of the cannon of the flight 151 is displayed on the terminal device 7. Here, in another game example, the game device 3 may display the game image viewed from the rear of the airplane 151 on the terminal device 7 and display the game image viewed from the position of the cannon 152 of the airplane 151 on the television 2. In this case, the characters of the players may be exchanged with those of the fourth game example, and the first player may operate the cannon 152 by using the controller 5, and the second player may operate the airplane 151 by using the terminal device 7.
(fifth Game example)
Next, a fifth game example is explained with reference to fig. 30. The fifth game example is a game in which a player operates using the controller 5, and the terminal device 7 is used as a display device instead of an operation device. Specifically, the fifth game example is a golf game, and the game device 3 causes the player character in the virtual game space to perform a golf swing motion in accordance with an operation (swing operation) in which the player swings the controller 5 like a golf club.
Fig. 30 is a diagram showing a use situation of the game system 1 in the fifth game example. In fig. 30, an image of a game space including (an object of) a player character 161 and (an object of) a golf club 162 is displayed on the screen of the television set 2. In fig. 30, (an object of) a ball 163 disposed in the game space, which is not shown because it is hidden by the golf club 162, is also displayed on the television 2. On the other hand, as shown in fig. 30, the terminal device 7 is disposed on the floor on the front side of the television set 2 such that the screen of the LCD 51 faces vertically upward. An image showing the ball 163, an image showing a part of the golf club 162 (specifically, the head 162a of the golf club), and an image showing the floor of the game space are displayed on the terminal device 7. The terminal game image is an image obtained by observing the periphery of the ball from above.
When playing a game, the player 160 stands near the terminal device 7 and performs a swing operation of swinging the controller 5 like a golf club. At this time, the CPU 10 controls the position and posture of the golf club 162 in the game space in accordance with the posture of the controller 5 calculated by the process of the above step S23 in the above step S27. Specifically, the golf club 162 is controlled in the following manner: the golf club 162 in the game space hits the ball 163 with the tip direction of the controller 5 (the positive Z-axis direction shown in fig. 3) directed toward the image of the ball 163 displayed on the LCD 51.
When the tip direction of the controller 5 is directed toward the LCD 51, an image (head image) 164 showing a part of the golf club 162 is displayed on the LCD 51 (see fig. 30). In addition, in order to increase the realistic sensation, the image of the ball 163 may be displayed in a full size, or the orientation of the head image 164 may be displayed so as to rotate in accordance with the rotation of the controller 5 about the Z axis. The terminal game image may be generated by a virtual camera provided in the game space, or may be generated by image data prepared in advance. In the case of generation using image data prepared in advance, a detailed and realistic image can be generated with a small processing load without building a topographical model of a golf course in detail.
When the ball 163 is hit by the golf club 162 as a result of the player 160 swinging the golf club 162 by performing the swing operation described above, the ball 163 moves (flies). That is, in step S27, the CPU 10 determines whether or not the golf club 162 has contacted the ball 163, and moves the ball 163 when it has contacted. Here, a game image for a television is generated so as to include the moved ball 163. That is, the CPU 10 controls the position and the posture of the first virtual camera for generating the television game image so that the moving ball is included in the shooting range of the first virtual camera. On the other hand, in the terminal device 7, when the golf club 162 hits the ball 163, the image of the ball 163 moves and disappears immediately outside the screen. Thus, in the fifth game example, the situation in which the ball moves is mainly displayed on the television 2, and the player 160 can confirm the trajectory of the ball flying through the swing operation by the game image on the television.
As described above, in the fifth game example, the player 160 can swing the golf club 162 (make the player character 161 swing the golf club 162) by swinging the controller 5. Here, in the fifth game example, control is performed in the following manner: when the tip direction of the controller 5 is directed to the image of the ball 163 displayed on the LCD 51, the golf club 162 in the game space is caused to hit the ball 163. Thus, the player can obtain a feeling as if he is swinging an actual golf club through the swing operation, so that the swing operation can be performed with more realistic feeling.
In the fifth game example, the head image 164 is also displayed on the LCD 51 with the front end direction of the controller 5 facing the terminal device 7. Therefore, the player can obtain a feeling that the posture of the golf club 162 in the virtual space corresponds to the posture of the controller 5 in the real space by directing the tip end direction of the controller 5 toward the terminal device 7, and can perform the swing operation with more realistic feeling.
As described above, in the fifth game example, when the terminal device 7 is used as the display device, the operation by the controller 5 can be performed with more realistic feeling by disposing the terminal device 7 at an appropriate position.
In the fifth game example, the terminal device 7 is disposed on the floor, and an image showing only the game space around the ball 163 is displayed on the terminal device 7. Therefore, the position and posture of the entire golf club 162 in the game space cannot be displayed on the terminal device 7, and the movement of the ball 163 after the swing operation cannot be displayed on the terminal device 7. Therefore, in the fifth game example, the entire golf club 162 is displayed on the television 2 before the ball 163 moves, and the situation where the ball 163 moves after the ball 163 moves is displayed on the television 2. In this way, according to the fifth game example, it is possible to provide the player with realistic operations and to present an easily viewable game image to the player through the two screens of the television set 2 and the terminal device 7.
In the fifth game example, the marker 55 of the terminal device 7 is used to calculate the posture of the controller 5. That is, CPU 10 turns on marker unit 55 (does not turn on marker device 6) in the initial processing of step S1, and CPU 10 calculates the posture of controller 5 from marker coordinate data 96 in step S23. This makes it possible to accurately determine whether or not the front end of the controller 5 is oriented toward the marker 55. Further, in the fifth game example, the above steps S21 and S22 may not be performed, but in another game example, the markers to be lit may be changed during the game by performing the above processes of steps S21 and S22. For example, the CPU 10 may determine whether the tip end direction of the controller 5 is oriented in the gravity direction from the first acceleration data 94 in step S21, and control the following in step S22: the marker 55 is turned on if the direction of gravity is oriented, and the marker device 6 is turned on if the direction of gravity is not oriented. Thus, when the tip end direction of the controller 5 is oriented in the gravity direction, the posture of the controller 5 can be calculated with high accuracy by acquiring the marker coordinate data of the marker section 55, and when the tip end direction of the controller 5 is oriented in the television set 2, the posture of the controller 5 can be calculated with high accuracy by acquiring the marker coordinate data of the marker device 6.
As described in the fifth game example, the game system 1 can use the terminal device 7 as a display device by being installed at an arbitrary position. Thus, when marker coordinate data is used as a game input, the controller 5 can be used in any direction by setting the terminal device 7 at a desired position in addition to the controller 5 toward the television 2. That is, according to the present embodiment, since the direction in which the controller 5 can be used is not limited, the degree of freedom of the operation of the controller 5 can be improved.
[7 ] other action examples of the Game System ]
The game system 1 can perform operations for playing various games as described above. The terminal device 7 can also be used as a portable display or a second display, and can also be used as a controller for performing touch input or input by movement, and according to the game system 1, a wide range of games can be implemented. Further, the following operations can be performed for applications other than games.
(operation example in which a player plays a game only with the terminal device 7)
In the present embodiment, the terminal device 7 functions as a display device and also functions as an operation device. Therefore, the terminal device 7 is used as a display unit and an operation unit without using the television set 2 and the controller 5, whereby the terminal device 7 can also be used like a portable game device.
Specifically describing the game process shown in fig. 22, the CPU 10 acquires the terminal operation data 97 from the terminal device 7 in step S3, and executes the game process using only the terminal operation data 97 as a game input (without using the controller operation data) in step S4. Then, a game image is generated in step S6, and the game image is transmitted to the terminal device 7 in step S10. In this case, steps S2, S5, and S9 may not be performed. According to the above, the game processing is performed in accordance with the operation of the terminal device 7, and the game image showing the result of the game processing is displayed on the terminal device 7. In this way, the terminal device 7 can be used as a portable game device (although the game device actually executes the game process). Therefore, according to the present embodiment, even when a game image cannot be displayed on the television set 2 due to the television set 2 being used (for example, another person is watching a television broadcast), the user can play a game using the terminal device 7.
Further, the CPU 10 may transmit an image of the menu screen displayed after power-on to the terminal device 7 and display the image, not limited to the game image. This enables the player to play the game from the beginning without using the television 2, which is convenient.
In the above, the display device that displays the game image can be changed from the terminal device 7 to the television 2 during the game. Specifically, the CPU 10 may output the game image to the television 2 by further executing the step S9. Further, the image output to the television set 2 in step S9 is the same as the game image transmitted to the terminal device 7 in step S10. Thus, by switching the input from the television 2 to the display of the input from the game device 3, the same game image as that of the terminal device 7 can be displayed on the television 2, and therefore, the display device displaying the game image can be changed to the television 2. Further, after the game image is displayed on the television 2, the screen display of the terminal device 7 may be turned off.
In the game system 1, an infrared remote control signal for the television 2 may be output from an infrared output unit (the marker device 6, the marker section 55, or the infrared communication module 82). Accordingly, the game device 3 can operate the television 2 by outputting the infrared remote control signal from the infrared output means in accordance with the operation on the terminal device 7. In this case, since the user can operate the television set 2 by using the terminal device 7 without operating the remote controller of the television set 2, it is convenient in the case of switching the input of the television set 2 as described above.
(example of operation for communicating with other devices via network)
As described above, since the game device 3 has a function of connecting to a network, the game system 1 can be used even when communicating with an external device via a network. Fig. 31 is a diagram showing a connection relationship of each device included in the game system 1 when connected to an external device via a network. As shown in fig. 31, the game device 3 can communicate with an external device 191 via a network 190.
As described above, when the external device 191 and the game device 3 can communicate with each other, the game system 1 can communicate with the external device 191 using the terminal device 7 as an interface. For example, the game system 1 can be used as a television telephone by transmitting and receiving images and sounds between the external device 191 and the terminal device 7. Specifically, the game device 3 receives images and sounds (images and sounds of the other party of the telephone) from the external device 191 via the network 190, and transmits the received images and sounds to the terminal device 7. Thereby, the terminal device 7 displays an image from the external device 191 on the LCD 51, and outputs sound from the external device 191 from the speaker 77. Further, the game device 3 receives the camera image captured by the camera 56 and the microphone sound detected by the microphone 79 from the terminal device 7, and transmits the camera image and the microphone sound to the external device 191 via the network 190. The game device 3 can use the game system 1 as a video phone by repeating the transmission and reception of the image and the sound with the external device 191.
In the present embodiment, since the terminal device 7 is a portable device, the user can use the terminal device 7 at an arbitrary position or orient the camera 56 in an arbitrary direction. In the present embodiment, since the terminal device 7 includes the touch panel 52, the game device 3 can also transmit the input information (touch position data 100) to the touch panel 52 to the external device 191. For example, when an image or a voice from the external device 191 is output through the terminal device 7 and characters or the like written on the touch panel 52 by the user are transmitted to the external device 191, the game System 1 can be used as the online teaching System (E-Learning System) 1.
(example of operation in conjunction with television broadcast)
In addition, the game system 1 can also operate in conjunction with a television broadcast when the television broadcast is viewed through the television set 2. That is, when a television program is being viewed through the television set 2, the game system 1 causes the terminal device 7 to output information and the like relating to the television program. Next, an operation example when the game system 1 operates in conjunction with television broadcasting will be described.
In the above operation example, the game device 3 can communicate with a server via a network (in other words, the external device 191 shown in fig. 31 is a server). The server stores various information (television information) associated with the television broadcast for each channel of the television broadcast. The television information may be information related to a program such as subtitles and cast information, or information of an EPG (electronic program guide) or information broadcast as data broadcast. The television information may be image, sound, or text information, or a combination thereof. Further, the number of servers is not necessarily one, and a server may be provided for each channel of television broadcasting or each program, and the game device 3 may communicate with each server.
When the television 2 is outputting video and audio of a television broadcast, the game device 3 allows the user to input a channel of the television broadcast being viewed using the terminal device 7. Then, the server is requested via the network to transmit television information corresponding to the inputted channel. Accordingly, the server transmits data of the television information corresponding to the channel. Upon receiving the data transmitted from the server, game device 3 outputs the received data to terminal device 7. The terminal device 7 displays image and character data among the data on the LCD 51, and outputs audio data from a speaker. According to the above, the user can enjoy information and the like relating to the television program currently being viewed by using the terminal device 7.
As described above, the game system 1 can also provide information linked with television broadcasting to the user through the terminal device 7 by communicating with an external device (server) via a network. In particular, in the present embodiment, the terminal device 7 is a portable device, and therefore, the user can use the terminal device 7 at an arbitrary position, and convenience is high.
As described above, in the present embodiment, the user can use the terminal device 7 in various applications and modes in addition to using the terminal device 7 in the game.
[8. modification ]
The above embodiment is an example of the present invention, and the present invention can be implemented by, for example, the configuration described below in other embodiments.
(modification having a plurality of terminal devices)
In the above embodiment, the game system 1 has only one terminal device, but the game system 1 may have a plurality of terminal devices. That is, the game device 3 may be configured as follows: the game device is capable of performing wireless communication with each of a plurality of terminal devices, transmitting game image data, game sound data, and control data to each terminal device, and receiving operation data, camera image data, and microphone sound data from each terminal device. In this case, the game device 3 may perform wireless communication with each of the terminal devices in a time division manner, or may perform communication by allocating a frequency band.
When a plurality of terminal devices are provided as described above, a wider variety of games can be played using the game system. For example, in the case where the game system 1 has two terminal devices, since the game system 1 has three display devices, game images provided to three players can be generated and displayed on the respective display devices. In the case where the game system 1 has two terminal devices, two players can play a game at the same time in a game in which the controller and the terminal devices are used as one set (for example, the fifth game example described above). When the game processing in step S27 is performed based on marker coordinate data output from the two controllers, the two players can each perform a game operation in which the controller is directed to the marker (the marker device 6 or the marker unit 55). That is, one player can cause the controller to perform a game operation toward the marker device 6, and the other player can cause the controller to perform a game operation toward the marker section 55.
(modification of function of terminal device)
In the above embodiment, the terminal device 7 functions as a so-called thin client (thin client) that does not execute a game process. In other embodiments, a part of the series of game processes executed by the game device 3 in the above embodiments may be executed by another device such as the terminal device 7. For example, the terminal device 7 may be caused to execute a part of the processing (for example, the terminal game image generation processing). That is, the terminal device may function as a portable game device that performs game processing in accordance with an operation on the operation unit, generates a game image in accordance with the game processing, and displays the game image on the display unit. In a game system having a plurality of information processing apparatuses (game apparatuses) capable of communicating with each other, for example, the plurality of information processing apparatuses may share and execute a game process.
(modification of the configuration of the terminal device)
The terminal device of the above embodiment is an example, and the shape of each operation button, the shape of the housing 50, the number and the installation position of each component of the terminal device are merely examples, and other shapes, numbers and installation positions may be used. For example, the terminal device may have the following configuration. Next, a modification of the terminal device will be described with reference to fig. 32 to 35.
Fig. 32 is a diagram showing an external configuration of a terminal device according to a modification of the above embodiment. Fig. 32 (a) is a front view, fig. (b) is a top view, fig. (c) is a right side view, and fig. (d) is a bottom view of the terminal device. Fig. 33 is a diagram showing a state in which the user holds the terminal device shown in fig. 32. In fig. 32 and 33, the same reference numerals as in fig. 8 are given to components corresponding to those of the terminal device 7 of the above-described embodiment, but they need not be constituted by the same components.
As shown in fig. 32, the terminal device 8 includes a housing 50, and the housing 50 has a substantially horizontally long rectangular plate-like shape. The housing 50 is sized to be gripped by a user. Therefore, the user can hold and move the terminal device 8 or change the arrangement position of the terminal device 8.
The terminal device 8 has an LCD 51 on the front surface of the housing 50. The LCD 51 is disposed near the center of the front surface of the housing 50. Therefore, the user can hold the terminal device while viewing the screen of the LCD 51 by holding the case 50 on both sides of the LCD 51 as shown in fig. 9. Fig. 9 shows an example in which the user holds the terminal device 8 in the lateral direction (in the laterally long direction) while holding the housings 50 on the left and right sides of the LCD 51, but the user may hold the terminal device 8 in the longitudinal direction (in the longitudinally long direction).
As shown in fig. 32 (a), the terminal device 8 has a touch panel 52 as an operation unit (operation section) on the screen of the LCD 51. In the present modification, the touch panel 52 is a resistive film type touch panel. However, the touch panel is not limited to the resistive type, and any type of touch panel such as a capacitive type can be used. The touch panel 52 may be of a single-touch type or a multi-touch type. In the present modification, a touch panel having the same resolution (detection accuracy) as that of the LCD51 is used as the touch panel 52. However, the resolution of the touch panel 52 and the resolution of the LCD51 do not necessarily have to coincide. The input to the touch panel 52 is usually performed by a stylus, but not limited to the stylus, the user can perform the input to the touch panel 52 by a finger. Further, the housing 50 may be provided with a storage hole for storing a stylus for operating the touch panel 52. Since the terminal device 8 includes the touch panel 52 in this manner, the user can operate the touch panel 52 while moving the terminal device 8. That is, the user can directly make an input to the screen of the LCD51 (through the touch panel 52) while moving the screen.
As shown in fig. 32, the terminal device 8 includes two analog sticks 53A and 53B and a plurality of buttons 54A to 54L as operation means (operation portion). Each analog joystick 53A and 53B is a device that indicates direction. Each of the analog sticks 53A and 53B is configured to be able to slide or tilt the stick part operated by the user's finger in an arbitrary direction (an arbitrary angle in the vertical, horizontal, and oblique directions) with respect to the surface of the housing 50. In addition, the left analog stick 53A is provided on the left side of the screen of the LCD 51, and the right analog stick 53B is provided on the right side of the screen of the LCD 51. Therefore, the user can input the pointing direction by the analog stick regardless of the left or right hand. In addition, as shown in fig. 33, the respective analog sticks 53A and 53B are provided at positions where the user can operate the left and right portions of the terminal device 8 in a state of holding them, and therefore the user can easily operate the respective analog sticks 53A and 53B even in a case of holding and moving the terminal device 8.
The buttons 54A to 54L are operation means for performing predetermined input. As will be described later, the buttons 54A to 54L are provided at positions where the user can operate the left and right portions of the terminal device 8 while holding them (see fig. 33). Therefore, even when the user holds and moves the terminal device 8, the user can easily operate these operation means.
As shown in fig. 32 (a), a cross button (direction input button) 54A and buttons 54B to 54H among the operation buttons 54A to 54L are provided on the front surface of the housing 50. That is, these buttons 54A to 54G are arranged at positions that can be operated by the thumb of the user (see fig. 33).
The cross button 54A is provided on the left side of the LCD 51 and on the lower side of the left analog stick 53A. That is, the cross button 54A is arranged at a position where the user can operate with the left hand. The cross button 54A has a cross shape and is a button capable of indicating the vertical and horizontal directions. Buttons 54B to 54D are provided on the lower side of LCD 51. The three buttons 54B to 54D are disposed at positions where they can be operated with both the left and right hands. The four buttons 54E to 54H are provided on the right side of the LCD 51 and below the right analog stick 53B. That is, the four buttons 54E to 54H are arranged at positions where the user can operate with the right hand. The four buttons 54E to 54H are disposed in a vertical and horizontal positional relationship (with respect to the center positions of the four buttons 54E to 54H). Therefore, the terminal device 8 can also function the four buttons 54E to 54H as buttons for allowing the user to instruct the up, down, left, and right directions.
In addition, as shown in fig. 32 (a), (b), and (c), the first L button 54I and the first R button 54J are provided on the obliquely upper portion (upper left portion and upper right portion) of the housing 50. Specifically, the first L-shaped button 54I is provided at the left end of the upper side surface of the plate-shaped housing 50 and is exposed from the upper and left side surfaces. The first R button 54J is provided at the right end of the upper side surface of the housing 50 and is exposed from the upper and right side surfaces. In this way, the first L button 54I is disposed at a position where the user can operate with the left index finger, and the first R button 54J is disposed at a position where the user can operate with the right index finger (see fig. 33).
As shown in fig. 32 (B) and (c), the second L button 54K and the second R button 54L are disposed on leg portions 59A and 59B provided to protrude from the rear surface (i.e., the surface opposite to the front surface on which the LCD 51 is provided) of the plate-shaped casing 50. Similarly to the eaves 59 of the above embodiment, the feet 59A and 59B are provided in regions including positions facing the operation portions (the respective kinds of rocking levers 53A and 53B) provided on the left and right of the display portion, respectively. The second L button 54K is provided slightly above the left side (left side when viewed from the front side) of the back surface of the housing 50, and the second R button 54L is provided slightly above the right side (right side when viewed from the front side) of the back surface of the housing 50. In other words, the second L button 54K is provided at a position substantially opposite to the left analog stick 53A provided on the front surface, and the second R button 54L is provided at a position substantially opposite to the right analog stick 53B provided on the front surface. In this way, the second L button 54K is disposed at a position where the user can operate with the middle finger of the left hand, and the second R button 54L is disposed at a position where the user can operate with the middle finger of the right hand (see fig. 33). As shown in fig. 32 (c), the second L button 54K and the second R button 54L are provided on the obliquely upward surfaces of the leg portions 59A and 59B, and have obliquely upward button surfaces. It is considered that the finger moves in the vertical direction when the user holds the terminal device 8, and therefore, the user can easily press the second L button 54K and the second R button 54L by facing the buttons upward. Further, the user can easily grip the housing 50 by providing the leg portions on the rear surface of the housing 50, and can easily perform an operation while gripping the housing 50 by providing the button on the leg portions.
In addition, with the terminal device 8 shown in fig. 32, since the second L button 54K and the second R button 54L are provided on the back surface, when the terminal device 8 is placed with the screen of the LCD 51 (the front surface of the housing 50) facing upward, the screen may not be completely horizontal. Thus, in another embodiment, three or more leg portions may be formed on the rear surface of the housing 50. Accordingly, in a state where the screen of the LCD 51 is facing upward, the terminal device 8 can be placed on the placement surface by the legs contacting the placement surface, and therefore, the terminal device 8 can be placed with the screen horizontal. Further, the terminal device 8 may be horizontally placed by adding a leg portion that can be attached and detached.
The buttons 54A to 54L are assigned functions corresponding to the game program as appropriate. For example, the cross button 54A and the buttons 54E to 54H may be used for a direction instruction operation, a selection operation, and the like, and the buttons 54B to 54E may be used for a confirmation operation, a cancellation operation, and the like.
Although not shown, the terminal device 8 has a power button for turning on/off the power of the terminal device 8. The terminal device 8 may have a button for turning on/off the screen display of the LCD 51, a button for performing connection setting (pairing) with the game device 3, and a button for adjusting the volume of a speaker (speaker 77 shown in fig. 10).
As shown in fig. 32 (a), the terminal device 8 includes a marker (marker 55 shown in fig. 10) including a marker 55A and a marker 55B on the front surface of the housing 50. The marker 55 is provided on the upper side of the LCD 51. Each of the markers 55A and 55B is composed of one or more infrared LEDs, as in the markers 6R and 6L of the marker device 6. The marker 55 is used when the game device 3 calculates the movement of the controller 5, as in the marker 6 described above. Further, the game device 3 can control the lighting of each infrared LED provided in the marker section 55.
The terminal device 8 includes a camera 56 as an imaging unit. The camera 56 includes an image pickup element (e.g., a CCD image sensor, a CMO S image sensor, etc.) having a predetermined resolution and a lens. As shown in fig. 32, in the present modification, a camera 56 is provided on the front surface of the housing 50. Therefore, the camera 56 can capture the face of the user holding the terminal device 8, and can capture the face of the user when playing a game while viewing the LCD 51, for example.
The terminal device 8 includes a microphone (a microphone 79 shown in fig. 10) as a sound input means. A microphone hole 50c is provided in the front surface of the housing 50. The microphone 79 is provided inside the casing 50 in the microphone hole 50 c. The microphone detects sound around the terminal device 8 such as the sound of the user.
The terminal device 8 includes a speaker (speaker 77 shown in fig. 10) as an audio output means. As shown in fig. 32 (d), a speaker hole 57 is provided in the lower side surface of the housing 50. The output sound of the speaker 77 is output from the speaker hole 57. In the present modification, the terminal device 8 includes two speakers, and speaker holes 57 are provided at positions of the left speaker and the right speaker, respectively.
The terminal device 8 is provided with an extension connector 58 for connecting another device to the terminal device 8. In the present modification, as shown in fig. 32 (d), the expansion connector 58 is provided on the lower side surface of the housing 50. The other devices connected to the expansion connector 58 may be any devices, and may be input devices such as a controller (gun-type controller or the like) used in a specific game and a keyboard. The expansion connector 58 may not be provided if there is no need to connect other devices.
Note that, with the terminal device 8 shown in fig. 32, the shape of each operation button, the housing 50, the number of components, the installation position, and the like are merely examples, and other shapes, numbers, and installation positions may be used.
As described above, in the above-described modification, on the back surface of the housing 50, the two leg portions 59A and 59B provided at the positions on the left and right sides are provided as the protruding portions. In this case as well, the user can easily hold the terminal device 8 by holding the terminal device 8 with the lower surface of the projection portion hooked on the ring finger or the middle finger, as in the above-described embodiment (see fig. 33). Further, as in the above-described embodiment, since the second L button 54K and the second R button 54L are provided on the upper surface of the projection portion, the user can easily operate these buttons in the above-described state.
As in the above-described embodiment and modification, the projection is preferably provided so as to protrude at least at the left and right sides of the rear surface side of the housing above the center of the housing. Thus, when the user grips both the left and right sides of the housing, the terminal device can be easily gripped by hooking the protrusion to the finger. Further, by providing the projection portion on the upper side, the user can support the housing with the palm (see fig. 10 and the like), and thus the terminal device can be reliably held.
The protrusion may not be provided above the center of the housing. For example, when the operation portions are provided on the left and right sides of the display portion, the protrusion portions may be provided at the following positions: the user can be hooked to any finger other than the thumb while holding the housing so that the user can operate the operation portions with the thumbs of both hands. Accordingly, the user can easily hold the terminal device by hooking the projection portion to the finger.
Fig. 34 and 35 are diagrams showing an external configuration of a terminal device according to another modification of the above embodiment. Fig. 34 is a right side view of the terminal device, and fig. 35 is a bottom view. The terminal device 9 shown in fig. 34 and 35 is the same as the terminal device 7 of the above embodiment except for the points where the convex portions 230a and 230b are provided. Next, the configuration of the terminal device 9 according to this modification will be described focusing on differences from the above-described embodiment.
The projections 230a and 230b are of a convex shape in cross section, and the projections 230a and 230b are provided on the left and right sides, respectively, on the back surface side of the housing 50. Here, the convex portion 230a is provided on the left side (left side when viewed from the front side) of the case 50, and the convex portion 230b is provided on the right side (right side when viewed from the front side) of the case 50. As shown in fig. 35, the respective convex portions 230a and 230b are provided on both left and right sides (both end portions) of the housing 50. In addition, each of the convex portions 230a and 230b is provided below the protruding portion (brim portion 59). The respective convex portions 230a and 230b are provided with a space from the protruding portion. That is, in the case 50, the portions between the respective convex portions 230a and 230b and the protruding portions are configured to be thinner than these portions. Each of the protrusions 230a and 230b has the following shape: the projection extends in the vertical direction and has a convex cross section perpendicular to the vertical direction.
In the present modification, the user can grip the terminal device 9 more reliably by gripping the respective convex portions 230a and 230b so as to be surrounded by the little finger (and ring finger). That is, the protrusions 230a and 230b have the function of a handle portion. The projection (grip portion) may have any shape, and is preferably formed to extend in the vertical direction so that the terminal device 9 can be easily held. The height of each of the convex portions 230a and 230b may be any height, or may be formed lower than the protruding portion. Accordingly, in a state where the terminal device 9 is placed with the screen of the LCD 51 facing upward, the lower side of the screen is lower than the upper side, and therefore the terminal device 9 can be placed in an easily viewable state. Further, since the convex portions 230a and 230b are provided at intervals from the protruding portion, the user can hold the terminal device 9 by bringing fingers into contact with the lower surface of the protruding portion without the convex portions interfering with the fingers. As described above, according to the above modification, the user can more reliably hold the terminal device by providing the convex portion below the protruding portion. In other embodiments, the projection may not be provided on the rear surface of the housing 50, and in this case, the user can reliably grip the housing 50 with the projection (the grip). Further, the surface of the convex portion (handle portion) may be made of a material that does not easily slip, thereby further improving the handle function. Even in the case where there is no projection, a material that does not easily slide can be used on the back surface of the housing.
(modification of device to which this configuration is applied)
In the above-described embodiments, the terminal device used together with the stationary game device was described as an example, but the configuration of the operation device described in the present specification can be applied to any device that a user uses by holding. For example, the operation device may be implemented as an information terminal such as a portable game machine, a portable telephone, a mobile phone, and an electronic book terminal.
As described above, the present invention can be used as an operation device (terminal device) in a game system, for example, for the purpose of allowing a user to easily grip the device.
The present invention has been described in detail, but all of the points described above are merely examples of the present invention and are not intended to limit the scope thereof. It goes without saying that various improvements and modifications can be made without departing from the scope of the present invention.
Cross reference to related applications
The disclosures of japanese patent application No. 2010-2456298, 2010-11-1, 2011-4-18, 2011-4-19, 2011-092612, 2011-5-2, 2011-5-6, 2011-103704, 2011-5-6, 2011-103705, 2011-5-6, 2011-103706, 2011-5-26, and 2011-118488 are added to this specification by reference.

Claims (20)

HK12112245.4A2010-11-012012-11-28Controller device and information processing deviceHK1171403B (en)

Applications Claiming Priority (18)

Application NumberPriority DateFiling DateTitle
JP2010245299AJP4798809B1 (en)2010-11-012010-11-01 Display device, game system, and game processing method
JP20102452982010-11-01
JP2010-2452992010-11-01
JP2010-2452982010-11-01
JP2011-0925062011-04-18
JP20110925062011-04-18
JP2011092612AJP6103677B2 (en)2010-11-012011-04-19 GAME SYSTEM, OPERATION DEVICE, AND GAME PROCESSING METHOD
JP2011-0926122011-04-19
JP2011-1028342011-05-02
JP2011102834AJP5837325B2 (en)2010-11-012011-05-02 Operating device and operating system
JP2011-1037062011-05-06
JP2011103704AJP6005907B2 (en)2010-11-012011-05-06 Operating device and operating system
JP2011103706AJP6005908B2 (en)2010-11-012011-05-06 Equipment support system and support device
JP2011-1037052011-05-06
JP2011-1037042011-05-06
JP20111037052011-05-06
JP2011118488AJP5936315B2 (en)2010-11-012011-05-26 Information processing system and information processing apparatus
JP2011-1184882011-05-26

Publications (2)

Publication NumberPublication Date
HK1171403A1true HK1171403A1 (en)2013-03-28
HK1171403B HK1171403B (en)2015-10-16

Family

ID=

Also Published As

Publication numberPublication date
HK1165745A1 (en)2012-10-12
HK1171399A1 (en)2013-03-28
KR20130020715A (en)2013-02-27
HK1171400A1 (en)2013-03-28

Similar Documents

PublicationPublication DateTitle
KR101492310B1 (en)Operating apparatus and information processing apparatus
TWI442963B (en)Controller device and information processing device
JP6188766B2 (en) Operating device and operating system
WO2011096203A1 (en)Game system, operating device, and game processing method
JP5829020B2 (en) GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND GAME PROCESSING METHOD
JP4798809B1 (en) Display device, game system, and game processing method
JP6103677B2 (en) GAME SYSTEM, OPERATION DEVICE, AND GAME PROCESSING METHOD
JP5936315B2 (en) Information processing system and information processing apparatus
JP2012096005A (en)Display device, game system and game processing method
HK1171403A1 (en)Controller device and information processing device
HK1171403B (en)Controller device and information processing device
HK1165745B (en)Controller device and controller system
HK1171400B (en)Controller device and controller system
HK1171399B (en)Device support system and support device
HK1171401A1 (en)Display device, game system, and game method
HK1171401B (en)Display device, game system, and game method
HK1171402B (en)Game system, controller device, and game process method

[8]ページ先頭

©2009-2025 Movatter.jp