Movatterモバイル変換


[0]ホーム

URL:


US11768383B2 - Display control apparatus and display control method - Google Patents

Display control apparatus and display control method
Download PDF

Info

Publication number
US11768383B2
US11768383B2US17/225,203US202117225203AUS11768383B2US 11768383 B2US11768383 B2US 11768383B2US 202117225203 AUS202117225203 AUS 202117225203AUS 11768383 B2US11768383 B2US 11768383B2
Authority
US
United States
Prior art keywords
viewpoint position
virtual space
viewpoint
sight
display control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US17/225,203
Other versions
US20210223558A1 (en
Inventor
Tomokazu Kake
Takayuki Ishida
Akira Suzuki
Yasuhiro Watari
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Interactive Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment IncfiledCriticalSony Interactive Entertainment Inc
Priority to US17/225,203priorityCriticalpatent/US11768383B2/en
Publication of US20210223558A1publicationCriticalpatent/US20210223558A1/en
Priority to US18/452,819prioritypatent/US12124044B2/en
Application grantedgrantedCritical
Publication of US11768383B2publicationCriticalpatent/US11768383B2/en
Priority to US18/887,090prioritypatent/US20250004284A1/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

Methods and apparatus provide for: generating virtual space images by specifying from among a plurality of viewpoint positions and a plurality of directions of line of sight in the virtual space; and displaying the virtual space images on a head-mounted display, where the generating includes changing from a currently selected viewpoint position among the plurality of viewpoint positions to a newly selected viewpoint position among the plurality of viewpoint positions, which is selected in accordance with operation by a user, when generating the virtual space images in response to changing to the newly selected viewpoint position, the generating includes specifying, as a new direction of line of sight among the plurality of directions of line of sight, a direction in which a first position in the virtual space is seen from the newly selected viewpoint position, and at least some of the plurality of viewpoint positions are separated from each other.

Description

CROSS REFERENCE TO RELATED APPLICATIONS
This is a continuation application of U.S. patent application Ser. No. 15/773,409, accorded a filing date of May 3, 2018, allowed, which is a national stage application of International Application No. PCT/JP2016/084937, filed Nov. 25, 2016, which claims priority to JP Application No. 2015-235897, filed Dec. 2, 2015, the entire disclosures of which are hereby incorporated by reference.
TECHNICAL FIELD
The present invention relates to a display control technology, and more particularly, to a display control apparatus and a display control method for controlling display on a head-mounted display.
BACKGROUND ART
Games are played by wearing a head-mounted display, connected to a game console, on the head, watching a screen displayed on the head-mounted display, and manipulating a controller or other device. With an ordinary stationary display, a user's field-of-view range spreads outside the display screen, possibly making it impossible to focus one's attention on the display screen or resulting in insufficient sense of immersion. In that respect, when a head-mounted display is worn, a user cannot see anything other than an image appearing on the head-mounted display, thereby increasing a sense of immersion into the image world and further enhancing the entertaining nature of the game.
SUMMARYTechnical Problem
The inventor recognized the need for a more convenient display control technology to ensure that games using a head-mounted display can be enjoyed by more user segments.
Solution to Problem
In order to solve the above problem, a display control apparatus according to a mode of the present invention includes a display control section that generates a virtual space image by specifying a viewpoint position and a direction of line of sight and displays the image on a head-mounted display. The display control section can specify a plurality of positions in the virtual space as viewpoint positions and can change the viewpoint position to a position determined from among the plurality of positions in accordance with an attitude of the head-mounted display, and when the viewpoint position is changed, the display control section specifies, as a direction of line of sight, the direction in which a first position in the virtual space is seen from the changed viewpoint position.
Another mode of the present invention is a display control apparatus. This apparatus includes a display control section and a viewpoint position control section. The display control section generates a virtual space image by specifying a viewpoint position and a direction of line of sight and displays the image on a head-mounted display. The viewpoint position control section moves the viewpoint position in accordance with a position of the head-mounted display. The viewpoint position control section moves the viewpoint position to a greater extent when the head-mounted display is moved horizontally than when the head-mounted display is moved perpendicularly.
It should be noted that arbitrary combinations of the above components and conversions of expressions of the present invention between method, apparatus, system, program, and so on are also effective as modes of the present invention.
Advantageous Effect of Invention
According to the present invention, it is possible to improve convenience of head-mounted display users.
BRIEF DESCRIPTION OF DRAWINGS
FIG.1 is a diagram illustrating an environment in which a game system according to an embodiment is used.
FIG.2 is an external view of a head-mounted display according to the embodiment.
FIG.3 is a functional configuration diagram of the head-mounted display.
FIG.4 depicts diagrams illustrating an external configuration of an input apparatus.
FIG.5 is a diagram illustrating an internal configuration of the input apparatus.
FIG.6 is a diagram illustrating a configuration of a gaming apparatus.
FIG.7 is a functional configuration diagram of the gaming apparatus.
FIGS.8(a) to8(c) are diagrams illustrating examples of images displayed on the head-mounted display.
FIG.9 is a schematic diagram for describing a method of specifying a viewpoint position and a direction of line of sight.
FIGS.10(a) and10(b) are diagrams illustrating examples of images displayed on the head-mounted display.
FIG.11 is a schematic diagram for describing details of another game provided by a game control section.
FIGS.12(a) to12(c) are diagrams illustrating examples of images displayed on the head-mounted display.
FIG.13 is a schematic diagram for describing a method of moving the viewpoint position in a game according to the embodiment.
DESCRIPTION OF EMBODIMENT
In the present embodiment, a description will be given of a display technology using a head-mounted display (HMD). A head-mounted display is a display apparatus worn on a user's head in such a manner as to cover his or her eyes so that the user can view still images and videos appearing on a display screen provided in front of user's eyes. What appears on the head-mounted display may be content such as movies and television (TV) programs. In the present embodiment, however, a description will be given of an example in which a head-mounted display is used as a display apparatus for displaying game images.
FIG.1 is a diagram illustrating an environment in which agame system1 according to an embodiment is used. Thegame system1 includes agaming apparatus10, aninput apparatus20, animaging apparatus14, a head-mounteddisplay100, and adisplay apparatus12. Thegaming apparatus10 executes a game program. Theinput apparatus20 is used to input a user instruction to thegaming apparatus10. Theimaging apparatus14 images a real space around a user. The head-mounteddisplay100 displays a first game image generated by thegaming apparatus10. Thedisplay apparatus12 displays a second game image generated by thegaming apparatus10.
Thegaming apparatus10 executes a game program based on an instruction input supplied from theinput apparatus20 or the head-mounteddisplay100, a position or attitude of theinput apparatus20 or the head-mounteddisplay100, and so on, generates a first game image and transports the image to the head-mounteddisplay100, and generates a second game image and transports the image to thedisplay apparatus12.
The head-mounteddisplay100 displays the first game image generated by thegaming apparatus10. The head-mounteddisplay100 also transports, to thegaming apparatus10, information related to user input to the input apparatus provided on the head-mounteddisplay100. The head-mounteddisplay100 may be connected to thegaming apparatus10 with a wired cable. Alternatively, the head-mounteddisplay100 may be connected wirelessly through wireless local area network (LAN) or other means.
Thedisplay apparatus12 displays a second game image generated by thegaming apparatus10. Thedisplay apparatus12 may be a TV having a display and a speaker. Alternatively, thedisplay apparatus12 may be a computer display or other apparatus.
Theinput apparatus20 has a function to transport user instruction input to thegaming apparatus10 and is configured as a wireless controller capable of wirelessly communicating with thegaming apparatus10 in the present embodiment. Theinput apparatus20 and thegaming apparatus10 may establish wireless connection using Bluetooth (registered trademark) protocol. It should be noted that theinput apparatus20 is not limited to a wireless controller and may be a wired controller connected to thegaming apparatus10 via a cable.
Theinput apparatus20 is driven by batteries and is configured to have a plurality of buttons for making instruction input so as to progress the game. When the user operates a button on theinput apparatus20, instruction input resulting from the operation is sent to thegaming apparatus10 through wireless communication.
Theimaging apparatus14 is a video camera that includes, for example, a charge-coupled device (CCD) imaging device or a complementary metal-oxide semiconductor (CMOS) imaging device and generates, by imaging a real space at a given interval, a frame image for each interval. Theimaging apparatus14 is connected to thegaming apparatus10 via a universal serial bus (USB) or other interface. An image captured by theimaging apparatus14 is used by thegaming apparatus10 to derive the positions and attitudes of theinput apparatus20 and the head-mounteddisplay100. Theimaging apparatus14 may be a ranging camera or a stereo camera capable of acquiring a distance. In this case, theimaging apparatus14 makes it possible to acquire the distance between theimaging apparatus14 and theinput apparatus20 or the head-mounteddisplay100.
In thegame system1 of the present embodiment, theinput apparatus20 and the head-mounteddisplay100 have a light-emitting section configured to emit light in a plurality of colors. During a game, the light-emitting section emits light in the color specified by thegaming apparatus10 and is imaged by theimaging apparatus14. Theimaging apparatus14 images theinput apparatus20, generates a frame image, and supplies the image to thegaming apparatus10. Thegaming apparatus10 acquires the frame image and derives position information of the light-emitting section in the real space from the position and size of the image of the light-emitting section in the frame image. Thegaming apparatus10 treats position information as a game operation instruction and reflects position information in game processing including controlling the action of a player's character.
Also, theinput apparatus20 and the head-mounteddisplay100 have an acceleration sensor and a gyrosensor. Sensor detection values are sent to thegaming apparatus10 at a given interval, and thegaming apparatus10 acquires sensor detection values and acquires attitude information of theinput apparatus20 and the head-mounteddisplay100 in the real space. Thegaming apparatus10 treats attitude information as a game operation instruction and reflects attitude information in game processing.
FIG.2 is an external view of the head-mounteddisplay100 according to the embodiment. The head-mounteddisplay100 includes amain body section110, ahead contact section112, and a light-emittingsection114.
Themain body section110 includes a display, a global positioning system (GPS) unit for acquiring position information, an attitude sensor, a communication apparatus, and so on. Thehead contact section112 may include a biological information acquisition sensor capable of measuring user's biological information such as temperature, pulse, blood components, perspiration, brain waves, and cerebral blood flow. As described above, the light-emittingsection114 emits light in the color specified by thegaming apparatus10 and functions as a criterion for calculating the position of the head-mounteddisplay100 in the image captured by theimaging apparatus14.
A camera for capturing the user's eyes may be further provided on the head-mounteddisplay100. The camera mounted to the head-mounteddisplay100 permits detection of the user's line of sight, movement of the pupils, blinking, and so on.
Although a description will be given of the head-mounteddisplay100 in the present embodiment, the display control technology of the present embodiment is applicable not only to a case in which the head-mounteddisplay100 in a narrow sense is worn but also to a case in which eyeglasses, an eyeglass-type display, an eyeglass-type camera, a headphone, a headset (microphone equipped headphone), an earphone, an earring, an ear-mounted camera, a hat, a camera-equipped hat, or hair band is worn.
FIG.3 is a functional configuration diagram of the head-mounteddisplay100. The head-mounteddisplay100 includes aninput interface122, anoutput interface130, abacklight132, acommunication control section140, anetwork adapter142, anantenna144, astorage section150, aGPS unit161, awireless unit162, anattitude sensor164, an external input/output (I/O)terminal interface170, anexternal memory172, aclock section180, adisplay apparatus190, and acontrol section160. These functional blocks can also be realized by hardware alone, software alone, or a combination thereof in various forms.
Thecontrol section160 is a main processor that processes and outputs signals such as image signals and sensor signals, instructions, and data. Theinput interface122 accepts an operation signal and a setup signal from input buttons and so on and supplies these signals to thecontrol section160. Theoutput interface130 receives an image signal from thecontrol section160 and displays the signal on thedisplay apparatus190. Thebacklight132 supplies backlight to a liquid crystal display making up thedisplay apparatus190.
Thecommunication control section140 sends, to external equipment, data input from thecontrol section160 in a wired or wireless communication manner via thenetwork adapter142 or theantenna144. Thecommunication control section140 receives data from external equipment in a wired or wireless manner via thenetwork adapter142 or theantenna144 and outputs the data to thecontrol section160.
Thestorage section150 temporarily stores data and parameters processed by thecontrol section160, operation signals, and so on.
TheGPS unit161 receives position information from a GPS satellite in accordance with an operation signal from thecontrol section160 and supplies position information to thecontrol section160. Thewireless unit162 receives position information from a wireless base station in accordance with an operation signal from thecontrol section160 and supplies position information to thecontrol section160.
Theattitude sensor164 detects attitude information such as orientation and tilt of themain body section110 of the head-mounteddisplay100. Theattitude sensor164 is realized by combining a gyrosensor, an acceleration sensor, an angular acceleration sensor, and so on as appropriate.
The external I/O terminal interface170 is an interface for connecting peripheral equipment such as USB controller. Theexternal memory172 is an external memory such as flash memory.
Theclock section180 specifies time information using a setup signal from thecontrol section160 and supplies time information to thecontrol section160.
FIG.4 illustrates an external configuration of theinput apparatus20, andFIG.4(a) illustrates a top surface configuration of theinput apparatus20, andFIG.4(b) illustrates a bottom surface configuration of theinput apparatus20. Theinput apparatus20 has a light-emittingbody22 and ahandle24. The light-emittingbody22 has an outside light-emitting device made of a light-transmitting resin formed in a spherical shape and a light-emitting diode or an electric bulb therein. When the light-emitting device therein emits light, the entire outside spherical body shines.Operating buttons30,32,34,36, and38 are provided on the top surface of thehandle24, and anoperating button40 is provided on the bottom surface thereof. The user operates the operatingbuttons30,32,34,36, and38 with the thumb and theoperating button40 with the index finger while holding an end portion of thehandle24 with the hand. The operatingbuttons30,32,34,36, and38 include pushbuttons and are operated as the user presses them. Theoperating button40 may be a button that permits entry of an analog amount.
The user plays a game while watching a game screen displayed on thedisplay apparatus12. Theimaging apparatus14 needs to image the light-emittingbody22 during execution of a game application. Therefore, an imaging range thereof is preferably arranged to face the same direction as thedisplay apparatus12. In general, the user often plays games in front of thedisplay apparatus12. Therefore, theimaging apparatus14 is arranged such that an optical axis thereof matches a front direction of thedisplay apparatus12. Specifically, theimaging apparatus14 is preferably arranged near thedisplay apparatus12 such that the imaging range thereof includes a position where the user can visually recognize the display screen of thedisplay apparatus12. This allows theimaging apparatus14 to image theinput apparatus20.
FIG.5 illustrates an internal configuration of theinput apparatus20. Theinput apparatus20 includes awireless communication module48, aprocessing section50, a light-emittingsection62, and the operatingbuttons30,32,34,36,38, and40. Thewireless communication module48 has a function to send and receive data to and from a wireless communication module of thegaming apparatus10. Theprocessing section50 performs predetermined processes in theinput apparatus20.
Theprocessing section50 includes amain control section52, aninput acceptance section54, atriaxial acceleration sensor56, atriaxial gyrosensor58, and a lightemission control section60. Themain control section52 sends and receives necessary data to and from thewireless communication module48.
Theinput acceptance section54 accepts input information from the operatingbuttons30,32,34,36,38, and40 and sends input information to themain control section52. Thetriaxial acceleration sensor56 detects acceleration components of three axial directions of X, Y, and Z. Thetriaxial gyrosensor58 detects angular speeds on XZ, ZY, and YX planes. It should be noted that, here, width, height, and length directions of theinput apparatus20 are specified as X, Y, and Z axes. Thetriaxial acceleration sensor56 and thetriaxial gyrosensor58 are preferably arranged inside thehandle24 and near the center inside thehandle24. Thewireless communication module48 sends, together with input information from the operating buttons, detection value information obtained by thetriaxial acceleration sensor56 and detection value information obtained by thetriaxial gyrosensor58, to the wireless communication module of thegaming apparatus10 at a given interval. This transmission interval is set, for example, at 11.25 milliseconds.
The lightemission control section60 controls light emission of the light-emittingsection62. The light-emittingsection62 has a red light-emitting diode (LED)64a, agreen LED64b, and ablue LED64c, thereby allowing them to emit light in a plurality of colors. The lightemission control section60 causes the light-emittingsection62 to emit light in a desired color by controlling light emission of thered LED64a, thegreen LED64b, and theblue LED64c.
When a light emission instruction is received from thegaming apparatus10, thewireless communication module48 supplies the light emission instruction to themain control section52. Themain control section52 supplies the light emission instruction to the lightemission control section60. The lightemission control section60 controls light emission of thered LED64a, thegreen LED64b, and theblue LED64csuch that the light-emittingsection62 emits light in the color specified by the light emission instruction. For example, the lightemission control section60 may control lighting of each LED through pulse width modulation (PWM) control.
FIG.6 illustrates a configuration of thegaming apparatus10. Thegaming apparatus10 includes a frameimage acquisition section80, animage processing section82, a deviceinformation deriving section84, awireless communication module86, aninput acceptance section88, anoutput section90, and anapplication processing section300. The processing capability of thegaming apparatus10 in the present embodiment is realized by a central processing unit (CPU), a memory, and a program loaded into the memory, and so on. Here, a configuration is depicted that is realized by these components working with each other in a coordinated fashion. The program may be built into thegaming apparatus10. Alternatively, the program may be externally supplied stored in a recording medium. Therefore, it is to be understood by those skilled in the art that these functional blocks can be realized in various ways by hardware alone, software alone, or a combination thereof. It should be noted that thegaming apparatus10 may have a plurality of CPUs from a viewpoint of hardware configuration.
Thewireless communication module86 establishes wireless communication with thewireless communication module48 of theinput apparatus20. This allows theinput apparatus20 to send operating button state information and detection value information of thetriaxial acceleration sensor56 and thetriaxial gyrosensor58 to thegaming apparatus10 at a given interval.
Thewireless communication module86 receives operating button state information and sensor detection value information sent from theinput apparatus20 and supplies them to theinput acceptance section88. Theinput acceptance section88 separates button state information and sensor detection value information and hands them over to theapplication processing section300. Theapplication processing section300 receives button state information and sensor detection value information as a game operation instruction. Theapplication processing section300 treats sensor detection value information as attitude information of theinput apparatus20.
The frameimage acquisition section80 is configured as a USB interface and acquires frame images at a given imaging speed (e.g., 30 frames/second) from theimaging apparatus14. Theimage processing section82 extracts a light-emitting body image from a frame image. Theimage processing section82 identifies the position and size of the light-emitting body in the frame images. For example, as the light-emittingbody22 of theinput apparatus20 emits light in a color that is unlikely used in the user's environment, theimage processing section82 can extract a light-emitting body image from a frame image with high accuracy. Theimage processing section82 may generate a binarized image by binarizing frame image data using a given threshold. This binarization encodes a pixel value of a pixel having luminance higher than the given threshold as “1” and the pixel value of a pixel having luminance equal to or lower than the given threshold as “0.” By causing the light-emittingbody22 to light up at luminance beyond this given threshold, theimage processing section82 can identify the position and size of the light-emitting body image from the binarized image. For example, theimage processing section82 identifies coordinates of a center of gravity and a radius of the light-emitting body image in the frame image.
The deviceinformation deriving section84 derives position information of theinput apparatus20 and the head-mounteddisplay100 as seen from theimaging apparatus14 from the position and size of the light-emitting body image identified by theimage processing section82. The deviceinformation deriving section84 derives position coordinates in camera coordinates from the center of gravity of the light-emitting body image and also derives distance information from theimaging apparatus14 from the radius of the light-emitting body image. The position coordinates and the distance information make up position information of theinput apparatus20 and the head-mounteddisplay100. The deviceinformation deriving section84 derives position information of theinput apparatus20 and the head-mounteddisplay100 for each frame image and hands over position information to theapplication processing section300. Theapplication processing section300 receives position information of theinput apparatus20 and the head-mounteddisplay100 as a game operation instruction.
Theapplication processing section300 progresses the game from position information and attitude information of theinput apparatus20 and button state information and generates an image signal indicating processing results of the game application. The image signal is sent to thedisplay apparatus12 from theoutput section90 and output as a display image.
FIG.7 is a functional configuration diagram of thegaming apparatus10. Theapplication processing section300 of thegaming apparatus10 includes acontrol section310 and adata holding section360. Thecontrol section310 includes agame control section311, an instructioninput acquisition section312, an HMDinformation acquisition section314, an input apparatusinformation acquisition section315, a firstimage generation section316, and a secondimage generation section317.
Thedata holding section360 holds program data of games executed in thegaming apparatus10, various data used by the game programs, and so on.
The instructioninput acquisition section312 acquires information related to user instruction input accepted by theinput apparatus20 or the head-mounteddisplay100 from theinput apparatus20 or the head-mounteddisplay100.
The HMDinformation acquisition section314 acquires information related to the attitude of the head-mounted display from the head-mounteddisplay100. Also, the HMDinformation acquisition section314 acquires information related to the position of the head-mounteddisplay100 from the deviceinformation deriving section84. These pieces of information are conveyed to thegame control section311. Information related to the attitude of the head-mounteddisplay100 may be acquired by the deviceinformation deriving section84 analyzing a captured image of the head-mounteddisplay100.
The input apparatusinformation acquisition section315 acquires information related to the attitude of theinput apparatus20. Also, the input apparatusinformation acquisition section315 acquires information related to the position of theinput apparatus20 from the deviceinformation deriving section84. These pieces of information are conveyed to thegame control section311. Information related to the attitude of theinput apparatus20 may be acquired by the deviceinformation deriving section84 analyzing a captured image of theinput apparatus20.
If theinput apparatus20 moves out of the imaging range of theimaging apparatus14 or if theinput apparatus20 is hidden behind the user's body or an obstacle and fails to be imaged by theimaging apparatus14, the input apparatusinformation acquisition section315 calculates the position of theinput apparatus20 based on the previously acquired position of theinput apparatus20 and information related to the attitude of theinput apparatus20 acquired after that point in time. For example, the current position of theinput apparatus20 may be calculated by calculating a deviation from the previously acquired position of theinput apparatus20 based on translational acceleration data acquired from the acceleration sensor of theinput apparatus20. While theinput apparatus20 is not imaged by theimaging apparatus14, the position of theinput apparatus20 is successively calculated in the similar manner. When theinput apparatus20 is imaged again by theimaging apparatus14, there is a possibility that the position of theinput apparatus20 successively calculated from acceleration data may not indicate a correct position due to cumulative drift error. Therefore, the position of theinput apparatus20 newly calculated by the deviceinformation deriving section84 may be used as the current position of theinput apparatus20. The same is true for the head-mounteddisplay100.
Thegame control section311 executes the game program and progresses the game based on user instruction input acquired by the instructioninput acquisition section312 and information related to the position or attitude of theinput apparatus20 or the head-mounteddisplay100. Thegame control section311 changes the position of a player's character, an operation target, based on input made by directional keys or an analog stick of theinput apparatus20 and a change in position of theinput apparatus20 or the head-mounteddisplay100 in a game field made up of a virtual three-dimensional (3D) space.
The firstimage generation section316 generates an image to be displayed on the head-mounteddisplay100. The firstimage generation section316 generates a game field image by specifying a viewpoint position based on the position of the operation target controlled by thegame control section311, specifying a direction of line of sight based on the attitude of the head-mounteddisplay100, and rendering a virtual 3D space. The firstimage generation section316 associates the attitude of the head-mounteddisplay100 and the direction of line of sight in the game field at a given time and changes, thereafter, the direction of line of sight with change in the attitude of the head-mounteddisplay100. As a result, the user can look over the game field by actually moving his or her head, allowing the user to feel as if he or she were really in the game field. The firstimage generation section316 generates a first image by adding information related to the game, an image to be displayed on the head-mounteddisplay100, and so on to the generated game field image. The first image generated by the firstimage generation section316 is sent to the head-mounteddisplay100 via a wireless communication module or a wired communication module.
The secondimage generation section317 generates an image to be displayed on thedisplay apparatus12. When the same image as displayed on the head-mounteddisplay100 is displayed on thedisplay apparatus12, the first image generated by the firstimage generation section316 is also sent to thedisplay apparatus12. When an image different from the image displayed on the head-mounteddisplay100 is displayed on thedisplay apparatus12, an example of which is when the user wearing the head-mounteddisplay100 and the user watching thedisplay apparatus12 execute a head-to-head game, the secondimage generation section317 generates a game field image by specifying a viewpoint position and a direction of line of sight different from those specified by the firstimage generation section316. The secondimage generation section317 generates a second image by adding information related to the game, an image to be displayed on thedisplay apparatus12, and so on to the generated game field image. The second image generated by the secondimage generation section317 is sent to thedisplay apparatus12 via a wireless communication module or a wired communication module.
FIG.8 illustrates examples of images displayed on the head-mounted display. Thegame control section311 provides a function to switch the viewpoint position between a plurality of positions specified in the game field. In the display screen depicted inFIG.8(a), a game field image is displayed that was generated with one of a plurality of positions specified in the game field as a viewpoint position. In the display screen,markers500 and502 further appear that indicate positions specified as viewpoint positions in the game field. When the user changes the attitude of the head-mounteddisplay100 by shaking his or her head horizontally and vertically, the firstimage generation section316 changes a direction of line of sight in accordance with an attitude of the head-mounteddisplay100. When it is rendered possible to detect the user's direction of line of sight by providing a camera for shooting the user's eyeballs inside the head-mounteddisplay100, the direction of line of sight may be changed by further taking into account the user's line of sight. The user's line of sight may be detected by using a known and arbitrary line-of-sight tracking technology.
When the marker enters a given range specified near the center of the display screen as the user points his or her face or line of sight toward the marker direction, as depicted inFIG.8(b), thegame control section311 causes the firstimage generation section316 to change the manner in which themarker500 is displayed, thereby indicating that the position depicted by themarker500 has been selected as a candidate for specifying a viewpoint position. When the user issues an instruction to change the viewpoint position, for example, by pressing a given button or performing a given gesture with the candidate for specifying a viewpoint position selected, thegame control section311 instructs the firstimage generation section316 to change the viewpoint position to the position depicted by the selectedmarker500. The firstimage generation section316 generates and displays a game field image having the position of themarker500 as a viewpoint position as depicted inFIG.8(c). In the display screen depicted inFIG.8(c), amarker504 appears that indicates the position specified as the viewpoint position in the display screen depicted inFIG.8(a).
FIG.9 is a schematic diagram for describing a method of specifying a viewpoint position and a direction of line of sight. In the present embodiment, a viewpoint position is specified on the surface of a sphere having its center near a center of the game field, and a default direction of line of sight is specified in the direction of seeing the first position near the center of the game field from the viewpoint position. As a result, no matter where the viewpoint position is specified, it is possible to display an image that allows the game field to be overlooked. When changing the viewpoint position, the firstimage generation section316 smoothly moves the viewpoint position along the sphere surface and generates a game field image by specifying a direction of line of sight in the direction of seeing the first position in the game field from the viewpoint position even while moving the viewpoint position. As a result, it is possible to display an image that allows the game field to be overlooked even while the viewpoint position is changed, thereby making it possible to indicate, in an easy-to-understand manner, to the user where the viewpoint position will be moved even when the viewpoint position is moved to a large extent. It should be noted that a viewpoint position may be provided on the surface of a sphere or a spheroid having its center at an arbitrary point in the game field or on a curved surface other than that. Also, when the viewpoint position is changed, the viewpoint position may be continuously changed in a linear or curved manner from the viewpoint position before the change to the viewpoint position after the change. In the example depicted inFIG.8, the surface of a lower hemisphere is underground. Therefore, a viewpoint position can be specified only on the surface of an upper hemisphere. However, when the game field is, for example, an outer space, viewpoint positions may be specified on the surface of the lower hemisphere.
FIG.10 illustrates examples of images displayed on the head-mounted display. If the user moves the head-mounteddisplay100 forward by moving his or her head forward when a game field image as seen from a viewpoint position is displayed as depicted inFIG.10(a), thegame control section311 causes the firstimage generation section316 to move the viewpoint position to a second position near the center of the game field. Thegame control section311 may move the viewpoint position when the head-mounteddisplay100 is moved forward by as much as or more than a given amount of travel. Alternatively, thegame control section311 may move the viewpoint position when the head-mounteddisplay100 travels at a speed equal to or more than a given value. As a result, the viewpoint position can be moved to near the center of the game field from spectators' seats as depicted inFIG.10(b). Therefore, the user who was watching, for example, a soccer game from a spectator's viewpoint can feel as if he or she entered the field where the game is taking place. Also, it is possible to provide an easy-to-understand method of moving the viewpoint using the head-mounteddisplay100.
FIG.11 is a schematic diagram for describing details of another game provided by the game control section. In the game depicted in the present figure, the user hides inside abox510 havingholes512 and pops up his or her face from ahole512 while being careful not to be hit by ahammer514 and reads letters written on aplate516. Thegame control section311 changes the viewpoint position based on the position of the head-mounteddisplay100. Thegame control section311 determines thehole512 to be hit with thehammer514 at a given timing and swings down thehammer514 into thedetermined hole512. We assume that if thehammer514 is swung down into thehole512 when the user's viewpoint position is above and outside thehole512, the user is hit with thehammer514. If the user is hit a given number of times or more with thehammer514, the game is over.
FIG.12 depicts diagrams illustrating examples of images displayed on the head-mounted display.FIG.12(a) depicts a game screen when the user looks up from the middle hole. The hammer is about to be swung down into the middle hole.FIG.12(b) depicts a game screen when the user has moved the viewpoint position to under the right hole by moving the head-mounteddisplay100 to the right. Because the hammer is about to be swung down into the middle hole, the hammer will not be swung down into the right hole for a while. At this time, if the user moves up the viewpoint position from the right hole by moving up the head-mounteddisplay100, the user can visually recognize the letters written on a plate provided outside the box as depicted inFIG.12(c).
FIG.13 is a schematic diagram for describing a method of moving the viewpoint position in a game according to the embodiment. When the user plays a game seated, for example, in a chair, the hip position is fixed. Therefore, the user moves his or her head in a circular arc. However, the possible range of head motion that causes no hindrance to the game is approximately 100 to 120 degrees at most. In order to make effective use of the possible motion range of the head-mounteddisplay100, therefore, thegame control section311 moves the viewpoint position to a greater extent when the head-mounteddisplay100 is moved horizontally than when the head-mounteddisplay100 is moved perpendicularly. Also, if the amount of travel exceeds the amount equivalent to the width of the middle hole when the head-mounteddisplay100 is moved to the left or right, the area between the middle hole and the left or right hole is skipped, moving the viewpoint position to under the left or right hole. Specifically, if the head-mounteddisplay100 is moved to the right while the viewpoint position is located under the middle hole, and when the head-mounteddisplay100 reaches aposition520, the viewpoint position jumps from the right edge position of the middle hole to the left edge position of the right hole. Also, if the head-mounteddisplay100 is moved to the left while the viewpoint position is located under the right hole, and when the head-mounteddisplay100 reaches aposition522, the viewpoint position jumps from the left edge position of the right hole to the right edge position of the middle hole. As a result, the viewpoint position is not moved to an area to which there is no or only a slight need to move the viewpoint position in a game, whereas the viewpoint position can be moved to only given areas to which there is need to move the viewpoint position or to which the viewpoint position is often moved, thereby making effective use of the possible motion range of the head-mounteddisplay100 and moving the viewpoint position. Also, it is possible to provide a user interface that permits movement of the viewpoint position with ease by moving the head-mounteddisplay100 even without using, for example, theinput apparatus20, thereby ensuring improved user convenience. A hysteresis is provided by using different positions, theposition520 for causing the viewpoint position to jump for rightward movement and theposition522 for causing the viewpoint position to jump for leftward movement, thereby reducing the likelihood for the viewpoint position to jump to the left and right frequently when the head-mounteddisplay100 is at an angle equivalent to a position in the neighborhood of a threshold.
When an attempt is made to move the viewpoint position to above the left or right hole, it is necessary to move the head-mounteddisplay100 upward while keeping the head-mounteddisplay100 tilted to the left or right. However, it is not easy for the user to move his or her head straight upward while keeping the body tilted to the left or right. In the present embodiment, therefore, when the head-mounteddisplay100 is moved up or down in a tilted position to the left or right, and even if the direction of movement is tilted diagonally, thegame control section311 moves the viewpoint position vertically, but not horizontally. Thegame control section311 moves the viewpoint position vertically by the amount of travel equivalent to a vertical component of a movement vector of the head-mounteddisplay100 and may ignore a horizontal component or move the viewpoint position vertically by the amount of travel equivalent to the magnitude of the movement vector of the head-mounteddisplay100. Thus, when the viewpoint position is changed in response to movement of the head-mounteddisplay100, it is possible to restrict the movement direction of the viewpoint position to a given direction and prevent the viewpoint position from moving in an unnecessary direction by converting the movement vector of the head-mounteddisplay100 into a vector in a given direction. Also, it is possible to provide a user interface that permits the movement of the viewpoint position only in a necessary direction, thereby ensuring improved user convenience.
Such a technology is applicable, for example, to a game in which a player's character hides behind an obstacle such as wall to ward off oncoming bullets.
The present invention has been described above based on an embodiment. The present embodiment is illustrative, and it is to be understood by those skilled in the art that the combination of components and processes thereof can be modified in various ways and that these modification examples also fall within the scope of the present invention.
Although an image for binocular stereopsis was displayed on thedisplay apparatus190 of the head-mounteddisplay100 in the above example, an image for monocular stereopsis may be displayed in a different example.
Although the head-mounteddisplay100 was used in a game system in the above example, the technology described in the embodiment can be also used to display content other than games.
REFERENCE SIGNS LIST
10 Gaming apparatus,20 Input apparatus,100 Head-mounted display,190 Display apparatus,311 Game control section,312 Instruction input acquisition section,314 HMD information acquisition section,315 Input apparatus information acquisition section,316 First image generation section,317 Second image generation section.
INDUSTRIAL APPLICABILITY
The present invention is applicable to a display control apparatus for controlling display to a head-mounted display.

Claims (13)

The invention claimed is:
1. A display control apparatus, comprising: a display control section adapted to generate virtual space images by (i) selecting a viewpoint position from among a plurality of viewpoint positions within a virtual space, where each viewpoint position among the plurality of viewpoint positions defines a respective position within the virtual space from which the user's point of view is determined, and (ii) selecting a direction of line of sight for the selected viewpoint position, where such selection is from among a respective plurality of directions of line of sight for each viewpoint position, wherein: the selection of the viewpoint position and the direction of line of sight for such viewpoint position establish both a currently selected viewpoint position within the virtual space from which the user's point of view is directed and a current direction of line of sight from such currently selected viewpoint position from which the user views the virtual space, the display control section is adapted to display the virtual space images on a head-mounted display, the display control section is adapted to change from the currently selected viewpoint position among the plurality of viewpoint positions to a newly selected viewpoint position among the plurality of viewpoint positions, which is selected in accordance with operation by a user, when generating the virtual space images in response to changing to the newly selected viewpoint position, the display control section specifies, as a new direction of line of sight among the plurality of directions of line of sight, a direction in which a first position in the virtual space is seen from the newly selected viewpoint position, and at least some of the plurality of viewpoint positions are separated from each other.
2. The display control apparatus ofclaim 1, wherein at least some of plurality of viewpoint positions are indicated by a respective marker.
3. The display control apparatus ofclaim 1, wherein, when generating the virtual space images in response to changing to the newly selected viewpoint position, the display control section specifies the new direction of line of sight in a direction toward the first position in the virtual space even while moving the viewpoint position.
4. The display control apparatus ofclaim 1, wherein, when generating the virtual space images in response to changing to the newly selected viewpoint position, the display control section continuously changes among a set of the plurality of viewpoint positions from the currently selected viewpoint position to the newly selected viewpoint position in a linear or curved manner.
5. The display control apparatus ofclaim 4, wherein the display control section continuously generates the virtual space images in response to the continuous change among the set of the plurality of viewpoint positions to create an appearance of smooth motion to the user in moving from the currently selected viewpoint position to the newly selected viewpoint position.
6. The display control apparatus ofclaim 1, wherein, when the head-mounted display is moved by an amount equal to or more than a given amount, or is moved at a speed equal to or more than a given speed, the display control section moves the currently selected viewpoint position to a second viewpoint position among the plurality of viewpoint positions in the virtual space.
7. A display control method, comprising: generating virtual space images by (i) selecting a viewpoint position from among a plurality of viewpoint positions within a virtual space, where each viewpoint position among the plurality of viewpoint positions defines a respective position within the virtual space from which the user's point of view is determined, and (ii) selecting a direction of line of sight for the selected viewpoint position, where such selection is from among a respective plurality of directions of line of sight for each viewpoint position, wherein: the selection of the viewpoint position and the direction of line of sight for such viewpoint position establish both a currently selected viewpoint position within the virtual space from which the user's point of view is directed and a current direction of line of sight from such currently selected viewpoint position from which the user views the virtual space, the generating includes displaying the virtual space images on a head-mounted display, the generating includes changing from the currently selected viewpoint position among the plurality of viewpoint positions to a newly selected viewpoint position among the plurality of viewpoint positions, which is selected in accordance with operation by a user, when generating the virtual space images in response to changing to the newly selected viewpoint position, the generating includes specifying, as a new direction of line of sight among the plurality of directions of line of sight, a direction in which a first position in the virtual space is seen from the newly selected viewpoint position, and at least some of the plurality of viewpoint positions are separated from each other.
8. The display control method ofclaim 7, wherein at least some of plurality of viewpoint positions are indicated by a respective marker.
9. The display control method ofclaim 7, wherein the specifying the new direction of line of sight is in a direction toward the first position in the virtual space even while moving the viewpoint position.
10. The display control method ofclaim 7, wherein, when generating the virtual space images in response to changing to the newly selected viewpoint position, the generating includes continuously changing among a set of the plurality of viewpoint positions from the currently selected viewpoint position to the newly selected viewpoint position in a linear or curved manner.
11. The display control method ofclaim 10, wherein the generating includes continuously generating the virtual space images in response to the continuous change among the set of the plurality of viewpoint positions to create an appearance of smooth motion to the user in moving from the currently selected viewpoint position to the newly selected viewpoint position.
12. The display control method ofclaim 7, wherein, when the head-mounted display is moved by an amount equal to or more than a given amount, or is moved at a speed equal to or more than a given speed, the generating includes moving the currently selected viewpoint position to a second viewpoint position among the plurality of viewpoint positions in the virtual space.
13. A non-transitory computer readable storage medium containing a computer program, which when executed by a computer, causes the computer to perform a display control method by carrying out actions, comprising: generating virtual space images by (i) selecting a viewpoint position from among a plurality of viewpoint positions within a virtual space, where each viewpoint position among the plurality of viewpoint positions defines a respective position within the virtual space from which the user's point of view is determined, and (ii) selecting a direction of line of sight for the selected viewpoint position, where such selection is from among a respective plurality of directions of line of sight for each viewpoint position, wherein: the selection of the viewpoint position and the direction of line of sight for such viewpoint position establish both a currently selected viewpoint position within the virtual space from which the user's point of view is directed and a current direction of line of sight from such currently selected viewpoint position from which the user views the virtual, the generating includes displaying the virtual space images on a head-mounted display, the generating includes changing from the currently selected viewpoint position among the plurality of viewpoint positions to a newly selected viewpoint position among the plurality of viewpoint positions, which is selected in accordance with operation by a user, when generating the virtual space images in response to changing to the newly selected viewpoint position, the generating includes specifying, as a new direction of line of sight among the plurality of directions of line of sight, a direction in which a first position in the virtual space is seen from the newly selected viewpoint position, and at least some of the plurality of viewpoint positions are separated from each other.
US17/225,2032015-12-022021-04-08Display control apparatus and display control methodActiveUS11768383B2 (en)

Priority Applications (3)

Application NumberPriority DateFiling DateTitle
US17/225,203US11768383B2 (en)2015-12-022021-04-08Display control apparatus and display control method
US18/452,819US12124044B2 (en)2015-12-022023-08-21Display control apparatus and display control method
US18/887,090US20250004284A1 (en)2015-12-022024-09-17Display control apparatus and display control method

Applications Claiming Priority (5)

Application NumberPriority DateFiling DateTitle
JP2015235897AJP6532393B2 (en)2015-12-022015-12-02 Display control apparatus and display control method
JP2015-2358972015-12-02
PCT/JP2016/084937WO2017094607A1 (en)2015-12-022016-11-25Display control device and display control method
US201815773409A2018-05-032018-05-03
US17/225,203US11768383B2 (en)2015-12-022021-04-08Display control apparatus and display control method

Related Parent Applications (2)

Application NumberTitlePriority DateFiling Date
PCT/JP2016/084937ContinuationWO2017094607A1 (en)2015-12-022016-11-25Display control device and display control method
US15/773,409ContinuationUS11042038B2 (en)2015-12-022016-11-25Display control apparatus and display control method

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US18/452,819ContinuationUS12124044B2 (en)2015-12-022023-08-21Display control apparatus and display control method

Publications (2)

Publication NumberPublication Date
US20210223558A1 US20210223558A1 (en)2021-07-22
US11768383B2true US11768383B2 (en)2023-09-26

Family

ID=58796719

Family Applications (4)

Application NumberTitlePriority DateFiling Date
US15/773,409ActiveUS11042038B2 (en)2015-12-022016-11-25Display control apparatus and display control method
US17/225,203ActiveUS11768383B2 (en)2015-12-022021-04-08Display control apparatus and display control method
US18/452,819ActiveUS12124044B2 (en)2015-12-022023-08-21Display control apparatus and display control method
US18/887,090PendingUS20250004284A1 (en)2015-12-022024-09-17Display control apparatus and display control method

Family Applications Before (1)

Application NumberTitlePriority DateFiling Date
US15/773,409ActiveUS11042038B2 (en)2015-12-022016-11-25Display control apparatus and display control method

Family Applications After (2)

Application NumberTitlePriority DateFiling Date
US18/452,819ActiveUS12124044B2 (en)2015-12-022023-08-21Display control apparatus and display control method
US18/887,090PendingUS20250004284A1 (en)2015-12-022024-09-17Display control apparatus and display control method

Country Status (5)

CountryLink
US (4)US11042038B2 (en)
EP (3)EP3474271B1 (en)
JP (1)JP6532393B2 (en)
CN (2)CN114296235B (en)
WO (1)WO2017094607A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US12166958B2 (en)*2021-09-272024-12-10Canon Kabushiki KaishaHead mounted display and cross reality system

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP6532393B2 (en)2015-12-022019-06-19株式会社ソニー・インタラクティブエンタテインメント Display control apparatus and display control method
JP6518578B2 (en)*2015-12-022019-05-22株式会社ソニー・インタラクティブエンタテインメント Display control apparatus and display control method
JP6495398B2 (en)*2017-09-042019-04-03株式会社コロプラ Method and program for providing virtual space, and information processing apparatus for executing the program
WO2019093278A1 (en)*2017-11-102019-05-16株式会社ソニー・インタラクティブエンタテインメントInformation processing device, information processing method, and program
JP6388270B1 (en)*2018-01-172018-09-12株式会社コナミデジタルエンタテインメント Information processing apparatus, information processing apparatus program, head mounted display, and display system
JP2019149122A (en)2018-02-282019-09-05ソニー株式会社Information processing device, information processing method, and program
CN112119451A (en)2018-05-222020-12-22索尼公司Information processing apparatus, information processing method, and program
WO2019229906A1 (en)*2018-05-302019-12-05株式会社ソニー・インタラクティブエンタテインメントImage generation device, image display system, image generation method, and computer program
EP3588249A1 (en)2018-06-262020-01-01Koninklijke Philips N.V.Apparatus and method for generating images of a scene
WO2020023524A1 (en)*2018-07-232020-01-30Magic Leap, Inc.Method and system for resolving hemisphere ambiguity using a position vector
JP6623431B2 (en)*2018-08-062019-12-25株式会社コナミデジタルエンタテインメント Information processing apparatus, information processing apparatus program, head mounted display, and display system
JP7384576B2 (en)*2019-06-062023-11-21株式会社ソニー・インタラクティブエンタテインメント Display control device, display control method, and display control program
JP6923964B2 (en)*2019-11-052021-08-25株式会社コナミデジタルエンタテインメント Information processing equipment, information processing equipment programs, head-mounted displays, and display systems
EP3819873A1 (en)*2019-11-052021-05-12Koninklijke Philips N.V.An image synthesis system and method therefor
JP6707224B2 (en)*2019-11-052020-06-10株式会社コナミデジタルエンタテインメント Information processing apparatus, information processing apparatus program, head mounted display, and display system
US11538265B2 (en)*2020-03-112022-12-27Universal City Studios LlcOrientation tag for providing orientation information
US20240394995A1 (en)*2021-09-302024-11-28Ntt Docomo, Inc.Virtual space presentation device
JP2023091513A (en)*2021-12-202023-06-30株式会社Psychic VR Lab Virtual reality display device, virtual reality display system, virtual reality display program, storage medium storing virtual reality display program, and virtual reality display method
JP2024127533A (en)*2023-03-092024-09-20キヤノン株式会社 Information processing device, information processing device system, information processing device control method, and program

Citations (89)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH08202281A (en)1995-01-301996-08-09Olympus Optical Co LtdHead mounted video display device system
US5742264A (en)*1995-01-241998-04-21Matsushita Electric Industrial Co., Ltd.Head-mounted display
US5907328A (en)1997-08-271999-05-25International Business Machines CorporationAutomatic and configurable viewpoint switching in a 3D scene
US5993318A (en)*1996-11-071999-11-30Kabushiki Kaisha Sega EnterprisesGame device, image sound processing device and recording medium
JP2000020753A (en)1998-07-072000-01-21Mitsubishi Electric Corp Virtual space controller
JP2000250699A (en)1999-03-042000-09-14Shimadzu Corp Eye-gaze input device
US20020024521A1 (en)*1996-05-022002-02-28Kabushiki Kaisha Sega EnterprisesGame device, method of processing and recording medium for the same
JP2002170132A (en)2000-12-042002-06-14Mixed Reality Systems Laboratory IncInformation presenting device and method and storage medium
US20030017872A1 (en)2001-07-192003-01-23Konami CorporationVideo game apparatus, method and recording medium storing program for controlling viewpoint movement of simulated camera in video game
CN1409218A (en)2002-09-182003-04-09北京航空航天大学Virtual environment forming method
JP2004054590A (en)2002-07-192004-02-19Canon Inc Virtual space drawing display device and virtual space drawing display method
CN1529880A (en)2001-06-252004-09-15���ء����߸�programmable joint simulator with force feedback and motion feedback
JP2004283521A (en)2003-03-202004-10-14Square Enix Co Ltd Video game system, recording medium and program
US6831656B2 (en)*2000-03-242004-12-14Konami Computer Entertainment Japan, Inc.Game system and computer readable storage medium storing game program
US20050024388A1 (en)*2003-07-302005-02-03Canon Kabushiki KaishaImage displaying method and apparatus
JP2005174021A (en)2003-12-112005-06-30Canon Inc Information presentation method and apparatus
US20050270309A1 (en)*2004-05-072005-12-08Namco Ltd.Program product, image generation method and image generation system
CN1708982A (en)2002-10-252005-12-14索尼计算机娱乐公司 Method and device for generating new image using image data changed along time axis
JP2006061716A (en)2005-10-312006-03-09Namco Ltd GAME DEVICE AND INFORMATION STORAGE MEDIUM
US20070252833A1 (en)*2006-04-272007-11-01Canon Kabushiki KaishaInformation processing method and information processing apparatus
US20070258658A1 (en)2006-05-022007-11-08Toshihiro KobayashiInformation processing apparatus and control method thereof, image processing apparatus, computer program, and storage medium
US7298384B2 (en)2002-01-222007-11-20Canon Kabushiki KaishaMixed reality presenting apparatus and image processing method
JP2007312930A (en)2006-05-242007-12-06Sony Computer Entertainment IncGame apparatus and game control method
US20080132334A1 (en)*2006-11-172008-06-05Nintendo Co., Ltd.Game system and storage medium storing game program
US20080297437A1 (en)*2007-05-312008-12-04Canon Kabushiki KaishaHead mounted display and control method therefor
US7479967B2 (en)2005-04-112009-01-20Systems Technology Inc.System for combining virtual and real-time environments
US20090069096A1 (en)2007-09-122009-03-12Namco Bandai Games Inc.Program, information storage medium, game system, and input instruction device
CN101414383A (en)2007-10-192009-04-22佳能株式会社Image processing apparatus and image processing method
US7646394B1 (en)*2004-03-052010-01-12Hrl Laboratories, LlcSystem and method for operating in a virtual environment
US20100026714A1 (en)*2008-07-312010-02-04Canon Kabushiki KaishaMixed reality presentation system
US20100066734A1 (en)*2008-09-162010-03-18Nintendo Co., Ltd.Storage Medium Storing Three-Dimensional Image Processing Program, Three-Dimensional Image Processing Apparatus and Three-Dimensional Image Processing Method
US20100087258A1 (en)2008-10-082010-04-08Namco Bandai Games Inc.Information storage medium, game system, and method of controlling game system
CN101783967A (en)2009-01-212010-07-21索尼公司Signal processing device, image display device, signal processing method, and computer program
US20100289878A1 (en)*2008-06-022010-11-18Satoshi SatoImage processing apparatus, method and computer program for generating normal information, and viewpoint-converted image generating apparatus
US20110014977A1 (en)*2008-03-262011-01-20Yukihiro YamazakiGame device, game processing method, information recording medium, and program
US20110034300A1 (en)*2009-08-052011-02-10David HallSensor, Control and Virtual Reality System for a Trampoline
US20110140994A1 (en)*2009-12-152011-06-16Noma TatsuyoshiInformation Presenting Apparatus, Method, and Computer Program Product
US20110245942A1 (en)*2010-03-312011-10-06Namco Bandai Games Inc.Information storage medium, terminal, image generation method, and network system
CN102317892A (en)2009-11-102012-01-11索尼计算机娱乐公司 Method of controlling information input device, information input device, program and information storage medium
JP2012008745A (en)2010-06-232012-01-12Softbank Mobile CorpUser interface device and electronic apparatus
US20120039507A1 (en)*2009-02-192012-02-16Sony Computer Entertainment Inc.Information Processing Device And Information Processing Method
JP2012048597A (en)2010-08-302012-03-08Univ Of TokyoMixed reality display system, image providing server, display device and display program
US20120169725A1 (en)2010-12-292012-07-05Sony CorporationHead-mounted display
CN102860837A (en)2011-07-082013-01-09株式会社东芝Image processing system, image processing device, image processing method, and medical image diagnostic device
US20130258461A1 (en)*2012-03-292013-10-03Fujitsu Limited3d display device and method
US20140160129A1 (en)2012-12-102014-06-12Sony CorporationInformation processing apparatus and recording medium
US20140186002A1 (en)*2012-12-272014-07-03Sony CorporationInformation processing apparatus and recording medium
US20140198033A1 (en)*2013-01-152014-07-17Seiko Epson CorporationHead-mounted display device, control method for head-mounted display device, and image display system
US20140225920A1 (en)*2013-02-132014-08-14Seiko Epson CorporationImage display device and display control method for image display device
US20140241586A1 (en)*2013-02-272014-08-28Nintendo Co., Ltd.Information retaining medium and information processing system
US20140327613A1 (en)*2011-12-142014-11-06Universita' Degli Studidi GenovaImproved three-dimensional stereoscopic rendering of virtual objects for a moving observer
US20140354515A1 (en)*2013-05-302014-12-04Oculus Vr, LlcPerception based predictive tracking for head mounted displays
US20140361956A1 (en)*2013-06-092014-12-11Sony Computer Entertainment Inc.Head Mounted Display
US20140362446A1 (en)*2013-06-112014-12-11Sony Computer Entertainment Europe LimitedElectronic correction based on eye tracking
CN104238665A (en)2013-06-242014-12-24佳能株式会社Image processing apparatus and image processing method
US20140375560A1 (en)*2013-06-252014-12-25Nintendo Co., Ltd.Storage medium storing information processing program, information processing device, information processing system, and method for calculating specified position
US20150009101A1 (en)2013-07-032015-01-08Sony CorporationDisplay apparatus
JP2015011368A (en)2013-06-262015-01-19国立大学法人佐賀大学 Display control device
US20150022664A1 (en)*2012-01-202015-01-22Magna Electronics Inc.Vehicle vision system with positionable virtual viewpoint
US20150049018A1 (en)*2011-07-142015-02-19Google Inc.Virtual Window in Head-Mounted Display
US20150049003A1 (en)*2013-08-192015-02-19Seiko Epson CorporationHead mounted display and method for controlling head mounted display
US20150052458A1 (en)2013-08-162015-02-19Disney Enterprises, Inc.Cross platform sharing of user-generated content
CN104407700A (en)2014-11-272015-03-11曦煌科技(北京)有限公司Mobile head-wearing type virtual reality and augmented reality device
CN104603673A (en)2012-09-032015-05-06Smi创新传感技术有限公司Head mounted system and method to compute and render stream of digital images using head mounted system
JP2015095045A (en)2013-11-112015-05-18株式会社ソニー・コンピュータエンタテインメント Image generating apparatus and image generating method
US20150177829A1 (en)*2013-12-252015-06-25Sony CorporationImage processing device and image processing method, display device and display method, computer program, and image display system
US9075514B1 (en)*2012-12-132015-07-07Amazon Technologies, Inc.Interface selection element display
US9081177B2 (en)*2011-10-072015-07-14Google Inc.Wearable computer with nearby object response
US20150199850A1 (en)*2014-01-162015-07-16Canon Kabushiki KaishaInformation processing apparatus and information processing method
US20150293362A1 (en)*2012-11-132015-10-15Sony CorporationImage display apparatus, image display method, mobile apparatus, image display system, and computer program
JP2015182712A (en)2014-03-262015-10-22日本電気株式会社Image processing device, method and program
JP2015187797A (en)2014-03-272015-10-29シャープ株式会社Image data generation device and image data reproduction device
US20150352437A1 (en)*2014-06-092015-12-10Bandai Namco Games Inc.Display control method for head mounted display (hmd) and image generation device
JP5869177B1 (en)2015-09-162016-02-24株式会社コロプラ Virtual reality space video display method and program
US20160078681A1 (en)*2013-04-242016-03-17Kawasaki Jukogyo Kabushiki KaishaWorkpiece machining work support system and workpiece machining method
US20160078682A1 (en)*2013-04-242016-03-17Kawasaki Jukogyo Kabushiki KaishaComponent mounting work support system and component mounting method
US20160125654A1 (en)*2013-05-222016-05-05Kawasaki Jukogyo Kabushiki KaishaComponent assembly work support system and component assembly method
US20160330376A1 (en)*2015-05-062016-11-10Otoy, Inc.Apparatus and method for spherical light field capture
US20170076486A1 (en)*2015-09-142017-03-16Koei Tecmo Games Co., Ltd.Data processing apparatus and method of controlling display
US20170076497A1 (en)*2015-09-142017-03-16Colopl, Inc.Computer program for directing line of sight
US20170153713A1 (en)*2015-11-302017-06-01Fujitsu LimitedHead mounted display device and control method
US20170169540A1 (en)*2014-05-162017-06-15Unimoto IncorporatedAll-around moving image distribution system, all-around moving image distribution method, image processing apparatus, communication terminal apparatus, and control methods and control programs of image processing apparatus and communication terminal apparatus
US20170278262A1 (en)*2014-07-312017-09-28Sony CorporationInformation processing device, method of information processing, and image display system
US20180307310A1 (en)*2015-03-212018-10-25Mine One GmbhVirtual 3d methods, systems and software
US10116914B2 (en)*2013-10-312018-10-303Di LlcStereoscopic display
US20180321493A1 (en)*2015-11-112018-11-08Lg Electronics Inc.Hmd and method for controlling same
US20180349083A1 (en)*2015-09-302018-12-06Sony CorporationInformation processing system and information processing method
US10261581B2 (en)*2014-09-182019-04-16Fxgear Inc.Head-mounted display controlled by sightline, method for controlling same, and computer program for controlling same
US20190121515A1 (en)*2015-10-152019-04-25Sony CorporationInformation processing device and information processing method

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2002263371A (en)2001-03-072002-09-17Enix Corp Communication game system, recording medium and program
SE0401582L (en)*2004-06-182005-05-10Totalfoersvarets Forskningsins Interactive procedure for presenting information in an image
US8160400B2 (en)*2005-11-172012-04-17Microsoft CorporationNavigating images using image based geometric alignment and object based controls
US8576276B2 (en)*2010-11-182013-11-05Microsoft CorporationHead-mounted display device which provides surround video
US20130293530A1 (en)*2012-05-042013-11-07Kathryn Stone PerezProduct augmentation and advertising in see through displays
JP5538483B2 (en)*2012-06-292014-07-02株式会社ソニー・コンピュータエンタテインメント Video processing apparatus, video processing method, and video processing system
KR101385681B1 (en)*2012-11-082014-04-15삼성전자 주식회사Head-mount type display apparatus and control method thereof
US20160162248A1 (en)*2014-12-092016-06-09Nathan BassettFish Finder Display System
JP6691351B2 (en)*2015-03-312020-04-28株式会社バンダイナムコエンターテインメント Program and game system
JP6684559B2 (en)*2015-09-162020-04-22株式会社バンダイナムコエンターテインメント Program and image generation device
JP6532393B2 (en)*2015-12-022019-06-19株式会社ソニー・インタラクティブエンタテインメント Display control apparatus and display control method

Patent Citations (123)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5742264A (en)*1995-01-241998-04-21Matsushita Electric Industrial Co., Ltd.Head-mounted display
US6124843A (en)*1995-01-302000-09-26Olympus Optical Co., Ltd.Head mounting type image display system
JPH08202281A (en)1995-01-301996-08-09Olympus Optical Co LtdHead mounted video display device system
US20020024521A1 (en)*1996-05-022002-02-28Kabushiki Kaisha Sega EnterprisesGame device, method of processing and recording medium for the same
US5993318A (en)*1996-11-071999-11-30Kabushiki Kaisha Sega EnterprisesGame device, image sound processing device and recording medium
US5907328A (en)1997-08-271999-05-25International Business Machines CorporationAutomatic and configurable viewpoint switching in a 3D scene
JP2000020753A (en)1998-07-072000-01-21Mitsubishi Electric Corp Virtual space controller
JP2000250699A (en)1999-03-042000-09-14Shimadzu Corp Eye-gaze input device
US6831656B2 (en)*2000-03-242004-12-14Konami Computer Entertainment Japan, Inc.Game system and computer readable storage medium storing game program
JP2002170132A (en)2000-12-042002-06-14Mixed Reality Systems Laboratory IncInformation presenting device and method and storage medium
US20040254771A1 (en)2001-06-252004-12-16Robert RienerProgrammable joint simulator with force and motion feedback
CN1529880A (en)2001-06-252004-09-15���ء����߸�programmable joint simulator with force feedback and motion feedback
US8108190B2 (en)2001-06-252012-01-31Robert RienerProgrammable joint simulator with force and motion feedback
US20030017872A1 (en)2001-07-192003-01-23Konami CorporationVideo game apparatus, method and recording medium storing program for controlling viewpoint movement of simulated camera in video game
US7298384B2 (en)2002-01-222007-11-20Canon Kabushiki KaishaMixed reality presenting apparatus and image processing method
JP2004054590A (en)2002-07-192004-02-19Canon Inc Virtual space drawing display device and virtual space drawing display method
US7492362B2 (en)2002-07-192009-02-17Canon Kabushiki KaishaVirtual space rendering/display apparatus and virtual space rendering/display method
US20070109296A1 (en)2002-07-192007-05-17Canon Kabushiki KaishaVirtual space rendering/display apparatus and virtual space rendering/display method
CN1409218A (en)2002-09-182003-04-09北京航空航天大学Virtual environment forming method
CN1708982A (en)2002-10-252005-12-14索尼计算机娱乐公司 Method and device for generating new image using image data changed along time axis
CN101123692A (en)2002-10-252008-02-13索尼计算机娱乐公司 Method and device for generating new image using image data changed along time axis
JP2004283521A (en)2003-03-202004-10-14Square Enix Co Ltd Video game system, recording medium and program
US20050024388A1 (en)*2003-07-302005-02-03Canon Kabushiki KaishaImage displaying method and apparatus
JP2005174021A (en)2003-12-112005-06-30Canon Inc Information presentation method and apparatus
US7646394B1 (en)*2004-03-052010-01-12Hrl Laboratories, LlcSystem and method for operating in a virtual environment
US20050270309A1 (en)*2004-05-072005-12-08Namco Ltd.Program product, image generation method and image generation system
US7479967B2 (en)2005-04-112009-01-20Systems Technology Inc.System for combining virtual and real-time environments
JP2006061716A (en)2005-10-312006-03-09Namco Ltd GAME DEVICE AND INFORMATION STORAGE MEDIUM
US20070252833A1 (en)*2006-04-272007-11-01Canon Kabushiki KaishaInformation processing method and information processing apparatus
US20070258658A1 (en)2006-05-022007-11-08Toshihiro KobayashiInformation processing apparatus and control method thereof, image processing apparatus, computer program, and storage medium
JP2007299326A (en)2006-05-022007-11-15Canon Inc Information processing apparatus and control method therefor, image processing apparatus, program, and storage medium
JP2007312930A (en)2006-05-242007-12-06Sony Computer Entertainment IncGame apparatus and game control method
US20080132334A1 (en)*2006-11-172008-06-05Nintendo Co., Ltd.Game system and storage medium storing game program
US20080297437A1 (en)*2007-05-312008-12-04Canon Kabushiki KaishaHead mounted display and control method therefor
US20090069096A1 (en)2007-09-122009-03-12Namco Bandai Games Inc.Program, information storage medium, game system, and input instruction device
CN101414383A (en)2007-10-192009-04-22佳能株式会社Image processing apparatus and image processing method
US20110014977A1 (en)*2008-03-262011-01-20Yukihiro YamazakiGame device, game processing method, information recording medium, and program
CN101970067A (en)2008-03-262011-02-09科乐美数码娱乐株式会社Game device, game processing method, information recording medium, and program
US20100289878A1 (en)*2008-06-022010-11-18Satoshi SatoImage processing apparatus, method and computer program for generating normal information, and viewpoint-converted image generating apparatus
US20100026714A1 (en)*2008-07-312010-02-04Canon Kabushiki KaishaMixed reality presentation system
US20100066734A1 (en)*2008-09-162010-03-18Nintendo Co., Ltd.Storage Medium Storing Three-Dimensional Image Processing Program, Three-Dimensional Image Processing Apparatus and Three-Dimensional Image Processing Method
US20100087258A1 (en)2008-10-082010-04-08Namco Bandai Games Inc.Information storage medium, game system, and method of controlling game system
CN101783967A (en)2009-01-212010-07-21索尼公司Signal processing device, image display device, signal processing method, and computer program
US20100182409A1 (en)2009-01-212010-07-22Sony CorporationSignal processing device, image display device, signal processing method, and computer program
US8564645B2 (en)2009-01-212013-10-22Sony CorporationSignal processing device, image display device, signal processing method, and computer program
US8879787B2 (en)*2009-02-192014-11-04Sony CorporationInformation processing device and information processing method
US20120039507A1 (en)*2009-02-192012-02-16Sony Computer Entertainment Inc.Information Processing Device And Information Processing Method
US20110034300A1 (en)*2009-08-052011-02-10David HallSensor, Control and Virtual Reality System for a Trampoline
CN102317892A (en)2009-11-102012-01-11索尼计算机娱乐公司 Method of controlling information input device, information input device, program and information storage medium
US20120212429A1 (en)*2009-11-102012-08-23Sony Computer Entertainment Inc.Control method for information input device, information input device, program therefor, and information storage medium therefor
US9250799B2 (en)2009-11-102016-02-02Sony CorporationControl method for information input device, information input device, program therefor, and information storage medium therefor
US20110140994A1 (en)*2009-12-152011-06-16Noma TatsuyoshiInformation Presenting Apparatus, Method, and Computer Program Product
US20110245942A1 (en)*2010-03-312011-10-06Namco Bandai Games Inc.Information storage medium, terminal, image generation method, and network system
JP2012008745A (en)2010-06-232012-01-12Softbank Mobile CorpUser interface device and electronic apparatus
JP2012048597A (en)2010-08-302012-03-08Univ Of TokyoMixed reality display system, image providing server, display device and display program
US20130194305A1 (en)2010-08-302013-08-01Asukalab Inc.Mixed reality display system, image providing server, display device and display program
CN102566052A (en)2010-12-292012-07-11索尼公司Head-mounted display
US9268138B2 (en)2010-12-292016-02-23Sony CorporationHead-mounted display
US20120169725A1 (en)2010-12-292012-07-05Sony CorporationHead-mounted display
US20130009957A1 (en)2011-07-082013-01-10Toshiba Medical Systems CorporationImage processing system, image processing device, image processing method, and medical image diagnostic device
CN102860837A (en)2011-07-082013-01-09株式会社东芝Image processing system, image processing device, image processing method, and medical image diagnostic device
US20150049018A1 (en)*2011-07-142015-02-19Google Inc.Virtual Window in Head-Mounted Display
US9081177B2 (en)*2011-10-072015-07-14Google Inc.Wearable computer with nearby object response
US20140327613A1 (en)*2011-12-142014-11-06Universita' Degli Studidi GenovaImproved three-dimensional stereoscopic rendering of virtual objects for a moving observer
US20150022664A1 (en)*2012-01-202015-01-22Magna Electronics Inc.Vehicle vision system with positionable virtual viewpoint
US20130258461A1 (en)*2012-03-292013-10-03Fujitsu Limited3d display device and method
US9380287B2 (en)2012-09-032016-06-28Sensomotoric Instruments Gesellschaft Fur Innovative Sensorik MbhHead mounted system and method to compute and render a stream of digital images using a head mounted display
US20150288944A1 (en)2012-09-032015-10-08SensoMotoric Instruments Gesellschaft für innovative Sensorik mbHHead mounted system and method to compute and render a stream of digital images using a head mounted display
CN104603673A (en)2012-09-032015-05-06Smi创新传感技术有限公司Head mounted system and method to compute and render stream of digital images using head mounted system
US20150293362A1 (en)*2012-11-132015-10-15Sony CorporationImage display apparatus, image display method, mobile apparatus, image display system, and computer program
US20140160129A1 (en)2012-12-102014-06-12Sony CorporationInformation processing apparatus and recording medium
US9075514B1 (en)*2012-12-132015-07-07Amazon Technologies, Inc.Interface selection element display
US9256284B2 (en)2012-12-272016-02-09Sony CorporationInformation processing apparatus and recording medium
JP2014127987A (en)2012-12-272014-07-07Sony CorpInformation processing apparatus and recording medium
US20140186002A1 (en)*2012-12-272014-07-03Sony CorporationInformation processing apparatus and recording medium
US9448625B2 (en)2013-01-152016-09-20Seiko Epson CorporationHead-mounted display device, control method for head-mounted display device, and image display system
US20140198033A1 (en)*2013-01-152014-07-17Seiko Epson CorporationHead-mounted display device, control method for head-mounted display device, and image display system
JP2014137396A (en)2013-01-152014-07-28Seiko Epson CorpHead-mounted type display device, head-mounted type display device control method and image display system
US20140225920A1 (en)*2013-02-132014-08-14Seiko Epson CorporationImage display device and display control method for image display device
US20140241586A1 (en)*2013-02-272014-08-28Nintendo Co., Ltd.Information retaining medium and information processing system
US20160078682A1 (en)*2013-04-242016-03-17Kawasaki Jukogyo Kabushiki KaishaComponent mounting work support system and component mounting method
US20160078681A1 (en)*2013-04-242016-03-17Kawasaki Jukogyo Kabushiki KaishaWorkpiece machining work support system and workpiece machining method
US20160125654A1 (en)*2013-05-222016-05-05Kawasaki Jukogyo Kabushiki KaishaComponent assembly work support system and component assembly method
US20140354515A1 (en)*2013-05-302014-12-04Oculus Vr, LlcPerception based predictive tracking for head mounted displays
US20140361956A1 (en)*2013-06-092014-12-11Sony Computer Entertainment Inc.Head Mounted Display
US20140362446A1 (en)*2013-06-112014-12-11Sony Computer Entertainment Europe LimitedElectronic correction based on eye tracking
US9684169B2 (en)2013-06-242017-06-20Canon Kabushiki KaishaImage processing apparatus and image processing method for viewpoint determination
US20140375687A1 (en)*2013-06-242014-12-25Canon Kabushiki KaishaImage processing apparatus and image processing method
CN104238665A (en)2013-06-242014-12-24佳能株式会社Image processing apparatus and image processing method
US20140375560A1 (en)*2013-06-252014-12-25Nintendo Co., Ltd.Storage medium storing information processing program, information processing device, information processing system, and method for calculating specified position
JP2015011368A (en)2013-06-262015-01-19国立大学法人佐賀大学 Display control device
US9846303B2 (en)2013-07-032017-12-19Sony CorporationDisplay system having display device and sensor on substrate
CN104280882A (en)2013-07-032015-01-14索尼公司Display system and display method
US20150009101A1 (en)2013-07-032015-01-08Sony CorporationDisplay apparatus
US9658737B2 (en)2013-08-162017-05-23Disney Enterprises, Inc.Cross platform sharing of user-generated content
CN104376194A (en)2013-08-162015-02-25迪士尼企业公司Cross platform sharing of user-generated content
US20150052458A1 (en)2013-08-162015-02-19Disney Enterprises, Inc.Cross platform sharing of user-generated content
CN104423041A (en)2013-08-192015-03-18精工爱普生株式会社Head mounted display and method for controlling head mounted display
US20150049003A1 (en)*2013-08-192015-02-19Seiko Epson CorporationHead mounted display and method for controlling head mounted display
US9740009B2 (en)2013-08-192017-08-22Seiko Epson CorporationHead mounted display and method for controlling head mounted display
US10116914B2 (en)*2013-10-312018-10-303Di LlcStereoscopic display
JP2015095045A (en)2013-11-112015-05-18株式会社ソニー・コンピュータエンタテインメント Image generating apparatus and image generating method
US20160282619A1 (en)2013-11-112016-09-29Sony Interactive Entertainment Inc.Image generation apparatus and image generation method
US20150177829A1 (en)*2013-12-252015-06-25Sony CorporationImage processing device and image processing method, display device and display method, computer program, and image display system
US20150199850A1 (en)*2014-01-162015-07-16Canon Kabushiki KaishaInformation processing apparatus and information processing method
JP2015182712A (en)2014-03-262015-10-22日本電気株式会社Image processing device, method and program
JP2015187797A (en)2014-03-272015-10-29シャープ株式会社Image data generation device and image data reproduction device
US20170169540A1 (en)*2014-05-162017-06-15Unimoto IncorporatedAll-around moving image distribution system, all-around moving image distribution method, image processing apparatus, communication terminal apparatus, and control methods and control programs of image processing apparatus and communication terminal apparatus
US20150352437A1 (en)*2014-06-092015-12-10Bandai Namco Games Inc.Display control method for head mounted display (hmd) and image generation device
US20170278262A1 (en)*2014-07-312017-09-28Sony CorporationInformation processing device, method of information processing, and image display system
US10261581B2 (en)*2014-09-182019-04-16Fxgear Inc.Head-mounted display controlled by sightline, method for controlling same, and computer program for controlling same
CN104407700A (en)2014-11-272015-03-11曦煌科技(北京)有限公司Mobile head-wearing type virtual reality and augmented reality device
US20180307310A1 (en)*2015-03-212018-10-25Mine One GmbhVirtual 3d methods, systems and software
US20160330376A1 (en)*2015-05-062016-11-10Otoy, Inc.Apparatus and method for spherical light field capture
US20170076486A1 (en)*2015-09-142017-03-16Koei Tecmo Games Co., Ltd.Data processing apparatus and method of controlling display
US9928650B2 (en)*2015-09-142018-03-27Colopl, Inc.Computer program for directing line of sight
US10029176B2 (en)*2015-09-142018-07-24Koei Tecmo Games Co., Ltd.Data processing apparatus and method of controlling display
US20170076497A1 (en)*2015-09-142017-03-16Colopl, Inc.Computer program for directing line of sight
JP5869177B1 (en)2015-09-162016-02-24株式会社コロプラ Virtual reality space video display method and program
US20180349083A1 (en)*2015-09-302018-12-06Sony CorporationInformation processing system and information processing method
US20190121515A1 (en)*2015-10-152019-04-25Sony CorporationInformation processing device and information processing method
US20180321493A1 (en)*2015-11-112018-11-08Lg Electronics Inc.Hmd and method for controlling same
US20170153713A1 (en)*2015-11-302017-06-01Fujitsu LimitedHead mounted display device and control method

Non-Patent Citations (14)

* Cited by examiner, † Cited by third party
Title
Caroline Jay et al: "Amplifying Head Movements with Head-Mounted Displays", PRESENCE, vol. 12, No. 3, pp. 268-276, Jun. 1, 2003.
Examination Report for corresponding EP Application No. 16870544.0, 5 pages, dated Feb. 25, 2020.
Examination Report for corresponding EP Application No. 18206819.7, 7 pages, dated Feb. 25, 2020.
Extended European Search Report for corresponding EP Application No. 18206819, 8 pages, dated Mar. 14, 2019.
Extended European Search Report for corresponding EP Application No. 21208486.7, 8 pages, dated Feb. 17, 2022.
International Preliminary Report on Patentability and Written Opinion for corresponding PCT Application No. PCT/JP2016/084937, 10 pages, dated Jul. 6, 2018.
International Search Report for corresponding PCT Application No. PCT/JP2016/084937, 3 pages, dated Feb. 28, 2017.
Notice of Reasons for Refusal for corresponding JP Application No. 2020055471, 6 pages, dated Feb. 4, 2021.
Notification of Decision to Grant corresponding CN Application No. 201680069099.8, 8 pages, dated Sep. 30, 2021.
Notification of Reasons for Refusal for corresponding JP Application No. 2021-088448, 4 pages, dated Apr. 6, 2022.
Partial Supplementary European Search Report for corresponding EP Application No. 16870544.0, 15 pages, dated Oct. 16, 2018.
Reconsideration Report before Appeal for corresponding JP Application No. 2021-088448, 4 pages, dated Dec. 16, 2022.
The First Office Action for corresponding CN Application No. 201680069099.8, 22 pages, dated Jul. 3, 2020.
The Second Office Action for corresponding CN Application No. 201680069099.8, 17 pages, dated Mar. 25, 2021.

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US12166958B2 (en)*2021-09-272024-12-10Canon Kabushiki KaishaHead mounted display and cross reality system

Also Published As

Publication numberPublication date
CN114296235B (en)2024-08-30
EP3385942B1 (en)2022-01-19
EP3982355B1 (en)2025-06-04
CN108292490B (en)2021-11-16
US12124044B2 (en)2024-10-22
EP3474271B1 (en)2021-09-22
US11042038B2 (en)2021-06-22
WO2017094607A1 (en)2017-06-08
JP6532393B2 (en)2019-06-19
EP3385942A1 (en)2018-10-10
EP3474271A1 (en)2019-04-24
EP3385942A4 (en)2019-02-27
US20230408833A1 (en)2023-12-21
CN114296235A (en)2022-04-08
JP2017102297A (en)2017-06-08
CN108292490A (en)2018-07-17
US20210223558A1 (en)2021-07-22
US20180329215A1 (en)2018-11-15
US20250004284A1 (en)2025-01-02
EP3982355A1 (en)2022-04-13

Similar Documents

PublicationPublication DateTitle
US12124044B2 (en)Display control apparatus and display control method
US10427033B2 (en)Display control apparatus and display control method
US20180373349A1 (en)Display control apparatus and display control method
US11030771B2 (en)Information processing apparatus and image generating method
US10627628B2 (en)Information processing apparatus and image generating method
JP6368404B1 (en) Information processing method, program, and computer
JP2019155115A (en)Program, information processor and information processing method
JP2019133309A (en)Program, information processor and information processing method
JP6705929B2 (en) Display control device and display control method
JP7462591B2 (en) Display control device and display control method
JP6683862B2 (en) Display control device and display control method
JP6891319B2 (en) Display control device and display control method
TWI746463B (en)Virtual reality apparatus
JP2019032715A (en)Information processing method, device, and program for causing computer to execute the method
JP6718930B2 (en) Program, information processing apparatus, and method

Legal Events

DateCodeTitleDescription
FEPPFee payment procedure

Free format text:ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPPInformation on status: patent application and granting procedure in general

Free format text:APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:FINAL REJECTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:ADVISORY ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPPInformation on status: patent application and granting procedure in general

Free format text:PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCFInformation on status: patent grant

Free format text:PATENTED CASE

CCCertificate of correction

[8]ページ先頭

©2009-2025 Movatter.jp