Movatterモバイル変換


[0]ホーム

URL:


US10960297B2 - Systems and methods for tracking a physical object using a passive object having a reflective surface - Google Patents

Systems and methods for tracking a physical object using a passive object having a reflective surface
Download PDF

Info

Publication number
US10960297B2
US10960297B2US16/133,597US201816133597AUS10960297B2US 10960297 B2US10960297 B2US 10960297B2US 201816133597 AUS201816133597 AUS 201816133597AUS 10960297 B2US10960297 B2US 10960297B2
Authority
US
United States
Prior art keywords
simulation
passive
passive object
real
location profile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/133,597
Other versions
US20200086208A1 (en
Inventor
Steven M. Chapman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Disney Enterprises Inc
Original Assignee
Disney Enterprises Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Disney Enterprises IncfiledCriticalDisney Enterprises Inc
Priority to US16/133,597priorityCriticalpatent/US10960297B2/en
Assigned to DISNEY ENTERPRISES, INC.reassignmentDISNEY ENTERPRISES, INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: CHAPMAN, STEVEN M.
Publication of US20200086208A1publicationCriticalpatent/US20200086208A1/en
Application grantedgrantedCritical
Publication of US10960297B2publicationCriticalpatent/US10960297B2/en
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

Systems, methods, and devices are disclosed for tracking physical objects using a passive reflective object. A computer-implemented method includes obtaining a location profile derived from content capturing a passive object having a reflective surface reflecting one or more real-world objects. The passive object is attached to a physical object. The method further includes transmitting the location profile to a simulation device. The method further includes generating a virtual representation of the physical object based on the location profile of the passive object. The method further includes presenting the virtual representation in a simulation experience.

Description

TECHNICAL FIELD
The present disclosure relates generally to simulated experiences, such as that experienced in gaming environments and augmented reality (AR)/virtual reality (VR) environments.
BRIEF SUMMARY OF THE DISCLOSURE
Embodiments of the present disclosure include systems, methods, and devices capable of tracking the location of a physical object using a passive reflective surface to display in a virtual environments such as augmented reality (AR)/virtual reality (VR).
In accordance with the technology described herein, a method for tracking physical objects using a passive reflective surface is disclosed. The computer-implemented method includes obtaining a location profile derived from content capturing a passive object having a reflective surface reflecting one or more real-world objects. The passive object is attached to a physical object. The method further includes transmitting the location profile to a simulation device. The further method includes generating a virtual representation of the physical object based on the location profile of the passive object. The method further includes presenting the virtual representation in a simulation experience.
In embodiments, the passive object includes a pattern to improve the location profile.
In embodiments, the pattern is a set of etched parallel lines on the passive object.
In embodiments, the location profile includes a position and an orientation of the passive object in the real-world environment.
In embodiments, obtaining a location profile derived from content further includes detecting the one or more real-world objects reflected in the passive object. The location profile derived from content further includes generating vectors from the passive object to the one or more real-world objects.
In embodiments, the vector includes a Rodrigues' vector.
In embodiments, the camera is integrated into the simulation device.
In embodiments, the passive object is a spheroid.
In embodiments, the simulation experience includes at least one of an augmented reality (AR), virtual reality (VR), motion capture performance, and gaming experience.
In embodiments, the method may include obtaining a second location profile of the passive object. The second location profile includes a second position and a second orientation. The method may further include transmitting the second location profile to the simulation device. The method may further include presenting the virtual object in a second location in the simulation experience based on the second location profile.
In accordance with additional aspects of the present disclosure, a system includes a camera. A system further includes a simulation device operatively connected to the camera. The simulation device may present a virtual representation of a physical object with an attached passive object having a reflective surface based on a location profile derived from content captured by the camera of one or more objects reflected off the passive object. The simulation device presents a dynamically moving virtual representation of the physical object associated with movement of the physical object in a simulation experience.
In embodiments, the passive object includes a mark associated with one or more changes to the simulation experience.
In embodiments, the mark is one or more of a barcode, QR code, UPC code, and serial number.
In embodiments, the one or more changes includes one or more of a virtual representation of the physical object, an additional simulation experience, an additional simulation event occurrence, and an additional virtual representation of a character in the simulation device.
In embodiments, the location profile includes a location and orientation of the passive object at a time.
In embodiments, the passive object is a spheroid and includes a pattern. The pattern may be a set of etched lines on the passive object running from a first pole of the passive object to a second pole of the passive object.
In embodiments, the simulation device further includes a lighting device in a fixed location in a real-world environment. The lighting device may be captured by the camera as a reflection in the passive object to calibrate the location profile by using a known location of the lighting device.
In accordance with additional aspects of the present disclosure, a computer-implemented method includes receiving content captured by a camera of a passive object having a reflective surface reflecting one or more real-world objects. The computer-implemented method further includes generating a location profile of the passive object attached to a physical object. The location profile includes a location and orientation of a passive object. The computer-implemented method further includes generating a location of a virtual representation of the physical object in the simulation experience. The computer-implemented method further includes presenting the simulation experience with the virtual representation in a virtual location.
In embodiments, generating a location profile of the passive object includes detecting the one or more real-world objects reflected in the passive object. Generating a location profile of the passive object further includes generating vectors from the passive object to the one or more real-world objects.
In embodiments, detecting the one or more real world objects includes one or more of centroid detection and corner detection.
BRIEF DESCRIPTION OF THE DRAWINGS
The present disclosure, in accordance with one or more various embodiments, is described in detail with reference to the following figures. The figures are provided for purposes of illustration only and merely depict typical or example embodiments.
FIG. 1 is an operational flow diagram illustrating an example process for providing a simulation experience with physical objects attached to passive objects having reflective surfaces in accordance with one embodiment.
FIG. 2A illustrates an example system for providing a simulation experience using physical objects having attached passive objects in accordance with various embodiments.
FIG. 2B illustrates example components of the simulation device elements in the system ofFIG. 2A.
FIG. 3 illustrates an example simulation experience in accordance with one embodiment.
FIG. 4 illustrates an example simulation experience in accordance with one embodiment.
FIG. 5 is an example computing component that may be used to implement various features of embodiments described in the present disclosure.
The figures are not exhaustive and do not limit the present disclosure to the precise form disclosed.
DETAILED DESCRIPTION
Systems that provide or generate simulated environments often rely on embedded electronics to incorporate and track real-world objects in those simulated environments. For example, some conventional simulation environments rely on purely virtual objects generated by simulation devices. Some conventional simulation environments can include virtual objects that are generated based on real-world objects having embedded electronics. Other simulation environments may use laser and radio technology to track real-world objects for use as virtual objects in the simulation environments. For example, a real-world object having embedded electronics may communicate with a simulation device or network to associate that real-world object with a virtual object in a simulation environment. Such communications may involve the embedded electronics sending electronic signals about the real-world object's position. In this way, movement of and/or actions involving the real-world object in the real-world correspond to movement and/or actions in the simulation environment.
For example, a simulation experience may include VR, AR, or other virtual experiences. VR can refer to the creation of a fully immersive virtual world/space experience with which users may interact. AR can refer to the blending of VR elements and real life. For example, AR may involve providing a live, displayed experience of a physical, real-world environment in which the real-world elements are augmented or replaced by computer-generated images, video, or text. Still, other environments, such as that created in a video game, can involve simulating player interaction(s) within the gaming environment.
Accordingly, various embodiments of the present disclosure are directed to a device, apparatus, or otherpassive object204 having a reflective surface (FIG. 2A) that can be attached to a (real-world)physical object202. Thepassive object204 allows a user to interact with or engage in a simulation experience within a simulation environment using that actual,physical object202. In some embodiments,passive object204 is a spheroid. It should be appreciated thatpassive object204 could be other shapes. In some embodiments, the reflections reflected offpassive object204 may be captured bycamera205. The captured reflection may be used to generate a location profile that can be relayed to a simulation device or system206 (either directly or indirectly through network212) such that movement of thephysical object202 can be accurately represented and tracked in the simulation environment. In some embodiments, the location profile of thephysical object202 may be improved by patterns onpassive object204.
As used herein, the term physical object can refer to a real-world analog used to interact with or interact in a simulated environment, such as a gaming environment, an AR/VR experience, and the like. It is distinguished from dedicated controllers and other objects with embedded electronics that are used for controlling or interacting within a simulated experience or environment. For example, a user may use a pencil to simulate a wand in a simulated game generated or presented by a gaming system.
FIG. 1 is an operational flow diagram illustrating an example process for providing a simulation experience that utilizes physical objects having passive objects attached in accordance with one embodiment. The operations of the various methods described herein are not necessarily limited to the order described or shown in the figures, and one of skill in the art will appreciate, upon studying the present disclosure, variations of the order of the operations described herein that are within the spirit and scope of the disclosure.
The operations and sub-operations of the flow diagram may be carried out, in some cases, by one or more of the components, elements, devices, components, and circuitry ofsystem200. This may include one or more of:camera205; simulation device206 (including the corresponding components of simulation device206);network212;server system214;server216;processor220;database218;presentation output device210;display208;speakers209; and/orcomputing component500, described herein and referenced with respect to at leastFIGS. 2A, 2B, and 5, as well as subcomponents, elements, devices, components, and circuitry depicted therein and/or described with respect thereto. In such instances, the description of the flow diagram may refer to a corresponding component, element, etc., but regardless of whether an explicit reference is made, it will be appreciated, upon studying the present disclosure, when the corresponding component, element, etc. may be used. Further, it will be appreciated that such references do not necessarily limit the described methods to the particular component, element, etc. referred to. Thus, it will be appreciated that aspects and features described above in connection with (sub-) components, elements, devices, circuitry, etc., including variations thereof, may be applied to the various operations described in connection with the flow diagram without departing from the scope of the present disclosure.
Atoperation100, a location profile including a three-dimensional position ofpassive object204 and an orientation or direction ofpassive object204 is obtained. In embodiments, the location profile may be derived from objects reflected offpassive object204. For example,camera205 may capture a video or images of passive object204 (e.g., a spheroid).Passive object204 may be reflecting spheroid representations of one or more objects in the real-world environment. In embodiments, corner detection, centroid detection, and/or other object recognition techniques may be used to detect the one or more objects captured in the spheroid representation. A vector, one example of a location profile, may be generated from the center of the passive object to a center of the one or more detected objects in the real-world environment. In some embodiments, the vector may be a Rodrigues' vector, a vector describing the three-dimensional position, an axis around which rotation occurs, and an angle of rotation of an object (e.g., passive object204). The magnitude of the Rodrigues' vector may be the tangent of half the angle of rotation. In some embodiments, other location information (e.g., time, position, etc.) may be obtained to determine the location and rotation ofpassive object204.
Camera205 may be stationary (e.g., fixed above or around the user, or otherwise capturing the real-world environment). In some embodiments,camera205 may be mobile (e.g., attached to a user, integrated into an AR/VR headset, etc.). In some embodiments,camera205 may be a 360 degree camera, a webcam, a video camera, a phone camera, and/or other type or form of camera. In some embodiments, there may be more than one camera.
In some embodiments,passive object204 may be attached tophysical object202. This allows thephysical object202 to be incorporated into the simulation environment and experiences. For example,passive object204 may be attached to physical objects using adhesives, Velcro® straps, buckles, mounts, and/or other attachment mechanisms. In one example, thephysical object202 may be a collectible associated with the simulation experience, such as a laser gun from a sci-fi game. The laser gun may be configured to receivepassive object204.Passive object204 may have a mark that is optically capturable that includes information associated with the simulation experience.
Passive object204 may include patterns to improve the accuracy of the location profile. For example, the patterns may be etched lines that mark axes of thephysical object202. It should be appreciated that other patterns may be used (e.g., parallel lines, orthogonally intersecting lines, curved lines that meet at the poles ofpassive object204 like longitudinal lines, etc.), and other methods may be used to make the patterns (e.g., tape, paint, etc.). In embodiments, the location profile may also be improved by using supplemental lighting in the real-world environment to help calibrate the location profile based on a permanent known position. For example, a flashing LED may be positioned in a real-world environment. Based on the length of time between flashes of the LED, the flashing LED may be used as a calibration mechanism of a known position in the real-world environment.
In embodiments,passive object204 may include an optically capturable mark (not shown) that associatespassive object204 with changes in a simulation experience. It should be appreciated, that there may be more than one mark onpassive object204 for multiple changes in a simulation experience. For example, the optically capturable mark may have a universal product code (UPC), a serial number, barcode, a model number, a QR code, or other identifier which a user may scan with a device, such as a smart phone orcamera205. The scanning device can obtain a profile associated with thephysical object202 from local memory,simulation device206, and/or theserver system214. The scanning device may transmit the profile topassive object204 using wireless communications via Bluetooth, Wi-Fi, infrared communications, Near Field Communications (NFC), for example, or through a wired connection, such as via Universal Serial Bus (USB). The optically capturable mark may be etched into or otherwise integrated intopassive object204.
For example, the profile may be transmitted to a personal computing device, such as a PC, a smart phone, a tablet, a dedicated configuration device, a gaming console, a simulation device (discussed in greater detail below), or similar device. When scanned and captured, the mark may be associated with an upgraded item, provide a new simulation experience, unlock a new character, and/or present other changes reflected in the simulation environment. In embodiments,passive object204 can have one or more marks for one or more changes. A particular profile associated with a mark can be selected by interacting with the personal computing device.
FIG. 2A illustrates an example system in which various embodiments may be implemented. One or more location profiles may be obtained fromserver system214 orsimulation device206 that has processed the video or images fromcamera205 ofpassive object204, as described above.Server system214 may include aserver216, adatabase218, and aprocessor220 from which location profiles can be obtained vianetwork212. As will be appreciated, theserver216, thedatabase218, and theprocessor220 may be communicative coupled to transmit signals and/or information within the server system. In some embodiments,simulation device206 and/orserver system214 may transmit a location profile that includes the minimum requirements to determine the location ofpassive object204 to improve processing times.
Network212 may be any communications network such as a cellular or data network, a satellite network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a personal area network (PAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), or any combination thereof. Accordingly,network212 may employ various communication media, such as a coaxial cable, fiber optic cable system, Ethernet, radio waves, etc. Further still,network212 may be one or more combinations of any of the aforementioned networks.
Atoperation102, the location profile obtained from the passive object may be transmitted to a simulation device. Referring again toFIG. 2A,simulation device206 may be a set top box, a gaming console, or in some embodiments, may be part of a TV, monitor, head-mounted display (HMD) device, or some similar device. Transmission of the location profile can occur wirelessly via Bluetooth, Wi-Fi, infrared communications, NFC, for example, or through a wired connection, such as via USB. In this way,simulation device206 can be apprised of wherephysical object202, that is to be used for interaction with or in a simulation environment generated bysimulation device206, is located in the real-world environment to inform the simulation environment.
Atoperation104, a virtual representation of the physical object may be generated based on the location profile of the passive object.Passive object204 andsimulation device206 may interact such that movement of thephysical object202 to whichpassive object204 is attached or associated is dynamically tracked as a virtual representation in a simulation experience generated bysimulation device206. The dynamic tracking is based on the changing location profile. In some embodiments, a first location ofpassive object204 may have a first location profile and a second location ofpassive object204 may have a second location profile. Depending on the frame rate ofcamera205, the rate at which the location profile updates, and accordingly, the rate at which the virtual representation is dynamically tracked, may change. The resolution ofcamera205, the portion ofpassive object204 captured bycamera205, and/or other factors may affect updating location of virtual representation ofphysical object202 to whichpassive object204 is attached.
Atoperation106, the virtual representation is presented in the simulation experience. The simulation experience may be presented on apresentation output device210. In some embodiments,presentation output device210 may include at least one ofdisplay208 for presenting visual aspects of the simulation experience, and one ormore speakers209 through which audio aspects of the simulation experience may be presented.
Display208 may provide a simulation experience through visual information presented thereon. Visual information may include information that may be observed visually, such as an image, video, and/or other visual information.Display208 may be included in or be embodied as one or more of an HMD410 (seeFIG. 4), an HMD in which simulation device206 (or alternatively, presentation device210) may be integrated, a see-through display, an optical see-through display, a video see-through display, a visor, eyeglasses, sunglasses, a computer, a laptop, a smartphone, a tablet, a mobile device, a projector, a monitor, a TV, and/or other displays.
In some implementations,display208 may include a motion, position, and/or orientation tracking component, in addition tocamera205, so that the visual information presented ondisplay208 changes as the position and/or orientation ofdisplay208 and the user changes.Display208 may be configured to display a simulation experience using AR, VR, or other simulation presentation technology. For example,display208 may visually provide the simulation experience by displaying an overlay image over one or more of an image, a video, and/or other visual information so that one or more parts of real-world objects appear to be augmented by one or more parts of virtual-world objects. In some implementations,display208 may use AR or VR technology to display a simulation experience by using systems and methods described in U.S. patent application Ser. No. 14/966,754, entitled “SYSTEMS AND METHODS FOR AUGMENTING AN APPEARANCE OF AN ACTUAL VEHICLE COMPONENT WITH A VIRTUAL VEHICLE COMPONENT,” filed Dec. 11, 2015, the foregoing being incorporated herein by reference in its entirety. Other systems and methods of providing a simulation experience are contemplated.
Speaker209 may provide a simulation experience through audio information generated byspeaker209. Audio information may include information that may be observed audibly. Audio information may include one or more of sound, vibration, and/or other audio information associated with the simulation experience and/or virtual representation ofphysical object202, such as the sound of a laser blast.Speaker209 may include one or more of a headphone, an earphone, a headset, an earset, and/or other speakers. In some implementations,speaker209 may include a speaker associated withdisplay208.
FIG. 2B illustrates example components that may make upsimulation device206.Simulation device206 may include the following components: a simulation profile component206A, a simulationevent occurrence component206B, a simulation stimuli component206C, a simulationlocation profile component206D, asimulation provision component206E, anenvironmental information component206F, and at least onesensor206G. It should be noted that not all of the aforementioned components are necessarily needed, and other components, such as local memory, processors, communication components, user interface components, etc. (some of which are depicted inFIG. 4) may be present.
Referring tosimulation device206, the profiles associated with one or more marks on passive object204 (or information therein) may be received or obtained by a simulation profile component206A. Simulation profile component206A may obtain the profile from a local memory unit, buffer, or cache. Simulation profile component206A may download or otherwise retrieve a profile from one or more data stores that may be remotely located. It should be noted that the content of a profile may differ depending on the mark associated with the profile. Some profiles may simply include information indicating a type of virtual object, whereas some profiles may additionally include information indicating color, weight, length, height, or any other characteristic that may be relevant in generating a simulation experience. In some embodiments, only information regarding one or more particular characteristics may be received or obtained bysimulation device206 rather than the entire profile. The user may alter or update one or more of the profiles received by simulation profile component206A. For example, the user may wish to customize aesthetic aspects to be represented in the simulation environment, such as a color of a virtual representation ofphysical object202, different material, branding, adding or subtracting graphics, etc. that may become available based on the mark. The user may wish to adjust certain characteristics so that the user can determine how changes to the one or more characteristics may affect the performance ofpassive object204 attached tophysical object202 in the simulation environment.
Simulationevent occurrence component206B is configured to receive and/or generate event occurrence information. Simulationevent occurrence component206B may be configured to identify occurrences of simulation events within the simulation experience based on use ofphysical object202, but from the perspective ofsimulation device206 and/orenvironmental information component206G. Simulationevent occurrence component206B may identify occurrences of simulation events based upon information received atenvironmental information component206G or communicate with simulationevent occurrence component206B. A simulation event may refer to one or more of specific motions, specific actions, specific sounds, specific locations, specific surroundings, and/or other specific conditions relating tophysical object202, the user ofphysical object202, and/or the contextual premise of the simulation experience. Occurrences of simulation events may be identified based on one or more of motion information, activity information, and environment information. Simulationevent occurrence component206B may be configured to identify an occurrence of a simulation event when one or more of motion information, activity information, and/or environment information indicates an occurrence of one or more of specific motions, specific actions, specific sounds, specific locations, specific surroundings, and/or other specific conditions relating tophysical object202 and/or the user ofphysical object202 that corresponds to a specific simulation event.
Simulation stimulation component206C ofsimulation device206 may be configured to receive and/or generate simulation stimuli that correspond to simulation events for which occurrences are identified. A simulation stimulus may refer to one or more of a visual, an audio, a haptic and/or other simulation that may change a simulation experience. Simulation stimuli component206C may also receive instructions to generate simulation stimuli local to the user andphysical object202. Simulation stimulation component206C may include one or more stimuli output components (not shown), such as LED lights, one or more speakers, etc. Information for instructing simulation stimulation component206C to generate stimuli may be received.
Simulation device206 may access simulationlocation profile component206D to determine a location profile based on the received content or location profile. In embodiments,simulation device206 and/orserver system214 may receive content (e.g., video or images) and process the content reflected offpassive object204 to generate a location profile. The location profile may include one or more vectors, such as, for example, a Rodrigues' vector, from the center ofpassive object204 to the center of the one or more detected objects that are reflected offpassive object204. The location profile may be used to determine a location and orientation ofpassive object204 in the simulation environment.
Simulation events obtained from simulationevent occurrence component206B may be communicated tosimulation provision component206E.Simulation provision component206E may be configured to provide a simulated experience by operating simulation devicepresentation output device220. The simulation experience can be achieved through one or more of visual, audio, haptic and/or other simulations, where the visual, audio, haptic, and/or other simulation changes responsive to simulation event occurrences and simulation stimuli. The simulation event occurrences and/or simulation stimuli may be based on the profile associated with a mark ofphysical object202.
Environmental information component206F may be configured to obtain or receive information regarding the surrounding actions, elements, or other relevant factors or aspects of the surrounding environment that may impact or be affected by the use of thepassive object204 attached tophysical object202. Without limitation, environmental information may include motion, action, sound, location, surroundings, and/or other information relating tophysical object202 with attachedpassive object204 and/or a person usingphysical object202. Environmental information may be obtained or received from output signals generated bysensor206G. However, such environmental information is obtained at, or from, the perspective ofsimulation device206. In some embodiments,environmental information component206F may be used to obtain all relevant information regarding the surrounding actions, elements, and/or other relevant factors and/or aspects of the surrounding environment that may impact or be affected by the use ofphysical object202.
Sensor206G may include one or more of image sensors, audio sensors, temperature sensors, motion sensors, accelerometers, tilt sensors, inclination sensors, angular rate sensors, gyroscopes, navigation sensors, geolocation sensors, magnetometers, radar detectors, radar sensors, proximity sensors, distance sensors, vibration sensors, light detection sensors, vehicle sensors, engine control component sensors, and/or other sensors. In some embodiments, sensors may further include cameras, a tracking marker, a microphone, or any other component that captures environmental information. In some embodiments,sensor206G may be worn by a user. In some embodiments,sensor206G may be installed insimulation device206 or otherwise coupled tosimulation device206. It should be noted that although only onesensor206G is illustrated, various embodiments contemplate the use of more than one sensor or some combination of the aforementioned sensors.
FIG. 3 illustrates an example simulation experience with a physical object provided to a user in accordance with various embodiments. Auser300 may wish to engage in a simulation experience, such as through a computer game or AR/VR experience utilizingphysical object302, i.e., a real-world racquet. The simulation experience may be provided by agaming console306 in which a simulation device may be embodied.Physical object302 is distinguished from aconventional paddle controller307 usually associated withgaming console306 for interacting in the simulation experience that includes embedded electronics.
To utilizephysical object302,user300 may attach apassive object304 tophysical object302. The manner in whichpassive object304 is attached tophysical object302 can vary. In some embodiments,passive object304 is provided with a reusableadhesive allowing user300 to attachpassive object304 tophysical object302. In some embodiments,passive object304 may be configured with one or more attachment mechanisms, such as a Velcro® or buckled strap, a clamp, a magnet, a suction cup, or other attachment mechanism. In some embodiments,passive object304 may be a spheroid reflective mirror, a mirror, a multi-faceted mirror, or other passive object. It should be appreciated that the shape of a given passive object may offer less reflection of the environment than the shape of another given passive object.
FIG. 3 illustrates twocameras322 and324. In this embodiment,camera322 may be implemented atgaming console306, whilecamera324 may be implemented anywhere a user wishes to locate a camera, such as on a room wall. It should be noted that more or less cameras may be utilized in accordance with various embodiments. Although not necessary in all embodiments, multiple cameras can increase the precision of a simulation experience by increasing the amount of data indicative of the movement and/or positioning ofpassive object304 attached tophysical object302.Camera322 or camera344 may capture a real-worldenvironment surrounding user300, a portion ofpassive object304 attached tophysical object302,user300, etc. The captured content may be used to derive a location profile of the passive object, as described above. In embodiments, one ofcamera322 or324 may be a tracking marker or tracking sensor picked up by sensors and/or other electronics ofgaming console306 to track conventional controllers (e.g., conventional paddle controller307).
User300 may select a profile forphysical object302 through a user interface ongaming console306. Selection of the profile can occur at any time and need not only occur upon attachment ofpassive object304 tophysical object302.
A simulation experience may be initiated viagaming console306. As illustrated,user300 may now utilizephysical object302 to participate, interact, control, or otherwise engage in the simulation experience, which may be presented on apresentation output device310, i.e., a TV. Depending on the profile selected byuser300 and/or any alterations to one or more characteristics based on the profile, the simulation device embodied as, or within,gaming console306 can generate an accurate representation and experience foruser300. The simulation device may obtain, receive, and/or generate a location profile; obtain, receive and/or generate simulation event occurrence information; obtain, receive and/or generate simulation stimuli; etc. In this way,user300 can engage in a more accurate simulation experience than can be provided byconventional controller307.
FIG. 4 illustrates an example simulation experience with a physical object provided to a user in accordance with various embodiments. Auser400 may wish to engage in a simulation experience, such as through a computer game or AR/VR experience utilizingphysical object402, i.e., a real-world sword toy built at an amusement park, that the user has brought home along with an associatedpassive object404. The simulation experience may be provided by anHMD410 in which a simulation device and presentation output device may be embodied.
To utilizephysical object402,user400 may attachpassive object404 tophysical object402.Camera412 may be used to capturepassive object404. The content captured bycamera412 may be processed to recognize one or more objects reflected offpassive object404 in a real-world environment and generate positional and orientation vectors from the center of thepassive object404 to the one or more objects. In some embodiments,camera412 may be integrated intoHMD410.User400 may select a profile associated withphysical object402 through a user interface onHMD410. In this embodiment, the profile forphysical object402 may be obtained by establishing a connection to the amusement park server/database which may be an embodiment of server system214 (seeFIG. 2A). In some embodiments, the mark (not shown) on thepassive object404 may update or supplement the profile fromserver system214 based on a date, time of day, location, or other factor. The profile may have been created uponuser400 buildingphysical object402 and a correspondingpassive object404 that has been configured with one or more profiles via a mark. The profile may be stored in the amusement park server/database.
A simulation experience may be initiated viaHMD410. As illustrated,user400 may now utilize physical object402 (represented as a virtual sword411) to participate, interact, control, or otherwise engage in the simulation experience, which may be presented throughHMD410. Depending on the profile selected byuser400, the simulation device embodied as, or within,HMD410 can generate an accurate representation and experience foruser400. Thesimulation device404 may obtain, receive, and/or generate a reflection location profile commensurate with the location profile, exchange simulation event occurrence information to generate simulation stimuli (such as sword sound effects), etc.
It should be noted thatuser400 may elect to change one or more profiles associated withphysical object402, such as the color or style, as described in greater detail above.User400 can change such profiles via a user interface onHMD410 or via a connected computing device user interface, such as a smart phone application.
It should be further noted that in the context of the simulation experience presented touser400,user400 may be given the opportunity to “unlock” and play with other versions of the virtual representation of the physical object, e.g., a virtual sword having different characteristics, such as more power in the simulation experience, a different color in the simulation experience, etc. Unlocking these other versions may be predicated uponuser400 paying an upgrade fee within the simulation experience, which may be an online gaming experience. Accordingly,HMD410 may be configured with functionality to present a transaction interface and/or connectivity to a service provider so that theuser400 can engage in a transaction to purchase the upgraded or modified version of the virtual representation of physical object.
The functionality described in the present disclosure can also be utilized in the context of performances. For example, a performer may perform in a green screen room to be adapted into a virtual character in the movie. Instead of using expensive laser grid systems or embedded electronics in a body suit, the disclosed technology may be used. One or more passive objects may be attached to a performer (e.g. a passive object on each arm, each leg, the torso, and a head). More passive objects attached to a performer may improve the granularity of a performance (e.g., attaching a passive object to the forearm and the upper arm may be able to distinguish a performer's forearm movement from the upper arm movement).
FIG. 5 illustrates an example computing component that may be used to implement various features of the system and methods disclosed herein, for example, one or more elements ofsystem200, such assimulation device206.
As used herein, the term component might describe a given unit of functionality that can be performed in accordance with one or more embodiments of the present application. As used herein, a component might be implemented utilizing any form of hardware, software, or a combination thereof. For example, one or more processors, controllers, ASICs, PLAs, PALs, CPLDs, FPGAs, logical components, software routines or other mechanisms might be implemented to make up a component. In implementation, the various components described herein might be implemented as discrete parts or the functions and features described can be shared in part or in total among one or more components. In other words, as would be apparent to one of ordinary skill in the art after reading this description, the various features and functionality described herein may be implemented in any given application and can be implemented in one or more separate or shared components in various combinations and permutations. Even though various features or elements of functionality may be individually described or claimed as separate components, one of ordinary skill in the art will understand that these features and functionality can be shared among one or more common software and hardware elements, and such description shall not require or imply that separate hardware or software components are used to implement such features or functionality.
Where components of the application are implemented in whole or in part using software, in one embodiment, these software elements can be implemented to operate with a computing or processing component capable of carrying out the functionality described with respect thereto. One such example computing component is shown inFIG. 5. Various embodiments are described in terms of this example—computingcomponent500. After reading this description, it will become apparent to a person skilled in the relevant art how to implement the application using other computing components or architectures.
Referring now toFIG. 5,computing component500 may represent, for example, computing or processing capabilities found within a self-adjusting display, desktop, laptop, notebook, and tablet computers; hand-held computing devices (tablets, PDA's, smart phones, cell phones, palmtops, etc.); workstations or other devices with displays; servers; or any other type of special-purpose or general-purpose computing devices as may be desirable or appropriate for a given application or environment.Computing component500 might also represent computing capabilities embedded within or otherwise available to a given device. For example, a computing component might be found in other electronic devices such as, for example navigation systems, portable computing devices, VR/AR, HMD, simulation devices, gaming consoles, and other electronic devices that might include some form of processing capability.
Computing component500 might include, for example, one or more processors, controllers, control components, or other processing devices, such as aprocessor504.Processor504 might be implemented using a general-purpose or special-purpose processing engine such as, for example, a microprocessor, a controller, or other control logic. In the illustrated example,processor504 is connected to abus502, although any communication medium can be used to facilitate interaction with other components ofcomputing component500 or to communicate externally.
Computing component500 might also include one or more memory components, simply referred to herein asmain memory508. For example, preferably random access memory (RAM) or other dynamic memory, might be used for storing information and instructions to be executed byprocessor504.Main memory508 might also be used for storing temporary variables or other intermediate information during execution of instructions to be executed byprocessor504.Computing component500 might likewise include a read only memory (“ROM”) or other static storage device coupled tobus502 for storing static information and instructions forprocessor504.
Thecomputing component500 might also include one or more various forms ofinformation storage mechanism510, which might include, for example, amedia drive512 and astorage unit interface520. The media drive512 might include a drive or other mechanism to support fixed orremovable storage media514. For example, a hard disk drive, a solid state drive, a magnetic tape drive, an optical disk drive, a compact disc (CD) or digital video disc (DVD) drive (R or RW), or other removable or fixed media drive might be provided. Accordingly,storage media514 might include, for example, a hard disk, an integrated circuit assembly, magnetic tape, cartridge, optical disk, a CD or DVD, or other fixed or removable medium that is read by, written to, or accessed bymedia drive512. As these examples illustrate, thestorage media514 can include a computer usable storage medium having stored therein computer software or data.
In alternative embodiments,information storage mechanism510 might include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded intocomputing component500. Such instrumentalities might include, for example, a fixed orremovable storage unit522 and aninterface520. Examples ofsuch storage units522 andinterfaces520 can include a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory component) and memory slot, a PCMCIA slot and card, and other fixed orremovable storage units522 andinterfaces520 that allow software and data to be transferred from thestorage unit522 tocomputing component500.
Computing component500 might also include acommunications interface524. Communications interface524 might be used to allow software and data to be transferred betweencomputing component500 and external devices. Examples ofcommunications interface524 might include a modem or softmodem, a network interface (such as an Ethernet, network interface card, WiMedia, IEEE 802.XX or other interface), a communications port (such as for example, a USB port, IR port, RS232 port Bluetooth® interface, or other port), or other communications interface. Software and data transferred viacommunications interface524 might typically be carried on signals, which can be electronic, electromagnetic (which includes optical) or other signals capable of being exchanged by a givencommunications interface524. These signals might be provided tocommunications interface524 viachannel528. Thischannel528 might carry signals and might be implemented using a wired or wireless communication medium. Some examples of a channel might include a phone line, a cellular link, an RF link, an optical link, a network interface, a local or wide area network, and other wired or wireless communication channels.
Further still,computing component500 may include a user interface530. User interface530 may include a display, a physical input mechanism such as one or more buttons, softkeys, or other actuatable components, or a combination thereof.
In this document, the terms “computer program medium” and “computer usable medium” are used to generally refer to transitory or non-transitory media such as, for example,memory508,storage unit520,media514, andchannel528. These and other various forms of computer program media or computer usable media may be involved in carrying one or more sequences of one or more instructions to a processing device for execution. Such instructions embodied on the medium, are generally referred to as “computer program code” or a “computer program product” (which may be grouped in the form of computer programs or other groupings). When executed, such instructions might enable thecomputing component500 to perform features or functions of the present application as discussed herein.
Although described above in terms of various exemplary embodiments and implementations, it should be understood that the various features, aspects and functionality described in one or more of the individual embodiments are not limited in their applicability to the particular embodiment with which they are described, but instead can be applied, alone or in various combinations, to one or more of the other embodiments of the application, whether or not such embodiments are described and whether or not such features are presented as being a part of a described embodiment. Thus, the breadth and scope of the present application should not be limited by any of the above-described exemplary embodiments.
Terms and phrases used in this document, and variations thereof, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. As examples of the foregoing: the term “including” should be read as meaning “including, without limitation” or the like; the term “example” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof; the terms “a” or “an” should be read as meaning “at least one,” “one or more” or the like; and adjectives such as “conventional,” “traditional,” “normal,” “standard,” “known” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass conventional, traditional, normal, or standard technologies that may be available or known now or at any time in the future. Likewise, where this document refers to technologies that would be apparent or known to one of ordinary skill in the art, such technologies encompass those apparent or known to the skilled artisan now or at any time in the future.
The presence of broadening words and phrases such as “one or more,” “at least,” “but not limited to” or other like phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent. The use of the term “component” does not imply that the parts or functionality described or claimed as part of the component are all configured in a common package. Indeed, any or all of the various parts of a component, whether control logic or other components, can be combined in a single package or separately maintained and can further be distributed in multiple groupings or packages or across multiple locations.
Additionally, the various embodiments set forth herein are described in terms of exemplary block diagrams, flow charts, and other illustrations. As will become apparent to one of ordinary skill in the art after reading this document, the illustrated embodiments and their various alternatives can be implemented without confinement to the illustrated examples. For example, block diagrams and their accompanying description should not be construed as mandating a particular architecture or configuration.

Claims (19)

What is claimed is:
1. A computer-implemented method, comprising:
obtaining a location profile of a passive object, including:
capturing, using a first camera, a first image of the passive object having a reflective surface reflecting a real-world object, wherein the passive object is attached to a physical object,
detecting, in the first image, a reflection of the real-world object reflected in the passive object, and
generating the location profile of the passive object based on the reflection detected in the first image;
transmitting the location profile to a simulation device, wherein the simulation device comprises a lighting device in a known fixed location in a real-world environment, the lighting device is captured by the first camera as a second reflection reflected in the passive object, and the second reflection is configured to be used in a calibration of the location profile by using the known fixed location of the lighting device;
generating a virtual representation of the physical object based on the location profile of the passive object; and
presenting the virtual representation in a simulation experience, wherein the passive object includes an optically capturable mark configured to associate the passive object with a change in the simulation experience, and wherein the change in the simulation experience comprises at least one of a change to the virtual representation of the physical object, an additional simulation experience, an additional simulation event, or a virtual character.
2. The computer-implemented method ofclaim 1, wherein the optically capturable mark includes a pattern of etched parallel lines.
3. The computer-implemented method ofclaim 1, wherein the location profile comprises a position and an orientation of the passive object in a real-world environment.
4. The computer-implemented method ofclaim 1, wherein obtaining the location profile further comprises generating a vector from a center of the passive object to a center of the real-world object.
5. The computer-implemented method ofclaim 1, wherein the simulation experience comprises at least one of an augmented reality (AR) experience, a virtual reality (VR) experience, a motion capture performance, or a gaming experience.
6. The computer-implemented method ofclaim 3, further comprising:
obtaining a second location profile of the passive object, wherein the second location profile comprises a second position and a second orientation of the passive object;
transmitting the second location profile to the simulation device; and
presenting the virtual representation in a second location in the simulation experience based on the second location profile.
7. A system, comprising:
a first camera; and
a simulation device operatively connected to the first camera, the simulation device presenting a virtual representation of a physical object with an attached passive object having a reflective surface, wherein the virtual representation is based on a location profile of the passive object, and the location profile is configured to be generated by:
capturing, using the first camera, a first image of the passive object reflecting a real-world object,
detecting, in the first image, a reflection of the real-world object reflected in the passive object, and
generating the location profile of the passive object based on the reflection; wherein the simulation device is configured to present a virtual representation of the physical object in a simulation experience,
wherein the passive object includes an optically capturable mark configured to associate the passive object with a change in the simulation experience,
wherein the change in the simulation experience comprises at least one of a change to the virtual representation of the physical object, an additional simulation experience, an additional simulation event, or a virtual character, and
wherein the simulation device comprises a lighting device in a known fixed location in a real-world environment, the lighting device is captured by the first camera as a second reflection reflected in the passive object, and the second reflection is configured to be used in a calibration of the location profile by using the known fixed location of the lighting device.
8. The system ofclaim 7, wherein the optically capturable mark comprises one or more of a barcode, a QR code, a UPC code, or a serial number.
9. The system ofclaim 7, wherein the location profile comprises a position and an orientation of the passive object in a real-world environment.
10. The system ofclaim 7, wherein the optically capturable mark includes a pattern of etched lines.
11. A computer-implemented method, comprising:
capturing, by a first camera, content of a passive object having a reflective surface reflecting a real-world object, the passive object being attached to a physical object;
receiving the content;
generating a location profile of the passive object, the location profile comprising a location and an orientation of the passive object;
generating, with a simulation device, a virtual representation of the physical object in a simulation experience, wherein the simulation device comprises a lighting device in a known fixed location in a real-world environment, the lighting device is captured by the first camera as a second reflection reflected in the passive object, and the second reflection is configured to be used in a calibration of the location profile by using the known fixed location of the lighting device; and
presenting the simulation experience with the virtual representation in a virtual location, wherein generating the location profile of the passive object includes detecting the real-world object reflected in the passive object, wherein the passive object includes an optically capturable mark configured to associate the passive object with a change in the simulation experience, and wherein the change in the simulation experience comprises at least one of a change to the virtual representation of the physical object, an additional simulation experience, an additional simulation event, or a virtual character.
12. The computer-implemented method ofclaim 11, wherein generating the location profile of the passive object includes generating a vector from the passive object to the real-world object.
13. The computer-implemented method ofclaim 11, wherein detecting the real-world object comprises one or more of centroid detection or corner detection.
14. The computer-implemented method ofclaim 1, further comprising presenting a transaction interface configured to allow a user to purchase the change in the simulation experience.
15. The computer-implemented method ofclaim 1, wherein the optically capturable mark is associated with a profile including information indicating a type of virtual representation of the physical object.
16. The computer-implemented method ofclaim 15, wherein the profile comprises information indicating one of a color, a weight, a length, or a height of the virtual representation of the physical object.
17. The computer-implemented method ofclaim 1, wherein obtaining the location profile further comprises:
capturing, using a second camera, a second image of the passive object reflecting the real-world object;
detecting, in the second image, the reflection of the real-world object reflected in the passive object, and
generating the location profile of the passive object based on the reflection detected in the first image and the second image.
18. The system ofclaim 7, further comprising a second camera, wherein one of the first camera or the second camera is a 360 degree camera.
19. The system ofclaim 18, wherein the 360 degree camera is located above a user.
US16/133,5972018-09-172018-09-17Systems and methods for tracking a physical object using a passive object having a reflective surfaceActive2038-11-17US10960297B2 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US16/133,597US10960297B2 (en)2018-09-172018-09-17Systems and methods for tracking a physical object using a passive object having a reflective surface

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US16/133,597US10960297B2 (en)2018-09-172018-09-17Systems and methods for tracking a physical object using a passive object having a reflective surface

Publications (2)

Publication NumberPublication Date
US20200086208A1 US20200086208A1 (en)2020-03-19
US10960297B2true US10960297B2 (en)2021-03-30

Family

ID=69774645

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US16/133,597Active2038-11-17US10960297B2 (en)2018-09-172018-09-17Systems and methods for tracking a physical object using a passive object having a reflective surface

Country Status (1)

CountryLink
US (1)US10960297B2 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
GB2586059B (en)*2019-08-012023-06-07Sony Interactive Entertainment IncSystem and method for generating user inputs for a video game
JP7672800B2 (en)*2020-06-232025-05-08株式会社ソニー・インタラクティブエンタテインメント Information processing device, method, program, and information processing system
KR20240169079A (en)*2022-03-302024-12-02유니버셜 시티 스튜디오스 엘엘씨 Systems and methods for generating responses to interactions within an interactive environment
US11995249B2 (en)*2022-03-302024-05-28Universal City Studios LlcSystems and methods for producing responses to interactions within an interactive environment

Citations (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20060028474A1 (en)*2004-08-052006-02-09Hanspeter PfisterRendering deformable and animated surface reflectance fields
US20110069299A1 (en)*2009-09-232011-03-24En-Feng HsuDistance-measuring device of measuring distance according to variation of imaging location and calibrating method thereof
US20120276997A1 (en)*2011-04-292012-11-01Xmg Studio, Inc.Systems and methods of importing virtual objects using barcodes
US20130050426A1 (en)*2011-08-302013-02-28Microsoft CorporationMethod to extend laser depth map range
US20140267412A1 (en)*2013-03-152014-09-18Disney Enterprises, Inc.Optical illumination mapping
US20160262913A1 (en)*2013-08-132016-09-15Brainlab AgMedical Registration Apparatus and Method for Registering an Axis
US20170221224A1 (en)*2014-03-032017-08-03Mitsubishi Electric CorporationPosition measurement apparatus for measuring position of object having reflective surface in the three-dimensional space
US20170352184A1 (en)*2016-06-062017-12-07Adam G. PoulosOptically augmenting electromagnetic tracking in mixed reality

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20060028474A1 (en)*2004-08-052006-02-09Hanspeter PfisterRendering deformable and animated surface reflectance fields
US20110069299A1 (en)*2009-09-232011-03-24En-Feng HsuDistance-measuring device of measuring distance according to variation of imaging location and calibrating method thereof
US20120276997A1 (en)*2011-04-292012-11-01Xmg Studio, Inc.Systems and methods of importing virtual objects using barcodes
US20130050426A1 (en)*2011-08-302013-02-28Microsoft CorporationMethod to extend laser depth map range
US20140267412A1 (en)*2013-03-152014-09-18Disney Enterprises, Inc.Optical illumination mapping
US20160262913A1 (en)*2013-08-132016-09-15Brainlab AgMedical Registration Apparatus and Method for Registering an Axis
US20170221224A1 (en)*2014-03-032017-08-03Mitsubishi Electric CorporationPosition measurement apparatus for measuring position of object having reflective surface in the three-dimensional space
US20170352184A1 (en)*2016-06-062017-12-07Adam G. PoulosOptically augmenting electromagnetic tracking in mixed reality

Also Published As

Publication numberPublication date
US20200086208A1 (en)2020-03-19

Similar Documents

PublicationPublication DateTitle
US11132067B2 (en)Simulation experience with physical objects
US11393154B2 (en)Hair rendering method, device, electronic apparatus, and storage medium
US12190438B2 (en)Generating ground truth datasets for virtual reality experiences
CN109471522B (en)Method for controlling pointer in virtual reality and electronic device
CN112634416B (en)Method and device for generating virtual image model, electronic equipment and storage medium
US10960297B2 (en)Systems and methods for tracking a physical object using a passive object having a reflective surface
CN112156464A (en)Two-dimensional image display method, device and equipment of virtual object and storage medium
CN108694073B (en)Control method, device and equipment of virtual scene and storage medium
CN112337105B (en)Virtual image generation method, device, terminal and storage medium
CN107656615A (en)The world is presented in a large amount of digital remotes simultaneously
CN108786110B (en)Method, device and storage medium for displaying sighting telescope in virtual environment
US20180373414A1 (en)Method for communicating via virtual space, program for executing the method on computer, and information processing apparatus for executing the program
CN111273780B (en)Animation playing method, device and equipment based on virtual environment and storage medium
CN110917616A (en)Orientation prompting method, device, equipment and storage medium in virtual scene
CN112581571B (en)Control method and device for virtual image model, electronic equipment and storage medium
WO2018113759A1 (en)Detection system and detection method based on positioning system and ar/mr
CN113209610B (en)Virtual scene picture display method and device, computer equipment and storage medium
JP2019211864A (en)Computer program, information processing device, and information processing method
CN110796083A (en)Image display method, device, terminal and storage medium
CN110533756B (en)Method, device, equipment and storage medium for setting attaching type ornament
KR20230070308A (en) Location identification of controllable devices using wearable devices
JP6554139B2 (en) Information processing method, apparatus, and program for causing computer to execute information processing method
JP6983639B2 (en) A method for communicating via virtual space, a program for causing a computer to execute the method, and an information processing device for executing the program.
US20240295735A1 (en)System to detect add-on prescription lenses to vr hmds with eye tracking
HK1259130B (en)Simulation experience with physical objects

Legal Events

DateCodeTitleDescription
FEPPFee payment procedure

Free format text:ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

ASAssignment

Owner name:DISNEY ENTERPRISES, INC., CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CHAPMAN, STEVEN M.;REEL/FRAME:046931/0660

Effective date:20180917

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCFInformation on status: patent grant

Free format text:PATENTED CASE

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4


[8]ページ先頭

©2009-2025 Movatter.jp