CROSS-REFERENCE TO RELATED APPLICATIONThe present application is a continuation and claims the priority benefit of U.S. patent application Ser. No. 15/814,368 filed Nov. 15, 2017, now U.S. Pat. No. 10,425,654, which is incorporated herein by reference in its entirety
BACKGROUND OF THE INVENTION1. Field of the InventionThe present invention generally relates to game session content. More specifically, the present invention relates to synchronization of session content to external content.
2. Description of the Related ArtPresently available digital content may allow for sharing of images, video, and other content generated during an game session with one or more players. For example, a player playing a digital game during a game session may have performed a notable feat, achieved a notable status, or otherwise wish to share content relating an in-game event. Notwithstanding, images and even video captured during the game session may fail to engage other individuals (e.g., online audiences). One reason for such failure to engage is the impersonal nature of such content. Because such content are generated within the in-game environment of a game title, many images or video that are captured therefrom may appear monotonous and lacking in emotion. While references herein may be made specifically to a game or game session, such reference should be understood to encompass any variety of different types of digital content made available via sessions as known in the art.
Audience members may be engaged when they see the faces of people they know, when they hear personalized reactions and accounts, and when their experience of the game session includes human interactions, reactions, and emotions. One way to incorporate such human interactions into game session content is to generate a reaction video that is captured as one or more individuals (e.g., player(s), non-playing friends and family in the same room, remote players and non-players) watch the game transpire.
Some game consoles may be associated with a peripheral camera or other device that captures images or video of the room in which a game is being played. Such peripheral cameras are usually fixed, however, as well as being set at a distance from the player(s) and other individuals in the room. Such long shots may capture images and video of more area within the room, but lack the immediacy and emotional engagement of close-up shots. While personal devices (e.g., smartphone, webcam, Wi-Fi connected handheld camera) may be used to capture such close-up shots—whether by photo or video—there is presently no way for such content that is external to the game to be synchronized automatically to in-game content so that there may be context to the individuals' reactions.
There is, therefore, a need in the art for improved systems and methods for synchronization of session content to external content.
SUMMARY OF THE CLAIMED INVENTIONEmbodiments of the present invention allow for synchronization of session content to external content. Session video of a plurality of game sessions may be captured at a content synchronization server. Each captured session video of each game session may be associated with an identifier of the respective game session. Additional content may be sent over a communication network to the content synchronization server. Such content may be external to the game session and identified as being associated with a game session identifier. One of the captured session videos may be identified as being associated with a game session identifier that matches the game session identifier associated with the received external content. The received external content may be synchronized to the identified session video based on the matching game session identifiers. A composite video may be generated that includes the received external content synchronized to the identified session video.
Various embodiments of the present invention may include systems for synchronization of session content to external content. Such systems may include a content delivery server that hosts a plurality of different game sessions, captures session video for each of the different game sessions where each captured session video of each game session is associated with an identifier of the respective game session, receives content external to the game session, identifies that the external content is associated with a game session identifier, identifies one of the captured session videos as being associated with a game session identifier that matches the game session identifier associated with the received external content, synchronizes the received external content to the identified session video based on the matching game session identifiers, and generates a composite video comprising the received external content synchronized to the identified session video. Systems may further include one or more game consoles that generates session content captured in the session video during the respective game session associated with the matching game session identifier.
Further embodiments of the present invention may include methods for synchronization of session content to external content. Such methods may include capturing session video for each of a plurality of different game sessions at a content synchronization server where each captured session video of each game session is associated with an identifier of the respective game session, receiving additional content sent over a communication network to the content synchronization server where the additional content is external to the game session, identifying that the external content is associated with a game session identifier, identifying one of the captured session videos as being associated with a game session identifier that matches the game session identifier associated with the received external content, synchronizing the received external content to the identified session video based on the matching game session identifiers, and generating a composite video comprising the received external content synchronized to the identified session video.
Yet further embodiments of the present invention may include non-transitory computer-readable storage media having embodied thereon a program executable by a processor to perform a method for synchronization of session content to external content as described above.
BRIEF DESCRIPTIONS OF THE DRAWINGSFIG. 1 illustrates a network environment in which a system for synchronization of session content to external content may be implemented.
FIG. 2A illustrates an exemplary layout of a composite video in which session content has been synchronized to external content.
FIG. 2B illustrates an alternative exemplary layout of a composite video in which session content has been synchronized to external content.
FIG. 2C illustrates another alternative exemplary layout of a composite video in which session content has been synchronized to external content.
FIG. 3 is a flowchart illustrating an exemplary method for synchronization of session content to external content.
FIG. 4 is an exemplary electronic entertainment system that may be used in synchronization of session content to external content.
DETAILED DESCRIPTIONEmbodiments of the present invention allow for synchronization of session content to external content. Session video of a plurality of game sessions may be captured at a content synchronization server. Each captured session video of each game session may be associated with an identifier of the respective game session. Additional content may be sent over a communication network to the content synchronization server. Such additional content may be external to the game session and identified as being associated with a game session identifier. One of the captured session videos may be identified as being associated with a game session identifier that matches the game session identifier associated with the received external content. The received external content may be synchronized to the identified session video based on the matching game session identifiers. A composite video may be generated that includes the received external content synchronized to the identified session video.
FIG. 1 illustrates anetwork environment100 in which a system for synchronization of session content to external content may be implemented. Thenetwork environment100 may include one or morecontent source servers110 that provide digital content (e.g., games) for distribution, one or more content provider server application program interfaces (APIs)120, contentdelivery network server130, acontent synchronization server140, and one ormore client devices150.
Content source servers110 may maintain and provide a variety of content available for distribution. Thecontent source servers110 may be associated with any content provider that makes its content available for access over a communication network. Such content may include not only digital games, but also pre-recorded content (e.g., DVR content, music) and live broadcasts (e.g., live sporting events, live e-sporting events, broadcast premieres). Any images, video clips, or other portions of such content may also be maintained atcontent source servers110.
Thecontent source servers110 may maintain content associated with any content provider that makes its content available to be accessed, including individuals who upload content from theirpersonal client devices150. Such content may be generated at suchpersonal client devices150 using native cameras, microphones, and other components for capturing images, audio, and video.
The content fromcontent source server110 may be provided through a contentprovider server API120, which allows various types ofcontent sources server110 to communicate with other servers in the network environment100 (e.g., content synchronization server140). The contentprovider server API120 may be specific to the particular language, operating system, protocols, etc. of thecontent source server110 providing the content. In anetwork environment100 that includes multiple different types ofcontent source servers110, there may likewise be a corresponding number of contentprovider server APIs120 that allow for various formatting, conversion, and other cross-device and cross-platform communication processes for providing content (e.g., composites of different types) todifferent client devices150, which may use different content media player application to play such content. As such, content titles of different formats may be made available so as to be compatible withclient device150.
The contentprovider server API120 may further facilitate access of each of theclient devices150 to the content hosted by thecontent source servers110, either directly or via contentdelivery network server130. Additional information, such as metadata, about the accessed content can also be provided by the contentprovider server API120 to theclient device150. As described below, the additional information (i.e. metadata) can be usable to provide details about the content being provided to theclient device150. Finally, additional services associated with the accessed content such as chat services, ratings and profiles can also be provided from thecontent source servers110 to theclient device150 via the contentprovider server API120.
The contentdelivery network server130 may include a server that provides resources and files related to the content fromcontent source servers110, including promotional images and service configurations withclient devices150. The contentdelivery network server130 can also be called upon by theclient devices150 that request to access specific content. Contentdelivery network server130 may include game servers, streaming media servers, servers hosting downloadable content, and other content delivery servers known in the art.
The contentprovider server API120 may communicate with acontent synchronization server140 in order to synchronize and generate composite content (e.g., from two different content source servers110) for theclient device150. As noted herein, one type of content source is an individual who uploads content to contentsource server110. Such content may be external to another (e.g., content captured during a digital game session), but may nevertheless be in reaction to or otherwise relating to the game. Because a game session of a digital game may take place over a period of time, different in-game events may take place throughout the time period. Each reaction may therefore correspond to a particular point in time that a respective in-game event occurs. Because the reaction content (e.g., mobile device-captured video of human reactions) are external to the digital game, however, such external content may be captured and saved separately (e.g., as a separate file) from content captured during a play session of a digital game on a game console and/or hosted by a game server.
Content synchronization server140 may identify that such external (e.g., reaction) content is associated with a game session identifier. Such game session identifier may be generated by the client devices150 (e.g., game console) engaging in the game session in which the game is being played. The game session identifier may be communicated to the client device150 (e.g., mobile device) that generated the external content. Such communication may occur via a mobile application downloaded to the mobile client device150 (e.g., from a game server or other content delivery network server130). The user of themobile client device150 may use mobile application to select another client device150 (e.g., a particular game console device) and request (e.g., via Bluetooth or WiFi connection) the game session identifier. The mobile application may then offer a variety of different session content with which to pair external content. Such session content may include in-game content from a game session, pre-recorded, or live content made available during a game session, etc.
The shared game session identifier may therefore create a pairing that associates the external content with the selected session content (e.g., video of the game session). In some embodiments, multiple external content files may be paired to the same session. Examples of external content may include reaction videos of the game player, reaction videos of audience members (local or remote), lip-synching videos in relation to music or video, commentary videos, different angles of the same live event, etc. Multiple different external content may be associated with the same session content. Such external content may not be required to be captured or generated at the same time, however. One external content file may be captured in real-time during the game session, while another external content file may be captured in relation to a replay of a recording of the game session. Such external content files may nevertheless by synchronized to the in-game content based on the shared session identifier and timestamps.
The in-game content (e.g., clips captured during the game session) may therefore be matched to external content based on a common game session identifier. Further, the paired content—in-game content and external content—may be associated with timestamps regarding points in time within the game session. Using such timestamps that appear in the two or more different content files (e.g., in-game session video and external video),content synchronization server140 may be able to synchronize the content files. The synchronized files may further be composited in a variety of different display configurations. The resulting composite video may thereafter be stored, accessed, and played, thereby presenting multiple synchronized content files within a single composite display. In some embodiments, composite videos may be generated based on default configurations (e.g. based on number of content files being composited, default settings), as well as on-the-fly based on input from producers, broadcasters, or other users. The composite videos may be maintained and made available for access, play, sharing, social media, streaming, broadcast, etc. byvarious client devices150.
Theclient device150 may include a plurality of different types of computing devices. For example, theclient device150 may include any number of different gaming consoles, mobile devices, laptops, and desktops. A particular player may be associated with a variety ofdifferent client devices150. Eachclient device150 may be associated with the particular player by virtue of being logged into the same player account.Such client devices150 may also be configured to access data from other storage media, such as, but not limited to memory cards or disk drives as may be appropriate in the case of downloaded services.Such devices150 may include standard hardware computing components such as, but not limited to network and media interfaces, non-transitory computer-readable storage (memory), and processors for executing instructions that may be stored in memory. Theseclient devices150 may also run using a variety of different operating systems (e.g., iOS, Android), applications or computing languages (e.g., C++, JavaScript). Anexemplary client device150 is described in detail herein with respect toFIG. 4.
FIG. 2A illustrates an exemplary layout of a composite video in which session content has been synchronized to external content. As illustrated, thecomposite video200A combines in-game content210A withexternal content220A in a picture-in-picture configuration. Theexternal content220A may be overlaid on top of the game environment displayed in the in-game content210A. The placement of suchexternal content220A may be static or move (e.g., so as not to block the view of events in the game environment).
FIG. 2B illustrates an alternative exemplary layout of a composite video in which session content has been synchronized to external content. As illustrated, the composite video220B combines in-game content210B with a plurality of different external content files220B-D. The external content220B-D may be captured by different end-user client devices150 (e.g., mobile phones, tablets). Eachsuch client device150 may be local (e.g., in the same room) or remote from a client device150 (e.g., game console) upon which the game is played. While thecomposite video200B ofFIG. 2B includes three sections for displayed different external content220B-D, there may be even more external content (e.g., from other client devices150), which may be switched in and out of the defined sections within the composite video.
FIG. 2C illustrates another alternative exemplary layout of a composite video in which session content has been synchronized to external content. As illustrated, thecomposite video200C first displays in-game content210C intercut withexternal content220E andexternal content220F before being switched back to the in-game content210C.
FIG. 3 illustrates amethod300 for synchronization of session content to external content. The method00 ofFIG. 3 may be embodied as executable instructions in a non-transitory computer readable storage medium including but not limited to a CD, DVD, or non-volatile memory such as a hard drive. The instructions of the storage medium may be executed by a processor (or processors) to cause various hardware components of a computing device hosting or otherwise accessing the storage medium to effectuate the method. The steps identified inFIG. 3 (and the order thereof) are exemplary and may include various alternatives, equivalents, or derivations thereof including but not limited to the order of execution of the same.
Inmethod300 ofFIG. 3, session video of a plurality of game sessions may be captured at a game server. Each captured session video of each game session may be associated with an identifier of the respective game session. Additional content may be sent over a communication network to the game server. Such additional content may be external to the game session and identified as being associated with a game session identifier. One of the captured session videos may be identified as being associated with a game session identifier that matches the game session identifier associated with the received external content. The received external content may be synchronized to the identified session video based on the matching game session identifiers. A composite video may be generated that includes the received external content synchronized to the identified session video.
Instep310, in-game video may be captured during a game session whereby a client device150 (e.g., game console) is playing a game hosted by contentdelivery network server130. Such in-game video may be provided tocontent source server110 for storage in association with a session identifier that is unique to the particular game session from which the in-game video was captured.
Instep320, external content may be received over a communication network (e.g., Internet). A user wishing to capture external content in association with the in-game video may download a mobile application to theirmobile client device150. Such mobile application may allow for identification and selection of a particularother client device150 with which to pair (e.g., the game console hosting the game session). The game console may then generate a unique game session identifier. External content later captured by themobile client device150 may be associated with the game session identifier, as well as be stamped with timestamp(s) related to the game session.
Insteps330 and340,content synchronization server140 may identify the game session identifier associated with the external content and find a matching game session identifier associated with an in-game video.
Instep350, the content files—both in-session and external—associated with the same game session identifier may therefore be synched with each other and based on their respective timestamps. Instep360, a composite video may be generated based on the synchronized in-game video and external content. Instep370, the composite video may be made available to one ormore client devices150 for download or sharing (e.g., via social networks or other online forums, as well as with connections).
FIG. 4 is an exemplary electronic entertainment system that may be used in synchronization of session content to external content. Theentertainment system400 ofFIG. 4 includes amain memory405, a central processing unit (CPU)410,vector unit415, agraphics processing unit420, an input/output (I/O)processor425, an I/O processor memory430, aperipheral interface435, amemory card440, a Universal Serial Bus (USB)interface445, and acommunication network interface450. Theentertainment system400 further includes an operating system read-only memory (OS ROM)455, asound processing unit460, an opticaldisc control unit470, and ahard disc drive465, which are connected via abus475 to the I/O processor425.
Entertainment system400 may be an electronic game console. Alternatively, theentertainment system400 may be implemented as a general-purpose computer, a set-top box, a hand-held game device, a tablet computing device, or a mobile computing device or phone. Entertainment systems may contain more or less operating components depending on a particular form factor, purpose, or design.
TheCPU410, thevector unit415, thegraphics processing unit420, and the I/O processor425 ofFIG. 4 communicate via asystem bus485. Further, theCPU410 ofFIG. 4 communicates with themain memory405 via adedicated bus480, while thevector unit415 and thegraphics processing unit420 may communicate through adedicated bus490. TheCPU410 ofFIG. 4 executes programs stored in theOS ROM455 and themain memory405. Themain memory405 ofFIG. 4 may contain pre-stored programs and programs transferred through the I/O Processor425 from a CD-ROM, DVD-ROM, or other optical disc (not shown) using the opticaldisc control unit470. I/O Processor425 ofFIG. 4 may also allow for the introduction of content transferred over a wireless or other communications network (e.g., 4G, LTE, 1G, and so forth). The I/O processor425 ofFIG. 4 primarily controls data exchanges between the various devices of theentertainment system400 including theCPU410, thevector unit415, thegraphics processing unit420, and theperipheral interface435.
Thegraphics processing unit420 ofFIG. 4 executes graphics instructions received from theCPU410 and thevector unit415 to produce images for display on a display device (not shown). For example, thevector unit415 ofFIG. 4 may transform objects from three-dimensional coordinates to two-dimensional coordinates, and send the two-dimensional coordinates to thegraphics processing unit420. Furthermore, thesound processing unit460 executes instructions to produce sound signals that are outputted to an audio device such as speakers (not shown). Other devices may be connected to theentertainment system400 via theUSB interface445, and thecommunication network interface450 such as wireless transceivers, which may also be embedded in thesystem400 or as a part of some other component such as a processor.
A user of theentertainment system400 ofFIG. 4 provides instructions via theperipheral interface435 to theCPU410, which allows for use of a variety of different available peripheral devices (e.g., controllers) known in the art. For example, the user may instruct theCPU410 to store certain game information on thememory card440 or other non-transitory computer-readable storage media or instruct a character in a game to perform some specified action.
The present invention may be implemented in an application that may be operable by a variety of end user devices. For example, an end user device may be a personal computer, a home entertainment system (e.g., Sony PlayStation2® or Sony PlayStation3® or Sony PlayStation4®), a portable gaming device (e.g., Sony PSP® or Sony Vita®), or a home entertainment system of a different albeit inferior manufacturer. The present methodologies described herein are fully intended to be operable on a variety of devices. The present invention may also be implemented with cross-title neutrality wherein an embodiment of the present system may be utilized across a variety of titles from various publishers.
The present invention may be implemented in an application that may be operable using a variety of devices. Non-transitory computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution. Such media can take many forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of non-transitory computer-readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, and any other memory chip or cartridge.
Various forms of transmission media may be involved in carrying one or more sequences of one or more instructions to a CPU for execution. A bus carries the data to system RAM, from which a CPU retrieves and executes the instructions. The instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU. Various forms of storage may likewise be implemented as well as the necessary network interfaces and network topologies to implement the same.
The foregoing detailed description of the technology has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the technology, its practical application, and to enable others skilled in the art to utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claim.