CROSS-REFERENCE TO RELATED APPLICATIONSThe present application is a U.S. National Stage Application corresponding to PCT Patent Application No. PCT/US2011/027981, filed Mar. 10, 2011, which claims priority to U.S. Provisional Application No. 61/416,708, filed Nov. 23, 2010, entitled “3D VIDEO CONVERTER.” The present application is also a continuation-in-part of: PCT Patent Application No. PCT/US2011/025262, filed Feb. 17, 2011, entitled “BLANKING INTER-FRAME TRANSITIONS OF A 3D SIGNAL;” PCT Patent Application No. PCT/US2011/027175, filed Mar. 4, 2011, entitled “FORMATTING 3D CONTENT FOR LOW FRAME-RATE DISPLAYS;” PCT Patent Application No. PCT/US2011/027933, filed Mar. 10, 2011, entitled “DISPLAYING 3D CONTENT ON LOW FRAME-RATE DISPLAYS;” PCT Patent Application No. PCT/US2011/032549, filed Apr. 14, 2011, entitled “ADAPTIVE 3-D SHUTTERING DEVICES;” and PCT Patent Application No. PCT/US2011/031115, filed Apr. 4, 2011, entitled “DEVICE FOR DISPLAYING 3D CONTENT ON LOW FRAME-RATE DISPLAYS.” The entire content of each of the foregoing applications is incorporated by reference herein.
BACKGROUND1. The Field of the Invention
This invention relates to systems, methods, and computer program products related to conversion and presentation of three-dimensional video content.
2. Background and Relevant Art
Three-dimensional (3D) display technology involves presenting two-dimensional images in such a manner that the images appear to the human brain to be 3D. The process typically involves presenting “left” image data to the left eye, and “right” image data to the right eye. When received, the brain perceives this data as a 3D image. 3D display technology generally incorporates the use of a filtering device or blanking device, such as glasses, which filter displayed image data to the correct eye. Filtering devices can be passive, meaning that image data is filtered passively (e.g., by color code or by polarization), or active, meaning that the image data is filtered actively (e.g., by shuttering).
Traditional display devices, such as computer monitors, television sets, and portable display devices, are typically either incapable of producing suitable image data for 3D viewing, or produce an inferior 3D viewing experience. For instance, using traditional display devices to view 3D content generally results in blurry images and/or images that have “ghosting” effects, both of which may cause dizziness, headache, discomfort, and even nausea in the viewer. This is true even for display devices that incorporate more recent display technologies, such as Liquid Crystal Display (LCD), Plasma, Light Emitting Diode (LED), Organic Light Emitting Diode (OLED), etc.
Recently, 3D display devices designed specifically for displaying 3D content have become increasingly popular. These 3D display devices are generally used in connection with active filtering devices (e.g., shuttering glasses) to produce 3D image quality not previously available from traditional display devices. These 3D display devices, however, are relatively expensive when compared with traditional display devices.
As a result, consumers who desire to view 3D content are faced with the purchase of expensive 3D display devices, even when they may already have traditional display devices available. Accordingly, there a number of considerations to be made regarding the display of 3D content.
BRIEF SUMMARYImplementations of the present invention solve one or more problems in the art with systems, methods, and computer program products configured to allow for viewing of three-dimensional (3D) content on a broad range of display devices. In particular, implementations of the present invention provide for shuttering of transitions between left eye images and right eye images of 3D content. One or more implementations of the present invention allow a viewer to experience a level of quality that can match or even exceed the quality experienced with specialized 3D display devices. Accordingly, implementations of the present invention can alleviate or eliminate the need to purchase a 3D-specific display device by enabling the viewing of 3D content on traditional display devices in a high quality manner.
For example, an implementation of a method of shuttering 3D content to improve the perception of the 3D content can involve receiving a shuttering signal. The method can also involve shuttering first and second shuttering components in response to the shuttering signal. In particular, the method can involve shuttering the first shuttering component during a first time period corresponding to display of second eye 3D content. The method can also involve shuttering both the first and second shuttering components concurrently during a second time period corresponding to the display of a transition from the second eye 3D content to first eye 3D content. In addition, the method can involve shuttering the second shuttering component during a third time period corresponding to display of the first eye 3D content.
Another implementation of a method of shuttering displayed 3D content can involve a shuttering device receiving a shuttering signal that includes a plurality of shuttering instructions synchronized to 3D content to be displayed at a display device. The method can also involve identifying a first shuttering instruction that instructs the shuttering device to shutter the first component for a first time period. In response to the first shuttering instruction, the method can involve shuttering a first shuttering component. The method can further involve identifying a second shuttering instruction that instructs the shuttering device to shutter both the first shuttering component and a second shuttering component for a second time period corresponding to a frame transition period. In response, the method can involve concurrently shuttering the first and second shuttering components for the second time period. Additionally, method can involve identifying a third shuttering instruction that instructs the shuttering device to shutter the second shuttering component for a third time period. Furthermore, the method can involve shuttering the second shuttering component for the third time period.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Additional features and advantages of exemplary implementations of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of such exemplary implementations. The features and advantages of such implementations may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features will become more fully apparent from the following description and appended claims, or may be learned by the practice of such exemplary implementations as set forth hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGSIn order to describe the manner in which the above-recited and other advantages and features of the invention can be obtained, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. It should be noted that the figures are not drawn to scale, and that elements of similar structure or function are generally represented by like reference numerals for illustrative purposes throughout the figures. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
FIG. 1 illustrates a schematic state diagram of a method of shuttering three-dimensional (3D) content in accordance with one or more implementations of the invention;
FIG. 2 illustrates a timing diagram demonstrating a received shuttering signal and corresponding displayed 3D content in accordance with one or more implementations of the invention;
FIG. 3 illustrates a schematic diagram of a shuttering device in accordance with one or more implementations of the invention;
FIG. 4 illustrates a schematic diagram of a system for viewing 3D content in accordance with one or more implementations of the invention;
FIG. 5 illustrates a flowchart of a series of acts in a method in accordance with an implementation of the present invention of shuttering displayed 3D content in response to a shuttering signal; and
FIG. 6 illustrates a flowchart of a series of acts in a method in accordance with an implementation of the present invention of shuttering displayed 3D content in response to a synchronous shuttering signal.
DETAILED DESCRIPTIONImplementations of the present invention solve one or more problems in the art with systems, methods, and computer program products configured to allow for viewing of three-dimensional (3D) content on a broad range of display devices. In particular, implementations of the present invention provide for shuttering of transitions between left eye images and right eye images of 3D content. One or more implementations of the present invention allow a viewer to experience a level of quality that can match or even exceed the quality experienced with specialized 3D display devices. Accordingly, implementations of the present invention can alleviate or eliminate the need to purchase a 3D-specific display device by enabling the viewing of 3D content on traditional display devices in a high quality manner.
Specialized 3D display devices attempt to provide an enhanced 3D viewing experience by modifying physical characteristics of the display device, such as by increasing the frame-rate and decreasing a frame overlap or transition period. The frame-rate refers to the number of unique video frames the display device can render in a given amount of time (e.g., one second). A frame transition period refers to the period of time that elapses when transitioning between two frames. During the frame transition period, the display device displays at least a portion of two or more video frames concurrently. Longer frame transition periods are perceptible to the human eye, and can lead to a degraded viewing experience. For example, longer frame transition periods can cause motion blurring or ghosting. These effects are a particular problem when viewing 3D video content.
One or more implementations of the present invention provide for filtering a user's view of displayed 3D content. This involves a blanking or shuttering device (e.g., shuttering glasses) receiving a shuttering signal that includes one or more shuttering instructions. The shuttering instruction(s) can instruct the shuttering device to synchronously shutter portions of the user's view while the user is viewing the displayed 3D content. The shuttering instructions can include at least one inter-frame shuttering instruction which instructs the shuttering device to shutter all or part of one or more frame transition periods from the user's view. Thus, one or more implementations allow for viewing of 3D content on a broad range of display devices, including devices that that may have lower frame-rates and longer frame transition periods.
FIG. 1, for example, illustrates a schematic state diagram for shuttering 3D content in accordance with one or more implementations of the present invention. In particular,FIG. 1 illustrates that one ormore shuttering devices116 can shutter (or obfuscate) the display of 3D content at adisplay device108. In one or more implementations, theshuttering device116 can comprise stereoscopic shuttering glasses that include two ormore shuttering components118,120, which are capable of selectively obfuscating all or part of a wearer's view of thedisplay device108. In one or more implementations, the shuttering components correspond to lenses or portions of lenses of the shutteringglasses116. As discussed more fully herein after, a shuttering signal can control theshuttering device116.
FIG. 1 illustrates that shuttering 3D content can include at least three different shutteringstates102,104, and106. Instate102, for example, theshuttering device116 can process a shuttering instruction124 (blank right) received from avideo processing device122. As instructed by the shutteringinstruction124, theshuttering device116 can use ashuttering component120 to shutter all or part of the user's right eye view of thedisplay device108. Additionally, theshuttering device116 can placeshuttering component118 in an un-shuttered or open state, allowing the user's left eye to view thevideo frame110 displayed by thedisplay device108.State102 may correspond to at least a portion of a period of time in which thedisplay device108 displays avideo frame110, which comprises 3D video content intended for viewing by the user's left eye.
Similarly, instate106, theshuttering device116 can process a shuttering instruction128 (blank left) received from thevideo processing device122. In response, theshuttering device116 can use ashuttering component118 to shutter all or part of the user's left eye view of thedisplay device108. Additionally, theshuttering device116 can placeshuttering component120 in an un-shuttered or open state, allowing the user's right eye to view thevideo frame114 displayed by thedisplay device108.State106 may correspond to at least a portion of a period of time in which thedisplay device108 displays avideo frame114 comprising 3D video content intended for viewing by the user's right eye.
One will appreciate that states102 and106 are not limited to shuttering left and right eyes in the manner illustrated. For instance, instate102, theshuttering device116 can use theshuttering component118 to shutter the viewer's left eye during display of right eye content. Furthermore, instate106, theshuttering device116 can use theshuttering component120 to shutter the viewer's right eye during display of left eye content. It will be appreciated, then, that states102,104,106 should not be interpreted as necessarily occurring in the illustrated order.
Instate104, theshuttering device116 can process and respond to an inter-frame shuttering instruction126 (blank both) by using both shutteringcomponents118,120 to concurrently shutter the user's left and right eye views of thedisplay device108.State104 may correspond to at least a portion a frame transition period in which thedisplay device108 transitions between displaying two video frames. For instance, thedisplay device108 might be transitioning from display of the “left eye”video frame110 to the “right eye”video frame114, or vice versa. Thus, in one or more implementations, thedisplay device108 displays aframe transition112, in which thedisplay device108 concurrently displays at least a portion of two or more different video frames (e.g.,video frame110 and video frame114). By shuttering both of the user's eyes during display of theframe transition112, theshuttering device116 can prevent the user from viewing all or part of theframe transition112 during all or part of the frame transition period.
One will appreciate that the appropriate shuttering of a single eye, as instates102 and106, synchronous with the display of appropriate 3D video content, can provide the illusion that two-dimensional images are actually 3D. Furthermore, inter-frame shuttering, or the synchronous shuttering of both eyes during frame transition periods, can enhance the clarity of the perceived 3D image. For example, inter-frame shuttering can reduce or eliminate undesirable effects such as motion blurring and ghosting. Thus, the disclosed inter-frame shuttering techniques can allow for viewing 3D content on display devices that may have lower frame-rates and/or longer frame overlap or transition periods.
FIG. 2 shows a timing diagram200 demonstrating a receivedshuttering signal204 and corresponding displayed3D content202 in accordance with at least one implementation. In particular,FIG. 2 illustrates a snapshot of time that includes a plurality of time periods (e.g.,time periods206,208,210,212,214) during which theshuttering device116 receives theshuttering signal204 and shutters displayed 3D content. Thehorizontal ellipses216 and218 indicate that any number of additional time periods can extend to any point before or after the illustrated snapshot. As illustrated, the displayed3D content202 can comprise a plurality of video frames110,114,222 as well as plurality of frame transitions112,220. Correspondingly, the receivedshuttering signal204 can comprise a plurality of shutteringinstructions124,126,128 that instruct theshuttering device116 to shutter the user's view of the displayed3D content202.
FIG. 2 illustrates that theshuttering device116 can receive a shuttering instruction124 (“BR,” or blank right) in connection with a “left eye”video frame110 of the displayed3D content202. This instruction can instruct theshuttering device116 to shutter the user's right eye during all or part of thetime period206 in which thedisplay device108 displays the “left eye”video frame110. In response, theshuttering device116 can shutter theshuttering component120 during all or part of thetime period206. Theshuttering device116 can open, or un-obscure, theother shuttering component118 during all or part of thetime period206, allowing the viewer to view the lefteye video frame110 with the left eye.
Theshuttering device116 can also receive an inter-frame shuttering instruction126 (“BB,” or blank both) in connection with aframe transition112 of the displayed3D content202. Theframe transition112 may occur as a result of thedisplay device108 transitioning between display of the “left eye”video frame110 and a “right eye”video frame114. Theinter-frame shuttering instruction126, can instruct theshuttering device116 to concurrently shutter both of the user's eyes during all or part of thetime period208 in which thedisplay device108 displays theframe transition112. In response, theshuttering device116 can shutter both of shutteringcomponents118,120 during all, or part of, thetime period208, preventing the viewer from seeing all or part of theframe transition112 with either eye.
In addition, the shuttering device can receive a shuttering instruction128 (“BL,” or blank left) in connection with the “right eye”video frame114 of the displayed3D content202. Thisinstruction128 can instruct theshuttering device116 to shutter the user's left eye during all or part of thetime period210 in which thedisplay device108 displays the “right eye”video frame114. This may occur, for example, after thedisplay device108 fully transitions to display of the “right eye”video frame114. In response, theshuttering device116 can shutter theshuttering component118 during all or part of thetime period210. Theshuttering device116 can open, or un-obscure, theother shuttering component120 during all or part of thetime period210, allowing the viewer to view the righteye video frame114 with the right eye.
Additionally, intime periods212 and214, thedisplay device108 can subsequently pass through anotherframe transition220 to display anotherleft video frame222. In connection therewith, theshuttering signal204 can includeappropriate shuttering instructions126,124. In response to which, the shuttering device can shutter theappropriate shuttering components118,120.
FIG. 2 shows that the displayed3D content202 comprises a series of alternating left and right video frames (in any order), and that theshuttering signal204 comprises an appropriate corresponding sequence of shutteringinstructions124,126,128. One will appreciate in view of the disclosure herein, however, that one or more implementations extend to shuttering any sequence of video frames. In one or more implementations, for example, the displayed3D content202 comprises differing sequences of left and right video frames (e.g., left, left, right, right). In one or more other implementations, the displayed3D content202 comprises only video frames intended for viewing with both eyes. In yet other implementations, the displayed3D content202 comprises a combination of different video frame types. One combination, for instance, can include both video frames intended for viewing with both eyes, as well as video frames intended for viewing with a single eye.
Thus, one will appreciate in light of the disclosure herein that theshuttering signal204 can include any appropriate sequence of shuttering instructions that correspond to the displayed3D content202. For instance, if displayed3D content202 includes a different sequence of left and right video frames, theshuttering signal204 can include an appropriate different sequence of shuttering instructions. Furthermore, theshuttering signal204 can depart from the illustrated shuttering instructions. For example, shutteringsignal204 can refrain from shuttering during one or more frame transition time periods. Furthermore, shutteringsignal204 can include any number of additional shuttering instructions, such a shuttering instruction that does no shuttering (e.g., when thedisplay device108 displays a video frame intended for viewing with both eyes).
In one or more implementations, theshuttering device116 receives theshuttering signal204 prior to the display of the3D content202 by thedisplay device108. Thus, theshuttering device116 may store (e.g., cache) at least a portion of theshuttering signal204 prior to performing any instructed shuttering. In additional implementations, theshuttering device116 receives theshuttering signal204 concurrently (or substantially concurrently) with the display of the3D content202 by thedisplay device108. Furthermore, in some instances, the shuttering instructions of theshuttering signal204 can instruct theshuttering device116 to shutter entire time periods. One will appreciate, however, that the shuttering instructions can also instruct theshuttering device116 to shutter only a portion of a corresponding time period. Furthermore, theshuttering signal204 can also instruct theshuttering device116 to shutter more than a corresponding time period.
FIG. 3 illustrates a schematic diagram of ashuttering device116 in accordance with one or more implementations. As noted herein above, theshuttering device116 may, in one or more implementations, take the form of shuttering glasses worn by a user. In alternative implementations, theshuttering device116 can take any appropriate alternate form that can filter displayed 3D content in the manner disclosed. For example, the shutteringcomponents118,116 may take the form of shuttering contact lenses (or other eye shield with separated left and right lenses) wirelessly coupled with other appropriate components. Accordingly, none of the disclosure herein should be viewed as limiting theshuttering device116 to glasses. Furthermore, none of the disclosure herein should be viewed as limiting the shutteringcomponents118,116 to physically separated components.
FIG. 3 illustrates that theshuttering device116 can include areceiver302. In one or more implementations, and as illustrated, thereceiver302 can comprise a wireless receiver (e.g., Wi-Fi, BLUETOOTH, infrared). Thereceiver302 can, in other implementations, comprise a wired receiver (e.g., optical, electronic wire). In any event, thereceiver302 can receive ashuttering signal204. As illustrated, theshuttering signal204 can comprise a plurality of shutteringinstructions124,126,128, etc. In one or more embodiments, theshuttering signal204 comprises a digital signal which includes a plurality of data packets. In such instances, each of the data packets can include one or more shuttering instructions. In one or more other embodiments, theshuttering signal204 comprises an analog signal which can encode the shuttering instructions as a waveform.
In any event, thereceiver302 can communicate theshuttering signal302 as a whole, or the shuttering instructions individually, to aprocessing component308.FIG. 3 illustrates that theprocessing component308 can include a plurality of subsidiary processing components or modules, such as ashuttering signal processor310 and a user input processor312. As illustrated by the arrows between the shutteringsignal processor310 and the user input processor312, any of the subsidiary components or modules of theprocessing component308 can be communicatively coupled in any appropriate manner. Further, as illustrated by the vertical ellipses between the shutteringsignal processor310 and the user input processor312, theprocessing component308 can include any number of additional subsidiary components or modules, as appropriate.
Illustratively, theprocessing component308 can use theshuttering signal processor310 to identify shuttering instructions in theshuttering signal302. Once identified, theshuttering signal processor310 can instruct one or more of the shutteringcomponents118,120 to alter a shuttering state (e.g., shuttered or not shuttered), as appropriate for the shuttering instruction. Theshuttering signal processor310 can use any number of other processing components or modules to perform the processing and to instruct the shuttering components. For instance, when theshuttering signal processor310 identifies aninter-frame shuttering instruction126, theshuttering signal processor310 can instruct both the shutteringcomponents118,120 to enter a shuttered state. If one or more of the shutteringcomponents118,120 is already in the shuttered state, theshuttering signal processor310 can instruct the shuttering component to remain in the shuttered state.
In one or more implementations, theshuttering device116 can comprise a universal and/or a configurable shuttering device. When theshuttering device116 is a universal device, theprocessing component308 can include any number of other processing components for identifying a signal type of theshuttering signal204. Thus, thereceiver302 can receive any of a number of types of shutteringsignals204, both analog and digital, and theprocessing component308 can determine the type of signal and process it accordingly. Furthermore, when theshuttering device116 is a configurable shuttering device, theprocessing component308 can include any number of other processing components for receiving updates. Thus, theshuttering device116 can receive any of a number of updates, such as updates to processing components, updates to signal types, new signal types, etc.
FIG. 3 also illustrates that theshuttering device116 can include additional components, such atransmitter304 and auser input component314. Theshuttering device116 can communicate with one or more other devices via thetransmitter304. For example, theshuttering device116 can use thetransmitter304, to communicate with other shuttering devices, thevideo processing device122, thedisplay device108, the Internet, etc. In one or more implementations, the shuttering device combines thetransmitter304 and thereceiver302 as a single component, while in one or more other implementations thetransmitter304 and thereceiver302 are separate components. The transmitter can transmit anoutput signal306 that can be separate from or combined with theshuttering signal204 and that can contain one or more instructions orpackets320. Similar to thereceiver302, thetransmitter304 can also use any wired or wireless communications mechanism, analog or digital.
Theuser input component314 can comprise any means for receiving user input, including any combination of one or more buttons, switches, touch devices, microphones, cameras, light sensing devices, pressure sensing devices, etc. The user input processor312 can process user input received via theuser input component314. The user input can comprise any appropriate type of user input, and theshuttering device116 can use the user input in any appropriate manner. For example, user input may comprise volume control, selection of a shuttering signal type, user feedback to be sent to another device (e.g., the video processing device122), selection of a mode of the shuttering device (e.g., on, off, standby, configuration, update), etc.
As illustrated by the vertical ellipses between theuser input component314 and an “other”component322, theshuttering device116 can include any number of additional components. The additional components can communicate with theprocessing component308 or any other component where appropriate. Additional components may include, for example, one or more speakers, a power source, lights, one or more microphones, etc. When additional components are present, components already discussed may be modified accordingly. For instance, when the other components include speakers, thereceiver302 can also receive an analog or digital audio signal, either as part of theshuttering signal204 or as a separate signal. Then, theprocessing component308 may include an audio decoder which sends a decoded audio signal to the speakers.
In one or more implementations, the other component(s)322 can include a shuttering signal continuity component, which ensures that shuttering occurs, even when theshuttering signal204 is lost. For instance, the shuttering signal continuity component can implement a phase lock loop (or any similar logic) that analyzes theshuttering signal204 and generates a replacement signal, when appropriate. The analysis can include determining average sequence and timing information about the shuttering instructions contained in theblanking signal204, and developing a model of this information. Then, the shuttering signal continuity component can use phase lock loop (or other logic) to generate a series of substitute shuttering instructions that are estimated to duplicate any shuttering instructions that would have been received if theshuttering signal204 had not been lost. These generated shuttering instructions can be used to control the shutteringcomponents118,120 in the absence of theshuttering signal204. When theshuttering signal204 is regained, the shuttering signal continuity component can synchronize the generated substitute shuttering instructions with theshuttering signal204.
Theshuttering signal204 may be lost for a variety of reasons, such as physical or electromagnetic interference, loss of signal strength, insufficient transmission bandwidth, etc. For example, if theshuttering signal204 is transmitted via an infrared transmission, theshuttering signal204 may be lost if a person or animal walks between the wearer of shuttering glasses and the transmission source, if the wearer turns his or her head, etc. Similarly, if theshuttering signal204 is transmitted via a BLUETOOTH or WiFi transmission, theshuttering signal204 may be lost due to interference on the same frequency as the transmission (e.g., from a microwave oven or other wireless devices), if the wearer moves out of range, if data packets in the transmission arrive out of order or delayed, etc. Accordingly, the ability to deal with shuttering signal continuity can improve the robustness and reliability of the shuttering device.
The shutteringcomponents118,120 can comprise any component that can selectively obfuscate/shutter all or a portion of a user's view of thedisplay device108. For example, in one or more implementations the shutteringcomponents118,120 can comprise one or more liquid crystal layers that respond to applied voltage. The liquid crystal layers can have the property of becoming opaque (or substantially opaque) when voltage is applied (or, alternatively, when voltage is removed). Otherwise, the liquid crystal layers can have the property of being transparent (or substantially transparent) when voltage is removed (or, alternatively, when voltage is applied). One will recognize in view of the disclosure herein that liquid crystal layers are not the only available shuttering technology. For example, alternate electronic shuttering technologies (e.g., polarized lead lanthanum zirconate titanate (PLZT)) and mechanical shuttering technologies are also available and within the scope of the disclosure herein.
FIG. 4 illustrates a schematic diagram of asystem400 for viewing 3D content, in accordance with one or more implementations. Thesystem400 is one possible environment in which the shuttering techniques disclosed herein may be used.FIG. 4 illustrates that thesystem400 can include thevideo processing device122, one ormore shuttering devices116, and thedisplay device108. These devices can be separate or combined. For instance, in one or more implementations thevideo processing device122 and thedisplay device108 are separate units, while in one or more other implementations these devices form a single unit.
In one or more implementations, thevideo processing device122 receives 3D content from a media device. The media device can comprise any number of devices capable of transmitting 3D content to thevideo processing device122. For example,FIG. 4 illustrates that the media device can comprise a streaming source408 (e.g., a satellite box, cable box, the Internet), a gaming device (e.g.,XBOX410, PLAYSTATION416), a storage media player device (e.g., Blu-Ray player412, DVD player414) capable of readingstorage media418, and the like. Thevideo processing device122 can, itself, comprise one or more media devices.
Thevideo processing device122 can communicate with thedisplay device108 and the shuttering device(s)116 in any appropriate manner. For instance, an appropriate wired mechanism, such as HDMI, component, composite, coaxial, network, optical, and the like can couple thevideo processing device122 and thedisplay device108 together. Additionally, or alternatively, an appropriate wireless mechanism, such as BLUETOOTH, Wi-Fi, etc., can couple thevideo processing device122 and thedisplay device108 together. Furthermore, as discussed herein above, any appropriate wired or wireless mechanism (e.g., BLUETOOTH, infrared, etc.) can couple thevideo processing device122 and the shuttering device(s)116 together.
Thevideo processing device122 can generate an appropriate output signal comprising 3D content received from a media device. For example, when thevideo processing device122 and thedisplay device108 are coupled via a digital mechanism (e.g., HDMI), thevideo processing device122 can generate a digital output signal. On the other hand, when thevideo processing device122 and thedisplay device108 are coupled via an analog mechanism (e.g., component, composite or coaxial), thevideo processing device122 can generate an analog output signal. Thevideo processing device122 can process 3D content received from the media device to convert the received 3D content to a format more suited for theparticular display device108. Additionally, thevideo processing device122 can generate and send a shuttering signal to the shuttering device(s)116 that is synchronized to the output signal.
One will appreciate in view of the disclosure herein that thevideo processing device122 can take any of a variety of forms. For example, thevideo processing device122 may be a set-top box or other customized computing system. Thevideo processing device122 may also be a general purpose computing system (e.g., a laptop computer, a desktop computer, a tablet computer, etc.). Alternatively, thevideo processing device122 may be a special purpose computing system (e.g., a gaming console, a set-top box, etc.) that has been adapted to implement one or more disclosed features.
Thedisplay device108 can be any one of a broad range of display devices that incorporate a variety of display technologies, both current and future (e.g., Cathode Ray, Plasma, LCD, LED, OLED). Furthermore, thedisplay device108 can take any of a number of forms, such as a television set, a computer display (e.g., desktop computer monitor, laptop computer display, tablet computer display), a handheld display (e.g., cellular telephone, PDA, handheld gaming device, handheld multimedia device), or any other appropriate form. While thedisplay device108 can be a display device designed specifically to displaying 3D content, thedisplay device108 can also be a more traditional display device, such as a lower frame-rate device. One will appreciate in light of the disclosure herein that thedisplay device108 can include both digital and analog display devices.
Accordingly,FIGS. 1-4 provide a number of components and mechanisms for shuttering the display of 3D content. The disclosed shuttering includes shuttering at least one inter-frame transition between displayed video frames from a user's view. Thus, one or more disclosed implementations allow for viewing of 3D content on a broad range of display devices, even display devices that may have lower frame-rates and/or lower frame transition periods.
Additionally, implementations of the present invention can also be described in terms of flowcharts comprising one or more acts in a method for accomplishing a particular result. Along these lines,FIGS. 5-6 illustrate flowcharts of computerized methods of shuttering displayed 3D content in response to a shuttering signal to improve the perception of the displayed 3D content by concurrently shuttering both of a viewer's eyes during frame transitions. The acts ofFIGS. 5 and 6 are described herein below with respect to the schematics, diagrams, devices and components shown inFIGS. 1-4.
For example,FIG. 5 illustrates that one or more implementations of a method of shuttering displayed 3D content in response to a shuttering signal can comprise anact502 of receiving a shuttering signal. Act502 can include receiving a shuttering signal that is synchronized to 3D content to be displayed at adisplay device108. For example, act502 can include thereceiver302 at theshuttering device116 receiving ashuttering signal204 that includes one ormore shuttering instructions124,126,128. Act502 can also include determining a signal type of theshuttering signal204 and processing theshuttering signal204 according to the signal type to determine the shuttering instructions.
FIG. 5 also illustrates that the method can comprise anact504 of shuttering a first shuttering component. Act504 can include shuttering a first shuttering component during a first time period corresponding to a display of second eye 3D content at the display device. In particular, act504 can include shuttering theshuttering component120 during at least a portion oftime period206 in which lefteye frame110 is displayed on thedisplay device108. In one or more implementations, act504 can include applying voltage to a liquid crystal layer withinshuttering component120, thereby causingshuttering component120 to become opaque.
In addition,FIG. 5 shows that the method can comprise anact506 of concurrently shuttering the first shuttering component and a second shuttering component. Act506 can include shuttering the first shuttering component and a second shuttering component concurrently during a second time period corresponding to a display of a transition from the second eye 3D content to first eye 3D content at the display device in response to the shuttering signal. In particular, act504 can include shuttering the shutteringcomponents118,120 during at least a portion oftime period208 in which at least a portion of both theleft eye frame110 and theright eye frame114 are displayed on thedisplay device108. In one or more implementations, act504 can include applying voltage to a liquid crystal layers within shutteringcomponents118,120, thereby causing shutteringcomponents118,120 to become opaque.
FIG. 5 further shows that the method can comprise anact508 of shuttering the second shuttering component. Act508 can include shuttering the second shuttering component during a third time period corresponding to a display of the first eye 3D content at the display device. In particular, act508 can include shuttering theshuttering component118 during at least a portion oftime period210 in whichright eye frame114 is displayed on thedisplay device108. In one or more implementations, act508 can include applying voltage to a liquid crystal layer withinshuttering component118, thereby causingshuttering component118 to become opaque.
The method can also involve other acts of shuttering, such as an act of shuttering neither the first shuttering component nor the shuttering component during a fourth time period. Furthermore, it will be appreciated that the first eye can correspond to a left eye and that the second eye can correspond to a right eye, or vice versa. Thus, shuttering the first and second shuttering components can involve shuttering a user's left and/or right eyes, in any order. It will also be appreciated that shuttering can involve shuttering all or part of the display of 3D content. For instance, concurrent shuttering can comprise concurrent shuttering during an entire display of the transition from the second eye 3D content to first eye 3D content, or during only a portion of the display of the transition.
In addition to the foregoing,FIG. 6 illustrates that one or more additional implementations of shuttering displayed 3D content in response to a synchronous shuttering signal can comprise anact602 of receiving a plurality of shuttering instructions. Act602 can include receiving a shuttering signal that includes a plurality of shuttering instructions synchronized to 3D content to be displayed at a display device. For example, act602 can include thereceiver302 receiving ashuttering signal204 that includes a plurality of shutteringinstructions124,126,128. In one or more implementations, act602 can involve receiving adigital signal204 that includes one or more data packets, which in turn include the shuttering instructions. Act602 can also include determining a signal type of the shuttering signal. Thus, the act can involve receiving and processing a variety of types of shuttering signals.
FIG. 6 also illustrates that the method can comprise anact604 of identifying a first shuttering instruction. Act604 can include identifying a first shuttering instruction that instructs the shuttering device to shutter a first shuttering component for a first time period. For example, act604 can include theshuttering signal processor310 identifying that the first shuttering instruction instructs theshuttering device116 to shutter theshuttering component120. In one or more implementations, the first time period can correspond to at least a portion of time during which adisplay device108 displays only 3D content intended for viewing by a left eye (i.e., video frame110).
In addition,FIG. 6 illustrates that the method can include anact606 of shuttering a first shuttering component. Act606 can include shuttering the first shuttering component for the first time period. For example, act606 can include shuttering theshuttering component120 during at least a portion oftime period206 in which lefteye frame110 is displayed on thedisplay device108. Act606 can include theprocessing component308 instructing theshuttering component120 to enter a shuttered state. Act606 can also involve instructing theshuttering component118 to open or leave the shuttered state so that only shutteringcomponent120 is shuttered during thefirst time period206.
FIG. 6 also shows that the method can comprise anact608 of identifying a second shuttering instruction corresponding to a frame transition period. Act608 can include identifying a second shuttering instruction that instructs the shuttering device to concurrently shutter the first shuttering component and a second shuttering component during a second time period corresponding to a frame transition period. For example, act608 can include theshuttering signal processor310 identifying that the second shuttering instruction instructs theshuttering device116 to shutter both theshuttering component118 and theshuttering component120 concurrently. The indentified second time period may be equal to the entire frame transition period, a portion of the frame transition period, or a period longer than the frame transition period.
In response to the second shuttering instruction, the method can comprise anact610 of shuttering the first shuttering component and the second shuttering component concurrently. Act610 can include shuttering the first shuttering component and the second shuttering component concurrently for the second time period. For example, act610 can include theprocessing component308 instructing theshuttering component120 to enter or remain in the shuttered state and instructing theshuttering component118 to enter the shuttered state so that both shutteringcomponents118,120 shutter concurrently.
Additionally,FIG. 6 illustrates that the method can comprise anact612 of identifying a third shuttering instruction. Act612 can include identifying a third shuttering instruction that instructs the shuttering device to shutter the second shuttering component for a third time period. For example, act612 can include theshuttering signal processor310 identifying that the third shuttering instruction instructs theshuttering device116 to shutter theshuttering component118. In one or more implementations, the third time period can correspond to at least a portion of time during which adisplay device108 displays only 3D content intended for viewing by a right eye (i.e., video frame114).
In response to the third shuttering instruction, the method can comprise anact614 of shuttering the second shuttering component. Act614 can include shuttering the second shuttering component for the third time period. For example, act614 can include shuttering theshuttering component118 during at least a portion oftime period210 in whichright eye frame114 is displayed on thedisplay device108. Act614 can include theprocessing component308 instructing theshuttering component118 to enter a shuttered state. Act614 can also involve instructing theshuttering component120 to open or leave the shuttered state so that only shutteringcomponent118 is shuttered during thethird time period210.
The method can include identifying other shuttering instructions as well. For example, the method can include an act of identifying a fourth shuttering instruction that instructs the shuttering device to refrain from shuttering either of the first shuttering component or the second shuttering component. This instruction may correspond, for example, with the display of content at the display device intended for view with both eyes. Thus, in response to this fourth shuttering instruction, the shuttering device can refrain from shuttering either of the first shuttering component or the second shuttering component. This can involve, for example, theprocessing component308 sending a command to the shutteringcomponents118,120 to leave the shuttering mode, when appropriate.
Accordingly,FIGS. 1-6 provide a number of components and mechanisms for shuttering the display of 3D content using inter-frame shuttering techniques. One or more disclosed implementations allow for viewing of 3D video content on a broad range of display devices, including devices that that may have lower frame-rates and longer frame transition periods, or that are not otherwise specifically designed for displaying 3D video content.
The implementations of the present invention can comprise a special purpose or general-purpose computing systems. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally considered a computing system, such as DVD players, Blu-Ray Players, gaming systems, and video converters. In this description and in the claims, the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions that may be executed by the processor.
The memory may take any form and may depend on the nature and form of the computing system. A computing system may be distributed over a network environment and may include multiple constituent computing systems. In its most basic configuration, a computing system typically includes at least one processing unit and memory. The memory may be physical system memory, which may be volatile, non-volatile, or some combination of the two. The term “memory” may also be used herein to refer to non-volatile mass storage such as physical storage media. If the computing system is distributed, the processing, memory and/or storage capability may be distributed as well. As used herein, the term “module” or “component” can refer to software objects or routines that execute on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads).
Implementations of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system. Thus, it should be understood that computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.