Movatterモバイル変換


[0]ホーム

URL:


CN113826380A - Reducing peak current usage in light emitting diode arrays - Google Patents

Reducing peak current usage in light emitting diode arrays
Download PDF

Info

Publication number
CN113826380A
CN113826380ACN202080034749.1ACN202080034749ACN113826380ACN 113826380 ACN113826380 ACN 113826380ACN 202080034749 ACN202080034749 ACN 202080034749ACN 113826380 ACN113826380 ACN 113826380A
Authority
CN
China
Prior art keywords
led
row
frame
leds
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080034749.1A
Other languages
Chinese (zh)
Inventor
威廉·托马斯·布兰克
伊利亚斯·帕帕斯
迈克尔·易
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Technologies LLC
Original Assignee
Facebook Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Facebook Technologies LLCfiledCriticalFacebook Technologies LLC
Publication of CN113826380ApublicationCriticalpatent/CN113826380A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

Translated fromChinese

实施例涉及在发射帧的第一子帧期间驱动第一颜色的第一发光二极管(LED)发光,在发射帧的第二子帧期间驱动第二颜色的第二LED发光,以及在发射帧的第二子帧期间驱动第三颜色的第三LED发光。从第一、第二和第三LED发射的光被引导到反射镜上,该反射镜将光反射到图像场的多个像素位置上。第一、第二和第三LED在LED阵列上对齐,使得第一LED与第二LED相距第一距离,以及第一LED与第三LED相距第二距离。

Figure 202080034749

Embodiments relate to driving a first light emitting diode (LED) of a first color to emit light during a first subframe of the emission frame, driving a second LED of a second color to emit light during a second subframe of the emission frame, and The third LED of the third color is driven to emit light during the second subframe. Light emitted from the first, second and third LEDs is directed onto a mirror which reflects the light onto a plurality of pixel locations of the image field. The first, second and third LEDs are aligned on the LED array such that the first LED is a first distance from the second LED and the first LED is a second distance from the third LED.

Figure 202080034749

Description

Reducing peak current usage in light emitting diode arrays
Cross Reference to Related Applications
This application claims priority from greek application No. 20190100204 filed on 9/5/2019 and us application No. 16/430,135 filed on 3/6/2019. The contents of greek patent application No. 20190100204 and U.S. application No. 16/430,135 are hereby incorporated by reference in their entirety for all purposes.
Background
The present disclosure relates to a display device, and more particularly, to a display device in which subsets of at least one Light Emitting Diode (LED) are turned on (turn on) at different times to reduce peak current.
Display devices are commonly used as head mounted displays or near eye displays in Virtual Reality (VR) or Augmented Reality (AR) systems. A display device typically includes an array of LEDs that emit light to the eyes of a viewer, and an optical block located between the display and the eyes. The optics block includes optics components that receive light emitted from the array and adjust the orientation of the light such that the light is projected onto an image field to form an image. To display an image by the display device, the LED array emits light, and the light is projected onto pixel locations on the image field.
In a given display device, an array of LEDs may include a large number of LEDs (e.g., 2560 LEDs) for each color (e.g., red, blue, green) used to display an image, and turning on all LEDs in the array (e.g., 7,680 LEDs) simultaneously consumes a large amount of instantaneous current. When a large amount of current is required to drive the LED array, a complex system design of the display device may be required to provide the driving current to the LED array. However, developing and manufacturing complex system designs is a time consuming and costly process. Furthermore, when a large amount of current is used to operate the display device, there is a risk of deterioration of silicon performance, which may result in an increase in defective products and a reduction in the life of the display device.
SUMMARY
Embodiments relate to projecting light from an array of LEDs onto an image field to display an image to a user. Light emitted from the LED array can be directed towards the rotating mirror and redirected to the image field by the rotating mirror located between the LED array and the image field. Depending on the orientation of the mirrors, the light is directed to illuminate a particular subset of pixel locations on the image field. As the mirror rotates, light is projected onto different subsets of pixel locations and an image is generated over the image field as the mirror scans the entire image field. The array of LEDs may include a plurality of LEDs (e.g., 2560 LEDs) for each color (e.g., red, blue, green), arranged such that LEDs of the same color are in the same row. When driving the LED array to emit light, different subsets of LEDs are turned on at different sub-frames of the emission frame to reduce current usage at a given time.
In some embodiments, each transmission frame used to operate the LED array is divided into two sub-frames. During the first sub-frame, the first LED is turned on to emit light of a first color (e.g., red), while the second LED of the second color (e.g., green) and the third LED of the third color (e.g., blue) are turned off. During a second sub-frame subsequent to the first sub-frame, the first LED is turned off, and the second LED and the third LED are turned on. The row of first LEDs is a first distance from the row of second LEDs and the row of third LEDs is a second distance from the row of second LEDs, wherein the first distance is different from the second distance. The first and second distances are determined based on at least emission timing (timing) of the first and second sub-frames such that light emitted from the first, second, and third LEDs is projected onto the appropriate pixel locations.
In some embodiments, each transmission frame is divided into three subframes. During the first sub-frame, the first LED is on, and the second and third LEDs are off. During a second sub-frame, the second LED is on, and the first and third LEDs are off. During a third sub-frame, the third LED is on, and the first and second LEDs are off. The row of first LEDs is a first distance from the row of second LEDs and the row of third LEDs is a second distance from the row of second LEDs, wherein the first and second distances are equal.
Brief Description of Drawings
Fig. 1 is a perspective view of a near-eye display (NED) according to an embodiment.
Fig. 2 is a cross-sectional view of an eye-shielding device of the NED shown in fig. 1, according to an embodiment.
Fig. 3A is a perspective view of a display device according to an embodiment.
FIG. 3B illustrates a block diagram of a source component, according to one embodiment.
Fig. 4 is a schematic diagram illustrating a Light Emitting Diode (LED) array according to an embodiment.
FIG. 5A is a diagram illustrating the projection of light emitted from LEDs into an image field according to one embodiment.
FIG. 5B is a diagram illustrating lines of pixel locations on an image field according to an embodiment.
Fig. 6A is a timing diagram illustrating an emission pattern of an LED with two sub-frames in a single frame, according to an embodiment.
FIG. 6B is a timing diagram illustrating an emission pattern of an LED with three sub-frames in a single frame according to one embodiment.
7A-7C are conceptual diagrams illustrating rows onto which LEDs project light for an emission frame having two sub-frames, according to an embodiment.
8A-8C are conceptual diagrams illustrating rows onto which LEDs project light for an emission frame having three sub-frames, according to an embodiment.
Fig. 9 is a flowchart depicting a process of operating a display device according to an embodiment.
The figures depict embodiments of the present disclosure for purposes of illustration only.
Detailed Description
Embodiments relate to operating a display device to reduce peak current usage of an LED array of the display device for displaying an image. Instead of turning on all the LEDs on the array simultaneously when displaying an image, a first LED of a first color is driven to emit light during a first sub-frame of an emission frame, while a second LED of a second color is disabled. During a second sub-frame of the same transmission frame, the second LED is driven to emit light, while the first LED is disabled.
Embodiments of the invention may include or be implemented in conjunction with an artificial reality system. Artificial reality is a form of reality that has been adjusted in some way before being presented to the user, and may include, for example, Virtual Reality (VR), Augmented Reality (AR), Mixed Reality (MR), hybrid reality (hybrid reality), or some combination and/or derivative thereof. The artificial reality content may include fully generated content or content generated in combination with captured (e.g., real world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, and any of them may be presented in a single channel or multiple channels (e.g., stereoscopic video that produces a three-dimensional effect to a viewer). Further, in some embodiments, the artificial reality may also be associated with an application, product, accessory, service, or some combination thereof, that is used, for example, to create content in the artificial reality and/or otherwise used in the artificial reality (e.g., to perform an activity in the artificial reality). An artificial reality system that provides artificial reality content may be implemented on a variety of platforms, including a Head Mounted Display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
Near-eye display
Fig. 1 is an illustration of a near-eye display (NED)100 according to an embodiment. NED 100 presents media to a user. Examples of media presented by NED 100 include one or more images, video, audio, or some combination thereof. In some embodiments, the audio is presented via an external device (e.g., speakers and/or headphones) that receives audio information fromNED 100, a console (not shown), or both and presents audio data based on the audio information.NED 100 may operate as a VR NED. However, in some embodiments,NED 100 may be modified to also operate as an Augmented Reality (AR) NED, a Mixed Reality (MR) NED, or some combination thereof. For example, in some embodiments,NED 100 may augment a view of a physical, real-world environment with computer-generated elements (e.g., images, video, sound, etc.).
TheNED 100 shown in fig. 1 includes aframe 105 and a display 110. Theframe 105 includes one or more optical elements that together display media to a user. The display 110 is configured for enabling a user to view content presented by theNED 100. As discussed below in connection with fig. 2, display 110 includes at least a source assembly to generate image light to present media to a user's eye. The source assembly includes, for example, a light source, an optical system, or some combination thereof.
Fig. 1 is merely an example of a VR system. However, in alternative embodiments, fig. 1 may also be referred to as a Head Mounted Display (HMD).
Fig. 2 is across-sectional view 200 of theNED 100 shown in fig. 1, in accordance with embodiments.Cross-section 200 shows at least onewaveguide assembly 210. The exit pupil (exit pupil) is the position where theeye 220 is located in the viewing window area 230 when theNED 100 is worn by the user. In some embodiments, theframe 105 may represent the frame of a pair of eyeglasses. For illustration purposes, fig. 2 shows across-section 200 associated with asingle eye 220 and asingle waveguide assembly 210, but in an alternative embodiment not shown, another waveguide assembly separate from thewaveguide assembly 210 shown in fig. 2 provides image light to theother eye 220 of the user.
As shown below in fig. 2, thewaveguide assembly 210 guides the image light through an exit pupil to theeye 220.Waveguide assembly 210 can be composed of one or more materials (e.g., plastic, glass, etc.) having one or more refractive indices effective to minimize the weight ofNED 100 and widen its field of view (hereinafter referred to as the 'FOV'). In an alternative configuration,NED 100 includes one or more optical elements betweenwaveguide assembly 210 andeye 220. The optical element may function (e.g., correct aberrations in the image light emitted from the waveguide assembly 210) to amplify the image light emitted from thewaveguide assembly 210, some other optical adjustment of the image light emitted from thewaveguide assembly 210, or some combination thereof. Examples of optical elements may include apertures, fresnel lenses, convex lenses, concave lenses, filters, or any other suitable optical element that affects image light. In one embodiment,waveguide assembly 210 may generate and direct a number of pupil replicas to viewport region 230 in a manner discussed in further detail below in connection with FIG. 5B.
Fig. 3A illustrates a perspective view of adisplay device 300 according to an embodiment. In some embodiments,display device 300 is a component of NED 100 (e.g.,waveguide assembly 210 or a portion of waveguide assembly 210). In alternative embodiments,display device 300 is part of some other NED or another system that directs display image light to a particular location. According to embodiments and implementations, thedisplay device 300 may also be referred to as a waveguide display and/or a scanning display. However, in other embodiments, thedisplay device 300 does not include a scanning mirror. For example, thedisplay device 300 may include a matrix of light emitters that project light through a waveguide to an image field, but without a scanning mirror. In another embodiment, the image emitted by the two-dimensional matrix of light emitters may be magnified by an optical component (e.g., a lens) before the light reaches the waveguide or screen.
For particular embodiments using waveguides and optical systems,display device 300 may includesource assembly 310,output waveguide 320, andcontroller 330. Thedisplay device 300 may provide images for both eyes or a single eye. For purposes of illustration, FIG. 3A shows adisplay device 300 associated with a monocular 220. Another display device (not shown) separate (or partially separate) from thedisplay device 300 provides image light to the other eye of the user. In a partially separated system, one or more components may be shared between the display devices for each eye.
Source assembly 310 generatesimage light 355.Source assembly 310 includes alight source 340 and anoptical system 345. Thelight source 340 is an optical component that generates image light using a plurality of light emitters arranged in a matrix. Each light emitter may emit monochromatic light. Thelight source 340 generates image light including, but not limited to, red image light, blue image light, green image light, infrared image light, and the like. Although RGB is often discussed in this disclosure, the embodiments described herein are not limited to using red, blue, and green as the primary colors. Other colors may also be used as primary colors of the display device. Furthermore, a display device according to an embodiment may use more than three primary colors.
Theoptical system 345 performs a set of optical processes including, but not limited to, focusing, combining, conditioning, and scanning processes on the image light generated by thelight source 340. In some embodiments, theoptical system 345 includes a combining assembly, a light conditioning assembly, and a scanning mirror assembly, as described in detail below in connection with fig. 3B.Source assembly 310 generatesimage light 355 and outputs image light 355 tocoupling element 350 ofoutput waveguide 320.
Theoutput waveguide 320 is an optical waveguide that outputs image light to the user'seye 220. Theoutput waveguide 320 receives image light 355 at one ormore coupling elements 350 and directs the received input image light to one ormore decoupling elements 360. Couplingelement 350 may be, for example, a diffraction grating, a holographic grating, some other element that couples image light 355 intooutput waveguide 320, or some combination thereof. For example, in embodiments wherecoupling element 350 is a diffraction grating, the pitch (pitch) of the diffraction grating is selected such that total internal reflection occurs and image light 355 propagates internally towardsdecoupling element 360. The pitch of the diffraction grating may be in the range 300nm to 600 nm.
Decoupling element 360 decouples the totally internally reflected image light fromoutput waveguide 320.Decoupling element 360 may be, for example, a diffraction grating, a holographic grating, some other element that decouples image light fromoutput waveguide 320, or some combination thereof. For example, in embodiments in whichdecoupling element 360 is a diffraction grating, the pitch of the diffraction grating is selected so that incident image light exitsoutput waveguide 320. By changing the orientation and position of image light 355 enteringcoupling element 350, the orientation and position of the image light exiting fromoutput waveguide 320 is controlled. The pitch of the diffraction grating may be in the range 300nm to 600 nm.
Output waveguide 320 may be composed of one or more materials that promote total internal reflection ofimage light 355.Output waveguide 320 may be composed of, for example, silicon, plastic, glass, or polymer, or some combination thereof. Theoutput waveguide 320 has a relatively small form factor. For example,output waveguide 320 may be about 50mm wide in the X dimension, about 30mm long in the Y dimension, and about 0.5-1mm thick in the Z dimension.
Thecontroller 330 controls an imaging rendering operation of thesource component 310.Controller 330 determines instructions forsource component 310 based at least on the one or more display instructions. Display instructions are instructions for rendering one or more images. In some embodiments, the display instruction may simply be an image file (e.g., a bitmap). The display instructions may be received from, for example, a console (not shown here) of the VR system. The scan instruction is an instruction used bysource assembly 310 to generateimage light 355. The scan instructions may include, for example, the type of image light source (e.g., monochromatic, polychromatic), the scan rate, the orientation of the scanning device, one or more illumination parameters, or some combination thereof.Controller 330 includes a combination of hardware, software, and/or firmware not shown here (to avoid obscuring other aspects of the disclosure).
FIG. 3B is a block diagram illustrating anexample source component 310, according to an embodiment.Source assembly 310 includes alight source 340 that emits light that is optically processed by anoptical system 345 to produce image light 335 to be projected onto an image field (not shown). Thelight source 340 is driven by the drivingcircuit 370 based on data transmitted from thecontroller 330 or theimage processing unit 375. In one embodiment, thedriver circuit 370 is a circuit panel that connects and mechanically holds the various light emitters of thelight source 340. The combineddriver circuit 370 andlight source 340 may sometimes be referred to as a display panel 380 or an LED panel (if some form of LED is used as the light emitter).
Thelight source 340 may generate spatially coherent or partially spatially coherent image light. Thelight source 340 may include a plurality of light emitters. The light emitters may be Vertical Cavity Surface Emitting Laser (VCSEL) devices, Light Emitting Diodes (LEDs), micro-LEDs, tunable lasers, and/or some other light emitting device. In one embodiment, thelight source 340 includes a matrix of light emitters. In another embodiment, thelight source 340 includes multiple sets of light emitters, each set of light emitters being grouped by color and arranged in a matrix. Thelight source 340 emits light in the visible band (e.g., from about 390nm to 700 nm). Thelight source 340 emits light according to one or more lighting parameters set by thecontroller 330 and possibly adjusted by theimage processing unit 375 and thedrive circuit 370. The illumination parameter is an instruction used by thelight source 340 to generate light. The illumination parameters may include, for example, source wavelength, pulse rate, pulse amplitude, beam type (continuous or pulsed), or other parameters that affect the emitted light, or some combination thereof. Thelight source 340 emits source light 385. In some embodiments, the source light 385 includes a plurality of beams of red, green, and blue light, or some combination thereof.
Theoptical system 345 may include one or more optical components that optically condition and potentially redirect light from thelight source 340. One form of example conditioning of light may include conditioning light. Adjusting the light from thelight source 340 may include, for example, expanding, collimating, correcting one or more optical errors (e.g., field curvature, chromatic aberration, etc.), some other adjustment of the light, or some combination thereof. The optical components of theoptical system 345 may include, for example, lenses, mirrors, apertures, gratings, or some combination thereof. The light emitted from theoptical system 345 is referred to asimage light 355.
Theoptical system 345 may redirect the image light 335 via one or more reflective and/or refractive portions thereof such that theimage light 355 is projected toward the output waveguide 320 (as shown in fig. 3A) in a particular orientation. The direction in which the image light is redirected is based on the particular orientation of the one or more reflective and/or refractive portions. In some embodiments, theoptical system 345 comprises a single scanning mirror that scans in at least two dimensions. In other embodiments, theoptical system 345 may include a plurality of scanning mirrors, each scanning mirror scanning in a direction orthogonal to each other. Theoptical system 345 may perform a raster scan (horizontal or vertical), a dual resonant scan (biresonant scan), or some combination thereof. In some embodiments, theoptical system 345 may perform controlled vibrations in the horizontal and/or vertical directions with a particular oscillation frequency to scan in 2 dimensions and generate a two-dimensional projected line image of the media presented to the user's eyes. In other embodiments, theoptical system 345 may also include a lens that functions similarly or identically to one or more scanning mirrors.
In some embodiments,optical system 345 comprises a galvanometer mirror. For example, a galvanometer mirror may represent any electromechanical instrument that indicates that it has sensed a current by deflecting an image beam with one or more mirrors. The galvanometer mirror may be scanned in at least one orthogonal dimension to generateimage light 355.Image light 355 from the galvanometer mirror represents a two-dimensional line image of the media presented to the user's eye.
In some embodiments,source assembly 310 does not include an optical system. Light from thelight source 340 is projected directly into the waveguide 320 (as shown in fig. 3A).
Thecontroller 330 controls the operation of thelight source 340 and, in some cases, theoptical system 345. In some embodiments,controller 330 may be a Graphics Processing Unit (GPU) of a display device. In other embodiments, thecontroller 330 may be other kinds of processors. The operations performed by thecontroller 330 include retrieving content for display and dividing the content into discrete portions. Thecontroller 330 instructs thelight source 340 to sequentially render the discrete portions using light emitters corresponding to respective rows in the image that is ultimately displayed to the user. Thecontroller 330 instructs theoptical system 345 to perform different adjustments of the light. For example,controller 330 controlsoptical system 345 to scan the presented discrete portions to different regions of the coupling element of output waveguide 320 (as shown in FIG. 3A). Thus, at the exit pupil of theoutput waveguide 320, each discrete portion is present in a different location. Although each discrete portion is presented at a different time, the presentation and scanning of the discrete portions occurs sufficiently fast that the user's eye integrates the different portions into a single image or series of images. Thecontroller 330 may also provide scan instructions to thelight source 340 that include addresses corresponding to individual source elements of thelight source 340 and/or electrical biases applied to the individual source elements.
Theimage processing unit 375 may be a general-purpose processor and/or one or more dedicated circuits dedicated to performing the features described herein. In one embodiment, a general-purpose processor may be coupled to the memory to execute software instructions that cause the processor to perform certain processes described herein. In another embodiment,image processing unit 375 may be one or more circuits dedicated to performing particular features. Although in fig. 3B, theimage processing unit 375 is shown as a separate unit from thecontroller 330 and the drivingcircuit 370, in other embodiments, theimage processing unit 375 may be a sub-unit of thecontroller 330 or the drivingcircuit 370. In other words, in those embodiments, thecontroller 330 or the drivingcircuit 370 performs various image processing processes of theimage processing unit 375. Theimage processing unit 375 may also be referred to as an image processing circuit.
Light emitting diode array
FIG. 4 is a top view of a Light Emitting Diode (LED)array 400 that may be included in thelight source 340 of FIGS. 3A and 3B, according to one embodiment. Thearray 400 includes a plurality of LEDs organized into rows and columns. In the example shown in fig. 4, thearray 400 includesred LEDs 410,green LEDs 420, andblue LEDs 430 arranged such that LEDs of the same color are in the same row. Thegreen LEDs 420 are aligned along thered LEDs 410 on one side of thegreen LEDs 420. Theblue LEDs 430 are placed along a line parallel to thered LEDs 410, while thegreen LEDs 420 are placed on the opposite side of thegreen LEDs 420.
The row ofred LEDs 410 is a first distance D1 from the row ofgreen LEDs 420, and the row ofblue LEDs 430 is a second distance D2 from the row ofgreen LEDs 420. The first distance D1 and the second distance D2 are associated with a plurality of subframes of a transmitted frame at least during a display mode, as described in detail below with reference to fig. 6A and 6B. The first distance D1 and the second distance D2 may be equal to or different from each other. Although not shown in fig. 4, various other configurations of LEDs are also within the scope of the present disclosure. For example,array 400 may include different color LEDs, may have different color arrangements, and/or additional rows of LEDs for each color.
Although the LEDs shown in FIG. 4 are arranged in rows and columns perpendicular to the rows, in other embodiments, the LEDs on thearray 400 may be arranged in other patterns. For example, some LEDs may be arranged diagonally or in other arrangements (regular or irregular, symmetrical or asymmetrical). Also, the terms row and column may describe two relative spatial relationships of elements. Although the columns described herein are generally associated with vertical lines of elements for simplicity, it should be understood that the columns need not be vertically (or longitudinally) aligned. Also, the rows need not be horizontally (or laterally) aligned. Rows and columns can sometimes also describe a non-linear arrangement. Nor do the rows and columns necessarily imply any parallel or perpendicular arrangement. Sometimes a row or a column may be referred to as a line. In other embodiments, there may be two or more rows of LEDs for each color. In some embodiments, the number of rows of LEDs varies from color to color based on the brightness of each color. For example, a red LED may be brighter than a blue LED. To compensate for the brightness differences in the array, there may be 2 rows of red LEDs and 5 rows of blue LEDs on the array.
In one embodiment, the LEDs may be micro LEDs. In other embodiments, other types of light emitters may be used, such as Vertical Cavity Surface Emitting Lasers (VCSELs). A "micro LED" may be a light emitting diode having a small active light emitting area (e.g., less than 2,000 μm in some embodiments)2And in other embodiments less than 20 μm2Or less than 10 μm2) Of the particular type of LED. In some embodiments, the emitting surface of the micro-LED may have a diameter of less than about 5 μm, although in other embodiments smaller (e.g., 2 μm) or larger diameters of the emitting surface may be utilized. In some examples, the micro-LEDs may also have a collimated or non-Lambertian light output, which may increase the brightness level of light emitted from the small active light emitting area.
Examples of image formation
FIG. 5A is a diagram illustrating the projection of light emitted from LEDs into an image field according to one embodiment. Fig. 5A illustrates how an image is formed in the display apparatus using light emitted from thelight source 340. An image field is an area that receives light emitted by thelight source 340 and forms an image. For example, the image field may correspond to a portion ofcoupling element 350 or a portion ofdecoupling element 360 in fig. 3A. In some cases, the image field is not the actual physical structure, but the area onto which the image light is projected and the image is formed. In one embodiment, the image field is the surface of thecoupling element 350, and the image formed on the image field is magnified as the light passes through theoutput waveguide 320. In another embodiment, the image field is formed after the light passes through the waveguides, which combine the different colors of light to form the image field. In some embodiments, the image field may be projected directly into the user's eye.
During a scanning operation,display device 500 projects light fromlight source 340 to image field 530 usingscanning mirror 520. Thedisplay device 500 may correspond to the near-eye display 100 or another scanning-type display device. Thelight source 340 may correspond to thelight source 340 shown in fig. 3B, or may be used in other display devices.Light source 340 includes anarray 400 having a plurality of rows and columns of light emitting devices, as indicated by the dots ininset 515. In one embodiment, thelight source 340 may include a single row of light emitting diodes for each color. In other embodiments, thelight source 340 may include more than one row of LEDs for each color. The light 502 emitted by thelight source 340 may be a set of collimated light beams. For example, light 502 in fig. 5A illustrates a plurality of light beams emitted by a row of LEDs. Before reaching themirror 520, the light 502 may be conditioned by a different optical device, such as a conditioning assembly. Themirror 520 reflects and projects the light 502 from thelight source 340 to an image field 530.Mirror 520 rotates about anaxis 522.Mirror 520 may be a micro-electromechanical system (MEMS) mirror or any other suitable mirror.Mirror 520 may be an embodiment ofoptical system 345 in fig. 3B, or a part ofoptical system 345. As themirror 520 rotates, the light 502 is directed to different portions of the image field 530, shown as reflected portions of light 504 in solid lines and reflected portions of light 504 in dashed lines.
At a particular orientation (i.e., a particular angle of rotation) ofmirror 520,array 400 illuminates a portion of image field 530 (e.g., a particular subset of the plurality ofpixel locations 532 on image field 530). In one embodiment, the LEDs are arranged and spaced such that the light beam from each LED is projected onto arespective pixel location 532. The distance between adjacent rows of LEDs is described with reference to fig. 6A and 6B. In another embodiment, small light emitters, such as micro-LEDs, are used for the LEDs, such that the light beams from a subset of the plurality of light emitters project together at thesame pixel location 532. In other words, a subset of the plurality of LEDs collectively illuminate asingle pixel location 532 at a time.
The image field 530 may also be referred to as a scan field because when the light 502 is projected onto a region of the image field 530, the region of the image field 530 is illuminated by the light 502. The image field 530 may be spatially defined by a matrix of pixel locations 532 (represented by blocks in inset 534) in rows and columns. The pixel position here refers to a single pixel. Pixel locations 532 (or simply pixels) in the image field 530 may not actually be additional physical structures at times. Rather, thepixel locations 532 may be spatial regions that divide the image field 530. Further, the size and location ofpixel locations 532 may depend on the projection of light 502 fromlight source 340. For example, at a given angle of rotation of themirror 520, the light beam emitted from thelight source 340 may fall on a region of the image field 530. Thus, the size and location of thepixel locations 532 of the image field 530 can be defined based on the location of each beam. In some cases,pixel locations 532 may be spatially subdivided into sub-pixels (not shown). For example,pixel locations 532 may include a red sub-pixel, a green sub-pixel, and a blue sub-pixel. The red sub-pixel corresponds to a location where one or more red beams are projected, and so on. When a sub-pixel is present, the color of thepixel 532 is based on the temporal and/or spatial average of the sub-pixel.
The number of rows and columns of LEDs in thearray 400 oflight sources 340 may be the same or different than the number of rows and columns ofpixel locations 532 in the image field 530. In one embodiment, the number of LEDs in a row in thearray 400 is equal to the number ofpixel locations 532 in a row of the image field 530, while the number of LEDs in a column in thearray 400 is two or more, but less than the number ofpixel locations 532 in a column of the image field 530. In other words, in such embodiments, thelight source 340 has the same number of columns of LEDs in thearray 400 as the number of columns ofpixel locations 532 in the image field 530, but fewer rows than the image field 530. For example, in one particular embodiment, thelight source 340 has approximately 1280 columns of LEDs in thearray 400, which is the same number of columns as thepixel locations 532 of the image field 530, but only 3 rows of LEDs. Thelight source 340 may have a first length L1 measured from the first row to the last row of LEDs. The image field 530 has a second length L2 measured fromline 1 to line p of the scan field 530. In one embodiment, L2 is larger than L1 (e.g., L2 is 50 to 10,000 times larger than L1).
In some embodiments, since the number of rows ofpixel locations 532 is greater than the number of rows of LEDs inarray 400,display device 500 usesmirror 520 to project light 502 to different rows of pixels at different times. As themirror 520 rotates and the light 502 rapidly scans through the image field 530, an image is formed on the image field 530. In some embodiments, thelight source 340 also has fewer columns than the image field 530. Themirror 520 can be rotated in two dimensions to fill the image field 530 with light (e.g., raster-type scan down the rows and then moved to a new column in the image field 530).
The display device may operate within a predefined display period. The display period may correspond to the duration of time for which an image is formed. For example, the display period may be associated with a frame rate (e.g., the inverse of the frame rate) representing the frequency at which the transmission frames repeat over a given time (e.g., 1 second). In a particular embodiment of thedisplay device 500 comprising a rotating mirror, the display period may also be referred to as a scan period. A full rotation cycle ofmirror 520 may be referred to as a scan period. The scan period here refers to a predetermined cycle time during which the entire image field 530 is completely scanned. The scan cycle may be divided into a plurality of emission frames, each emission frame corresponding to light projected onto a particular subset of pixel locations on the image field 530 (e.g., three rows of pixel locations of the image field). The scanning of the image field 530 is controlled by themirror 520. The generation of light by thedisplay device 500 may be synchronized with the rotation of themirror 520. For example, in one embodiment, the movement ofmirror 520 from an initial position projecting light ontoline 1 of image field 530 to a final position projecting light onto line p of image field 530, and then back to the initial position, is equal to the scan period. The scan period may also be related to the frame rate of thedisplay device 500. By completing the scan cycles, each scan cycle forms an image (e.g., a frame) over an image field 530. Thus, the frame rate may correspond to the number of scanning cycles in one second.
Asmirror 520 rotates, light scans through the image field and forms an image. The actual color value and light intensity (brightness) for a givenpixel location 532 may be an average of the various color beams illuminating the pixel location during the scan period. After the scan cycle is completed, themirror 520 returns to the initial position to again project light onto the first few lines of the image field 530, except that a new set of drive signals may be fed to the LEDs. The same process may be repeated as themirror 520 is rotated cyclically. Thus, in different frames, different images are formed in the scan field 530.
FIG. 5B is a diagram illustrating lines of pixel locations on an image field according to an embodiment. Image field 530 may include rows of pixel locations, each row including a plurality of pixel locations. In the example shown in FIG. 5B, there are z lines in the image field 530. As thescanning mirror 520 rotates, light emitted from thearray 400 is projected onto different portions of the image field 530 to illuminate different rows of the image field 530 at a given time.
In one embodiment, all of the LEDs onarray 400 are driven on at the beginning of an emission frame and emit light during an emission period corresponding to the first half of the emission frame. During the emission period, light emitted from thearray 400 is projected onto the first three rows of pixel locations (e.g.,row 1,row 2, row 3) on the image field 530. For example, red light emitted from thered LED 410 is projected to the 1 st row, green light emitted from thegreen LED 420 is projected to the 2 nd row, and blue light emitted from theblue LED 430 is projected to the 3 rd row. After an emission frame,mirror 520 is rotated so that in a subsequent emission frame, light emitted fromarray 400 is projected onto three rows of pixel locations on image field 530, offset by one row compared to the previous emission frame. For example, red light is now projected online 2, green light on line 3, and blue light on line 4.
However, when all the LEDs on thearray 400 are driven to emit light simultaneously during an emission period, a large instantaneous current is required to turn on all the LEDs simultaneously. However, to provide the current required to turn on all of the LEDs of thearray 400 simultaneously, the near-eye display 100 may be bulky, as it requires a large current source and becomes expensive due to the high cost of design and manufacture. Furthermore, over time, there is an increased risk of the silicon not working properly due to operation at high currents, and there is a safety risk due to a large thermal budget that is not easily absorbed. To reduce peak current, a time division approach may be used to divide each emission frame into sub-frames and turn on different portions of the LEDs on thearray 400 in each sub-frame, rather than turning on all of the LEDs on thearray 400 simultaneously, as described in detail below with reference to fig. 6A and 6B.
Fig. 6A is a timing diagram illustrating an emission pattern of an LED according to an embodiment, where each emission frame has two sub-frames A, B. The method of operating thearray 400 is to turn on a subset of the LEDs on thearray 400 at different sub-frames of the transmission frame. Thecontroller 330 orimage processing unit 375 determines the value of the voltage applied to eachLED 410 such that the light emitted from the LED corresponds to a sub-pixel in a pixel location on the image field 530. In the example shown in fig. 6A, a first subset of the LEDs onarray 400 emit light during sub-frame a of the emission frame, and a second subset of the LEDs onarray 400 emit light during sub-frame B of the emission frame following sub-frame a. In one embodiment, the first subset of LEDs isred LEDs 410 and the second subset of LEDs isgreen LEDs 420 andblue LEDs 430. In another embodiment, the first subset of LEDs and the second subset of LEDs may have different colors. For example, a first subset of LEDs may begreen LEDs 420, while a second subset of LEDs may bered LEDs 410 andblue LEDs 430.
In the timing diagram shown in fig. 6A, thered LED 410 disposed on the first row of thearray 400 emits light during sub-frame a of each emission frame. When thered LED 410 is turned on to emit light, thegreen LED 420 and theblue LED 430 are turned off.Green LEDs 420 are disposed on a second row ofarray 400 and emit light during sub-frame B, which follows sub-frame a, during each emission frame.Blue LED 430 is disposed on the third row ofarray 400 belowgreen LED 420 and emits light simultaneously withgreen LED 420 during sub-frame B. When thegreen LED 420 and theblue LED 430 are turned on to emit light, thered LED 410 is turned off.
To generate an image on the image field 530, light emitted from thered LEDs 410, thegreen LEDs 420, and theblue LEDs 430 is projected onto the image field 530. Each pixel on image field 530 is divided into a red sub-pixel, a green sub-pixel, and a blue sub-pixel. When all of the LEDs onarray 400 are simultaneously illuminated, the rows of LEDs onarray 400 are equal between two adjacent rows.
However, when different portions of the LEDs are driven to emit light at different times, the distance between rows on thearray 400 needs to compensate for the time delay to maintain image alignment over the image field 530. As shown in FIG. 4, the distance between thered LEDs 410 and thegreen LEDs 420 is D1, and the distance between thegreen LEDs 420 and theblue LEDs 430 is D2, where the row ofgreen LEDs 420 is between the row ofred LEDs 410 and the row ofblue LEDs 430. When thearray 400 is configured to turn on the red LEDs during sub-frame a and the green LEDs during sub-frame B, the distance D1 between thered LEDs 410 and thegreen LEDs 420 is equal to (n +1/2) times the distance D2 between thegreen LEDs 420 and theblue LEDs 430, where n is 0 or an integer greater than 0.
Fig. 6B is a timing diagram illustrating an emission pattern of an LED according to an embodiment, wherein each emission frame has three sub-frames a ', B ', C '. During sub-frame A', a first subset of LEDs emits light of a first color. During sub-frame B', a second subset of LEDs emits light, the second subset of LEDs having a second color different from the first subset of LEDs. During sub-frame C', a third subset of LEDs emits light of a third color different from the first and second colors. In the example shown in FIG. 6B, the first subset of LEDs isred LEDs 410, the second subset of LEDs isgreen LEDs 420, and the third subset of LEDs isblue LEDs 430. In other examples, the subset of LEDs may have a different color than that shown in fig. 6B. The subframes a ', B ', C ' may be of equal duration or of different duration. However, the sum of the durations of the sub-frames a ', B ', and C ' must be equal to or shorter than the duration of one frame.
During sub-frame A', thered LED 410 is on, while thegreen LED 420 and theblue LED 430 are off. During sub-frame B',green LED 420 is on, andred LED 410 andblue LED 430 are off. During sub-frame C',blue LED 430 is on, andred LED 410 andgreen LED 420 are off.
As shown in the timing diagram of fig. 6B, when there are three sub-frames and each row of LEDs on thearray 400 is driven to emit light during a different sub-frame, the distance D1 between the red andgreen LEDs 410 and 420 and the distance D2 between the green andblue LEDs 420 and 430 are the same. However, for a display device configured to emit light from all three rows ofarray 400 simultaneously during an emission frame (e.g., all three rows of the array emit light during the first half of the emission frame), distances D1 and D2 are greater than the distance between adjacent rows.
Fig. 7A-7C are conceptual diagrams illustrating rows onto which LEDs project light for an emission frame having two sub-frames as described above with reference to fig. 6A, according to an embodiment. During the first sub-frame of each transmit frame, a first subset of the LEDs onarray 400 are turned on, while a second subset of the LEDs onarray 400 are turned off. During a second sub-frame of each transmit frame following the first sub-frame, a first subset of the LEDs onarray 400 are turned off, and a second subset of the LEDs onarray 400 are turned on.Red LED 410,green LED 420, andblue LED 430 are operated bycontroller 330 to turn on and off to emit light during appropriate emission sub-frames. Light emitted from the LEDs is reflected off the surface of themirror 520, and themirror 520 rotates to scan the light over an image field 530.
Fig. 7A shows the projection of light onto the image field 530 during the first transmit frame. The number of LEDs in each row of thearray 400 matches the number of pixel locations in a row of the image field 530. As shown in the left diagram, thered LEDs 410 on thearray 400 are on during the first sub-frame of the first of three consecutive transmission frames. During the first sub-frame, light emitted from thered LED 410 is projected onto the nth row of the image field 530. Thegreen LED 420 and theblue LED 430 are turned off during the first sub-frame. The light emitted from eachred LED 410 illuminates the red sub-pixels of the pixel locations on the nth row of the image field 530.
As shown in the right diagram, during the second sub-frame of the first transmission frame, thered LED 410 is turned off, and thegreen LED 420 and theblue LED 430 are turned on. Light emitted from thegreen LED 420 is projected onto line n-1 of the image field 530. Line n-1 is adjacent to line n, and thered LED 410 projects light onto line n during the first sub-frame. Light emitted from theblue LED 430 is projected onto line n-2 of the image field 530.
FIG. 7B illustrates the projection of light onto the image field 530 during a second transmit frame that follows the first transmit frame, according to one embodiment. Thereflector 520 redirects the light emitted from thearray 400 such that the light emitted from the red, green, andblue LEDs 410, 420, 430 is projected onto a row that is offset by one row compared to fig. 7A. As shown in the left diagram, during the first sub-frame of the second transmission frame, light from thered LEDs 410 on thearray 400 is projected onto the (n + 1) th row, where thered LEDs 410 correspond to the red sub-pixels of the pixel locations on the (n + 1) th row. As shown in the right diagram, light from thegreen LED 420 is projected to the nth row, and light from theblue LED 430 is projected to the (n-1) th row. Light emitted fromgreen LED 420 corresponds to the green sub-pixel of the pixel location on the nth row, and light emitted fromblue LED 430 corresponds to the blue sub-pixel of the pixel location on the n-1 th row.
Fig. 7C shows the projection of light onto the image field 530 during a third transmit frame. As shown in the left diagram, during the first sub-frame of the third transmission frame, light from thered LED 410 is projected onto the (n + 2) th row. During the second sub-frame of the third transmission frame, light from thegreen LED 420 is projected to the n +1 th row, and light from theblue LED 430 is projected to the n-th row.
Because each transmit frame lasts for a short time (e.g., 345ns), the light when transmitted from one transmit frame to the next is indistinguishable to the human eye. Over the three consecutive emission frames shown in fig. 7A-7C, light emitted from thered LED 410, thegreen LED 420, and theblue LED 430 is projected onto the nth row. Light emitted from thered LED 410 during the first sub-frame of the first emission frame, light emitted from thegreen LED 420 during the second sub-frame of the second emission frame, and light emitted from theblue LED 430 during the second sub-frame of the third emission frame appear as a row of pixels on the nth row of the image field. Although different colors of light are projected at different times, each pixel appears as a single color due to the blurring and spatial integration of light by the human eye, rather than three different sub-pixels, each of which appears in a different emission frame.
Although not shown, in a fourth emission frame following the three emission frames shown in FIGS. 7A-7C, light from thered LED 410 is projected to row n +3 during the first sub-frame, light from thegreen LED 420 is projected to row n +2 during the second sub-frame, and light from theblue LED 430 is projected to row n +1 during the second sub-frame. Light emitted from thered LED 410 during the first sub-frame of the second emission frame, light emitted from thegreen LED 420 during the second sub-frame of the third emission frame, and light emitted from theblue LED 430 during the second sub-frame of the fourth emission frame appear as one row of pixels on the n +1 th row.
8A-8C are conceptual diagrams illustrating rows of emission frames having three sub-frames A ', B ', C ' onto which LEDs project light, according to an embodiment. Fig. 8A shows the projection of light onto the image field 530 during the first transmit frame. The first transmission frame is divided into three sub-frames a ', B ', C ', and during each sub-frame, different rows of LEDs are turned on to emit light. As shown in the left diagram, thered LED 410 is turned on during the first sub-frame of the first transmission frame and projected to the nth row, and thegreen LED 420 and theblue LED 430 are turned off. As shown in the middle diagram, thegreen LED 420 is turned on during the second sub-frame of the first transmission frame and is projected to the (n-1) th row, and thered LED 410 and theblue LED 430 are turned off. As shown in the right diagram, theblue LED 430 is turned on during the third sub-frame of the emission frame and is projected to the n-2 th row, and thered LED 410 and thegreen LED 420 are turned off.
Fig. 8B shows the projection of light onto the image field 530 during a second emission frame that immediately follows the first emission frame. Thereflector 520 redirects the light emitted from thearray 400 such that the light emitted from the red, green, andblue LEDs 410, 420, 430 is projected onto a row that is offset by one row compared to fig. 7A. As shown in the left diagram, thered LED 410 is turned on during the first sub-frame of the second transmission frame and is projected to the (n + 1) th row, and thegreen LED 420 and theblue LED 430 are turned off. As shown in the middle diagram, thegreen LED 420 is turned on during the second sub-frame of the second transmission frame and projected to the nth row, and thered LED 410 and theblue LED 430 are turned off. As shown in the right diagram, theblue LED 430 is turned on during the third sub-frame of the second emission frame and is projected to the (n-1) th row, and thered LED 410 and thegreen LED 420 are turned off.
Fig. 8C shows the projection of light onto the image field 530 during a third emission frame immediately following the second emission frame. As shown in the left diagram, thered LED 410 is turned on during the first sub-frame of the third transmission frame and is projected to the n +2 th row, and thegreen LED 420 and theblue LED 430 are turned off. As shown in the middle diagram, thegreen LED 420 is turned on during the second sub-frame of the third transmission frame and is projected to the (n + 1) th row, and thered LED 410 and theblue LED 430 are turned off. As shown in the right drawing, theblue LED 430 is turned on during the third sub-frame of the third emission frame and projected to the nth row, and thered LED 410 and thegreen LED 420 are turned off.
Light emitted from thered LED 410 during a first sub-frame of the first transmission frame, light emitted from thegreen LED 420 during a second sub-frame of the second sub-frame, and light emitted from theblue LED 430 during a third sub-frame of the third sub-frame are displayed as one row of pixels on the nth row of the image field.
Example method of operating a display device
Fig. 9 is a flowchart depicting a process of operating a display device according to an embodiment. The display device is configured to operate 910 a first LED of a first color during a first sub-frame of a transmit frame to emit light from the first LED. The display device is configured to direct 920 light emitted from the first LED onto a plurality of pixel locations of the image field. Light emitted from the first LED may be directed to a rotating mirror that projects the light onto a plurality of pixel locations. The plurality of pixel locations may correspond to a line of pixels on the image field. When the first LED emits light during the first sub-frame, the display device is configured to disable 930 the second LED of the second color during the first sub-frame of the emission frame.
The display device is configured to operate 940 the second LED during a second sub-frame of the transmission frame. The display device is configured to direct 950 light emitted from the second LED onto a plurality of pixel locations of the image field. When the second LED emits light during the second sub-frame, the display device is configured to disable 960 the first LED during the second sub-frame of the transmission frame.
In some embodiments, the display device includes a third LED of a third color. The display device may operate the third LED during a second sub-frame of emission such that the second LED and the third LED emit light during the second sub-frame.
In other embodiments, the display device may operate the third LED during the third sub-frame of the transmission frame. The display device may be configured to disable the first LED and the second LED during the third subframe while the third LED is illuminated.
In some embodiments, the first LEDs are arranged along a first row of the array, the second LEDs are arranged along a second row parallel to the first row on one side of the first row, and the third LEDs are arranged along a third row parallel to the first and second rows on an opposite side of the first row.
The language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the disclosure be limited not by this detailed description, but rather by any claims that issue on an application based thereupon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the disclosure, which is set forth in the following claims.

Claims (20)

1. A display device, comprising:
a plurality of first Light Emitting Diodes (LEDs) configured to emit at least a first color of light onto a corresponding plurality of pixel locations on an image field;
a plurality of second LEDs configured to emit at least a second color of light onto the plurality of pixel locations on the image field; and
a controller configured to:
operating the first LED and disabling the second LED during a first sub-frame of a transmit frame, an
Operating the second LED and disabling the first LED during a second sub-frame of the transmit frame.
2. The display device of claim 1, further comprising a mirror separate from the first and second LEDs, the mirror being operative to:
during a first sub-frame of one of the emission frames, reflecting light from the first LED onto the plurality of pixel locations, an
Reflecting light from the second LED onto the plurality of pixel locations during a second sub-frame of another one of the emission frames.
3. The display device of claim 2, wherein the plurality of pixel locations is a row of pixel locations in the image field.
4. The display device of claim 3, further comprising:
a plurality of third LEDs configured to emit light of a third color onto the plurality of pixel locations,
wherein the first LEDs are aligned along a first row, the second LEDs are aligned along a second row parallel to the first row on one side of the first row, and the third LEDs are aligned along a third row parallel to the first and second rows on an opposite side of the first row.
5. The display device of claim 4, wherein the third LED is operated by the controller during the first sub-frame and disabled by the controller during the second sub-frame.
6. The display device of claim 5, wherein the first row is a first distance from the second row and the first row is a second distance from the third row, and wherein the first distance is (n +1/2) times the second distance, where n is 0 or an integer greater than 0.
7. The display device of claim 4, wherein the third LED is operated by the controller during a third sub-frame and is disabled during the first sub-frame and the second sub-frame.
8. The display device of claim 7, wherein the first row is a first distance from the second row and the first row is a second distance from the third row, and wherein the first distance is equal to the second distance.
9. A method, comprising:
operating a first LED of a first color during a first sub-frame of a transmit frame to emit light from the first LED;
directing light emitted from said first LED onto a plurality of pixel locations of an image field;
disabling a second LED of a second color during the first subframe of the transmission frame
Operating the second LED during a second sub-frame of the transmit frame to emit light from the second LED;
directing light emitted from the second LED onto the plurality of pixel locations of the image field; and
disabling the first LED during the second subframe of the transmit frame.
10. The method of claim 9, wherein directing light from the first LED comprises reflecting light from the first LED onto the plurality of pixel locations by a mirror during a first sub-frame of one of the emission frames, and
wherein directing light from the second LED comprises reflecting light from the second LED onto the plurality of pixel locations by the mirror during a second sub-frame of another of the emission frames.
11. The method of claim 10, wherein the plurality of pixel locations is a row of pixel locations in the image field.
12. The method of claim 11, further comprising:
operating a plurality of third LEDs to emit light from the third LEDs, wherein the first LEDs are aligned along a first row, the second LEDs are aligned along a second row parallel to the first row on one side of the first row, and the third LEDs are aligned along a third row parallel to the first row and the second row on an opposite side of the first row.
13. The method of claim 12, wherein the controller operates the third LED during a first sub-frame of the transmit frame and disables the third LED during a second sub-frame of the transmit frame.
14. The method of claim 13, wherein the first row is a first distance from the second row and the first row is a second distance from the third row, and wherein the first distance is equal to (n +1/2) times the second distance, where n is 0 or an integer greater than 0.
15. The method of claim 12, wherein the third LED is operated during a third subframe and is disabled during the first subframe and the second subframe.
16. The method of claim 15, wherein the first row is a first distance from the second row and the first row is a second distance from the third row, and wherein the first distance is equal to the second distance.
17. An LED array, comprising:
a plurality of first LEDs arranged along a first row and configured to emit light of a first color onto a plurality of pixel locations on an image field;
a plurality of second LEDs arranged along a second row on one side of and spaced a first distance from the first row, the second LEDs configured to emit light of a second color onto the plurality of pixel locations on the image field; and
a plurality of third LEDs arranged along a third row on the other side of and spaced apart from the first row by a second distance, the third LEDs configured to emit light of a third color onto the plurality of pixel locations on the image field, the first distance corresponding to (n +1/2) times the second distance, where n is 0 or an integer greater than 0.
18. The LED array of claim 17, wherein the second LED and the third LED are configured to be operated when the first LED is disabled during a first sub-frame of a transmission frame.
19. The LED array of claim 18, wherein the first LED is configured to be operated when the second LED and the third LED are disabled during a second sub-frame of a transmission frame.
20. The LED array of claim 17, wherein light emitted from the first, second, and third LEDs is reflected from a mirror onto the plurality of pixel locations.
CN202080034749.1A2019-05-092020-05-04Reducing peak current usage in light emitting diode arraysPendingCN113826380A (en)

Applications Claiming Priority (5)

Application NumberPriority DateFiling DateTitle
GR201901002042019-05-09
GR201901002042019-05-09
US16/430,135US10861373B2 (en)2019-05-092019-06-03Reducing peak current usage in light emitting diode array
US16/430,1352019-06-03
PCT/US2020/031371WO2020227242A1 (en)2019-05-092020-05-04Reducing peak current usage in light emitting diode array

Publications (1)

Publication NumberPublication Date
CN113826380Atrue CN113826380A (en)2021-12-21

Family

ID=73046519

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202080034749.1APendingCN113826380A (en)2019-05-092020-05-04Reducing peak current usage in light emitting diode arrays

Country Status (6)

CountryLink
US (1)US10861373B2 (en)
EP (1)EP3932052A1 (en)
JP (1)JP2022531073A (en)
KR (1)KR20220007883A (en)
CN (1)CN113826380A (en)
WO (1)WO2020227242A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11308848B2 (en)*2020-08-052022-04-19Jade Bird Display (shanghai) LimitedScan needle and scan display system including same
US11317065B2 (en)2020-08-052022-04-26Jade Bird Display (shanghai) LimitedScan needle and scan display system including same
US11270632B2 (en)*2020-08-052022-03-08Jade Bird Display (shanghai) LimitedScan needle and scan display system including same
US11527572B2 (en)2020-08-052022-12-13Jade Bird Display (shanghai) LimitedScan needle and scan display system including same
US11308862B2 (en)2020-08-052022-04-19Jade Bird Display (shanghai) LimitedScan needle and scan display system including same
US11270620B2 (en)2020-08-052022-03-08Jade Bird Display (shanghai) LimitedScan needle and scan display system including same
JP2023087311A (en)*2021-12-132023-06-23キヤノン株式会社Image display device and method for controlling image display device

Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20030210213A1 (en)*2002-05-072003-11-13Jiahn-Chang WuProjector with array LED matrix light source
US20050168710A1 (en)*2004-02-042005-08-04Seiko Epson CorporationProjection-type display apparatus and control method for projection-type display apparatus as well as control program for projection-type display apparatus
US20060044297A1 (en)*2001-05-282006-03-02Canon Kabushiki KaishaImage display apparatus
US20090273288A1 (en)*2008-03-122009-11-05Freescale Semiconductor, Inc.Led driver with dynamic power management
US20100026203A1 (en)*2008-07-312010-02-04Freescale Semiconductor, Inc.Led driver with frame-based dynamic power management
US20150302805A1 (en)*2012-12-212015-10-22Sharp Kabushiki KaishaLiquid crystal display device
US20160181337A1 (en)*2014-12-192016-06-23Samsung Display Co., Ltd.Organic light emitting display

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2001069584A1 (en)2000-03-142001-09-20Mitsubishi Denki Kabushiki KaishaImage display and image displaying method
JP2005025160A (en)*2003-06-132005-01-27Seiko Epson Corp Method for driving spatial light modulator and projector
JP2007133095A (en)*2005-11-092007-05-31Sharp Corp Display device and manufacturing method thereof
US8643630B2 (en)*2006-12-292014-02-04Texas Instruments IncorporatedMethod and system for generating a display
US8542167B2 (en)*2007-08-012013-09-24Himax Technologies LimitedProjection type display apparatus
US20090128786A1 (en)*2007-11-192009-05-21Texas Instruments IncorporatedImaging system with a microelectromechanical light valve
US8400391B2 (en)*2008-01-102013-03-19Honeywell International Inc.Method and system for improving dimming performance in a field sequential color display device
JP5417948B2 (en)*2009-04-032014-02-19セイコーエプソン株式会社 Head-mounted display device
US8810561B2 (en)*2011-05-022014-08-19Microvision, Inc.Dual laser drive method. apparatus, and system
JP6075590B2 (en)*2012-02-282017-02-08日本精機株式会社 Vehicle display device
WO2016072368A1 (en)*2014-11-062016-05-12シャープ株式会社Lighting device and display device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20060044297A1 (en)*2001-05-282006-03-02Canon Kabushiki KaishaImage display apparatus
US20030210213A1 (en)*2002-05-072003-11-13Jiahn-Chang WuProjector with array LED matrix light source
US20050168710A1 (en)*2004-02-042005-08-04Seiko Epson CorporationProjection-type display apparatus and control method for projection-type display apparatus as well as control program for projection-type display apparatus
US20090273288A1 (en)*2008-03-122009-11-05Freescale Semiconductor, Inc.Led driver with dynamic power management
US20100026203A1 (en)*2008-07-312010-02-04Freescale Semiconductor, Inc.Led driver with frame-based dynamic power management
US20150302805A1 (en)*2012-12-212015-10-22Sharp Kabushiki KaishaLiquid crystal display device
US20160181337A1 (en)*2014-12-192016-06-23Samsung Display Co., Ltd.Organic light emitting display

Also Published As

Publication numberPublication date
KR20220007883A (en)2022-01-19
US20200357331A1 (en)2020-11-12
WO2020227242A1 (en)2020-11-12
EP3932052A1 (en)2022-01-05
WO2020227242A8 (en)2021-10-07
JP2022531073A (en)2022-07-06
US10861373B2 (en)2020-12-08

Similar Documents

PublicationPublication DateTitle
US11450250B1 (en)Scanning waveguide display
US10861373B2 (en)Reducing peak current usage in light emitting diode array
US10951867B2 (en)Light emitter architecture for scanning display device
CN112292628B (en)Single chip super bright LED array for waveguide display
US11598965B2 (en)Super-resolution scanning display for near-eye displays
US20190019448A1 (en)Redundant microleds of multiple rows for compensation of defective microled
US10690919B1 (en)Superluminous LED array for waveguide display
US10705341B1 (en)Temporally incoherent and spatially coherent source for waveguide displays
US11237397B1 (en)Multi-line scanning display for near-eye displays

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
CB02Change of applicant information
CB02Change of applicant information

Address after:California, USA

Applicant after:Yuan Platform Technology Co.,Ltd.

Address before:California, USA

Applicant before:Facebook Technologies, LLC

WD01Invention patent application deemed withdrawn after publication
WD01Invention patent application deemed withdrawn after publication

Application publication date:20211221


[8]ページ先頭

©2009-2025 Movatter.jp