Movatterモバイル変換


[0]ホーム

URL:


CN108375840B - Light field display unit based on small array image source and three-dimensional near-to-eye display device using light field display unit - Google Patents

Light field display unit based on small array image source and three-dimensional near-to-eye display device using light field display unit
Download PDF

Info

Publication number
CN108375840B
CN108375840BCN201810154739.6ACN201810154739ACN108375840BCN 108375840 BCN108375840 BCN 108375840BCN 201810154739 ACN201810154739 ACN 201810154739ACN 108375840 BCN108375840 BCN 108375840B
Authority
CN
China
Prior art keywords
light
microstructure
micro
light field
display unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810154739.6A
Other languages
Chinese (zh)
Other versions
CN108375840A (en
Inventor
姚成
程德文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing NED+AR Display Technology Co.,Ltd.
Original Assignee
Beijing Ned+ar Display Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ned+ar Display Technology Co ltdfiledCriticalBeijing Ned+ar Display Technology Co ltd
Priority to CN201810154739.6ApriorityCriticalpatent/CN108375840B/en
Publication of CN108375840ApublicationCriticalpatent/CN108375840A/en
Application grantedgrantedCritical
Publication of CN108375840BpublicationCriticalpatent/CN108375840B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The invention discloses a light field display unit applicable to optical perspective, which is typical, in particular to a light field display unit based on a small array image source, wherein the light field display unit comprises the small array image source formed on a substrate and a microstructure array device arranged nearby the small array image source, the small array image source and the microstructure array device are aligned in a preset mode, so that a connecting line (4) between the center of each micro display area unit (201) on the small array image source and the center of the corresponding microstructure unit (301) of the microstructure array device is intersected at the center of an exit pupil (5) of the display unit, a head-mounted near-eye display device realized by the light field display unit of the invention has the advantages that ambient light rays directly penetrate through to enter human eyes in a perspective mode, a virtual image information image is modulated by the microstructure array device to form a space light field, and the state that the light rays enter the human eyes under a natural condition is simulated, the method conforms to the observation habit of human eyes, can effectively relieve the problem of imbalance of deficiency and excess fusion, and is particularly suitable for the requirement of augmented reality application.

Description

Light field display unit based on small array image source and three-dimensional near-to-eye display device using light field display unit
Technical Field
The invention relates to the field of light field display, in particular to an optical perspective display unit based on a small array image source, which is particularly beneficial to realizing optical perspective true three-dimensional display in a head-mounted near-eye display device.
Background
The head-mounted display is an important component of a human-computer interaction interface in the fields of virtual reality and augmented reality as a near-eye display device. Due to mobility, portability and privacy, the display device becomes the preferred display device in the fields of virtual reality and augmented reality for personal consumption. On the other hand, since the head mounted display can produce stereoscopic vision using binocular vision, it is also widely used to display three-dimensional information.
For augmented reality applications, the head-mounted display requires that the external environment information be seen while the virtual information is observed to realize the virtual-real superposition effect, enhance the reality effect and improve the interaction efficiency. The currently used see-through display schemes include video see-through and optical see-through. The video transmission type captures an external environment image in real time through a camera and presents the image in front of human eyes, so that the problems of color distortion, reduced definition, obvious time delay and the like are inevitably caused, and the optical perspective type scheme is widely recommended. The commonly studied optical see-through displays mainly include the transflective type and the waveguide type.
Reality of virtual reality stack certainly requires the three-dimensional display of virtual image, and traditional wear display uses the principle of binocular parallax to produce three-dimensional third dimension, as shown in fig. 1, through showing same object point in different positions and formation of image in far away position on the screen that the binocular corresponds, when observing, the binocular visual axis forms the contained angle, forms a kind of vergence state promptly to feel this object point is imaged in certain concrete degree of depth department. However, in the transflective or waveguide type mode, the focal length and the object distance are determined for any point on the screen by using the principle of lens imaging, so that the actual optical image plane is at a fixed depth. Because the binocular convergence can be changed by adjusting the included angle of the visual axes, but the image plane at a fixed position leads to the unique focusing state of human eyes, so the convergence and the focusing are different usually, namely comfortable three-dimensional display cannot be realized. The larger the difference between convergence and focusing is, the stronger the discomfort of human eyes, especially for the optical transmission type, because human eyes can observe the virtual image and the real environment light simultaneously, the contradiction between convergence and focusing state can greatly affect the effect of virtual-real fusion, namely when human eyes observe the object of the real environment and can see the object of the virtual scene, the discomfort caused by the difference between convergence and convergence can be more obvious because of the external real object as contrast, and the actual visual effect of the virtual image is difficult to be vivid.
One effective way to alleviate this problem is to simulate the situation in which light enters the eye in a real environment by projecting virtual images at different distances from the human eye, which can be referred to as true three-dimensional displays. Light field display is one of the main approaches to achieve true three-dimensional display. The currently adopted light field display scheme is mostly realized by adopting a micro-structure array, and the form of the micro-structure array comprises an aperture array, a point light source array, a micro-lens array and the like. These devices use the basic principle of pinhole imaging or single lens imaging to construct a set of light rays, i.e. a light field, with directional information using the spatial correspondence of the elements of the microstructure array and the pixels of the display device. The small hole array scheme can obtain higher definition, and is easy to be additionally arranged in a perspective display light path for augmented reality, but the brightness loss is obvious; the point light source array scheme is suitable for transmission type display, but is more complex in full color display; the microlens array scheme can achieve higher brightness, but there is currently a lack of a suitable scheme for achieving high quality transmissive display for augmented reality. Therefore, in general, there is no near-to-eye display technology that combines true three-dimensional and perspective well.
Disclosure of Invention
In view of the above, the present invention provides a light field display unit based on a small array image source, which is used for realizing optical transmission type display. In the observation process, the light emitted by the small array image source is modulated by the microstructure array, so that the spatial distribution of the light emitted by the object point under the natural condition can be restored, the natural state of the human eye in the real environment observation is simulated, the observation habit of the human eye is met, the problem of convergence-focus imbalance in virtual and real fusion is effectively relieved, and the requirements of augmented reality application are met.
The light field display unit comprises a small array image source and a microstructure array device, wherein the small array image source is formed on a substrate and is used for emitting light containing image information to the direction of the microstructure array device, and the microstructure array device is arranged nearby the small array image source and is used for receiving the light emitted by the small array image source and modulating the light; the micro-structure array device is characterized in that the small array image source and the micro-structure array device are aligned in a preset mode, so that a connecting line of the center of each micro-display area unit on the small array image source and the center of the corresponding micro-structure unit of the micro-structure array device intersects with the center of the exit pupil of the display unit.
Specifically, the miniature array image source is formed on a substrate, and has pixel-covered non-transparent regions as micro-display region units, each unit being arranged in a predetermined manner with a gap therebetween; the microstructure array device comprises a substrate made of a transparent dielectric material and a plurality of microstructure units corresponding to the micro display area units, wherein gaps are formed among the microstructure units and used for allowing external ambient light to penetrate through.
The substrate of the miniature array image source can be made of transparent dielectric materials, so that no pixel is arranged in the gap to keep transparency, and the external environment light can penetrate through the gap; or a non-transparent dielectric material for blocking ambient light from the outside from being directed to the microstructure array device.
The micro-structure unit refracts or filters and modulates light rays emitted by each effective pixel on the corresponding micro-display area unit to form parallel or approximately parallel space light beams, each pixel only corresponds to one direction of space light beams, and the space light beams formed by all the pixels form a space light field which is distributed between the micro-structure array device and the exit pupil.
As a microstructure unit, any one or a combination of the following may be selected: lens with focal power, holographic lens and aperture diaphragm.
The transparent substrate of the microstructure array device is provided with a first surface and a second surface which are opposite, and the microstructure units are distributed on the first surface of the transparent substrate.
Preferably, an opaque ring-shaped diaphragm is included at the edge portion of the microstructure unit or a portion corresponding to the edge of the microstructure unit to shield stray light.
The invention also relates to a near-eye display device based on the light field display unit, which further comprises an image rendering module, wherein the image rendering module is used for preprocessing a displayed image to form image information, inputting the image information into a small array image source in a wired or wireless transmission mode, and forming a complete picture suitable for being watched by human eyes at the exit pupil position after being modulated by the microstructure array device.
The specific preprocessing step for rendering the image may include:
and generating sub-images according to the preset alignment mode, wherein the number of the sub-images is the same as that of the micro-display area units and the micro-structure units. Wherein the generation of each sub-image conforms to the camera model.
According to the light field display unit and the near-eye display device, because the relative positions of each micro-display area unit and the micro-structure unit are determined, the spatial light beams formed by modulating light emitted by each pixel through the micro-structure array are arranged in a controllable mode, the spatial thin light beams bear the color and intensity information of pixel points from which the spatial thin light beams originate, and can be used for describing the position and direction distribution of light rays in a space and restoring a spatial light field; the intersection point of the reverse extension lines of the space light beams containing the same image information is an image point formed by the system, and the transparent gap can allow external environment light rays to pass through the exit pupil area within a redefined range and simultaneously realize perspective display.
Drawings
FIG. 1 is a schematic view showing the principle of the convergence-focus contradiction
FIG. 2 is a schematic diagram of a system according to an embodiment of the present invention
FIG. 3 illustrates the relationship of the optical paths of a miniature array image source, a micro-structured array device and an exit pupil in accordance with the present invention
FIG. 4 is a schematic diagram of a compact array image source configuration according to an embodiment of the present invention
FIG. 5 is a schematic structural diagram of a microstructure array device according to an embodiment of the present invention
FIG. 6 shows the propagation path of real ambient light according to an embodiment of the present invention
FIG. 7 shows a virtual image information ray propagation path according to an embodiment of the invention
FIG. 8 shows the propagation path of stray light according to an embodiment of the present invention
FIG. 9 is a schematic diagram of an embodiment of the invention for eliminating stray light using an annular diaphragm
FIG. 10 is a schematic diagram of an optical path of a microstructure device based on a pinhole array according to another embodiment of the present invention
FIG. 11 is a schematic diagram showing the optical path of a microstructure device containing optical power in the invention
FIG. 12 shows a schematic diagram of a near-eye display device for virtual-real fusion applications according to the present invention
Detailed Description
The present invention will now be described in further detail with reference to the accompanying drawings, in which specific embodiments are shown. It should be understood by those skilled in the art that specific names, terms, etc. used in the following descriptions do not limit the technical aspects of the present invention, and in the following descriptions, the same reference numerals will be used for the same components for convenience of description.
In the following embodiments, the expressions "first" and "second" and the like may modify various constituent elements of the present invention, but do not limit the corresponding constituent elements. For example, the expressions do not limit the order or importance of the corresponding constituent elements, etc. Expressions may be used to distinguish one constituent element from another constituent element. For example, the first user device and the second user device are all user devices and represent different user devices. For example, a first constituent element may be named a second constituent element without departing from the spirit and scope of the present disclosure. Likewise, even the second constituent element may be named the first constituent element.
The light field display unit based on a small array image source according to the present invention can be implemented as a near-eye display device as shown in fig. 2, and as shown in fig. 2, the near-eye display device is implemented in a wearable manner in the form of glasses, and the device includes a head-mounted part worn on the head, such as a frame structure includingglasses legs 12 and animage rendering module 13, and although the image rendering module is shown in a manner of being externally connected by cables in fig. 2, it is understood by those skilled in the art that the image rendering module can also be placed in the frame structure or connected with the display part by other manners capable of implementing signal connection. Also, the head-mounted portion is not limited to a glasses-type structure, and any structure that allows the light field display unit of the present invention to be worn in front of the user's eyes can be regarded as the head-mounted portion of the present invention.
As shown in fig. 2, the lightfield display units 1 of the present invention, each including a compact array image source device based on a transparent medium and a micro-structured array device based on a transparent medium, are assembled on a structure in the form of glasses to form a pair in a binocular case. Due to the adoption of the transparent substrate, the light field display unit can provide high transmittance for ambient light, so that the light field display unit is conveniently applied to virtual-real fusion display of augmented reality. Specifically, as shown in fig. 3, in the smallarray image source 2, an area providing image hundred-pixel level display coverage is a microdisplay area unit 201, each microdisplay area unit 201 includes a certain number of pixel display points, a plurality of microdisplay area units 201 are arranged according to a certain rule to form the smallarray image source 2, agap 202 exists between each unit to form an area without pixel coverage, and the small array image source is kept as a transparent medium substrate to allow light to directly pass through; in themicrostructure array device 3, themicrostructure units 301 are arranged corresponding to the corresponding micro display regions, and theregion 302 without the microstructure units is kept as a transparent medium to allow light to directly pass through. The miniaturearray image source 2 and themicrostructure array device 3 are aligned in a predetermined manner such that a line connecting the center pixel of eachmicro-display area unit 201 and the center of the light transmission aperture of each microstructure unit intersects at the center of the exit pupil. Wherein the image information displayed on the smallarray image source 2 is generated by theimage rendering module 13, the smallarray image source 2 is located in the imaging area of themicrostructure array device 3, and the image light of each area enters thehuman eye 6 located at theexit pupil 5 via thecorresponding microstructure 301.
Fig. 4 shows a front structure of a small-sizedarray image source 2, on which the microdisplay area units 201 are arranged according to a certain rule, and a typical arrangement manner includes, but is not limited to, a rectangular arrangement or a hexagonal arrangement, that is, a plurality of microdisplay area units 201 are distributed on the light-emitting surface of theimage source 2 and are integrally formed into a rectangular or hexagonal array image display area, andgaps 202 between the units are transparent dielectric substrates; themicrostructure units 301 on themicrostructure array device 3 correspond to the arrangement rule of theunits 201 in the micro display area, and agap 302 area is formed between the units and is also kept as a transparent medium substrate.Image source 2 also includes hardware circuit portions (not shown) such as drive lines, data transmitters, data processors, etc., typically located on or within a substrate.
Themicro-display area units 201 shown in fig. 4 are arranged periodically on the transparent medium substrate. The arrangement depicted in the figure is a rectangular arrangement, but is not limited thereto, and may also be a hexagonal arrangement or a checkerboard arrangement to improve the uniformity of the spatial distribution of the light field. The compactarray image source 2 is preferably fabricated by covering the OLED/LED light emitting dots on a transparent dielectric substrate, which may be, but not limited to, a resin material or a glass material. The part covered by the OLED/LED luminous point is an opaque area which is used as a microdisplay area unit 201.
The miniaturearray image source 2 of the present invention can be implemented by customizing the OLED micro-screen. Generally, ITO glass is used as a base material, and the anode andcathode 203 are arranged in a group grating pattern and are arranged orthogonally. In the prior art, the resolution is small (such as 50 × 50), but a higher pixel density (PPI) is favorable for obtaining a more exquisite display effect, but the manufacturing difficulty of a small-sized array image source is higher. Each pixel has a complete OLED structure and self-luminous capability, and the emitted light can be monochromatic light or colored light and is opaque to the outside. To maintain the see-through display capability and the reconstructed light field capability, eachmicrodisplay area cell 201 should cover a size no greater than 1mm (calculated as the square side length). The miniaturearray image source 2 can also be implemented in a similar way with micro LEDs as light emitting units and the arrangement of the cathodes andanodes 203 can be other than orthogonal, such as hexagonal or checkerboard. The display of the pixels is driven bycircuitry 204 surrounding the array, and the data source is theimage rendering module 13, which transmits the data either through a wired connection or a wireless connection. Thecircuit 204 may be opaque because it does not participate in the display, but may be transparent by adopting an appropriate process to expand the visible range.
Fig. 5 shows a schematic structural view of an imaging function region of themicrostructure array device 3 corresponding to fig. 4, and the structure not related to the imaging function is omitted. The substrate of thedevice 3 is also a transparent dielectric material, including but not limited to a resin material or a glass material, the shape of the substrate may be a parallel flat plate or a shape with diopter, and the arrangement rule of themicrostructure units 301 corresponds to that of the microdisplay area units 201. For a device with optical power, such as a micro-lens or a holographic lens, to form themicro-structure unit 301, the light emitting surface of themicro-display area unit 201 should be located at or near the front focal plane of the correspondingmicro-structure unit 301, and the light emitted by each pixel is modulated by the micro-structure unit to form a parallel light beam or an approximately parallel light beam. To improve the image clarity, the aperture of the beam formed by the light emitted by each pixel passing through the microstructure unit should be less than 1mm, i.e. it is preferable that the microdisplay area unit 201 emits light to themicrostructure unit 301 in a directional manner.
Fig. 6 shows the way in which the range of theexit pupil 5 is determined. Solving for the extent of theexit pupil 5 where theambient light 7 passes only through the area of thegap 202 of the miniaturearray image source 2 and the area of thegap 302 of themicrostructure array device 3, the overlapping portion of all such extents defines the position (eyebox) where the pupil can move, with the maximum aperture being the position of theexit pupil 5. Geometrically, connecting lines of the lower edge of eachmicro-display area unit 201 and the lower edge of the correspondingmicro-structure unit 301 are taken, and the intersection point of all the connecting lines defines the upper edge of theexit pupil 5; the line connecting the upper edge of themicro-display area cell 201 and the upper edge of themicro-structure cell 301 is taken, and the intersection of all these lines defines the lower edge of theexit pupil 5. In this range,ambient light 7 is only transmitted through thegap 202 region of the compactarray image source 2 and thegap 302 region of themicrostructure array device 3 as viewed by thehuman eye 6. For near-eye display devices, theexit pupil 5 is about 20mm from the outermost surface of themicrostructure array device 3 close to it (eye relief).
Although the display device shown in fig. 2 is a binocular display device, the left and right eyes correspond to one lightfield display unit 1, respectively. However, as will be appreciated by those skilled in the art, the device may also be implemented in the form of a monocular display, and may also be adapted to non-transmissive immersive displays (virtual reality) by eliminating the transparent nature of the substrate and shielding real ambient light. Likewise, the lightfield display unit 1 according to the present invention is not only applicable to near-eye display, but also applicable to large display size window glass, windshield, display case, other mobile devices, etc., where the requirement for pixel density (PPI) of thearray image source 2 is reduced, the aperture of the corresponding microlens is increased, and the determined distance (eye relief) of theexit pupil 5 from the outermost surface of themicrostructure array device 3 close thereto is adjusted to infinity. Although the microdisplay area unit 201 including only a single pixel can realize display even in the case of a large size with low PPI, it is desirable to maintain at least 2 × 2 pixels in the single microdisplay area unit 201 for the effect of embodying light field display.
FIG. 7 illustrates a display of virtual information from the rendering module. After being modulated by the correspondingmicrostructure unit 301, light rays emitted by each pixel on themicro-display area unit 201 are distributed between themicrostructure array 3 and theexit pupil 5 in the form of parallel light beams or approximately parallel light beams, and the position and the direction of each light beam are determined by the position of each pixel point and the position of the center of the corresponding microstructure, so that each virtual image point can be defined by the light beams at different positions and directions in the light field 8. For the light beams corresponding to the pixels rendered by the same point in the image information, the reverse extension lines of the main rays of the light beams intersect at one point in the image space, namely, the virtual image is formed. Depending on the angles of the rays, the system can generate avirtual image point 901 at a short distance or avirtual image point 902 at a long distance.
The image rendering module assigns the gray value (for monochrome display) at the intersection point of the beamlet and the surface of the three-dimensional virtual object/the gray value of each channel in each color channel (such as RGB) to each pixel of the small-sized array image source according to the distribution of the spatial light field, namely the corresponding relation between the pixel of the small-sized array image source and the spatial beamlet (namely a camera model), so as to generate a sub-image displayed by each micro-display area unit, and parameters such as a view angle, a view axis direction, a view point position and the like in the camera model are determined by the size of themicro-display area unit 201, the spacing distance between the small-sizedarray image source 2 and themicro-structure array device 3 and the distribution of the central connectingline 4. For example, in common OpenGL, each pair of micro-display area unit-micro-structure unit combination can be considered as a view frustum, i.e. assigned a projection matrix; the view point of these viewing cones is located at the geometric center of the microstructure units, and the viewing axis is on the line connecting the geometric centers of the micro-display area units and the microstructure units and intersects at the center of the exit pupil. The view frustum takes the micro display area unit as a near clipping plane, and a far clipping plane can be positioned at any position far away.
At the same time, the image rendering module is also able to intensity modulate the sub-image of eachmicro-display area unit 201 again. Generally, the farther away all the pixel points in each microdisplay area unit 201 are from the central pixel position, the easier it is to coincide with the light rays emitted from the adjacent microdisplay area unit 201 during refocusing on the retina, so that the brightness of the pixel points can be reduced according to the distance from the pixel points to the central pixel of each microdisplay area unit 201, so as to reduce the obvious increase of the brightness caused by the light rays coincidence and improve the uniformity of the picture.
In the present invention, the intensity of theambient light 7 and the intensity of the image information light are a pair of contradictory quantities. When the image source brightness is constant, the relationship between the two is reversed, and the intensity ratio is determined by the area duty ratio of thegap region 302 in themicrostructure period 3, which can be adjusted by changing the duty ratio. In order to achieve a better display effect, it is preferable that the area of the region a defined by the outermost microdisplay region unit 201 on the small-sizedarray image source 2 is larger than the area of the peripheral region B defined by the outermost micro structures on the micro structure array device 3 (i.e., the area a is larger than the area B), as shown in fig. 4 and 5, and theinterval 202 in the region a is larger than theinterval 302 in the region B.
Since the distance from the light source of theambient light 7 to thehuman eye 6 is large compared to the size of themicrostructure array device 3, and it can be regarded as approximately parallel light incidence, so that the absorption of light by each transparent medium in the lightfield display unit 1 itself can be ignored, and theambient light 7 is generally regarded as parallel incidence, and the transmittance of the incident light is approximately equal to the area duty ratio of thegap region 302 in themicrostructure array device 3. The intensity of the image information provided by the image rendering module can be adjusted by changing the light transmission aperture of eachmicrostructure unit 301, and can also be adjusted by changing the pixel brightness of the small-sizedarray image source 201. To ensure display quality, the maximum dimension of eachmicrostructure unit 301 should not exceed 1mm in the near-eye display state. Since the pupil diameter of thehuman eye 6 is generally between 2 mm and 8mm, the light emitted from the smallarray image source 2 and modulated by themicro-structure array device 3 and the light from the external environment and passing through the lightfield display unit 1 can be received at any time.
The lightfield display unit 1 may introduce stray light due to various structural and registration issues, and fig. 8 shows an analysis of the stray light that may be present in the system. If a display effect without stray light is to be obtained on the premise of keeping no diaphragm, the microstructure units need to have larger apertures, so that light rays emitted by themicro-display area units 201 can be completely modulated by themicrostructure units 301 before entering theexit pupil 5. However, under the limitation of general manufacturing process and display effect, especially formicrostructure units 301 with optical power, the aperture is usually smaller than the ideal size, so when light is emitted from themicro-display area unit 201 and emitted to themicrostructure array device 3, part of the light may enter the range of theexit pupil 5 from thegap area 302 outside themicrostructure unit 301, and thestray light 10 is formed and received by thehuman eye 6. Thestray light 10 is characterized by a stronger position further away from the center of theexit pupil 5. In this regard, a preferred approach is to adjust parameters related to each structure in the lightfield display unit 1, such as reducing the size of theexit pupil 5, or adjusting the rear intercept (eye relief), etc., so as to attenuate stray light 10 actually entering thehuman eye 6. When the ratio of the illuminance of thestray light 10 at the maximum in the range of theexit pupil 5 to the average illuminance of the normal light field is below a certain threshold (e.g., 50%), and the ratio of the illuminance of thestray light 10 at the center of theexit pupil 5 to the average illuminance of the normal light field is below a certain threshold (e.g., 10%), the distribution of thestray light 10 can be considered to be within a range acceptable to the human eye. As will be appreciated by those skilled in the art, a reduced range of theexit pupil 5 provides a higher transmittance ofambient light 7, but also results in increased difficulty for the wearer to accommodate different interpupillary distances.a interpupillary distance adjustment mechanism may be added to achieve registration of the position of theexit pupil 5 with the user'seye 6.
To this end, the present invention further proposes a way to eliminate stray light as shown in fig. 9: an opaqueannular diaphragm 303 is added to thegap region 302 of themicrostructure array 3 along the periphery of themicrostructure unit 301 by film coating or adding a mechanical structure, so that the originalstray light 10 is absorbed or reflected before exiting through themicrostructure array device 3, and thus does not enter theexit pupil 5 region. The coating film can be a reflecting film or an absorbing film. The addition of theannular diaphragm 303 is beneficial to enlarging the size of theexit pupil 5, and the pupil distance of a wearer can be matched without an additional pupil distance adjusting mechanism under most conditions, so that the wearing and observation comfort level is improved. In this case, theinterval 302 in the B region should not include the region covered by the stop, that is, the interval on themicrostructure array 3 is a light-transmitting portion having no modulating effect on the image, and preferably occupies 30% or more of the B region to ensure transmission of the ambient image. Each opaque portion in the area a, i.e. theunit 201 of the micro-display area of the display sub-image, is smaller than the area defined by the outermost circle of the diaphragm.
When the opticalfield display unit 1 according to the present invention can be implemented as a glasses type near-eye display device as shown in fig. 2, a circuit for driving a small array image source, other signal processors, sensors, etc. may be further installed inside theframe 12 of the glasses, and the opticalfield display unit 1 corresponds to the left andright eyes 6, respectively, so that a binocular true three-dimensional stereoscopic vision without convergence-focus contradiction can be implemented.
Alternatively, the present invention also includes a see-through display scheme based on an array of apertures, as shown in FIG. 10. Themicrostructure units 301 are formed by small holes without focal power, so that the light rays emitted by the microdisplay area units 201 are limited in a smaller range through the rear opening angles of the small holes, and therefore the directivity is embodied and distributed to form a light field. Further, anannular diaphragm 303 is disposed around each of the small holes to shield light that directly enters the exit pupil without passing through the small holes, so as to eliminate stray light.
Although the substrates of themicrostructure array devices 3 are each shown in a flat plate manner in the above-described embodiments of the present invention, but not limited thereto, the substrates of themicrostructure array devices 3 may be formed in a form having optical power, as shown in fig. 11, in which case the light field display unit of the present invention supports vision correction when the wearer observes ambient light. Specifically, alens 11 having different surface types on both sides and having optical power is used as a base of themicrostructure array device 3.Microstructure units 1101 are located on the surface of themirror plate 11 facing the compact array image source and anannular stop 1103 is located on the other surface of themirror plate 11. When observing, the light of external real environment can be incident on thelens 11 through the transparent basal part of the small-sized array image source, and is refracted by the focal power of thelens 11 while passing through thegap area 1102, so that the vision correction aiming at the light of the external environment is realized. The light emitted from themicro-display area unit 201 passes through thelens 11 after being modulated by the micro-structure 1101, and is also refracted by thelens 11, so that the vision correction for the virtual information is realized.
Fig. 12 shows an AR application mode based on the light field display unit of the present invention, since the light field display unit of the present invention can achieve the consistency of the convergence and focus states of the human eyes, the virtual information 16 (image information corresponding to each pixel point) can be rendered at any position, that is, the virtual image light received by the human eye appears to be emitted from any designated location and eventually enters the human eye, in this application, human eyes can move atpositions 601 and 602 in a real environment, and a near-eye display device using the light field display unit of the present invention can always render virtual information at a position consistent with areal object 15 according to the position of human eyes fed back by a sensor or the like, so that no matter whether the human eyes are near 601, e.g., within 0.5 meters, or when viewing the virtual information at aremote location 602, the comfort state can be maintained for a long time and consistent clarity of the virtual information and the real environment is ensured.
According to the light field display unit disclosed by the invention, when the light field display unit is displayed in an augmented reality mode, the micro display area unit is arranged on the inner surface of the substrate and is opaque, and external environment light rays incident into the micro display area unit area from the outer surface of the substrate are shielded by the micro display device and cannot be incident on the micro-structural array, so that the external environment light rays can be incident on the micro-structural array only by directly penetrating through a gap of the transparent medium substrate, and the loss of real environment light when a virtual image and the real environment light are synthesized by adopting a transmission and reflection light splitting film in a normal mode is avoided. In the range of the exit pupil, only external ambient light penetrating through the gaps of the miniature array image source and the virtual image modulated by the microstructure units are included, and the virtual image seems to simulate the occurrence of the real occurrence position of the virtual image and is emitted into human eyes, so that the virtual-real fusion is more real.
While embodiments of the invention have been illustrated in detail, it should be understood that the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Modifications and adaptations to these embodiments may occur to one skilled in the art without departing from the scope of the present invention as set forth in the following claims.

Claims (10)

CN201810154739.6A2018-02-232018-02-23Light field display unit based on small array image source and three-dimensional near-to-eye display device using light field display unitActiveCN108375840B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201810154739.6ACN108375840B (en)2018-02-232018-02-23Light field display unit based on small array image source and three-dimensional near-to-eye display device using light field display unit

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201810154739.6ACN108375840B (en)2018-02-232018-02-23Light field display unit based on small array image source and three-dimensional near-to-eye display device using light field display unit

Publications (2)

Publication NumberPublication Date
CN108375840A CN108375840A (en)2018-08-07
CN108375840Btrue CN108375840B (en)2021-07-27

Family

ID=63017863

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201810154739.6AActiveCN108375840B (en)2018-02-232018-02-23Light field display unit based on small array image source and three-dimensional near-to-eye display device using light field display unit

Country Status (1)

CountryLink
CN (1)CN108375840B (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111527440A (en)*2018-12-042020-08-11京东方科技集团股份有限公司 Display panel, display device, and display method
CN110361872A (en)*2019-07-152019-10-22合肥工业大学A kind of perspective three dimensional display device and three-dimensional display system
CN112802404A (en)*2019-11-132021-05-14舜宇光学(浙江)研究院有限公司Micro LED display device and method thereof and Micro projection system
US20210159373A1 (en)*2019-11-222021-05-27Facebook Technologies, LlcLight extraction for micro-leds
CN112946893B (en)*2019-12-102024-03-08拾斛科技(南京)有限公司Near field display device
CN111290128B (en)*2020-03-312021-10-01京东方科技集团股份有限公司 Optical system and display device, smart glasses
CN113495366B (en)*2020-04-032022-05-17驻景(广州)科技有限公司Three-dimensional display method based on sub-pixel emergent light space superposition
CN111650754B (en)*2020-07-172022-08-12北京耐德佳显示技术有限公司Head-up display equipment
CN112837629A (en)*2021-02-222021-05-25深圳市思坦科技有限公司 Vision Correction Lenses and Vision Correction Glasses
CN112882240A (en)*2021-03-162021-06-01拾斛科技(南京)有限公司Display device and display method
CN115343848B (en)*2021-05-132025-09-23中强光电股份有限公司 Light field near-eye display device and light field near-eye display method
CN115494640A (en)*2021-06-172022-12-20中强光电股份有限公司Light field near-eye display for generating virtual reality image and method thereof
CN113645462B (en)*2021-08-062024-01-16深圳臻像科技有限公司Conversion method and device for 3D light field
WO2023070535A1 (en)*2021-10-292023-05-04京东方科技集团股份有限公司Display apparatus, and display panel and manufacturing method therefor
CN114167616A (en)*2021-12-132022-03-11谷东科技有限公司See-through near-eye display optical system and head-mounted display device
CN114252991B (en)*2022-01-102024-11-19东南大学 A super-surface micro-nano near-eye display based on retinal display
CN115774335B (en)*2022-11-112024-07-16Oppo广东移动通信有限公司Virtual image display device
CN119882266B (en)*2025-03-262025-07-25北京邮电大学Array optical waveguide-based sub-pixel regulation and control three-dimensional light field display system
CN120522909A (en)*2025-07-242025-08-22成都工业学院Large-depth-of-field display device based on virtual pupil

Citations (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103064136A (en)*2013-01-162013-04-24福州大学Combined microlens array for integrated imaging three-dimensional (3D) display and manufacturing method thereof
US8582209B1 (en)*2010-11-032013-11-12Google Inc.Curved near-to-eye display
CN103885582A (en)*2012-12-192014-06-25辉达公司Near-eye Microlens Array Displays
CN104246578A (en)*2012-04-252014-12-24微软公司Light field projector based on movable LED array and microlens array for use in head-mounted light-field display
CN106537220A (en)*2014-03-052017-03-22亚利桑那大学评议会Wearable 3D augmented reality display with variable focus and/or object recognition
CN106873161A (en)*2017-03-022017-06-20上海天马微电子有限公司Display device and near-to-eye wearable equipment
CN106908953A (en)*2017-03-282017-06-30陈超平A kind of binocular near-eye display device of integrated vision correction
WO2017208148A1 (en)*2016-05-302017-12-07Università Di PisaWearable visor for augmented reality

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103995356B (en)*2014-05-302016-01-20北京理工大学A kind of light field helmet mounted display device of true stereo sense
CN105717640B (en)*2014-12-052018-03-30北京蚁视科技有限公司Near-to-eye based on microlens array

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8582209B1 (en)*2010-11-032013-11-12Google Inc.Curved near-to-eye display
CN104246578A (en)*2012-04-252014-12-24微软公司Light field projector based on movable LED array and microlens array for use in head-mounted light-field display
CN103885582A (en)*2012-12-192014-06-25辉达公司Near-eye Microlens Array Displays
CN103064136A (en)*2013-01-162013-04-24福州大学Combined microlens array for integrated imaging three-dimensional (3D) display and manufacturing method thereof
CN106537220A (en)*2014-03-052017-03-22亚利桑那大学评议会Wearable 3D augmented reality display with variable focus and/or object recognition
WO2017208148A1 (en)*2016-05-302017-12-07Università Di PisaWearable visor for augmented reality
CN106873161A (en)*2017-03-022017-06-20上海天马微电子有限公司Display device and near-to-eye wearable equipment
CN106908953A (en)*2017-03-282017-06-30陈超平A kind of binocular near-eye display device of integrated vision correction

Also Published As

Publication numberPublication date
CN108375840A (en)2018-08-07

Similar Documents

PublicationPublication DateTitle
CN108375840B (en)Light field display unit based on small array image source and three-dimensional near-to-eye display device using light field display unit
KR102365726B1 (en)Method for providing composite image based on optical see-through and apparatus using the same
US11543565B2 (en)Display panel, display device and display method
US9164351B2 (en)Freeform-prism eyepiece with illumination waveguide
US9507174B2 (en)Spatial focal field type glasses display
US5754344A (en)Head-mounted stereoscopic image display apparatus
US7486341B2 (en)Head mounted display with eye accommodation having 3-D image producing system consisting of, for each eye, one single planar display screen, one single planar tunable focus LC micro-lens array, one single planar black mask and bias lens
US9274345B2 (en)Multiple view display
EP2209036A1 (en)Display device and its display method
JP2000092520A (en) 3D image display device
US10642061B2 (en)Display panel and display apparatus
JP2011145607A (en)Head mount display
TWI717602B (en)Head mounted display device
KR102743041B1 (en)Head Mounted Display
KR20220010358A (en)Apparatus of displaying augmented reality
EP3615987B1 (en)A display apparatus and a method thereof
CN106371209B (en)Virtual reality shows helmet and optical module
CN120225938A (en)Device with combined 2D-3D display
CN105676464B (en)Image display device
CN108469684B (en)Transparent display and display system
WO2019044501A1 (en) MOUNTED
JP4893821B2 (en) Image display device
CN114063285A (en)Pixel-based curved surface near-to-eye display method, display and display system
CN120595477A (en)Transparent display screen near-to-eye display system based on micro lens array
KR20230121544A (en)Augmented Reality Apparatus that can Provide an Optimal Image by Analyzing the Location of the User's Pupil

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
TA01Transfer of patent application right
TA01Transfer of patent application right

Effective date of registration:20200422

Address after:No.108, floor 1, building 7, No.30 yard, Shixing street, Shijingshan District, Beijing 100043

Applicant after:Beijing NED+AR Display Technology Co.,Ltd.

Address before:215500 Suzhou City, Suzhou, Jiangsu, Changshou City high tech Industrial Development Zone Lake Mountain Road No. 333 Tongji Science and Technology Square 1 302

Applicant before:SUZHOU NED+AR TCOE TECHNOLOGY Co.,Ltd.

GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp