Disclosure of Invention
An object of the present application is to provide an image source module, a near-eye display system, a control method, and a near-eye display device, which are used to solve the problem of image display in near-eye display.
The embodiment of the application provides an image source module, include: use in the near-to-eye display system who has eye movement tracking function, the image source module includes: an image light source, a control unit and a scanning unit comprising at least two scanners, wherein,
the control unit is respectively connected with the image light source and the scanning unit so as to transmit control signals to the image light source and the scanning unit, each scanner in the scanning unit respectively scans and outputs light beams corresponding to different sub-fields of view based on the image light beams output by the image light source, and the working state of each scanner is respectively controllable.
Further, the scanner is a fiber optic scanner or a micro-electro-mechanical system scanner.
The present invention further provides a near-eye display system, comprising a waveguide, an eye tracking module and the image source module, wherein,
the image source module generates laser beams containing image information, the laser beams are scanned and output to the waveguide 30, and the output laser beams are adjusted according to eye movement signals sent by the eye movement tracking module;
the waveguide expands the laser beam input by the image source module in a first direction and a second direction, and the expanded beam is output from the waveguide;
the eye movement tracking module monitors the direction of the eyes of the user in real time so as to determine the sub-visual field where the visual field of the user is located, and therefore, a corresponding eye movement position signal is generated and sent to the image source module to adjust the emergent light beam.
The embodiment of the present application further provides a control method for the near-to-eye display system, where the method includes:
the eye movement tracking module monitors the orientation of human eyes, determines a focusing area of the human eyes and a sub-field of view in which the focusing area is located according to the orientation, generates an eye movement position signal and sends the eye movement position signal to the image source module;
a control unit in the image source module generates an image control signal according to the eye movement position signal input by the eye movement tracking module and sends the image control signal to an image light source so as to control the image light source to generate an image corresponding to a human eye focusing area; and generating a scanning control signal to be sent to a scanning unit so as to control a scanner corresponding to the focusing area in the scanning unit to scan the image.
Further, each scanner in the scanning unit corresponds to a corresponding sub-field of view.
Further, controlling a scanner corresponding to a focus area in the scanning unit to perform image scanning includes:
when the focus area of human eyes is positioned in any one of the sub-fields of view, a scanner corresponding to the sub-field of view in the scanning unit is controlled to scan the image.
Further, controlling a scanner corresponding to a focus area in the scanning unit to perform image scanning includes:
when the focus area of human eyes is positioned in two or more sub-fields of view, the scanners corresponding to the two or more sub-fields of view in the scanning unit are controlled to simultaneously scan images.
Further, the control method is triggered based on an operation instruction of a user.
Further, the control method is triggered based on an operation instruction of a user, and specifically comprises the following steps: and receiving an operation instruction of starting the power saving mode by a user, starting the power saving mode based on the operation instruction and triggering the control method.
The embodiment of the application further provides near-eye display equipment, the near-eye display equipment is used for augmented reality display, the near-eye display equipment at least comprises the set of near-eye display system, the control method is adopted for control, external environment light penetrates through the near-eye display equipment to enter human eyes, and meanwhile light beams emitted by the near-eye display system enter the human eyes to achieve augmented reality display.
By adopting the technical scheme in the embodiment of the application, the following technical effects can be realized:
the near-eye display system comprises a plurality of scanners, and light beams scanned and output by each scanner correspond to different sub-fields of view, so that the emergent light beams act on human eyes or a display medium (such as a lens), a user can watch corresponding images, and near-eye display is realized. Meanwhile, under the monitoring of the eye movement tracking module 40, the near-eye display system can control one or some scanners to work in real time according to the direction of the eyes of the user, and the rest scanners do not work, so that the scanning output can be performed only for the focusing area of the eyes of the user, the interference of the displayed content on the user can be reduced while the system consumption is saved.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
For ease of understanding, the basic principle of laser scanning imaging in existing near-eye display devices is first explained. As shown in fig. 1, which is a schematic diagram, fig. 1 includes: a laser light source 101, a scanner 102 and a human eye retina 103.
When the imaging is displayed, the laser emitted by the laser source acts on a certain pixel point position after being output by the scanner, so that the scanning of the pixel point position is realized, and the laser beam moves to the next pixel point position to scan under the control of the scanner. In other words, the laser beam outputted by the scanner will be lighted up at each pixel position with corresponding color, gray scale or brightness according to a certain sequence. In a frame of time, the laser beam traverses each pixel position at a high enough speed, and due to the characteristic that human eyes observe objects with residual vision, human eyes cannot perceive the movement of the laser beam at each pixel position, but see a complete image (in fig. 1, a user can see an image whose content is displayed as "Hi"). Of course, the content shown in fig. 1 is only for simple illustration of the basic principle of laser scanning imaging in the near-eye display, so as to facilitate understanding of the technical solutions in the embodiments of the present application, and should not be taken as a limitation of the present application.
However, in the near-eye display scene, the field of vision of the human eye is limited. Specifically, referring to fig. 2, it is assumed that the image in the near-eye display is as shown in fig. 2, and actually, the human eye cannot cover the entire image in the visual field in such a short distance. Taking the areas a1 and a2 as examples, assuming that human eyes focus on the area a1, the field of vision of human eyes can cover the area, which means that the human eyes can view the image content in the area a1, but at this time, due to the field of vision limitation of human eyes, the image content in other areas is difficult to see or even cannot be seen; if it is desired to view the image content in region a2, the focus position of the human eye needs to be shifted from region a1 to a 2. If the display is performed by an AR device, the corresponding image beam will directly act on human eyes, and then the different areas in fig. 2 can also be understood as different sub-fields, and the sub-fields are tiled to form a complete field, so that a complete image can be displayed. Of course, the above is only for explaining the way in which the image is viewed by human eyes in the near-eye display scene, so as to facilitate understanding of the subsequent aspects of the present application. And should not be construed as limiting the present application.
Based on the foregoing, embodiments of the present application provide a near-eye display system, as shown in fig. 3 a. The near-eye display system includes: image source module 20, waveguide 30, and eye-tracking module 40. Wherein,
image source module 20 generates a laser beam comprising image information for scanning output to waveguide 30 and adjusts the output laser beam according to the eye movement signal sent by eye movement tracking module 40. For convenience of description, the laser beam may be simply referred to as a beam in the embodiments of the present application, and the laser beam and the beam represent the same concept in the embodiments of the present application unless otherwise specified.
The waveguide 30 expands the laser beam input from the image source module 20 in the first direction and the second direction, and the expanded beam is output from the waveguide 30. As shown in fig. 3a, the first direction and the second direction respectively represent the propagation directions of the light beam in the waveguide 30, and when the user actually uses the display device corresponding to the near-eye display system, the first direction can be regarded as a vertical direction in the visual field plane of the human eye, and the second direction can be regarded as a horizontal direction in the visual field plane of the human eye, so the first direction and the second direction can be also referred to as: a vertical direction and a horizontal direction. It is to be understood that the terms "first" and "second" are used herein for distinguishing and should not be construed as limiting in sequence.
The eye tracking module 40 monitors the orientation of the user's eyes in real time to determine the sub-fields of view in which the user's field of view is located, thereby generating corresponding eye position signals to be sent to the image source module 20 to adjust the outgoing light beams. It should be noted that the eye tracking module 40 can be implemented by using conventional components and algorithms, and therefore, the detailed description thereof is omitted here.
Referring to fig. 3a and 3b, the image source module 20 may include: a control unit 201, an image light source 202, and a scanning unit 203 including at least two scanners 2031. Wherein,
the control unit 201 is connected to the image light source 202 and the scanning unit 203 respectively to provide image signals to the image light source 202 and corresponding control signals to the image light source 202 and the scanning unit 203, each scanner 2031 in the scanning unit 203 respectively scans and outputs light beams corresponding to different subfields based on the image light beam output by the image light source 202, and the working state of each scanner 2031 is respectively controllable.
In the embodiment of the present application, the image light source 202 may specifically be an atomic laser, an ion laser, or a semiconductor laser, and in order to ensure the display effect, generally, any one or a combination of red (R), green (G), and blue (B) monochromatic lasers is adopted, or a white laser (it should be understood that the white laser may be separated into the foregoing RGB monochromatic lasers by corresponding optical devices), and of course, the laser of the corresponding color and the corresponding type may be specifically selected according to the needs of the actual application. The image light source 202 may further include optical elements such as a beam combiner, which may be configured according to the actual application requirement, and will not be described herein again.
In one embodiment, the image light sources 202 may transmit their output laser beams to the scanners 2031 in the scanning unit 203, respectively, through optical fibers.
The scanning unit 203 includes at least two scanners 2031, each scanner 2031 corresponds to a different subfield in the same field of view, so that during imaging, different subfields scanned and output by different scanners 2031 are spliced into a complete field of view, and a user can view a complete image. Also, the operating state of each scanner 2031 is individually controllable.
Referring to fig. 3b, a case that 6 scanners 2031 are included in the scanning unit 203 is shown, in this case, the complete field of view scanned and output by the scanning unit 203 is formed by splicing 6 subfields (that is, each scanner 2031 corresponds to one subfield), and one or some scanners 2031 may be controlled to operate according to the control signal of the control unit 201, so as to implement display control on different subfields. Of course, it is understood that the number of the scanners 2031 included in the scanning unit 203 may be set according to the needs of the practical application, and is not particularly limited herein.
In some embodiments, the scanner 2031 may be a fiber scanner or a Micro-Electro-Mechanical System (MEMS) scanner, and an appropriate scanner may be selected according to the actual application requirement, which is not described herein again (in the embodiments of the present application, a fiber scanner is taken as an example). Of course, no matter what kind of scanner is used, two-dimensional scanning can be performed in the actual scanning display process.
It should be noted here that each scanner 2031 in the scanning unit 203 is packaged separately, so that the scanners 2031 do not affect each other when they operate.
Referring to fig. 4, the waveguide 30 may include a coupling-in member 301, an expansion member 302, and a coupling-out member 303 disposed on the waveguide 30.
The waveguide 30 may adopt a spatial light modulator, and the coupling-in component 301 is located on the surface of the waveguide 30 and opposite to the light outlet of the image source module 20, so that the light beam output by the image source module 20 can be coupled into the waveguide 301, and under the action of the coupling-in component 301, the light beam entering the waveguide 301 will be further input to the expansion component 302. The coupling-in member 301 is not limited to the position shown in fig. 4, and may be located on the side of the waveguide 301, and may be configured according to the requirement of the practical application.
The expanding member 302 and the outcoupling member 303 expand the light beam entering the waveguide 301 in the first direction and the second direction, respectively. Specifically, the extension member 302 is disposed on the light exit path of the incoupling member 301 and extends in the direction of the light exit path, and the light exit direction of the extension member 302 is perpendicular to the extension direction of the extension member 302. The light incident side of the coupling-out member 303 is parallel to and opposite to the extending direction of the extending member 302, the coupling-out member 303 extends in the light emitting direction of the extending member 302, and the coupling-out member 303 emits light toward the human eye side.
It should be noted that, in practical applications, the size and shape of each component in the waveguide 30 are not limited to the state shown in fig. 4, and are only an embodiment given for explaining the waveguide 30, and therefore should not be construed as limiting the present application.
On the basis of the above, an embodiment of the present application further provides a control method based on the near-eye display system, as shown in fig. 5, the method specifically includes the following steps:
step S501: the eye tracking module 40 monitors the orientation of the human eyes, determines the focusing areas of the human eyes and the sub-fields in which the focusing areas are located according to the orientation, generates an eye movement position signal, and transmits the eye movement position signal to the image source module 20.
As previously described, the full field of view of the near-eye display system may be divided into a corresponding number of sub-fields of view depending on the number of scanners included in the scanning unit 203. Referring to the near-eye display system shown in fig. 3b, the scanning unit 203 comprises 6 scanners 2031, and accordingly, as shown in fig. 6, the complete field of view of the near-eye display system can be formed by splicing the 6 subfields a 1-a 6 (i.e., each scanner 2031 scans an output beam uniquely corresponding to one subfield). Based on this, the eye tracking module 40 can determine the focus area of the human eye and the sub-field of view to which the focus area belongs by monitoring the orientation of the human eye.
Step S503: the control unit 201 in the image source module 20 generates an image control signal according to the eye movement position signal input by the eye movement tracking module 40, sends the image control signal to the image light source 202, and controls the image light source 202 to generate an image corresponding to a human eye focusing area; and generates a scanning control signal to be sent to the scanning unit 203, which controls the scanner 2031 corresponding to the focus area in the scanning unit 203 to perform image scanning.
For image control, the control unit 201 controls the operation of one or some of the scanners 2031, and at the same time, since the focus area of the human eye is limited, the image light source 202 may be controlled to output only the light beam of the image content corresponding to the focus area of the human eye. Of course, in some embodiments of the present application, the complete image content needs to be displayed completely by the multiple scanners in the scanning unit 203 through stitching scanning, and the relative position of the image content is fixed, that is, the complete image content exceeds the field of view of human eyes, for such a case, the control unit 203 may control the image light source 202 to output only the light beam of the local image content corresponding to the focusing area of human eyes. In other embodiments of the present application, the image content to be displayed is less (e.g., weather information, date, etc.), and the image content may be displayed in the focus area of the human eye, for which case the control unit 203 controls the image light source 202 to output the light beam of the entire image content.
For scan control, the eye tracking module 40 may generate the eye movement position signal by using coordinate data, that is, the eye tracking module 40 may determine a coordinate range of a focusing area (i.e., a virtual image) of the human eye on the imaging plane (a two-dimensional coordinate system is established on the imaging plane), and generate the corresponding eye movement position signal accordingly. Of course, the algorithm of the eye tracking module 40 can be used for implementation, and thus, the description thereof is omitted.
Specifically, referring to fig. 7, it is assumed that in an embodiment, the eye tracking module 40 monitors the rotation of the human eye and determines that the focal region of the human eye falls within the sub-field of view a1, so as to generate a corresponding eye movement position signal to be sent to the control unit 201 in the image source module 20, and further, the control unit 201 controls the scanner 203 (which may be referred to as a first scanner) corresponding to the sub-field of view a1 in the scanning unit 203 to operate based on the eye movement position signal, and does not operate other scanners in the scanning unit 203, so that the near-eye display system outputs an image of the field of view a 1.
In other embodiments, the focal region of the human eye may be located over two or even more subfields. Referring to fig. 8, the eye tracking module 40 monitors the rotation of the human eye and determines that the focal region of the human eye partially falls into the sub-field a1 and partially falls into the sub-field a2, so as to generate corresponding eye movement position signals to be sent to the control unit 201 in the image source module 20, and further, the control unit 201 controls two scanners 203 (which may be referred to as a first scanner and a second scanner) corresponding to the sub-fields a1 and a2 in the scanning unit 203 to operate based on the eye movement position signals, and does not operate other scanners in the scanning unit 203, so that the near-eye display system outputs images of the partial field a1 and the partial field a2, thereby collectively forming an image in the focal region of the human eye.
It should be noted here that, in an actual near-eye display scene, the near-eye display system and the corresponding control method described above may be applied to an AR display device (e.g., applied to AR glasses). In some embodiments, the simultaneous working state of the multiple scanners may possibly cause too fast power consumption of the AR display device, and therefore, the control method in the present application may be implemented in a power saving mode of the AR display device, that is, a user may start the power saving mode of the AR display device, and then the multiple scanners in the AR display device may perform adjustment of the working state according to rotation of human eyes, and the specific adjustment manner may refer to the foregoing contents, which is not described herein in detail.
In other embodiments, the user may also define the display scheme of the AR device by himself, so that the AR device automatically executes in the working process according to the definition of the user, and specifically, the user may be defined as: in the default display state, the control method is adopted to control the working state of each scanner, and in the case of multimedia playing (such as video playing) or other situations requiring full-field display, all the scanners are turned on for scanning output.
The light beam emitted by the near-eye display system acts on human eyes or a display medium (such as a lens), so that a user can watch a corresponding image, and near-eye display is realized. Meanwhile, under the monitoring of the eye movement tracking module 40, the near-eye display system can adjust the emergent light beams in real time according to the rotation direction of the eyes of the user, and the adjusted light beams can only display images in the visual field of the eyes when displaying images, so that the interference of the displayed content on the user can be reduced while the system consumption is saved.
In practical applications, when the near-eye display system provided in the embodiment of the present application is applied to a near-eye display device, such as an AR device, the near-eye display device may include at least one set of the near-eye display system described in the foregoing and may be controlled by using the control method in the foregoing.
Referring to fig. 9, the near-eye display device is in the form of AR glasses, in this case, the near-eye display device may only include one set of near-eye display system S1, light beams emitted from the near-eye display system S1 may enter human eyes, and meanwhile, external ambient light may also enter human eyes through the imaging module, so that the user views corresponding augmented reality images. Of course, fig. 9 only shows one possible form of the near-eye display device, and in practical applications, two sets of the near-eye display systems described above may also be used on the near-eye display device to respectively act on the left eye and the right eye, and of course, the present application is not limited thereto.
The embodiments in the present application are described in a progressive manner, and the same and similar parts among the embodiments can be referred to each other, and each embodiment focuses on the differences from the other embodiments. Especially, as for the device, apparatus and medium type embodiments, since they are basically similar to the method embodiments, the description is simple, and the related points may refer to part of the description of the method embodiments, which is not repeated here.
Thus, particular embodiments of the present subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may be advantageous.
The expressions "first", "second", "said first" or "said second" used in various embodiments of the present disclosure may modify various components regardless of order and/or importance, but these expressions do not limit the respective components. The above description is only configured for the purpose of distinguishing elements from other elements. For example, the first user equipment and the second user equipment represent different user equipment, although both are user equipment. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present disclosure.
When an element (e.g., a first element) is referred to as being "operably or communicatively coupled" or "connected" (operably or communicatively) to "another element (e.g., a second element) or" connected "to another element (e.g., a second element), it is understood that the element is directly connected to the other element or the element is indirectly connected to the other element via yet another element (e.g., a third element). In contrast, it is understood that when an element (e.g., a first element) is referred to as being "directly connected" or "directly coupled" to another element (a second element), no element (e.g., a third element) is interposed therebetween.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.