Disclosure of Invention
The application aims to provide a correction parameter acquisition method and device for an AR imaging system, so as to correct the AR imaging system and improve user experience.
The first aspect of the present application provides a correction parameter acquiring method for an AR imaging system, including:
the AR imaging system comprises a display component, a light source, an optical machine lens module, a waveguide component and an imaging lens module which are arranged along a light path;
Controlling the display assembly to emit a correction image, projecting the correction image to the optical machine lens module by the light source, shrinking and expanding the beam by the optical machine lens module, reflecting the correction image to the imaging lens module by the waveguide assembly, and focusing the correction image to a camera by the imaging lens module for imaging, wherein the camera is used for shooting the imaging of the correction image through the AR imaging system;
acquiring the current distance z between the optical machine lens module and the light source;
Acquiring the original length wL, the original width hL and the original imaging distance zL of the corrected image;
Determining a length wC, a width hC, an imaging distance zC of camera imaging of the corrected image from imaging on the camera;
Calculating a correction parameter corresponding to the current distance z by using the following first relation, wherein the correction parameter comprises a magnification alpha (z) and a waveguide light path deviation beta (z);
the first relation is as follows:
α(z)=wC/wL=hC/hL;
zC=α(z)*zL+β(z)。
The second aspect of the application provides a correction parameter acquisition device of an AR imaging system, which comprises a control module and an acquisition module;
the AR imaging system comprises a display component, a light source, an optical machine lens module, a waveguide component and an imaging lens module which are arranged along a light path;
The control module is used for controlling the display assembly to emit a correction image, projecting the correction image to the optical machine lens module by the light source, shrinking and expanding the beam by the optical machine lens module, reflecting the correction image to the imaging lens module by the waveguide assembly, and focusing the correction image to a camera by the imaging lens module for imaging, wherein the camera is used for shooting the imaging of the correction image through the AR imaging system;
the acquisition module is used for executing the following steps:
acquiring the current distance z between the optical machine lens module and the light source;
Acquiring the original length wL, the original width hL and the original imaging distance zL of the corrected image;
Determining a length wC, a width hC, an imaging distance zC of camera imaging of the corrected image from imaging on the camera;
Calculating a correction parameter corresponding to the current distance z by using the following first relation, wherein the correction parameter comprises a magnification alpha (z) and a waveguide light path deviation beta (z);
the first relation is as follows:
α(z)=wC/wL=hC/hL;
zC=α(z)*zL+β(z)。
According to the correction parameter acquisition method and device for the AR imaging system, the camera is arranged at the exit pupil of the AR imaging system to simulate human eyes, imaging of the correction image is shot, the imaging distance correction parameter and the space frequency response SFR of the AR imaging system can be obtained according to the imaging of the correction image, the AR imaging system can be corrected according to the correction parameter and the SFR, the imaging distance projected by the AR imaging system accords with intuition of a user, the interaction quality of the AR imaging system and a virtual image of the user is improved, the user has stronger immersion feeling, and the user experience is improved.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
It is noted that unless otherwise indicated, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs.
Fig. 1 shows a schematic diagram of an AR imaging system according to the present application, and as shown in fig. 1, the AR imaging system includes a display module 100, a light source 200, an opto-mechanical lens module 300, a waveguide module 400, and an imaging lens module 500 disposed along an optical path. As shown in fig. 1, the present application also places a camera to simulate a human eye at the exit pupil of the AR imaging system to be corrected, taking an image of the corrected image.
The waveguide assembly 400 includes, among other things, an input coupling grating 410, a waveguide 420, and an output coupling grating 430.
Fig. 2 is a flowchart of a method for obtaining correction parameters of an AR imaging system according to the present application, as shown in fig. 2, where the method includes the following steps:
S101, controlling a display assembly to send out a correction image, projecting the correction image to an optical lens module by a light source, shrinking and expanding beams by the optical lens module, reflecting the correction image to an imaging lens module by a waveguide assembly, and focusing the correction image to upper imaging of a camera by the imaging lens module;
wherein the camera is used for shooting the imaging of the correction image through the AR imaging system.
The correction image may be a knife edge image, or may be any other image with a correction function, which is not limited in the present application.
S102, acquiring the current distance z between the optical machine lens module 300 and the light source 200;
S103, acquiring the original length wL and the original width hL of the corrected image and the original imaging distance zL;
The imaging distance is the distance between the imaging lens module 500 and the imaging of the camera, and the original imaging distance zL is the imaging distance of the AR imaging system to be corrected before correction.
S104, determining the length wC, the width hC and the imaging distance zC of camera imaging of the corrected image according to imaging on the camera;
S105, calculating and obtaining a correction parameter corresponding to the current distance z by using a first relation, wherein the correction parameter comprises a magnification alpha (z) and a waveguide light path deviation beta (z);
the first relation is as follows:
α(z)=wC/wL=hC/hL;
zC=α(z)*zL+β(z)。
the first relation may be obtained by the following procedure:
Before the AR imaging system is corrected, the imaging distance correction of the camera may be done with reference to fig. 3. After aligning the optical machine lens module and the camera with the optical axis, inputting a light source knife edge image in a darkroom, imaging in the camera, and adjusting an automatic focusing module of the optical machine lens module (even if the optical machine lens module is displaced, changing the distance between the optical machine lens module and the imaging lens module) to obtain the magnification of the knife edge image under different distances, and defining a fitting curve of the standard imaging distance of the camera and the magnification of the knife edge image, namely the first relation.
As shown in fig. 1, the optical lens module is disposed on the optical axis, and the optical lens module is automatically focused at the initialization position z0, that is, the current distance between the optical lens module and the light source is z0, and at this time, the length, width and imaging distance of the camera imaging of the corrected image are (wC0,hC0,zC0), which has the following relationship:
α(z0)=wC0/wL=hC0/hL;zC0=α(z0)*zL+β(z0).
The magnification alpha (z0) corresponding to the initialization position z0 and the waveguide light path deviation beta (z0) of the optical machine lens module are obtained through the relational expression, and finally the correction parameters (alpha, beta) are fed back to the optical machine lens module to finish measurement and correction of the imaging distance corresponding to the initialization position z0.
In some embodiments of the application, the step of constructing the first relation comprises:
The distance between the optical machine lens module and the light source is changed for a plurality of times, and the change amount of each time is +/-delta z;
Recording correction parameters corresponding to each deltaz;
And constructing the first relation according to the correction parameter corresponding to each deltaz.
Specifically, the process of correcting the initialization position z0 is expanded, as shown in fig. 4, the distance between the optical lens module 300 and the light source 200 is changed, the variation is ±Δz, for example, z0-Δz、z0-2Δz、z0 -3 Δz, the imaging distance corresponding to each variation Δz is recorded, the magnification α (z) of the corrected image and the waveguide optical path deviation β (z) are recorded, and finally the correction relationship zC=α(z)*zL +β (z) is obtained. By feeding the correction relation back to the optical machine lens module, the correction of the imaging distance is completed, and the accuracy of the imaging distance is improved.
In some embodiments of the present application, the method for obtaining correction parameters of an AR imaging system may further include calculating a spatial frequency response SFR value of camera imaging of the correction image, and determining whether the opto-mechanical lens module is out of focus according to the SFR value.
SFR (Spatial Frequency Response ) is mainly used for measuring the influence on a single image caused by the increase of lines of spatial frequency, and is an evaluation of the resolving power of the whole imaging system, and the closer the value of SFR is to 1, the better the imaging effect of the imaging system is.
Specifically, as shown in fig. 5, the step of determining whether the optomechanical lens module is out of focus according to the SFR value includes:
S201, judging whether an SFR value of camera imaging of the corrected image at a preset space frequency is smaller than a standard SFR value, wherein the standard SFR value refers to the SFR value at the preset space frequency when the optical machine lens module is focused, and is calibrated in advance;
s202, if yes, determining that the optical machine lens module is out of focus;
s203, if not, determining that the optical machine lens module focuses.
Specifically, by analyzing the SFR value of the camera imaging of the corrected image, it can be determined whether the projection of the optical machine lens module is on the focusing plane, that is, whether the optical machine lens module is out of focus or in focus can be determined according to the SFR value.
As shown in fig. 6, the optical lens module 300 in the left view is focused to generate clear camera imaging, and the optical lens module 300 in the right view is out of focus to blur the camera imaging.
As shown in fig. 7, in the case of defocus of the opto-mechanical lens module, the SFR value of the camera image at the preset spatial frequency is significantly reduced compared to the standard SFR value. For example, the preset spatial frequency is 0.25, and as shown in fig. 7, at the spatial frequency of 0.25, the SFR value of the optical lens module defocus is significantly smaller than the SFR value of the focus.
According to the correction parameter acquisition method of the AR imaging system, the camera is placed at the exit pupil of the AR imaging system to simulate human eyes, imaging of the correction image is shot, the imaging distance correction parameter and the space frequency response SFR of the AR imaging system can be obtained according to the imaging of the correction image, the AR imaging system can be corrected according to the correction parameter and the SFR, the imaging distance projected by the AR imaging system accords with intuition of a user, the interaction quality of the AR imaging system and a virtual image of the user is improved, the user has stronger immersion feeling, and the user experience is improved.
The embodiment of the application also provides a correction parameter acquisition device of the AR imaging system, which corresponds to the correction parameter acquisition method of the AR imaging system, as shown in FIG. 8, and comprises a control module 101 and an acquisition module 102;
the AR imaging system comprises a display component, a light source, an optical machine lens module, a waveguide component and an imaging lens module which are arranged along a light path;
The control module 101 is configured to control the display module to emit a correction image, project the correction image to the optical machine lens module by the light source, expand and contract the beam through the optical machine lens module, reflect the correction image to the imaging lens module through the waveguide module, and focus the correction image to the camera for imaging through the imaging lens module;
the acquiring module 102 is configured to perform the following steps:
acquiring the current distance z between the optical machine lens module and the light source;
Acquiring the original length wL, the original width hL and the original imaging distance zL of the corrected image;
Determining a length wC, a width hC, an imaging distance zC of camera imaging of the corrected image from imaging on the camera;
Calculating a correction parameter corresponding to the current distance z by using the following first relation, wherein the correction parameter comprises a magnification alpha (z) and a waveguide light path deviation beta (z);
the first relation is as follows:
α(z)=wC/wL=hC/hL;
zC=α(z)*zL+β(z)。
in one possible implementation, the obtaining module 102 is further configured to:
The distance between the optical machine lens module and the light source is changed for a plurality of times, and the change amount of each time is +/-delta z;
Recording correction parameters corresponding to each deltaz;
and correcting the AR imaging system according to the correction parameters corresponding to each deltaz.
In one possible implementation, the obtaining module 102 is further configured to:
Calculating a spatial frequency response, SFR, value of a camera image of the corrected image;
Judging whether the optical machine lens module is out of focus or not according to the SFR value.
In one possible implementation manner, the acquiring module 102 is specifically configured to:
Judging whether the SFR value of the camera imaging of the corrected image at the preset space frequency is smaller than a standard SFR value, wherein the standard SFR value refers to the SFR value at the preset space frequency when the optical machine lens module focuses;
if yes, determining that the optical machine lens module is out of focus;
If not, determining that the optical machine lens module focuses.
In one possible implementation, the correction image employs a knife edge image.
According to the correction parameter acquisition device of the AR imaging system, the camera is arranged at the exit pupil of the AR imaging system to simulate human eyes, imaging of a correction image is shot, imaging distance correction parameters and space frequency response SFR of the AR imaging system can be obtained according to imaging of the correction image, the AR imaging system can be corrected according to the correction parameters and the SFR, the imaging distance projected by the AR imaging system accords with intuition of a user, the interaction quality of the AR imaging system and virtual images of the user is improved, the user has stronger immersion feeling, and the user experience is improved.
It should be noted that:
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the above description of exemplary embodiments of the application, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed application requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the following claims, any of the claimed embodiments can be used in any combination.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in the creation means of a virtual machine according to an embodiment of the present application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application can also be implemented as an apparatus or device program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the invention, and all equivalent structural changes made by the specification and drawings of the present invention or direct/indirect application in other related technical fields are included in the scope of the present invention.