Movatterモバイル変換


[0]ホーム

URL:


CN114441142B - Method and device for obtaining correction parameters of AR imaging system - Google Patents

Method and device for obtaining correction parameters of AR imaging system
Download PDF

Info

Publication number
CN114441142B
CN114441142BCN202111660633.1ACN202111660633ACN114441142BCN 114441142 BCN114441142 BCN 114441142BCN 202111660633 ACN202111660633 ACN 202111660633ACN 114441142 BCN114441142 BCN 114441142B
Authority
CN
China
Prior art keywords
imaging
lens module
image
correction
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111660633.1A
Other languages
Chinese (zh)
Other versions
CN114441142A (en
Inventor
黄国书
申志兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co LtdfiledCriticalGoertek Techology Co Ltd
Priority to CN202111660633.1ApriorityCriticalpatent/CN114441142B/en
Publication of CN114441142ApublicationCriticalpatent/CN114441142A/en
Application grantedgrantedCritical
Publication of CN114441142BpublicationCriticalpatent/CN114441142B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

Translated fromChinese

本申请提供一种AR成像系统的校正参数获取方法及装置,其中方法包括:控制显示组件发出校正图像,由光源投影校正图像至光机透镜模组,再经由光机透镜模组缩扩束,再经由波导组件反射至成像透镜模组,再经由成像透镜模组聚焦至相机上成像;获取光机透镜模组和光源之间的当前距离z;获取校正图像原始的长wL、宽hL,以及原始成像距zL;根据相机成像,确定校正图像的相机成像的长wC、宽hC、成像距zC;利用以下第一关系式计算得到当前距离z对应的校正参数,校正参数包括放大倍率α(z)与波导光路偏差β(z)。相较于现有技术,本申请能够提高AR成像系统与用户的虚拟影像交互品质,使用户拥有更强的沉浸感,提高用户体验。

The present application provides a method and device for obtaining correction parameters of an AR imaging system, wherein the method includes: controlling a display component to emit a correction image, projecting the correction image from a light source to an optomechanical lens module, then shrinking and expanding the beam through the optomechanical lens module, then reflecting to an imaging lens module through a waveguide component, and then focusing to a camera through the imaging lens module to form an image; obtaining the current distance z between the optomechanical lens module and the light source; obtaining the original length wL , width hL , and original imaging distance zL of the correction image; determining the length wC , width hC , and imaging distance zC of the camera imaging of the correction image according to the camera imaging; using the following first relationship to calculate the correction parameters corresponding to the current distance z, the correction parameters including the magnification α(z) and the waveguide optical path deviation β(z). Compared with the prior art, the present application can improve the quality of virtual image interaction between the AR imaging system and the user, allowing the user to have a stronger sense of immersion and improve the user experience.

Description

Correction parameter acquisition method and device of AR imaging system
Technical Field
The application relates to the technical field of optical testing, in particular to an imaging correction method and device of an AR imaging system.
Background
The AR (Augmented Reality ) technology is to simulate and simulate entity information (such as visual information, sound, etc.) which is difficult to experience in a certain time space range of the real world through scientific technology such as a computer, and then superimpose the entity information, apply virtual information to the real world, and perceive the virtual information by human senses, thereby achieving sense experience exceeding reality. The augmented reality technology not only displays real world information, but also simultaneously displays virtual information, and the two information are mutually supplemented and overlapped.
The projection system of the AR product consists of a display module, an optical machine, a lens, accessory accessories and a bracket, wherein the imaging module projects the content onto the lens through the optical machine, and then the content is reflected to the eyes of a person by the lens, so that the person can interact with the virtual image, and the AR augmented reality function is achieved.
Therefore, the relative assembly precision of the display module, the optical machine and the lens structure is a key factor for determining the AR imaging effect, and the tolerance of the display module, the optical machine and the lens and assembly can cause the error of the imaging distance.
Disclosure of Invention
The application aims to provide a correction parameter acquisition method and device for an AR imaging system, so as to correct the AR imaging system and improve user experience.
The first aspect of the present application provides a correction parameter acquiring method for an AR imaging system, including:
the AR imaging system comprises a display component, a light source, an optical machine lens module, a waveguide component and an imaging lens module which are arranged along a light path;
Controlling the display assembly to emit a correction image, projecting the correction image to the optical machine lens module by the light source, shrinking and expanding the beam by the optical machine lens module, reflecting the correction image to the imaging lens module by the waveguide assembly, and focusing the correction image to a camera by the imaging lens module for imaging, wherein the camera is used for shooting the imaging of the correction image through the AR imaging system;
acquiring the current distance z between the optical machine lens module and the light source;
Acquiring the original length wL, the original width hL and the original imaging distance zL of the corrected image;
Determining a length wC, a width hC, an imaging distance zC of camera imaging of the corrected image from imaging on the camera;
Calculating a correction parameter corresponding to the current distance z by using the following first relation, wherein the correction parameter comprises a magnification alpha (z) and a waveguide light path deviation beta (z);
the first relation is as follows:
α(z)=wC/wL=hC/hL;
zC=α(z)*zL+β(z)。
The second aspect of the application provides a correction parameter acquisition device of an AR imaging system, which comprises a control module and an acquisition module;
the AR imaging system comprises a display component, a light source, an optical machine lens module, a waveguide component and an imaging lens module which are arranged along a light path;
The control module is used for controlling the display assembly to emit a correction image, projecting the correction image to the optical machine lens module by the light source, shrinking and expanding the beam by the optical machine lens module, reflecting the correction image to the imaging lens module by the waveguide assembly, and focusing the correction image to a camera by the imaging lens module for imaging, wherein the camera is used for shooting the imaging of the correction image through the AR imaging system;
the acquisition module is used for executing the following steps:
acquiring the current distance z between the optical machine lens module and the light source;
Acquiring the original length wL, the original width hL and the original imaging distance zL of the corrected image;
Determining a length wC, a width hC, an imaging distance zC of camera imaging of the corrected image from imaging on the camera;
Calculating a correction parameter corresponding to the current distance z by using the following first relation, wherein the correction parameter comprises a magnification alpha (z) and a waveguide light path deviation beta (z);
the first relation is as follows:
α(z)=wC/wL=hC/hL;
zC=α(z)*zL+β(z)。
According to the correction parameter acquisition method and device for the AR imaging system, the camera is arranged at the exit pupil of the AR imaging system to simulate human eyes, imaging of the correction image is shot, the imaging distance correction parameter and the space frequency response SFR of the AR imaging system can be obtained according to the imaging of the correction image, the AR imaging system can be corrected according to the correction parameter and the SFR, the imaging distance projected by the AR imaging system accords with intuition of a user, the interaction quality of the AR imaging system and a virtual image of the user is improved, the user has stronger immersion feeling, and the user experience is improved.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 shows a schematic diagram of an AR imaging system provided by the present application;
FIG. 2 is a flow chart illustrating a method for obtaining correction parameters for an AR imaging system according to the present application;
FIG. 3 is a schematic diagram illustrating the calibration parameter acquisition principle of an AR imaging system according to the present application;
FIG. 4 is a schematic view of an optical path of an AR imaging system according to the present application;
FIG. 5 is a flowchart of the present application for determining whether the opto-mechanical lens module is out of focus;
FIG. 6 is a schematic view of focusing and defocusing optical paths of the optical machine lens module according to the present application;
FIG. 7 is a graph showing the SFR values of the in-focus and out-of-focus of the opto-mechanical lens module according to the present application;
Fig. 8 is a schematic diagram of a correction parameter acquiring apparatus of an AR imaging system according to the present application.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
It is noted that unless otherwise indicated, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs.
Fig. 1 shows a schematic diagram of an AR imaging system according to the present application, and as shown in fig. 1, the AR imaging system includes a display module 100, a light source 200, an opto-mechanical lens module 300, a waveguide module 400, and an imaging lens module 500 disposed along an optical path. As shown in fig. 1, the present application also places a camera to simulate a human eye at the exit pupil of the AR imaging system to be corrected, taking an image of the corrected image.
The waveguide assembly 400 includes, among other things, an input coupling grating 410, a waveguide 420, and an output coupling grating 430.
Fig. 2 is a flowchart of a method for obtaining correction parameters of an AR imaging system according to the present application, as shown in fig. 2, where the method includes the following steps:
S101, controlling a display assembly to send out a correction image, projecting the correction image to an optical lens module by a light source, shrinking and expanding beams by the optical lens module, reflecting the correction image to an imaging lens module by a waveguide assembly, and focusing the correction image to upper imaging of a camera by the imaging lens module;
wherein the camera is used for shooting the imaging of the correction image through the AR imaging system.
The correction image may be a knife edge image, or may be any other image with a correction function, which is not limited in the present application.
S102, acquiring the current distance z between the optical machine lens module 300 and the light source 200;
S103, acquiring the original length wL and the original width hL of the corrected image and the original imaging distance zL;
The imaging distance is the distance between the imaging lens module 500 and the imaging of the camera, and the original imaging distance zL is the imaging distance of the AR imaging system to be corrected before correction.
S104, determining the length wC, the width hC and the imaging distance zC of camera imaging of the corrected image according to imaging on the camera;
S105, calculating and obtaining a correction parameter corresponding to the current distance z by using a first relation, wherein the correction parameter comprises a magnification alpha (z) and a waveguide light path deviation beta (z);
the first relation is as follows:
α(z)=wC/wL=hC/hL;
zC=α(z)*zL+β(z)。
the first relation may be obtained by the following procedure:
Before the AR imaging system is corrected, the imaging distance correction of the camera may be done with reference to fig. 3. After aligning the optical machine lens module and the camera with the optical axis, inputting a light source knife edge image in a darkroom, imaging in the camera, and adjusting an automatic focusing module of the optical machine lens module (even if the optical machine lens module is displaced, changing the distance between the optical machine lens module and the imaging lens module) to obtain the magnification of the knife edge image under different distances, and defining a fitting curve of the standard imaging distance of the camera and the magnification of the knife edge image, namely the first relation.
As shown in fig. 1, the optical lens module is disposed on the optical axis, and the optical lens module is automatically focused at the initialization position z0, that is, the current distance between the optical lens module and the light source is z0, and at this time, the length, width and imaging distance of the camera imaging of the corrected image are (wC0,hC0,zC0), which has the following relationship:
α(z0)=wC0/wL=hC0/hL;zC0=α(z0)*zL+β(z0).
The magnification alpha (z0) corresponding to the initialization position z0 and the waveguide light path deviation beta (z0) of the optical machine lens module are obtained through the relational expression, and finally the correction parameters (alpha, beta) are fed back to the optical machine lens module to finish measurement and correction of the imaging distance corresponding to the initialization position z0.
In some embodiments of the application, the step of constructing the first relation comprises:
The distance between the optical machine lens module and the light source is changed for a plurality of times, and the change amount of each time is +/-delta z;
Recording correction parameters corresponding to each deltaz;
And constructing the first relation according to the correction parameter corresponding to each deltaz.
Specifically, the process of correcting the initialization position z0 is expanded, as shown in fig. 4, the distance between the optical lens module 300 and the light source 200 is changed, the variation is ±Δz, for example, z0-Δz、z0-2Δz、z0 -3 Δz, the imaging distance corresponding to each variation Δz is recorded, the magnification α (z) of the corrected image and the waveguide optical path deviation β (z) are recorded, and finally the correction relationship zC=α(z)*zL +β (z) is obtained. By feeding the correction relation back to the optical machine lens module, the correction of the imaging distance is completed, and the accuracy of the imaging distance is improved.
In some embodiments of the present application, the method for obtaining correction parameters of an AR imaging system may further include calculating a spatial frequency response SFR value of camera imaging of the correction image, and determining whether the opto-mechanical lens module is out of focus according to the SFR value.
SFR (Spatial Frequency Response ) is mainly used for measuring the influence on a single image caused by the increase of lines of spatial frequency, and is an evaluation of the resolving power of the whole imaging system, and the closer the value of SFR is to 1, the better the imaging effect of the imaging system is.
Specifically, as shown in fig. 5, the step of determining whether the optomechanical lens module is out of focus according to the SFR value includes:
S201, judging whether an SFR value of camera imaging of the corrected image at a preset space frequency is smaller than a standard SFR value, wherein the standard SFR value refers to the SFR value at the preset space frequency when the optical machine lens module is focused, and is calibrated in advance;
s202, if yes, determining that the optical machine lens module is out of focus;
s203, if not, determining that the optical machine lens module focuses.
Specifically, by analyzing the SFR value of the camera imaging of the corrected image, it can be determined whether the projection of the optical machine lens module is on the focusing plane, that is, whether the optical machine lens module is out of focus or in focus can be determined according to the SFR value.
As shown in fig. 6, the optical lens module 300 in the left view is focused to generate clear camera imaging, and the optical lens module 300 in the right view is out of focus to blur the camera imaging.
As shown in fig. 7, in the case of defocus of the opto-mechanical lens module, the SFR value of the camera image at the preset spatial frequency is significantly reduced compared to the standard SFR value. For example, the preset spatial frequency is 0.25, and as shown in fig. 7, at the spatial frequency of 0.25, the SFR value of the optical lens module defocus is significantly smaller than the SFR value of the focus.
According to the correction parameter acquisition method of the AR imaging system, the camera is placed at the exit pupil of the AR imaging system to simulate human eyes, imaging of the correction image is shot, the imaging distance correction parameter and the space frequency response SFR of the AR imaging system can be obtained according to the imaging of the correction image, the AR imaging system can be corrected according to the correction parameter and the SFR, the imaging distance projected by the AR imaging system accords with intuition of a user, the interaction quality of the AR imaging system and a virtual image of the user is improved, the user has stronger immersion feeling, and the user experience is improved.
The embodiment of the application also provides a correction parameter acquisition device of the AR imaging system, which corresponds to the correction parameter acquisition method of the AR imaging system, as shown in FIG. 8, and comprises a control module 101 and an acquisition module 102;
the AR imaging system comprises a display component, a light source, an optical machine lens module, a waveguide component and an imaging lens module which are arranged along a light path;
The control module 101 is configured to control the display module to emit a correction image, project the correction image to the optical machine lens module by the light source, expand and contract the beam through the optical machine lens module, reflect the correction image to the imaging lens module through the waveguide module, and focus the correction image to the camera for imaging through the imaging lens module;
the acquiring module 102 is configured to perform the following steps:
acquiring the current distance z between the optical machine lens module and the light source;
Acquiring the original length wL, the original width hL and the original imaging distance zL of the corrected image;
Determining a length wC, a width hC, an imaging distance zC of camera imaging of the corrected image from imaging on the camera;
Calculating a correction parameter corresponding to the current distance z by using the following first relation, wherein the correction parameter comprises a magnification alpha (z) and a waveguide light path deviation beta (z);
the first relation is as follows:
α(z)=wC/wL=hC/hL;
zC=α(z)*zL+β(z)。
in one possible implementation, the obtaining module 102 is further configured to:
The distance between the optical machine lens module and the light source is changed for a plurality of times, and the change amount of each time is +/-delta z;
Recording correction parameters corresponding to each deltaz;
and correcting the AR imaging system according to the correction parameters corresponding to each deltaz.
In one possible implementation, the obtaining module 102 is further configured to:
Calculating a spatial frequency response, SFR, value of a camera image of the corrected image;
Judging whether the optical machine lens module is out of focus or not according to the SFR value.
In one possible implementation manner, the acquiring module 102 is specifically configured to:
Judging whether the SFR value of the camera imaging of the corrected image at the preset space frequency is smaller than a standard SFR value, wherein the standard SFR value refers to the SFR value at the preset space frequency when the optical machine lens module focuses;
if yes, determining that the optical machine lens module is out of focus;
If not, determining that the optical machine lens module focuses.
In one possible implementation, the correction image employs a knife edge image.
According to the correction parameter acquisition device of the AR imaging system, the camera is arranged at the exit pupil of the AR imaging system to simulate human eyes, imaging of a correction image is shot, imaging distance correction parameters and space frequency response SFR of the AR imaging system can be obtained according to imaging of the correction image, the AR imaging system can be corrected according to the correction parameters and the SFR, the imaging distance projected by the AR imaging system accords with intuition of a user, the interaction quality of the AR imaging system and virtual images of the user is improved, the user has stronger immersion feeling, and the user experience is improved.
It should be noted that:
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the above description of exemplary embodiments of the application, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed application requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the following claims, any of the claimed embodiments can be used in any combination.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in the creation means of a virtual machine according to an embodiment of the present application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application can also be implemented as an apparatus or device program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the invention, and all equivalent structural changes made by the specification and drawings of the present invention or direct/indirect application in other related technical fields are included in the scope of the present invention.

Claims (8)

Translated fromChinese
1.一种AR成像系统的校正参数获取方法,其特征在于,包括:1. A method for obtaining correction parameters of an AR imaging system, comprising:所述AR成像系统包括沿光路设置的显示组件、光源、光机透镜模组、波导组件和成像透镜模组;The AR imaging system includes a display component, a light source, an optomechanical lens module, a waveguide component and an imaging lens module arranged along an optical path;控制所述显示组件发出校正图像,由所述光源投影校正图像至所述光机透镜模组,再经由所述光机透镜模组缩扩束,再经由所述波导组件反射至所述成像透镜模组,再经由所述成像透镜模组聚焦至相机上成像;所述相机用于拍摄所述校正图像经过所述AR成像系统的成像;Control the display component to emit a correction image, and the light source projects the correction image to the optical machine lens module, and then the correction image is contracted and expanded by the optical machine lens module, and then reflected to the imaging lens module by the waveguide component, and then focused to the camera by the imaging lens module to form an image; the camera is used to capture the image of the correction image passing through the AR imaging system;获取所述光机透镜模组和所述光源之间的当前距离z;Obtaining a current distance z between the optomechanical lens module and the light source;获取所述校正图像原始的长wL、宽hL,以及原始成像距zLObtaining the original length wL , width hL , and original imaging distance zL of the corrected image;根据所述相机上的成像,确定所述校正图像的相机成像的长wC、宽hC、成像距zCDetermine the length wC , width hC , and imaging distance zC of the camera imaging of the correction image according to the imaging on the camera;利用以下第一关系式计算得到所述当前距离z对应的校正参数,所述校正参数包括放大倍率α(z)与波导光路偏差β(z);The correction parameters corresponding to the current distance z are calculated using the following first relationship, wherein the correction parameters include the magnification α(z) and the waveguide optical path deviation β(z);所述第一关系式如下:The first relationship is as follows:α(z)=wC/wL=hC/hLα(z)=wC /wL =hC /hL ;zC=α(z)*zL+β(z);zC =α(z)*zL +β(z);构建所述第一关系式的步骤包括:The steps of constructing the first relational expression include:多次改变所述光机透镜模组和所述光源之间的距离,每次的变化量为±Δz;Changing the distance between the optical machine lens module and the light source multiple times, with each change amount being ±Δz;记录每个Δz对应的校正参数;Record the correction parameters corresponding to each Δz;根据每个Δz对应的校正参数构建所述第一关系式。The first relational expression is constructed according to the correction parameters corresponding to each Δz.2.根据权利要求1所述的方法,其特征在于,所述方法还包括:2. The method according to claim 1, characterized in that the method further comprises:计算所述校正图像的相机成像的空间频率响应SFR值;Calculating the spatial frequency response SFR value of the camera imaging of the corrected image;根据所述SFR值判断所述光机透镜模组是否离焦。Whether the optical machine lens module is out of focus is determined according to the SFR value.3.根据权利要求2所述的方法,其特征在于,所述根据所述SFR值判断所述光机透镜模组是否离焦,包括:3. The method according to claim 2, characterized in that judging whether the optical mechanical lens module is defocused according to the SFR value comprises:判断预设空间频率处所述校正图像的相机成像的SFR值是否小于标准SFR值;所述标准SFR值是指所述光机透镜模组对焦时在所述预设空间频率处的SFR值;Determine whether the SFR value of the camera imaging of the corrected image at a preset spatial frequency is less than a standard SFR value; the standard SFR value refers to the SFR value at the preset spatial frequency when the optical-mechanical lens module is focused;若是,则确定所述光机透镜模组离焦;If yes, determining that the optical mechanical lens module is out of focus;若否,则确定所述光机透镜模组对焦。If not, it is determined that the optical-mechanical lens module is in focus.4.根据权利要求1至3中任一项所述的方法,其特征在于,所述校正图像采用刀口图像。4. The method according to any one of claims 1 to 3, characterized in that the correction image is a knife-edge image.5.一种AR成像系统的校正参数获取装置,其特征在于,所述装置包括:控制模块和获取模块;5. A correction parameter acquisition device for an AR imaging system, characterized in that the device comprises: a control module and an acquisition module;所述AR成像系统包括沿光路设置的显示组件、光源、光机透镜模组、波导组件和成像透镜模组;The AR imaging system includes a display component, a light source, an optomechanical lens module, a waveguide component and an imaging lens module arranged along an optical path;所述控制模块,用于控制所述显示组件发出校正图像,由所述光源投影校正图像至所述光机透镜模组,再经由所述光机透镜模组缩扩束,再经由所述波导组件反射至所述成像透镜模组,再经由所述成像透镜模组聚焦至相机上成像;所述相机用于拍摄所述校正图像经过所述AR成像系统的成像;The control module is used to control the display component to emit a correction image, the light source projects the correction image to the optical mechanical lens module, the optical mechanical lens module shrinks and expands the beam, the waveguide component reflects the correction image to the imaging lens module, and the imaging lens module focuses the correction image on the camera; the camera is used to capture the image of the correction image passing through the AR imaging system;所述获取模块,用于执行以下步骤:The acquisition module is used to perform the following steps:获取所述光机透镜模组和所述光源之间的当前距离z;Obtaining a current distance z between the optomechanical lens module and the light source;获取所述校正图像原始的长wL、宽hL,以及原始成像距zLObtaining the original length wL , width hL , and original imaging distance zL of the corrected image;根据所述相机上的成像,确定所述校正图像的相机成像的长wC、宽hC、成像距zCDetermine the length wC , width hC , and imaging distance zC of the camera imaging of the correction image according to the imaging on the camera;利用以下第一关系式计算得到所述当前距离z对应的校正参数,所述校正参数包括放大倍率α(z)与波导光路偏差β(z);The correction parameters corresponding to the current distance z are calculated using the following first relationship, wherein the correction parameters include the magnification α(z) and the waveguide optical path deviation β(z);所述第一关系式如下:The first relationship is as follows:α(z)=wC/wL=hC/hLα(z)=wC /wL =hC /hL ;zC=α(z)*zL+β(z);zC =α(z)*zL +β(z);所述获取模块,还用于:The acquisition module is further used for:多次改变所述光机透镜模组和所述光源之间的距离,每次的变化量为±Δz;Changing the distance between the optical machine lens module and the light source multiple times, with each change amount being ±Δz;记录每个Δz对应的校正参数;Record the correction parameters corresponding to each Δz;根据每个Δz对应的校正参数构建所述第一关系式。The first relational expression is constructed according to the correction parameters corresponding to each Δz.6.根据权利要求5所述的装置,其特征在于,所述获取模块,还用于:6. The device according to claim 5, characterized in that the acquisition module is further used for:计算所述校正图像的相机成像的空间频率响应SFR值;Calculating the spatial frequency response SFR value of the camera imaging of the corrected image;根据所述SFR值判断所述光机透镜模组是否离焦。Whether the optical machine lens module is out of focus is determined according to the SFR value.7.根据权利要求6所述的装置,其特征在于,所述获取模块,具体用于:7. The device according to claim 6, characterized in that the acquisition module is specifically used to:判断预设空间频率处所述校正图像的相机成像的SFR值是否小于标准SFR值;所述标准SFR值是指所述光机透镜模组对焦时在所述预设空间频率处的SFR值;Determine whether the SFR value of the camera imaging of the corrected image at a preset spatial frequency is less than a standard SFR value; the standard SFR value refers to the SFR value at the preset spatial frequency when the optical-mechanical lens module is focused;若是,则确定所述光机透镜模组离焦;If yes, determining that the optical mechanical lens module is out of focus;若否,则确定所述光机透镜模组对焦。If not, it is determined that the optical-mechanical lens module is in focus.8.根据权利要求5至7中任一项所述的装置,其特征在于,所述校正图像采用刀口图像。8. The device according to any one of claims 5 to 7, characterized in that the correction image is a knife-edge image.
CN202111660633.1A2021-12-302021-12-30 Method and device for obtaining correction parameters of AR imaging systemActiveCN114441142B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202111660633.1ACN114441142B (en)2021-12-302021-12-30 Method and device for obtaining correction parameters of AR imaging system

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202111660633.1ACN114441142B (en)2021-12-302021-12-30 Method and device for obtaining correction parameters of AR imaging system

Publications (2)

Publication NumberPublication Date
CN114441142A CN114441142A (en)2022-05-06
CN114441142Btrue CN114441142B (en)2025-02-11

Family

ID=81365709

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202111660633.1AActiveCN114441142B (en)2021-12-302021-12-30 Method and device for obtaining correction parameters of AR imaging system

Country Status (1)

CountryLink
CN (1)CN114441142B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN115685485A (en)*2022-09-222023-02-03歌尔光学科技有限公司 A method, system and device for confirming the focus position of an optical module

Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6194697B1 (en)*1999-04-132001-02-27Hewlett-Packard CompanyCalibration system for an imaging apparatus and method
CN110780445A (en)*2018-11-122020-02-11芋头科技(杭州)有限公司 Method and system for active calibration of an assembled optical imaging system
CN113114905A (en)*2021-04-162021-07-13广州立景创新科技有限公司Camera module, focusing adjustment system and focusing method

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2590891B2 (en)*1987-07-021997-03-12株式会社ニコン Projection optical device
JP3483381B2 (en)*1996-01-062004-01-06キヤノン株式会社 Image forming apparatus and magnification correction method using the same
JPH10261563A (en)*1997-03-181998-09-29Nikon Corp Projection exposure method and apparatus
JP2007072296A (en)*2005-09-082007-03-22Canon Inc Image correction method for image correction for frequency response of image to display device
JP5544901B2 (en)*2010-01-262014-07-09カシオ計算機株式会社 Angle-of-view center deviation correction apparatus, imaging apparatus, angle-of-view center deviation correction method, and program
US8576390B1 (en)*2012-07-312013-11-05Cognex CorporationSystem and method for determining and controlling focal distance in a vision system camera
US20170169612A1 (en)*2015-12-152017-06-15N.S. International, LTDAugmented reality alignment system and method
CN107563987A (en)*2016-07-012018-01-09北京疯景科技有限公司Demarcate the method and device of imaging difference
KR20200102408A (en)*2018-01-022020-08-31루머스 리미티드 Augmented Reality Display with Active Alignment and Corresponding Method
US11002969B2 (en)*2018-01-252021-05-11Facebook Technologies, LlcLight projection system including an optical assembly for correction of differential distortion
US10725302B1 (en)*2018-11-022020-07-28Facebook Technologies, LlcStereo imaging with Fresnel facets and Fresnel reflections
US20200186786A1 (en)*2018-12-062020-06-11Novarad CorporationCalibration for Augmented Reality
CN110160749B (en)*2019-06-052022-12-06歌尔光学科技有限公司Calibration device and calibration method applied to augmented reality equipment
CN110650290B (en)*2019-10-122021-06-15惠州市德赛自动化技术有限公司Active focusing adjustment method for camera
US20210141130A1 (en)*2019-11-122021-05-13Facebook Technologies, LlcHigh-index waveguide for conveying images
CN112040224A (en)*2020-08-312020-12-04南昌欧菲晶润科技有限公司Method, medium and electronic device for verifying camera module performance test equipment
CN112326206B (en)*2020-11-062023-06-13歌尔光学科技有限公司 AR module binocular fusion detection device and detection method
CN113124830B (en)*2021-04-092024-07-19广州得尔塔影像技术有限公司Imaging optical gradient testing method and testing equipment for camera module

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6194697B1 (en)*1999-04-132001-02-27Hewlett-Packard CompanyCalibration system for an imaging apparatus and method
CN110780445A (en)*2018-11-122020-02-11芋头科技(杭州)有限公司 Method and system for active calibration of an assembled optical imaging system
CN113114905A (en)*2021-04-162021-07-13广州立景创新科技有限公司Camera module, focusing adjustment system and focusing method

Also Published As

Publication numberPublication date
CN114441142A (en)2022-05-06

Similar Documents

PublicationPublication DateTitle
JP3473964B2 (en) Camera distance measurement and quick autofocusing method and apparatus
CN102576410B (en)Evaluation of image processing algorithms
US9137526B2 (en)Image enhancement via calibrated lens simulation
CN108012147A (en)The virtual image of AR imaging systems is away from test method and device
CN114788254B (en) Auxiliary focusing method, device and system
CN114441142B (en) Method and device for obtaining correction parameters of AR imaging system
WO2020110712A1 (en)Inspection system, inspection method, and program
JP2007298327A (en) Particle measuring apparatus and method
US20170310897A1 (en)Image processing apparatus, imaging apparatus, and image processing method
JP5599849B2 (en) Lens inspection apparatus and method
JP2021027523A5 (en)
EP2420881B1 (en)Microscope and ghosting elimination method
JP2020501191A (en) Optical system and diopter adjustment method
KR102820618B1 (en)Method for simulating an optical image representation
JP6755737B2 (en) Distance measuring device, imaging device, and distance measuring method
JP5224976B2 (en) Image correction apparatus, image correction method, program, and recording medium
US20160216690A1 (en)Method and apparatus for correcting distortion of 3d hologram
Choi et al.Novel telecentric collimator design for mobile optical inspection instruments
US12271993B2 (en)Method for forming an image of an object, computer program product and image forming system for carrying out the method
CN113052294B (en) Method and configuration method for simulating a second camera objective lens and movie camera
WO2018110282A1 (en)Focusing apparatus, focusing method, and storage medium storing program
JP6632384B2 (en) Image reproducing apparatus and image reproducing method
CN118276313B (en) Aberration modeling method and correction system for large aperture optical system
JP6771954B2 (en) Image processing device, control method and program of image processing device
JP2019110365A (en) Imaging system, lens apparatus, and image processing tool

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
TA01Transfer of patent application right
TA01Transfer of patent application right

Effective date of registration:20221118

Address after:No. 500 Songling Road, Laoshan District, Qingdao City, Shandong Province, 266100

Applicant after:GOERTEK TECHNOLOGY Co.,Ltd.

Address before:261000 plant 1, phase III, goer Photoelectric Industrial Park, No. 3999, Huixian Road, Yongchun community, Qingchi street, high tech Zone, Weifang City, Shandong Province

Applicant before:GoerTek Optical Technology Co.,Ltd.

GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp