Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a measuring method and a measuring device of a near-eye display, and aims to solve the problems of large measuring errors of an exit pupil, an eye point and an optical axis of the near-eye display, complex and expensive equipment, complex operation and low efficiency in the prior art.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
the invention provides a measuring method of a near-eye display, which uses an area array photoelectric sensor and a transmission device without an imaging lens to obtain the exit pupil parameter of the near-eye display to be measured, and specifically comprises the following steps: the area array photoelectric sensor is arranged in a human eye watching area of the near-eye display to be detected, the area is also called an eye box of the near-eye display, and the area array photoelectric sensor directly receives a light output signal from the near-eye display to be detected and obtains an illumination distribution image. According to the optical imaging principle displayed by the near-eye display, the exit pupil is the common exit of imaging light beams of each point of the near-eye display, namely, the emergent light rays of the output image of the near-eye display pass through the exit together, namely, each point on the exit contains the light information of each point of the whole output image, and the illumination distribution is uniform and has obvious boundaries. If the area array photoelectric sensor is positioned at the exit pupil of the near-eye display, and the size of the sensitive area of the area array photoelectric sensor can completely cover the whole exit pupil, the area array photoelectric sensor can receive the exit pupil image with uniform illumination distribution and clear boundary, and at the moment, the area surrounded by the relatively sharpest image boundary is the smallest compared with the area of the facula image received when the area array photoelectric sensor is positioned at other positions. If the area array photoelectric sensor deviates from the exit pupil, the illumination distribution image obtained by the area array photoelectric sensor becomes large and the boundary is blurred, and not every point in the image can receive light rays from each point of the output image surface of the near-to-eye display. Therefore, the planar array photoelectric sensor or the near-to-eye display to be detected is controlled by the transmission device to enable the planar array photoelectric sensor and the near-to-eye display to be detected to move relatively, the planar array photoelectric sensor obtains illumination distribution images at different spatial positions, an image boundary recognition algorithm is used for analyzing the illumination distribution images at different spatial positions, when the obtained images have the clearest boundary, the image area is the smallest relative to other positions, at the moment, the plane position of the planar array photoelectric sensor is the plane of the exit pupil of the near-to-eye display to be detected, and at the moment, the illumination area obtained by the planar array photoelectric sensor is the exit pupil of the near-to-eye display to be detected. As shown in fig. 1, 1 is a near-eye display to be measured, 2 is an object plane of the near-eye display to be measured, 3 is an optical axis of the near-eye display to be measured, 5 is an illumination distribution image of light rays emitted by an image space image output by the near-eye display to be measured and received by a sufficiently large area array photoelectric sensor, 6 is an area array photoelectric sensor, and 7 is a sensitive area of the area array photoelectric sensor.
In the above technical solution, the exit pupil parameters include: spatial position information of the exit pupil, the eye point and the optical axis, and two-dimensional size information of the exit pupil boundary. In order to avoid ambiguity, the technical scheme further describes the exit pupil, the eyepoint and the optical axis as follows: the exit pupil is an optical image of the entrance pupil of the optical system of the near-eye display to be detected, is a common exit of outgoing beams after each point on an object plane (namely an original emission plane of a display image) passes through the imaging of the optical system of the near-eye display, and the position and the size of the exit pupil of the near-eye display are important parameters of the near-eye display; the eye point is positioned on the plane of the exit pupil and at the center of the exit pupil, and is also the drop foot of the optical axis and the exit pupil; the optical axis is a straight line passing through the eye point and perpendicular to the plane of the exit pupil. As shown in fig. 2, 1 is a near-eye display to be measured, 2 is an object plane of the near-eye display to be measured, 3 is an optical axis of the near-eye display to be measured, 4 is an exit pupil, 8 is an image space picture (image plane, also referred to as an output picture of the near-eye display to be measured in this patent) output by the near-eye display to be measured, 9 is an eye box, and 10 is an eye point.
For convenience of description, the frames output by the near-eye display refer to image space frames sent by the object plane of the near-eye display and imaged by the optical system of the near-eye display.
Further, in the above technical solution, on the basis that the plane where the area array photoelectric sensor is located at the plane position where the exit pupil of the near-to-eye display to be measured is located, the center of the illumination distribution image obtained by the area array photoelectric sensor is obtained by using an image center recognition algorithm, so as to obtain the center position of the exit pupil, where the center position of the exit pupil is the eye point of the near-to-eye display to be measured.
Further, in the above technical solution, the optical axis of the near-eye display to be measured is obtained by the following method: and after the central position of the exit pupil is obtained by using an image center recognition algorithm, making a vertical line of the exit pupil of the near-eye display to be detected through the central position of the exit pupil, wherein the straight line of the vertical line is the optical axis of the near-eye display to be detected.
Further, in the above technical solution, the image boundary recognition algorithm includes, but is not limited to, a boundary sharpness recognition algorithm and/or a boundary contrast recognition algorithm. The sharpest boundary is obtained when the image boundary sharpness and/or contrast reaches an extreme value. Further, the algorithm is used for judging the definition of the boundary of the illumination distribution image obtained in each direction and position.
Further, in the above technical solution, the image center recognition algorithm includes, but is not limited to, a gray-scale gravity center method or a geometric method. The gray level gravity center method is to calculate the coordinates of a gray level weight center according to the gray level distribution of the illumination distribution image obtained by the area array photoelectric sensor. The geometric method is to obtain a graph surrounded by the illumination distribution image boundary obtained by the area array photoelectric sensor and calculate the geometric center of the graph.
Further, in the above technical solution, when the size of the light sensitive surface of the area array photoelectric sensor is smaller than the size of the exit pupil of the near-eye display to be measured, the area array photoelectric sensor cannot obtain the complete illumination pattern, and can obtain the illumination distribution images received by the area array photoelectric sensor at different positions step by step, and judge by using the image boundary recognition algorithm, when the illumination distribution images have the clearest boundary or have the relatively minimum area, the area array photoelectric sensor is located on the exit pupil of the near-eye display to be measured. The planar array photoelectric sensor and the near-to-eye display to be detected relatively translate through the transmission device, and the complete exit pupil is obtained through splicing, and further, the eye point and the optical axis of the near-to-eye display to be detected can be obtained according to the central position of the exit pupil.
Further, in the above technical solution, the area array photoelectric sensor or the near-eye display to be tested is controlled by the transmission device, so that the area array photoelectric sensor and the near-eye display to be tested move relatively, the area array photoelectric sensor obtains illumination distribution images at different spatial positions, an image boundary recognition algorithm is used for analyzing the illumination distribution images at different spatial positions, the area of the illumination distribution image obtained by the area array photoelectric sensor is relatively minimum with that of other positions, at the moment, the plane position of the area array photoelectric sensor is the plane of the exit pupil of the near-eye display to be tested, and at the moment, the illumination area obtained by the area array photoelectric sensor is the exit pupil of the near-eye display to be tested. Furthermore, the method can verify the spatial position of the near-eye display exit pupil to be detected and the spatial position of the near-eye display exit pupil obtained by taking the boundary definition degree of the illumination distribution image obtained by the area array photoelectric sensor as a criterion.
Further, in the above technical solution, the relative motion between the area array photoelectric sensor and the near-eye display to be measured includes translational motion in three directions of up and down, left and right, front and back, and rotation along two or more mutually perpendicular rotation axes, or multiple degrees of freedom compound motion.
In some alternative embodiments, the frames output by the near-eye display to be tested include, but are not limited to, full whiteboard frames or black matrix white frames. According to the optical imaging principle displayed by the near-eye display, the exit pupil is the common exit of imaging light beams of each point on the object plane of the near-eye display, and the emergent light rays jointly pass through the plane, namely each point on the plane contains illumination information of each point of the whole complete image. The type of the output picture of the near-to-eye display to be detected does not influence the illumination uniformity and boundary characteristics of the exit pupil.
Further, in the above technical solution, a process of controlling the area array photoelectric sensor and the near-eye display to be tested by using the transmission device to make the two relatively move specifically includes: and processing and analyzing the illumination distribution image obtained by the position of the area array photoelectric sensor to obtain corresponding position and posture (abbreviated as 'pose') adjustment information, and adjusting the relative position of the area array photoelectric sensor and the near-eye display to be detected through a transmission device.
The invention also discloses a measuring method of the near-eye display, which uses the area array photoelectric sensor without the imaging lens, the transmission device and the imaging device to obtain the optical axis position of the near-eye display to be measured, and specifically comprises the following steps:
s1, outputting a first detection picture with or without an image center mark by the near-to-eye display to be detected; the area array photoelectric sensor is arranged in the human eye watching area of the near-eye display to be detected, directly receives the light output signal from the near-eye display to be detected and obtains an illumination distribution image;
s2, obtaining the central position of the illumination distribution image through an image center recognition algorithm, and controlling the area array photoelectric sensor and/or the imaging device and/or the near-to-eye display to be detected through the transmission device so as to adjust the pose of the area array photoelectric sensor and/or the imaging device and/or the near-to-eye display to be detected, so that the center of an entrance pupil of the imaging device is arranged at the central point position of the image obtained by the area array photoelectric sensor;
s3, the near-eye display to be detected outputs a second detection picture with an image center mark, the imaging device focuses on the second detection picture, the imaging device or the near-eye display to be detected is controlled by the transmission device to make the imaging device or the near-eye display to be detected move relatively, the center of a receiving target surface of the imaging device coincides with the center of the acquired second detection picture, and the imaging device and the optical axis position information thereof are recorded at the moment;
s4, the near-eye display to be detected outputs a first detection picture, the area array photoelectric sensor and/or the imaging device and/or the near-eye display to be detected are/is controlled through the transmission device, so that the relative pose of the area array photoelectric sensor, the imaging device and the near-eye display to be detected is adjusted, the pose of the area array photoelectric sensor is perpendicular to the optical axis of the imaging device in the step S3, and then the area array photoelectric sensor is controlled to be adjusted back and forth in the direction of the optical axis until an illumination distribution image with the most clear boundary is obtained;
s5, repeating the steps S2 to S4 until the definition degree of the boundary of the illumination distribution image obtained in the step S4 reaches a set range, wherein the optical axis position is the optical axis position of the near-eye display to be detected, the area where the illumination distribution image with the clear boundary obtained by the area array photoelectric sensor is located is the exit pupil of the near-eye display, and the center of the illumination distribution image is the eye point.
In the above technical solution, according to the near-eye display imaging principle, when the area array photoelectric sensor obtains the illumination distribution image with the relatively most clear boundary, the area enclosed by the illumination distribution image boundary is the smallest compared with the area of the spot image received when the area array photoelectric sensor is located at other positions. Thus, for step S4, the size of the illumination distribution image area may be used instead of the sharpness of the illumination distribution image boundary as a criterion. And the results of the exit pupil parameters obtained by the two criteria can be mutually verified.
According to the optical imaging principle displayed by the near-eye display, the optical axis of the near-eye display to be measured is perpendicular to the exit pupil through an eye point, each illumination area plane parallel to the exit pupil in the eye box is perpendicular to the optical axis, and the optical axis of the near-eye display to be measured passes through the center of the illumination area. Therefore, the illumination distribution image obtained by the area array photoelectric sensor can be analyzed by using an image center recognition algorithm to obtain the center of the illumination distribution image, then the center of the entrance pupil of the imaging device is placed at the center of the illumination distribution image obtained by the area array photoelectric sensor, the near-to-eye display to be detected is controlled to output a second detection picture, the entrance pupil is taken as the rotation center, the relative pose of the imaging device and the near-to-eye display to be detected is adjusted by using the transmission device, the center of the imaging device is overlapped with the picture center of the near-to-eye display to be detected, and the optical axis position of the imaging device at the moment is recorded. The optical axis position of the imaging device is substantially coincident with the optical axis position of the near-eye display. In order to obtain the optical axis position of the near-eye display to be tested more accurately, verification and adjustment are needed, namely the area array photoelectric sensor is placed perpendicular to the optical axis position of the front-end imaging device, the center of the illumination distribution image is obtained again, the relative pose of the imaging device and the near-eye display to be tested is adjusted according to the center of the illumination distribution image and the center of the image of the near-eye display to be tested, and the optical axis position of the imaging device is obtained again. The relative pose of the area array photoelectric sensor, the near-eye display and the imaging device is adjusted back and forth through the transmission device, the area array photoelectric sensor is used for obtaining the clear boundary of the illumination distribution image, namely the exit pupil of the near-eye display, until the clear boundary of the illumination distribution image obtained by the area array photoelectric sensor reaches the set error range, namely the optical axis position of the imaging device is the optical axis position of the near-eye display to be detected at the moment when all the boundaries of the obtained illumination distribution image are the most clear, and the area where the clear boundary of the illumination distribution image obtained by the area array photoelectric sensor is located is the exit pupil of the near-eye display, namely the center of the illumination distribution image is the eye point.
Further, in the above technical solution, in step S2, the driving device is used to control the area array photoelectric sensor and/or the imaging device and/or the near-eye display to be tested, so as to adjust the relative pose of the area array photoelectric sensor, the imaging device and the near-eye display to be tested, which has various implementation manners. For example, the near-to-eye display is kept still, and the area array photoelectric sensor and the imaging device are controlled through the transmission device, so that the pose of the area array photoelectric sensor and the imaging device is adjusted, and the center of the entrance pupil of the imaging device is placed at the center point of the image obtained by the area array photoelectric sensor.
Further, in the above technical solution, the image center recognition algorithm includes, but is not limited to, gray-scale gravity center method or geometric method. The gray level gravity center method is to calculate the coordinates of the gray level weight center according to the gray level distribution of the image. The geometric method is to identify and acquire the centers of four sides of the illumination image, and the coordinate of the intersection point of the connecting lines of the centers of the two opposite sides is the coordinate of the center of the image.
Further, the driving device in the step S3 controls the imaging device or the near-to-eye display to be tested to make the imaging device or the near-to-eye display to perform relative movement, generally relative rotation, and the rotation center is located at the center of the entrance pupil position of the imaging device lens.
In some alternative embodiments, the first detection screen output by the near-eye display to be detected includes, but is not limited to, a full white board screen or a black matrix white frame screen, and the screen may have a center mark or not. The second detection picture output by the near-to-eye display to be detected is an image with a center mark, including but not limited to a full-center cross hair picture.
In some alternative embodiments, the first detection screen and the second detection screen may be the same.
The invention also discloses a measuring device of the near-eye display, which comprises the near-eye display to be measured, a sample stage for clamping the near-eye display to be measured, an area array photoelectric sensor, a first transmission device and a program control system; the surface array photoelectric sensor is placed facing the output direction of the near-eye display to be tested, the surface array photoelectric sensor or the sample table is connected with the first transmission device, and the first transmission device is controlled by the program control system so that the near-eye display to be tested and the surface array photoelectric sensor perform relative movement; the program control system is electrically connected with the first transmission device and the area array photoelectric sensor respectively. The relative motion of the area array photoelectric sensor and the near-eye display to be detected comprises translation in the up-down direction, the left-right direction, the front-back direction and rotation along two or more mutually perpendicular rotation shafts or multi-degree-of-freedom compound motion.
In some alternative embodiments, the area array photosensor includes, but is not limited to, CMOS, CCD, with corresponding data acquisition and transmission circuitry. It should be noted that this is only an example, and a person skilled in the art may adjust the present invention according to the common general knowledge.
In some alternative embodiments, the first transmission means comprises a rotation mechanism and/or a translation mechanism.
In some alternative embodiments, the first transmission is a robotic device having four or more axes of rotation.
Further, in the above technical scheme, the device further comprises a second transmission device, the second transmission device is connected with the sample table, and the second transmission device drives the sample table to move; the second transmission device comprises a rotation mechanism and a translation mechanism.
The invention also discloses a measuring device of the near-eye display, which comprises a sample table for clamping the near-eye display to be measured, an area array photoelectric sensor, an imaging device, a first transmission device and a program control system; the planar array photoelectric sensor and the imaging device are respectively arranged facing the output direction of the near-eye display to be detected, are respectively connected with the first transmission device, and are controlled by the program control system to adjust the pose of the planar array photoelectric sensor and the imaging device. The program control system is electrically connected with the first transmission device, the area array photoelectric sensor and the imaging device respectively.
In some alternative embodiments, the area array photosensor includes, but is not limited to, CMOS, CCD, with corresponding data acquisition and transmission circuitry. It should be noted that this is only an example, and a person skilled in the art may adjust the present invention according to the common general knowledge.
In some alternative embodiments, the first transmission means comprises a rotation mechanism and/or a translation mechanism.
In some alternative embodiments, the first transmission is a robotic device having four or more axes of rotation.
Further, in the above technical scheme, the device further comprises a second transmission device, the second transmission device is connected with the sample table, and the second transmission device drives the sample table to move; the second transmission device comprises a rotation mechanism and a translation mechanism.
The invention has the beneficial effects that: the sampling device used in the measuring method of the near-eye display provided by the invention is only a common area array photoelectric sensor, a lens with complex light path design is not required to be carried, and the scheme cost is greatly reduced. Meanwhile, the implementation steps of the measurement scheme are simple and feasible, the measurement steps of the optical axis and the eyepoint of the near-eye display are greatly simplified, the measurement efficiency is improved, and the measurement cost is reduced. In addition, according to the measuring method of the near-eye display, the used measuring device is formed by matching the common area array photoelectric sensor with the imaging device, so that the measuring precision and the measuring efficiency are greatly improved, and the measuring cost is reduced.
Detailed Description
The following detailed description of the invention is given by way of illustration only and not by way of limitation, as will be understood by those skilled in the art in conjunction with the accompanying drawings. It will be appreciated by those skilled in the art that modifications may be made to the following embodiments without departing from the scope and spirit of the invention. The scope of the invention is defined by the appended claims.
Example one
The embodiment discloses a measuring device of a near-eye display, as shown in fig. 3, comprising the near-eye display (1) to be measured, a sample stage (15) for clamping the near-eye display (1) to be measured, an area array photoelectric sensor (6) and a supporting frame (11) for clamping the area array photoelectric sensor (6); the support frame (11) comprises a three-dimensional translation table (16) and a two-axis rotating table (12), wherein the near-eye display (1) to be detected is arranged on the sample table (15), the three-dimensional translation table (16) and the two-axis rotating table (12) respectively control the movement and rotation of the area array photoelectric sensor (6), the position relation between the area array photoelectric sensor (6) and the near-eye display (1) to be detected is changed, and the illumination distribution measurement of different positions is realized; when the area array photoelectric sensor (6) is positioned at different positions, the received optical illumination distribution is different. And (3) carrying out boundary recognition algorithm analysis on the illumination distribution image to obtain the exit pupil parameters of the near-to-eye display to be detected.
Example two
The embodiment discloses a measuring device of a near-eye display, as shown in fig. 4, comprising the near-eye display (1) to be measured, a sample stage (15) for clamping the near-eye display (1) to be measured, an area array photoelectric sensor (6) and a supporting frame (11) for clamping the area array photoelectric sensor (6); the sample platform comprises a three-dimensional translation platform (16), the support frame (11) comprises a two-axis rotating platform (12), wherein the near-eye display (1) to be detected is arranged on the sample platform (15), the three-dimensional translation platform (16) controls the near-eye display to be detected to move, the two-axis rotating platform (12) controls the area array photoelectric sensor (6) to rotate, and then the position relation between the area array photoelectric sensor (6) and the near-eye display (1) to be detected is changed, the illuminance distribution measurement at different positions is realized, the illumination distribution image is analyzed by a boundary recognition algorithm, and the exit pupil parameters of the near-eye display to be detected are obtained.
Example three
The embodiment discloses a measuring device for an optical axis of a near-eye display, as shown in fig. 5, comprising the near-eye display (1) to be measured, a sample stage (15) for clamping the near-eye display (1) to be measured, an area array photoelectric sensor (6), a supporting frame (11-1) for clamping the area array photoelectric sensor (6), an imaging device (13), a supporting frame (11-2) for clamping the imaging device, a three-dimensional translation stage (16) and a two-axis rotary stage (12). The device comprises a sample table (15), a three-dimensional translation table (16) and a two-axis rotating table (12), wherein the near-eye display (1) to be tested is arranged on the sample table (15), the three-dimensional translation table (16) and the two-axis rotating table (12) control the near-eye display (1) to be tested to move and rotate, an area array photoelectric sensor (6) is arranged on a supporting frame (11-1) for clamping the area array photoelectric sensor (6), and an imaging device (13) is arranged on a supporting frame (11-2) for clamping the imaging device. And obtaining the exit pupil parameter and the optical axis position of the near-to-eye display to be detected by using the area array photoelectric sensor (6) and the imaging device (13).
Example four
The embodiment discloses measuring device of near-to-eye display optical axis, as shown in fig. 6, including near-to-eye display (1) to be measured, sample platform (15) that are used for the centre gripping near-to-eye display (1) to be measured, area array photoelectric sensor (6), imaging device (13), four-axis moving platform (14), diaxon revolving stage (12), wherein, near-to-eye display (1) to be measured sets up on sample platform (15), diaxon revolving stage (12) control near-to-eye display (1) to be measured rotates, area array photoelectric sensor (6) and imaging device (13) set up respectively on four-axis moving platform (14), four-axis moving platform (14) control area array photoelectric sensor (6) and imaging device (13) remove. And obtaining the exit pupil parameter and the optical axis position of the near-to-eye display to be detected by using the area array photoelectric sensor (6) and the imaging device (13).
The embodiment also discloses a measuring method of the near-eye display, as shown in fig. 7, the measuring steps include:
s1, controlling an output picture (2) of a near-eye display (1) to be tested;
s2, acquiring illumination distribution of the near-to-eye display to be detected in an initial space position through the area array photoelectric sensor (6), carrying out algorithm judgment on an illumination distribution image, and acquiring corresponding pose adjustment information;
s3, changing the relative pose of the area array photoelectric sensor (6) and the near-eye display (1) to be detected by using the three-dimensional translation table (16) and the two-axis rotary table (12), acquiring the illumination distribution of the near-eye display to be detected again, and judging the illumination distribution image by using an image algorithm again;
s4, if the sharpness of the boundary of the illumination distribution image is judged not to be the highest, acquiring corresponding pose adjustment information again, and repeating the step S3; if the illumination distribution image is judged to meet the threshold value requirement set by the algorithm, the plane where the area array photoelectric sensor (6) is located is the plane where the exit pupil of the near-to-eye display to be detected is located, at this time, the light spot area on the area array photoelectric sensor (6) is the exit pupil (4) of the near-to-eye display to be detected, the center of the exit pupil (4) is an eyepoint (5), and the axis perpendicular to the center of the exit pupil (4) of the near-to-eye display (1) to be detected is the optical axis (3) of the near-to-eye display to be detected.
The embodiment also discloses another measuring method of the near-eye display, as shown in fig. 8, the measuring steps include:
s1, outputting a first detection picture by the near-to-eye display to be detected; the area array photoelectric sensor (6) is arranged in the human eye watching area of the near-eye display (1) to be detected, directly receives the light output signal from the near-eye display (1) to be detected and obtains an illumination distribution image;
s2, obtaining the central position of the illumination distribution image through an image center recognition algorithm, and determining the relative pose adjustment information of the imaging device (13) and the near-to-eye display (1) to be detected; controlling the imaging device (13) through the four-axis moving platform (14) to adjust the pose of the imaging device (13);
s3, outputting a second detection picture by the near-eye display to be detected, focusing the imaging device (13) on the second detection picture, controlling the near-eye display (1) to be detected by the two-axis rotating table (12) to enable the near-eye display (1) to be detected and the imaging device (13) to rotate, enabling the center of a receiving target surface of the imaging device to be overlapped with the center of the acquired second detection picture, and recording the imaging device and the optical axis position of the imaging device;
s4, controlling the area array photoelectric sensor (6) through the four-axis moving platform (14), enabling the area array photoelectric sensor (6) to be perpendicular to the optical axis of the imaging device obtained in the step S3, and then controlling the area array photoelectric sensor (6) to be adjusted back and forth in the direction of the optical axis until an illumination distribution image (5) with the most clear boundary is obtained;
repeating the steps S1 to S4 until the boundary definition degree of the illumination distribution image obtained in the step S4 reaches a set range, wherein the optical axis position is the optical axis position of the near-eye display to be detected, the area where the illumination distribution image with the clear boundary obtained by the area array photoelectric sensor is located is the exit pupil of the near-eye display, and the center of the illumination distribution image is the eye point.