Movatterモバイル変換


[0]ホーム

URL:


CN111243028A - Electronic equipment and lens association method and device - Google Patents

Electronic equipment and lens association method and device
Download PDF

Info

Publication number
CN111243028A
CN111243028ACN201811334220.2ACN201811334220ACN111243028ACN 111243028 ACN111243028 ACN 111243028ACN 201811334220 ACN201811334220 ACN 201811334220ACN 111243028 ACN111243028 ACN 111243028A
Authority
CN
China
Prior art keywords
light spot
lens module
lens
spot image
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811334220.2A
Other languages
Chinese (zh)
Other versions
CN111243028B (en
Inventor
王春茂
浦世亮
徐鹏
俞海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co LtdfiledCriticalHangzhou Hikvision Digital Technology Co Ltd
Priority to CN201811334220.2ApriorityCriticalpatent/CN111243028B/en
Priority to PCT/CN2019/112850prioritypatent/WO2020093873A1/en
Publication of CN111243028ApublicationCriticalpatent/CN111243028A/en
Application grantedgrantedCritical
Publication of CN111243028BpublicationCriticalpatent/CN111243028B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The embodiment of the application provides an electronic device and a method and a device for associating lenses, wherein the device comprises: the device comprises a first lens module, a second lens module, a projection component and a processor; wherein the projection component projects a light spot; the first lens module acquires a first light spot image aiming at the light spot; the second lens module acquires a second light spot image aiming at the light spots; and the processor is used for calibrating and obtaining the correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image. Therefore, in the scheme, the light spots are projected by the projection component, the lens modules respectively acquire images aiming at the light spots, the processor automatically calibrates the correlation result aiming at the images acquired by the lens modules, the whole calibration process does not need human participation, and the convenience of lens correlation is improved.

Description

Electronic equipment and lens association method and device
Technical Field
The application relates to the technical field of security protection, in particular to an electronic device and a lens association method and device.
Background
In some scenarios, different shots need to be associated, and shot association refers to forming a corresponding relationship between the same objects in different shot pictures. For example, use long focal length lens can gather the higher image of definition, use short focal length lens can gather the great image of visual angle, can be related long focal length lens and short focal length lens, to same scene, both obtained the higher image of definition, obtained the great image of visual angle again.
If the professional marks internal and external parameters for a plurality of lenses needing to be associated, and then the coordinates of the overlapped areas among the plurality of lenses are associated according to the internal and external parameters. In the scheme, after the plurality of lenses are associated, the positions, the angles and the focal lengths are fixed, and if the positions, the angles and the focal lengths need to be adjusted, the positions, the angles and the focal lengths can be re-associated only by professionals, so that the convenience in using the scheme is poor.
Disclosure of Invention
An object of the embodiments of the present application is to provide an electronic device, a lens association method and an apparatus, so as to improve convenience of lens association.
To achieve the above object, an embodiment of the present application provides an electronic device, including: the device comprises a first lens module, a second lens module, a projection component and a processor; wherein,
the projection component is used for projecting light spots;
the first lens module is used for acquiring a first light spot image aiming at the light spot;
the second lens module is used for acquiring a second light spot image aiming at the light spot;
the processor is configured to obtain a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image through calibration.
Optionally, the first lens module includes a long-focus lens, the second lens module includes a short-focus lens, the first light spot image corresponds to a partial region of the second light spot image, and an angle of the first lens module changes along with an angle of the projection component.
Optionally, the apparatus further includes a third lens module, and an angle of the third lens module changes synchronously with an angle of the projection component;
the third lens module is used for acquiring a third light spot image aiming at the light spot, and the third light spot image corresponds to a partial area of the second light spot image;
the processor is specifically configured to obtain a correlation result among the first lens module, the second lens module, and the three lens modules based on the first light spot image, the second light spot image, and the third light spot image through calibration.
Optionally, the first lens module includes a first lens and a first imaging device, the first lens is configured to collect optical signals for the light spot, and the first imaging device is configured to convert the optical signals collected by the first lens into electrical signals to obtain a first light spot image;
the second lens module comprises a second lens and a second imaging device, the second lens is used for collecting optical signals aiming at the light spots, and the second imaging device is used for converting the optical signals collected by the second lens into electric signals to obtain a second light spot image.
In order to achieve the above object, an embodiment of the present application further provides a lens association method, including:
acquiring a first light spot image; the first light spot image is an image collected by the first lens module aiming at a light spot projected by the projection component;
acquiring a second light spot image; the second light spot image is an image acquired by a second lens module aiming at the light spot;
and calibrating to obtain a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image.
Optionally, the light spot comprises a positioning area; the calibration obtaining of the correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image includes:
determining a plurality of corner points in the first light spot image as first corner points;
determining a plurality of corner points in the second light spot image as second corner points;
matching the first corner point with the second corner point according to the position relation between the positioning area and each corner point to obtain a plurality of matching point pairs;
and calibrating to obtain a correlation result between the first lens module and the second lens module according to the coordinate value of the center point of the matching point.
Optionally, the first lens module includes a long-focus lens, the second lens module includes a short-focus lens, and the first light spot image corresponds to a partial region of the second light spot image.
Optionally, the light spot further includes an encoded texture region, and the rectangular block included in the positioning region is different in size from the rectangular block included in the encoded texture region.
In order to achieve the above object, an embodiment of the present application further provides a lens related apparatus, including:
the first acquisition module is used for acquiring a first light spot image; the first light spot image is an image collected by the first lens module aiming at a light spot projected by the projection component;
the second acquisition module is used for acquiring a second light spot image; the second light spot image is an image acquired by a second lens module aiming at the light spot;
and the calibration module is used for calibrating to obtain a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image.
Optionally, the light spot comprises a positioning area; the calibration module is specifically configured to:
determining a plurality of corner points in the first light spot image as first corner points;
determining a plurality of corner points in the second light spot image as second corner points;
matching the first corner point with the second corner point according to the position relation between the positioning area and each corner point to obtain a plurality of matching point pairs;
and calibrating to obtain a correlation result between the first lens module and the second lens module according to the coordinate value of the center point of the matching point.
Optionally, the first lens module includes a long-focus lens, the second lens module includes a short-focus lens, and the first light spot image corresponds to a partial region of the second light spot image.
Optionally, the light spot further includes an encoded texture region, and the rectangular block included in the positioning region is different in size from the rectangular block included in the encoded texture region.
In order to achieve the above object, an embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements any of the lens association methods described above.
The electronic equipment that this application embodiment provided includes: the device comprises a first lens module, a second lens module, a projection component and a processor; wherein the projection component projects a light spot; the first lens module acquires a first light spot image aiming at the light spot; the second lens module acquires a second light spot image aiming at the light spots; and the processor is used for calibrating and obtaining the correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image. Therefore, in the scheme, the light spots are projected by the projection component, the lens modules respectively acquire images aiming at the light spots, the processor automatically calibrates the correlation result aiming at the images acquired by the lens modules, the whole calibration process does not need human participation, and the convenience of lens correlation is improved.
Of course, not all of the advantages described above need to be achieved at the same time in the practice of any one product or method of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a first schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a first spot image provided in the embodiment of the present application;
fig. 3 is a second structural schematic diagram of an electronic device according to an embodiment of the present application;
fig. 4a is a schematic diagram of a light spot provided in the embodiment of the present application;
fig. 4b is a schematic view of another light spot provided in the embodiment of the present application;
fig. 5 is a schematic view of a corner point provided in an embodiment of the present application;
fig. 6a is a schematic view of a second light spot image provided in the embodiment of the present application;
fig. 6b is a schematic view of a third spot image provided in the embodiment of the present application;
fig. 6c is a schematic view of a fourth light spot image provided in the embodiment of the present application;
fig. 7 is a schematic flowchart of a lens association method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a lens association apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to solve the foregoing technical problems, embodiments of the present application provide an electronic device, a lens association method, and a lens association apparatus.
Referring to fig. 1, the electronic device may include a first lens module, a second lens module, a projection part, and a processor; wherein,
a projection component for projecting a light spot;
the first lens module is used for acquiring a first light spot image aiming at the light spot;
the second lens module is used for acquiring a second light spot image aiming at the light spots;
and the processor is used for calibrating and obtaining the correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image.
The apparatus provided in the embodiment of the present application includes at least two lens modules, which are described below by taking two lens modules as an example, and for distinguishing descriptions, one of the lens modules is referred to as a first lens module, and the other lens module is referred to as a second lens module; the image collected by the first lens module aiming at the light spot is called a first light spot image, and the image collected by the second lens module aiming at the light spot is called a second light spot image.
For example, the lens module may include a lens and an imaging device, and the imaging device may be a CCD (charge coupled device), a CMOS (Complementary Metal oxide semiconductor), and the like, and is not limited in particular. The lens collects optical signals, and the imaging device converts the optical signals into electric signals, so that images are obtained.
In this way, the first lens module comprises a first lens and a first imaging device, the first lens is used for collecting optical signals aiming at the light spots, and the first imaging device is used for converting the optical signals collected by the first lens into electric signals to obtain a first light spot image;
the second lens module comprises a second lens and a second imaging device, the second lens is used for collecting optical signals aiming at the light spots, and the second imaging device is used for converting the optical signals collected by the second lens into electric signals to obtain a second light spot image.
First camera lens module and second camera lens module are all carried out image acquisition to the facula, and as an embodiment, first camera lens module includes long focal length lens, and the second camera lens module includes short focal length lens, and perhaps said first camera lens is long focal length lens, and above-mentioned second camera lens is short focal length lens. The visual angle of short focal length lens is great, and the visual angle of long focal length lens is less, and under this condition, can regard the image that includes the second lens module collection of short focal length lens as the main picture, regard the image that includes the first lens module collection of long focal length lens as the sprite. Referring to fig. 2, the first spot image corresponds to a partial area of the second spot image, or the second spot image corresponds to a large scene, and the first spot image corresponds to a small scene, which is a part of the large scene.
In one case, as shown in fig. 2, the light spot range may be slightly larger than the collection range of the first light spot image, or the first light spot image is a part of the light spot, in which case, the central point of the light spot range and the central point of the first light spot image may be approximately coincident. Alternatively, the first spot image may substantially coincide with the spot range, or the spot range may be slightly smaller than the acquisition range of the first spot image.
Referring to fig. 3, the angle (or referred to as a collection direction) and the focal length of the two lens modules, and the angle (or referred to as a projection direction) of the projection unit are adjustable. Alternatively, the focal length of the projection component is also adjustable, and is not limited. In this embodiment, the angle of the first lens module changes synchronously with the angle of the projection component, so that the small scene corresponding to the long-focus lens substantially coincides with the light spot projected by the projection component, or the long-focus lens is aligned to the light spot.
In one embodiment, the light spot may include a positioning region and an encoded texture region, wherein the positioning region includes rectangular blocks having different sizes from those of the encoded texture region. For example, referring to fig. 4a, the positioning area is a part of the middle area, and the positioning area includes four small rectangles and a cross line; the coding texture area is a residual area, and the coding texture area is a checkerboard with black and white. For another example, referring to fig. 4b, the positioning area is a middle row and a middle column, the remaining area is the coding texture area, and the checkerboard size of the positioning area is larger than the size of the coding texture area.
The processor calibrates the two lens modules based on the first light spot image and the second light spot image. Specifically, a plurality of corner points in the first spot image may be determined as the first corner points; determining a plurality of corner points in the second light spot image as second corner points; matching the first corner point with the second corner point according to the position relation between the positioning area and each corner point to obtain a plurality of matching point pairs; and calibrating to obtain a correlation result between the first lens module and the second lens module according to the coordinate values of the angular points in the matching point pairs.
The corner point can be understood as the intersection of a black checkerboard and a white checkerboard, such as shown in fig. 5. For example, a Harris corner detection algorithm may be adopted to detect the first light spot image and the second light spot image respectively, so as to obtain corners in the images. For the purpose of distinguishing the description, a corner point in the first spot image is referred to as a first corner point, and a corner point in the second spot image is referred to as a second corner point.
As another example, the checkerboard corners have features unique to some corners, and based on these features, a filter template may be set, and the corners in the image may be identified according to the filter template.
The corresponding relation between the first corner point and the second corner point can be determined based on the positioning area in the light spot image, and a matching point pair is formed according to the corresponding relation, wherein one matching point pair comprises a first corner point and a second corner point, and the first corner point and the second corner point correspond to the same point in the projected light spot. According to the coordinate values of the two angular points in the matching point pair, the internal parameter of the lens module and the external parameter between the lens modules can be obtained through calibration, according to the internal parameter and the external parameter, the mapping relation between the pixel point coordinate in the first light spot image and the pixel point coordinate in the second light spot image can be obtained, and the mapping relation can be used as the correlation result between the lens modules.
Specifically, in one case, the aberration correction may be performed on the speckle image to obtain a corrected image, and then the internal parameters such as the focal length and the principal point coordinates of the lens module may be solved based on the corrected image. For example, the internal parameters such as the focal length and the principal point coordinates may be calculated by the zhang's scaling method or the opencv (Open Source Computer Vision Library). The specific manner of calibrating the internal reference is not limited.
And then, solving external parameters between the lens modules through an optimization function, such as a reprojection error function or a solvePnP function, wherein the external parameters comprise a rotation relation and a translation relation.
After the correlation result between the lens modules is obtained through calibration, coordinate mapping can be performed between the images collected by the two lens modules, for example, a first lens module comprising a long-focus lens collects a face image, a second lens module comprising a short-focus lens collects a whole-body image of a person, and the face image and the whole-body image can be mapped through the correlation result, so that a face region is determined in the whole-body image.
By applying the electronic equipment provided by the embodiment of the application, the light spots are projected by the projection component, the lens modules respectively collect images aiming at the light spots, the processor automatically calibrates the correlation result aiming at the images collected by the lens modules, the whole calibration process does not need human participation, and the convenience of lens correlation is improved.
In some related schemes, a fixed calibration board is usually required to be arranged, and a camera acquires a calibration image for the calibration board, so that the calibration of the camera is realized through the calibration image. In some environments with poor conditions, such as desert, mountain, etc., it is very inconvenient to set a fixed calibration board. By adopting the electronic equipment provided by the embodiment of the application, the projection component projects the light spot, a fixed calibration plate is not required to be arranged, and the calibration convenience is improved.
In the foregoing, two lens modules are taken as an example for description, the embodiment of the present application does not limit the number of the lens modules, the electronic device may include more than two lens modules, and the specific association scheme is similar.
As an embodiment, the apparatus further includes a third lens module, an angle of which changes synchronously with an angle of the projection component;
the third lens module is used for acquiring a third light spot image aiming at the light spot, and the third light spot image corresponds to a partial area of the second light spot image;
the processor is specifically configured to obtain a correlation result among the first lens module, the second lens module, and the three lens modules based on the first light spot image, the second light spot image, and the third light spot image through calibration.
In this embodiment, the third lens module also includes a long-focus lens. In one case, referring to fig. 6a, the focal length of the third lens module is greater than that of the first lens module, and the third speckle image corresponds to a partial region of the first speckle image. In another case, referring to fig. 6b, the focal length of the third lens module is smaller than that of the first lens module, and the first light spot image corresponds to a partial area of the third light spot image. In yet another case, it is also possible that, referring to fig. 6c, the first spot image and the third spot image may partially coincide.
The image collected by the second lens module including the short-focus lens can be used as a main picture, and the image collected by the first lens module including the long-focus lens and the image collected by the third lens module can be used as a sub-picture. One or more sub-pictures may be provided, and are not particularly limited.
The electronic device provided by the embodiment of the application can also be understood as a lens correlation system.
The embodiment of the application also provides a lens association method and a lens association device, and the method and the device can be applied to a processor in electronic equipment, or can also be applied to other electronic equipment, and are not limited specifically.
Fig. 7 is a schematic flowchart of a lens association method according to an embodiment of the present application, including:
s701: and acquiring a first light spot image. The first light spot image is an image collected by the first lens module aiming at the light spot projected by the projection component.
S702: and acquiring a second light spot image. And the second light spot image is an image acquired by the second lens module aiming at the light spot.
In the embodiment of the invention, the projection component projects the light spot, and different lens modules respectively acquire images aiming at the light spot to obtain a light spot image. For distinguishing descriptions, one of the lens modules is called a first lens module, and the other lens module is called a second lens module; the image collected by the first lens module aiming at the light spot is called a first light spot image, and the image collected by the second lens module aiming at the light spot is called a second light spot image.
For example, the lens module may include a lens and an imaging device, and the imaging device may be a CCD (charge coupled device), a CMOS (Complementary Metal oxide semiconductor), and the like, and is not limited in particular.
In one embodiment, the first lens module includes a long-focus lens, and the second lens module includes a short-focus lens. The visual angle of short focal length lens is great, and the visual angle of long focal length lens is less, and under this condition, can regard the image that includes the second lens module collection of short focal length lens as the main picture, regard the image that includes the first lens module collection of long focal length lens as the sprite. Referring to fig. 2, the first spot image corresponds to a partial area of the second spot image, or the second spot image corresponds to a large scene, and the first spot image corresponds to a small scene, which is a part of the large scene.
In one case, as shown in fig. 2, the light spot range may be slightly larger than the collection range of the first light spot image, or the first light spot image is a part of the light spot, in which case, the central point of the light spot range and the central point of the first light spot image may be approximately coincident. Alternatively, the first spot image may substantially coincide with the spot range, or the spot range may be slightly smaller than the acquisition range of the first spot image.
In this embodiment, the angle of the first lens module changes synchronously with the angle of the projection component, so that the small scene corresponding to the long-focus lens substantially coincides with the light spot projected by the projection component, or the long-focus lens is aligned to the light spot.
In one embodiment, the light spot may include a positioning region and an encoded texture region, wherein the positioning region includes rectangular blocks having different sizes from those of the encoded texture region. For example, referring to fig. 4a, the positioning area is a part of the middle area, and the positioning area includes four small rectangles and a cross line; the coding texture area is a residual area, and the coding texture area is a checkerboard with black and white. For another example, referring to fig. 4b, the positioning area is a middle row and a middle column, the remaining area is the coding texture area, and the checkerboard size of the positioning area is larger than the size of the coding texture area.
S703: and calibrating to obtain a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image.
As an embodiment, a plurality of corner points in the first spot image may be determined as the first corner points; determining a plurality of corner points in the second light spot image as second corner points; matching the first corner point with the second corner point according to the position relation between the positioning area and each corner point to obtain a plurality of matching point pairs; and calibrating to obtain a correlation result between the first lens module and the second lens module according to the coordinate values of the angular points in the matching point pairs.
The corner point can be understood as the intersection of a black checkerboard and a white checkerboard, such as shown in fig. 5. For example, a Harris corner detection algorithm may be adopted to detect the first light spot image and the second light spot image respectively, so as to obtain corners in the images. For the purpose of distinguishing the description, a corner point in the first spot image is referred to as a first corner point, and a corner point in the second spot image is referred to as a second corner point.
As another example, the checkerboard corners have features unique to some corners, and based on these features, a filter template may be set, and the corners in the image may be identified according to the filter template.
The corresponding relation between the first corner point and the second corner point can be determined based on the positioning area in the light spot image, and a matching point pair is formed according to the corresponding relation, wherein one matching point pair comprises a first corner point and a second corner point, and the first corner point and the second corner point correspond to the same point in the projected light spot. According to the coordinate values of the two angular points in the matching point pair, the internal parameter of the lens module and the external parameter between the lens modules can be obtained through calibration, according to the internal parameter and the external parameter, the mapping relation between the pixel point coordinate in the first light spot image and the pixel point coordinate in the second light spot image can be obtained, and the mapping relation can be used as the correlation result between the lens modules.
Specifically, in one case, the aberration correction may be performed on the speckle image to obtain a corrected image, and then the internal parameters such as the focal length and the principal point coordinates of the lens module may be solved based on the corrected image. For example, the internal parameters such as the focal length and the principal point coordinates can be calculated by the method in the zhang's scaling method or opencv (Open Source Computer Vision Library). The specific manner of calibrating the internal reference is not limited.
And then, solving external parameters between the lens modules through an optimization function, such as a reprojection error function or a solvePnP function, wherein the external parameters comprise a rotation relation and a translation relation.
After the correlation result between the lens modules is obtained through calibration, coordinate mapping can be performed between the images collected by the two lens modules, for example, a first lens module comprising a long-focus lens collects a face image, a second lens module comprising a short-focus lens collects a whole-body image of a person, and the face image and the whole-body image can be mapped through the correlation result, so that a face region is determined in the whole-body image.
By applying the lens association method provided by the embodiment of the application, the association result is automatically calibrated aiming at the image collected by each lens module, the whole calibration process does not need human participation, and the convenience of lens association is improved.
An embodiment of the present application further provides a lens association apparatus, as shown in fig. 8, including:
a first obtainingmodule 801, configured to obtain a first spot image; the first light spot image is an image collected by the first lens module aiming at a light spot projected by the projection component;
a second obtainingmodule 802, configured to obtain a second light spot image; the second light spot image is an image acquired by a second lens module aiming at the light spot;
acalibration module 803, configured to calibrate to obtain a correlation result between the first lens module and the second lens module based on the first light spot image and the second light spot image.
In one embodiment, the spot of light includes a localized area; thecalibration module 803 is specifically configured to:
determining a plurality of corner points in the first light spot image as first corner points;
determining a plurality of corner points in the second light spot image as second corner points;
matching the first corner point with the second corner point according to the position relation between the positioning area and each corner point to obtain a plurality of matching point pairs;
and calibrating to obtain a correlation result between the first lens module and the second lens module according to the coordinate value of the center point of the matching point.
In one embodiment, the first lens module includes a long focal length lens, the second lens module includes a short focal length lens, and the first light spot image corresponds to a partial region of the second light spot image.
In one embodiment, the light spot further includes an encoded texture region, and the positioning region includes a rectangular block having a different size from a rectangular block included in the encoded texture region.
An embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the computer program implements any of the lens association methods described above.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the method embodiment, the apparatus embodiment, the device embodiment, and the computer-readable storage medium embodiment, since they are substantially similar to the device embodiment, the description is relatively simple, and the relevant points can be referred to the partial description of the device embodiment.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (12)

CN201811334220.2A2018-11-092018-11-09Electronic equipment and lens association method and deviceActiveCN111243028B (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
CN201811334220.2ACN111243028B (en)2018-11-092018-11-09Electronic equipment and lens association method and device
PCT/CN2019/112850WO2020093873A1 (en)2018-11-092019-10-23Electronic device, method and device for lens association

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201811334220.2ACN111243028B (en)2018-11-092018-11-09Electronic equipment and lens association method and device

Publications (2)

Publication NumberPublication Date
CN111243028Atrue CN111243028A (en)2020-06-05
CN111243028B CN111243028B (en)2023-09-08

Family

ID=70610817

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201811334220.2AActiveCN111243028B (en)2018-11-092018-11-09Electronic equipment and lens association method and device

Country Status (2)

CountryLink
CN (1)CN111243028B (en)
WO (1)WO2020093873A1 (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2006120146A2 (en)*2005-05-112006-11-16Sony Ericsson Mobile Communication AbDigital cameras with triangulation autofocus systems and related methods
US20100328200A1 (en)*2009-06-302010-12-30Chi-Chang YuDevice and related method for converting display screen into touch panel screen
CN102148965A (en)*2011-05-092011-08-10上海芯启电子科技有限公司 Multi-target tracking close-up shooting video surveillance system
CN102291569A (en)*2011-07-272011-12-21上海交通大学Double-camera automatic coordination multi-target eagle eye observation system and observation method thereof
CN103116889A (en)*2013-02-052013-05-22海信集团有限公司Positioning method and electronic device
EP2615580A1 (en)*2012-01-132013-07-17Softkinetic SoftwareAutomatic scene calibration
CN104092939A (en)*2014-07-072014-10-08山东神戎电子股份有限公司Laser night vision device synchronous zooming method based on piecewise differentiation technology
CN104363986A (en)*2014-10-312015-02-18华为技术有限公司Image processing method and device
US20150160539A1 (en)*2013-12-092015-06-11Geo Semiconductor Inc.System and method for automated test-pattern-free projection calibration
CN106934861A (en)*2017-02-092017-07-07深圳先进技术研究院Object dimensional method for reconstructing and device
CN107370934A (en)*2017-09-192017-11-21信利光电股份有限公司A kind of multi-cam module

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN205563716U (en)*2016-03-302016-09-07广州市盛光微电子有限公司Panoramic camera calibration device based on many camera lenses multisensor
CN106846415B (en)*2017-01-242019-09-20长沙全度影像科技有限公司A kind of multichannel fisheye camera binocular calibration device and method
CN106791337B (en)*2017-02-222023-05-12北京汉邦高科数字技术股份有限公司Zoom camera with double-lens optical multiple expansion and working method thereof

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2006120146A2 (en)*2005-05-112006-11-16Sony Ericsson Mobile Communication AbDigital cameras with triangulation autofocus systems and related methods
US20100328200A1 (en)*2009-06-302010-12-30Chi-Chang YuDevice and related method for converting display screen into touch panel screen
CN102148965A (en)*2011-05-092011-08-10上海芯启电子科技有限公司 Multi-target tracking close-up shooting video surveillance system
CN102291569A (en)*2011-07-272011-12-21上海交通大学Double-camera automatic coordination multi-target eagle eye observation system and observation method thereof
EP2615580A1 (en)*2012-01-132013-07-17Softkinetic SoftwareAutomatic scene calibration
CN103116889A (en)*2013-02-052013-05-22海信集团有限公司Positioning method and electronic device
US20150160539A1 (en)*2013-12-092015-06-11Geo Semiconductor Inc.System and method for automated test-pattern-free projection calibration
CN104092939A (en)*2014-07-072014-10-08山东神戎电子股份有限公司Laser night vision device synchronous zooming method based on piecewise differentiation technology
CN104363986A (en)*2014-10-312015-02-18华为技术有限公司Image processing method and device
CN106934861A (en)*2017-02-092017-07-07深圳先进技术研究院Object dimensional method for reconstructing and device
CN107370934A (en)*2017-09-192017-11-21信利光电股份有限公司A kind of multi-cam module

Also Published As

Publication numberPublication date
CN111243028B (en)2023-09-08
WO2020093873A1 (en)2020-05-14

Similar Documents

PublicationPublication DateTitle
US11272161B2 (en)System and methods for calibration of an array camera
CN106875339B (en)Fisheye image splicing method based on strip-shaped calibration plate
CN115830103B (en)Transparent object positioning method and device based on monocular color and storage medium
US8897502B2 (en)Calibration for stereoscopic capture system
CN107077743B (en) System and method for dynamic calibration of array cameras
KR101657039B1 (en) Image processing apparatus, image processing method, and imaging system
US20110249117A1 (en)Imaging device, distance measuring method, and non-transitory computer-readable recording medium storing a program
EP1343332A2 (en)Stereoscopic image characteristics examination system
CN111340888B (en)Light field camera calibration method and system without white image
JP7675288B2 (en) Photographing and measuring method, device, equipment and storage medium
JP2019510234A (en) Depth information acquisition method and apparatus, and image acquisition device
CN106683071A (en)Image splicing method and image splicing device
CN107483774A (en) Cameras and vehicles
TW201340035A (en)Method for combining images
CN109584312A (en)Camera calibration method, device, electronic equipment and computer readable storage medium
WO2022100668A1 (en)Temperature measurement method, apparatus, and system, storage medium, and program product
CN110728703A (en)Registration and fusion method of visible light image and solar blind ultraviolet light image
TWI595444B (en)Image capturing device, depth information generation method and auto-calibration method thereof
CN109658459A (en)Camera calibration method, device, electronic equipment and computer readable storage medium
CN111243028B (en)Electronic equipment and lens association method and device
KR20180083245A (en)Apparatus and method for processing information of multi camera
KR102295987B1 (en)Calibration method and apparatus of stereo camera module, computer readable storage medium
JP2004007213A (en) Digital 3D model imaging equipment
JP7312594B2 (en) Calibration charts and calibration equipment
KR101506393B1 (en)Endoscope System and Method for Obtaining a Image

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp