Disclosure of Invention
The application provides a three-dimensional measurement method and a three-dimensional measurement system, which are used for solving the technical problem of large calculated amount in the three-dimensional measurement process.
The application provides a three-dimensional measurement method, which comprises the steps of sequentially projecting a plurality of projection images to the surface of an object to be detected through visible light rays, wherein each projection image comprises a plurality of sine stripes, the frequency of the sine stripes of the plurality of projection images is the same, the corresponding phases of the sine stripes in each projection image are shifted according to a preset rule, a plurality of reflection images are acquired after the object to be detected reflects light rays of the plurality of projection images, the corresponding wrapping phases of the plurality of reflection images are determined according to the sine stripes of the plurality of reflection images, the corresponding phase progression of the plurality of reflection images is determined according to the preset rule of the sine stripes in the plurality of reflection images, and three-dimensional information of the surface of the object to be detected is determined according to the wrapping phases and the phase progression.
In an embodiment of the first aspect of the present application, the preset rule includes performing phase shift processing on each sinusoidal stripe according to a phase shift value corresponding to a position of each sinusoidal stripe in the projection image.
In a first embodiment of the first aspect of the present application, the light intensity distribution function of the projected image is:
Wherein I' (x, y) is a light intensity bias value, I "(x, y) is a sine amplitude, δn =2pi/2N, 2N is a step number of time-varying phase shift, and z (x, y) is a shift value of a (x, y) position in the projection image according to the preset rule.
In an embodiment of the first aspect of the present application, the method further includes calculating an offset value of the (x, y) position in the projection image according to the preset rule according to the formula z (x, y) =sN (n) ·σ (x, y);
Wherein SN (N) is a sign function, when N < N, SN (N) = -1, when N is not less than N, SN (N) = 1, sigma (x, y) is a space-variant phase shift value,K (x, y) is the phase progression, f is the frequency of the sine stripes, and R is the upper limit value of the preset rule shift term value.
In an embodiment of the first aspect of the present application, the determining the phase progression corresponding to the plurality of reflection images according to the preset law of sinusoidal fringes in the plurality of reflection images includes determining space-variant phase shift values of the plurality of projection images according to the preset law of light intensity of the plurality of reflection images and phase variation of the plurality of reflection images, and determining the phase progression corresponding to the plurality of projection images according to the space-variant phase shift values and frequencies of sinusoidal fringes in the plurality of reflection images.
In an embodiment of the first aspect of the present application, the determining, according to the light intensities of the plurality of reflected images and a preset rule of phase changes of the plurality of reflected images, a space-variant phase shift value of the plurality of projection images according to the preset rule includes:
when N is even, the formula is passedDetermining the space-variant phase shift value σ (x, y);
when N is odd, the formula is passedAnd determining the space-variant phase shift value sigma.
In an embodiment of the first aspect of the present application, the determining, according to the space-variant phase shift value and the frequency of the sinusoidal stripe, a phase number corresponding to the plurality of projection images includes:
According toThe phase progression k (x, y) is determined.
In an embodiment of the first aspect of the present application, the determining, according to the sinusoidal fringes of the plurality of reflection images, a wrapping phase corresponding to the plurality of reflection images includes:
According to the formulaDetermining the wrapping phase
In an embodiment of the first aspect of the present application, the determining the three-dimensional information of the surface of the object to be detected according to the wrapping phase and the phase progression includes determining the three-dimensional information of the surface of the object to be detected according to the formulaDetermining the real phase phi (x, y) of the surface to be detected, and determining the three-dimensional information of the surface of the object to be detected according to the change of the real phase phi.
The three-dimensional measurement system comprises electronic equipment, an emission device and an acquisition device, wherein the emission device and the acquisition device are respectively connected with the electronic equipment, the electronic equipment is used for controlling the emission device to sequentially project a plurality of projection images to the surface of an object to be detected through visible rays, each projection image comprises a plurality of sine stripes, the frequency of the sine stripes of the plurality of projection images is the same, the phases of the sine stripes in each projection image are shifted according to a preset rule, the electronic equipment is further used for controlling the acquisition device to acquire a plurality of reflection images obtained after the object to be detected reflects light of the plurality of projection images, the electronic equipment is further used for determining the wrapping phases of the plurality of reflection images according to the sine stripes of the plurality of reflection images, determining the phases of the plurality of reflection images according to the preset rule of sine stripes in the sine stripes, and determining the number of stages of the wrapping phases of the object to be detected according to the three-dimensional information.
In summary, according to the three-dimensional measurement method and system provided by the application, in addition to time-varying phase shift, space-varying phase shift related to the position in the projection image is added in the projection image projected onto the surface of the object to be detected, so that after the electronic equipment receives the reflection image, the phase progression can be determined according to the phase change in the reflection image, and the real phases corresponding to a plurality of reflection images can be obtained by combining the wrapping phases which can be determined from the reflection image, and finally, the measurement of the three-dimensional information of the surface of the object is completed according to the real phases. That is, in the embodiment of the application, the electronic device only needs to determine a plurality of projection images with the same frequency, and after the acquisition device sends out the plurality of projection images, the electronic device can measure the three-dimensional information of the object surface according to the plurality of reflection images with the same frequency by the plurality of reflection images with the same frequency acquired by the acquisition device.
Therefore, the embodiment of the application embeds the phase number into the space-variant phase shift of the image, the number of the images to be determined, transmitted, acquired and detected is less, and the detection can be completed by only four pictures at least, so that the number of the images to be processed by the electronic equipment, the transmitting device and the acquisition device in the whole three-dimensional automatic optical detection process is reduced, the calculated amount of the electronic equipment in the three-dimensional optical detection process is reduced, the large occupation of the calculation resources of the electronic equipment is avoided, the detection speed and efficiency are improved due to the less processing amount, the requirement on the calculation capability of the electronic equipment is reduced, and the cost required in the three-dimensional automatic optical detection process is reduced, so that the whole detection process has higher cost benefit. In addition, the embodiment of the application embeds the phase progression into the space-variant phase shift of the stripe image instead of the pixel intensity of the image, so that the higher sine amplitude of the image can be reserved, and the quality of the image is ensured.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Before formally describing the embodiments of the present application, the scenario in which the present application is applied and the problems existing in the prior art will be described with reference to the accompanying drawings. Specifically, the application is applied to a three-dimensional automatic optical detection (3D Automatic Optic Inspection, abbreviated as 3D AOI) technology, the 3D AOI technology is used for measuring three-dimensional point cloud data on the surface of an object in a non-contact manner in an optical projection manner, for example, the application can be particularly applied to a production line of products such as a printed circuit board (Printed Circuit Board, abbreviated as PCB) and the like, and the produced products are subjected to high-speed and high-precision quantitative defect detection.
For example, fig. 1 is a schematic diagram of an application scenario of the present application, where in the scenario shown in fig. 1, a three-dimensional measurement system composed of an electronic device 40, a transmitting device 20 and a collecting device 30 is used together to measure three-dimensional information on a surface of an object 10 to be detected, and the transmitting device 20 and the collecting device 30 may be controlled by the same electronic device 40, or the transmitting device 20 and the collecting device 30 may be controlled by separately connected electronic devices. In order to measure three-dimensional information of the surface of the object 10 to be detected, first a projected image of sinusoidal phase-shifted stripes is projected by the electronic device 40 onto the surface of the object 10 by means of the emitting device 20, which emitting device 20 may be a projector or the like. Then, after the light of the projected image of the surface of the object 10 is reflected, the electronic device 40 acquires the light of the surface of the object 10 by using the acquisition device 30 to obtain a reflected image, where the acquisition device 30 may be a camera or other device. Because the uneven position of the surface of the object 10 can bend light, sinusoidal fringes in the projection image projected onto the surface of the object 10 by the transmitting device 20 are subjected to phase shift by the action of the uneven area of the object 10, and the sinusoidal fringes in the reflection image actually collected by the collecting device 30 are subjected to phase shift, so that the collecting device 30 collects the wrapping phase value including the deformation information of the surface of the detected object 10 in the reflection image, and therefore, the electronic equipment 40 can acquire the phase change through the reflection image collected by the collecting device 30, and further determine the three-dimensional information of the surface of the object.
In some embodiments, the multi-frequency differential phase shift method is widely used for detection in the scenario shown in fig. 1 because of its high accuracy, high resolution, and insensitivity to changes in reflectivity of the object surface. Specifically, fig. 2 is a schematic diagram of a process performed by a transmitting device in a multi-frequency differential phase shift method, wherein before the transmitting device transmits a projection image, an electronic device generates a plurality of sinusoidal phase shift fringe projection images with different frequencies, and the plurality of projection images with each frequency are subjected to phase modulation, and then the generated projection images are sequentially transmitted to the surface of an object through the transmitting device. For example, in step S11 shown in fig. 2, the electronic device generates three projection images of frequency ① and three projection images of frequency ② as examples, and the three projection images of each frequency are modulated in phase by a certain step, that is, the three projection images of each frequency have the same frequency and different phases. In some embodiments, the projected image may be represented by equation 1 below:
Wherein In (x, y) represents the light intensity of the (x, y) pixel position in the nth projection image, i.e., the image encoding intensity, n=0, 1, & gt.
When the electronic device sequentially projects a plurality of projection images with a plurality of frequencies generated in fig. 2 to the surface of the object through the emission device in the form of visible light rays, the electronic device respectively collects reflection images of the light rays of each image on the surface of the object through the collection device, the number of the reflection images is the same as that of the projection images, and then the electronic device reversely calculates phase information corresponding to stripes in the reflection images from the formula 1 by using the obtained images. For example, fig. 3 is a schematic diagram of a process performed by the acquisition device in the multi-frequency differential phase shift method, in which, in step S21 shown in fig. 3, the electronic apparatus obtains the same number of reflected images of the object surface emitted in fig. 2 by the emission device through the acquisition device, and it can be seen that the fringes in the vertical reflected images in fig. 2 are changed in distribution direction due to the influence of the unevenness of the object surface. Subsequently, in S22, the electronic device decodes the reflected image through the arctangent function to obtain a multi-frequency deformation phase map corresponding to each frequency, for example, in the example shown in fig. 3, one multi-frequency deformation phase map may be obtained according to the three reflected images with frequencies ①, and another multi-frequency deformation phase map may be obtained according to the three reflected images with frequencies ②. Then in S23, the electronic device performs heterodyne processing on the plurality of phase change maps in S22 to obtain an equivalent phase map, which may also be referred to as a parcel phase map.
Because the arctangent function can only acquire the wrapping phases ranging from (-pi, pi) during phase decoding, the wrapping phase diagram obtained in S23 is equivalent to performing 2 pi modulo processing on the real phase, so that the wrapping phase is not unique and phase ambiguity is caused. Therefore, the electronic device also needs to perform the phase unwrapping (Phase Unwrapping) on the wrapped phase obtained in S23, and through the correspondence between the deformed phase in S23 and the original phase in S11, the true phase in S24 can be recovered after the wrapped phase in S23 is obtained. The correspondence between the wrapping phase of S23 and the true phase of S24 can be expressed by the following equation 2:
wherein k (x, y) is the phase progression, as can be seen from equation 2, performing phase unwrapping can be equivalent to determining the phase progression, and after heterodyning the plurality of deformed phase maps to obtain the phase progression in S22, wrapping the phase can be combinedThe true phase phi (x, y) is obtained.
In summary, in the three-dimensional automatic optical detection method using the multi-frequency phase shift method as shown in fig. 2-3, although the electronic device can determine the phase value of the image after being reflected by the object through the transmitted and received sinusoidal stripe image, and further determine the three-dimensional information, in the above process, each of the frequencies S21-S22 needs to obtain a deformation phase map through images of multiple frequencies, and S22-S23 also needs to perform heterodyne processing through deformation phase maps of multiple different frequencies, so that the number of images that the electronic device needs to process in the whole three-dimensional automatic optical detection process is large. In the encoding stage shown in fig. 2, the electronic device needs to encode more projection images, and in the decoding stage shown in fig. 3, the electronic device needs to decode more reflection images to obtain the wrapping phase and the real phase, so that the calculated amount of the electronic device in the process of three-dimensional automatic optical detection is greatly increased, precious calculation resources of the electronic device are occupied, higher requirements are provided for the calculation capability of the electronic device, and the cost required in the process of three-dimensional automatic optical detection is also increased.
Therefore, in order to overcome the technical problems in the technologies shown in fig. 2-3, the application provides a three-dimensional measurement method and a three-dimensional measurement system, wherein space-variant phase shift is added in a plurality of projected images projected, so that after receiving a reflected image, an electronic device can determine the phase progression according to the phase change in the reflected image, thereby enabling a transmitting device to only transmit a plurality of projected images with one frequency, and a collecting device to only collect a plurality of reflected images with one frequency, the electronic device can determine the phase progression and the real phase corresponding to the object surface according to the plurality of reflected images with one frequency, and finally complete measurement of three-dimensional information on the object surface, thereby reducing the number of images to be processed by the electronic device in the whole three-dimensional automatic optical detection process, improving the detection efficiency and speed, and having higher cost benefit.
The technical scheme of the application is described in detail below by specific examples. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
Fig. 4 is a schematic flow chart of an embodiment of the three-dimensional measurement method provided by the present application, where the method shown in fig. 1 can be applied to the scene shown in fig. 1, and the electronic device 40, the transmitting device 20 and the collecting device 30 measure the three-dimensional information of the object 10 to be detected together. Specifically, the three-dimensional measurement method as shown in fig. 4 includes:
the electronic device 40 determines a plurality of projection images to be projected S101.
The phase shift image of the sinusoidal stripe can be specifically provided for each projection image, each projection image comprises a plurality of sinusoidal stripes, meanwhile, the frequency of the sinusoidal stripes of the plurality of projection images is the same, and when the plurality of projection images are subjected to phase modulation, two parts of time-varying phase shift and space-varying phase shift are simultaneously added, wherein the time-varying phase shift is that the phase of the sinusoidal stripes is subjected to phase shift according to a preset phase shift step number between the plurality of projection images and is related to the sequence number of the image, the space-varying phase shift is that the phase of the sinusoidal stripes is subjected to phase shift processing according to a preset rule inside each projection image and is related to coordinates in the image. In some embodiments, when each sinusoidal stripe in the projection image is spatially-shifted, the sinusoidal stripe may be phase-shifted according to a spatially-shifted phase-shift value uniquely corresponding to the position of the sinusoidal stripe in the projection image, so that the phase-shift values performed by the sinusoidal stripes at different positions in the projection image are different, which may be referred to as "spatially-shifted phase.
Specifically, in S101 the electronic device may determine the image to be projected specifically according to the following equation 3:
Wherein, the above formula 3 is specifically a light intensity distribution function of the projection image, In (x, y) is a light intensity value of an nth image in the multiple projection images at a (x, y) pixel position, I '(x, y) is a light intensity offset value, I' (x, y) is a sine amplitude, I '(x, y) and I' (x, y) can be set to (2-bit-1)/2, where bit represents a coding bit number of the projector, δn is a time-varying phase shift value, δn =2pi/2n, 2n is a step number of the time-varying phase shift, N is greater than or equal to 2, z (x, y) is a shift value of the (x, y) position in the projection image according to the preset rule, and the space-varying phase shift value σ (x, y) is included, see formula 4. From equation 3 above, it can be seen that the sinusoidal fringes within each projected image are phase shifted, and that the phase shift includes two parts, time-varying phase shift δn and space-varying phase shift z (x, y).
In some embodiments, since space-variant phase shift is used to represent the difference between different locations within the projected image, any form that can uniquely determine the pixel coordinates or the phase progression can be used, for example, the preset regular phase shift value z (x, y) within the projected image can be calculated according to the following equation 4:
z (x, y) =sN (n) ·σ (x, y) formula 4
Wherein SN (n) is a sign function,K (x, y) is the phase progression, f is the frequency of sine stripes, namely the number of sine cycles on a projection image, and R is the upper limit of the phase shift value of the space-variant shift term according to a preset rule.
In some embodiments, fig. 5 is a schematic diagram of determining a projection image according to the present application, where S311 is a space-variant phase shift value distribution at different positions in the projection image, and it can be seen that the space-variant phase shifts calculated according to equation 4 are different in phase at different positions in one projection image and may be periodically changed, S312 is that a plurality of projection images in the plurality of projection images S312 undergo a time-variant phase shift, and then in S32, according to the plurality of projection images in S312, the phase shift is performed according to the space-variant phase shift in S311, so that in the plurality of images obtained in S32, both the time-variant phase shift and the space-variant phase shift are performed.
S102, the electronic equipment 40 projects the plurality of projection images determined in S101 to the surface of the object 10 to be detected through the emitting device 20 sequentially through visible rays.
The specific manner of projecting the projection image in this embodiment may refer to the prior art, and is not limited thereto, and S101-S102 may form part P1 in fig. 4, which is a process of determining and projecting the projection image for the electronic device 40.
S103, the electronic device 40 sequentially collects a plurality of reflection images obtained after the object to be detected reflects the light rays of the plurality of projection images sent by the emitting device 20 in S102 through the collecting device 30.
S103 may form part P2 in FIG. 4, and S102-S103 should be executed, for example, after the electronic device determines a plurality of projection images in S101, after sending a first projection image to the object to be detected through S102, a first reflection image corresponding to the first projection image is collected through S103, then a second projection image is sent to the object to be detected through S102, and a second reflection image corresponding to the second projection image is continuously collected through S103, and so on, finally a plurality of reflection images are obtained.
Subsequently, after the processing flow of the two parts of P1 projection and P2 acquisition, the electronic device 40 acquires a plurality of reflected images, and then the three-dimensional information of the object surface can be measured according to the plurality of reflected images through a detection part P3 in fig. 4, where the P3 detection part specifically includes:
In S106, the calculated wrapping phases using the plurality of reflection imagesAnd a phase progression k (x, y), unwrapping the wrapped phase, obtaining real phase values of the surface of the object to be detected corresponding to the plurality of reflection images through a formula,Finally, three-dimensional information of the surface of the object to be detected, such as specific positions of areas with unsmooth surfaces, heights of the unsmooth areas and the like, can be determined according to the change of the real phase value phi (x, y), and the application does not limit the subsequent processing of the real phase value.
Fig. 6 is a schematic diagram of determining a projection image according to the present application, in which, in order to obtain a true phase value (corresponding to S44 in fig. 6), it is also necessary to determine a package phase from a plurality of reflection images (corresponding to S41 in fig. 6) through S104, respectively(Corresponding to S43 in fig. 6), and determining the phase number k (x, y) through S105 (corresponding to S42 in fig. 6), the actual phase value can be determined in S106, and the execution sequence of S104 and S105 is not limited, and may be performed simultaneously or sequentially.
In S104, after the electronic device collects a plurality of projection images, each projection image may be represented by the intensity function of formula 3, and the electronic device may determine the wrapping phase by processing the arctangent function according to the following formula 5
Where In (x, y) is the intensity function of the nth projected image and deltan is the time-varying phase shift of the nth projected image. Because of the intensity function in equation 3, when determining the wrapping phase in S104, there is an intensity functionSigma (x, y), I ' (x, y) and I ' ' (x, y) are four unknowns, thus requiring a minimum of 4 intensity functions of the projection images, which, after simultaneous solution, yield the wrapping phase
In S105, when the projection images provided by the embodiment of the present application are encoded by equation 3, a space-variant phase shift is further added, so that the space-variant phase shift carries information of a phase progression, so that after the electronic device collects a plurality of projection images, the space-variant phase shift values corresponding to the plurality of projection images can be determined together according to the light intensity functions of the projection images and the preset rule of the phase change, and further the phase progression k (x, y) included in the space-variant phase shift values is determined.
In some embodiments, when the electronic device calculates the space-variant phase shift value through S105, the manner used is related to the phase shift step number N, specifically, when N is an even number, the space-variant phase shift value σ (x, y) may be calculated by the following formula 6:
and when N is an odd number, the space-variant phase shift value σ (x, y) can be calculated by the following equation 7:
Wherein,As a function of the sign of the symbol,
In some embodiments, after the space-variant phase shift value calculated by equation 6 or equation 7, the phase progression k (x, y) may be calculated according to equation 8 as follows:
Where Round () is a rounding function.
In summary, in the three-dimensional measurement method provided by the embodiment of the application, in addition to adding time-varying phase shift and space-varying phase shift related to the position in the projection image projected onto the surface of the object to be detected, after receiving the reflection image, the electronic device can determine the phase progression according to the phase change in the reflection image, and by combining the wrapping phases which can be determined from the reflection image, the real phases corresponding to a plurality of reflection images can be obtained, and finally, the measurement of the three-dimensional information on the surface of the object is completed according to the real phases. That is, in the embodiment of the application, the electronic device only needs to determine a plurality of projection images with the same frequency, and after the acquisition device sends out the plurality of projection images, the electronic device can measure the three-dimensional information of the object surface according to the plurality of reflection images with the same frequency by the plurality of reflection images with the same frequency acquired by the acquisition device.
Therefore, because the embodiment of the application embeds the phase number into the space-variant phase shift of the image, compared with the prior art shown in fig. 2-3, the phase technology can be obtained only by heterodyning a plurality of reflection images corresponding to a plurality of frequencies, the number of the images to be determined, transmitted, collected and detected is less, and the detection can be completed only by four pictures at least, so that the number of the images to be processed by the electronic equipment, the transmitting device and the collecting device in the whole three-dimensional automatic optical detection process is reduced, the calculated amount of the electronic equipment in the three-dimensional optical detection process is reduced, the large occupation of the calculation resources of the electronic equipment is avoided, the detection speed and efficiency are improved due to the less processing amount, the requirement on the calculation capability of the electronic equipment is reduced, and the cost required in the three-dimensional automatic optical detection process is reduced, so that the whole detection process has higher cost effectiveness. In addition, the embodiment of the application embeds the phase progression into the space-variant phase shift of the stripe image instead of the pixel intensity of the image, so that the higher sine amplitude of the image can be reserved, and the quality of the image is ensured.
It should be noted that, in the three-dimensional measurement system according to the present embodiment, the division of each device/apparatus is merely a division of logic functions, and may be fully or partially integrated into one physical entity or may be physically separated. For example, both the transmitting means and the collecting means may be integrated in the electronic device. The modules can be realized in the form of software which is called by the processing element, in the form of hardware, in the form of software which is called by the processing element, and in the form of hardware. The function of the above determination module may be implemented as a processing element that is set up separately, or may be integrated into a chip of the above apparatus, or may be stored in a memory of the above apparatus in the form of program codes, and may be called and executed by a processing element of the above apparatus. The implementation of the other modules is similar. In addition, all or part of the modules can be integrated together or can be independently implemented. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in a software form.
For example, the above modules/units may be one or more integrated circuits configured to implement the above methods, such as one or more Application SPECIFIC INTEGRATED Circuits (ASICs), or one or more microprocessors (DIGITAL SIGNAL processors, DSPs), or one or more field programmable gate arrays (field programmable GATE ARRAY, FPGAs), or the like. For another example, when a module above is implemented in the form of processing element scheduler code, the processing element may be a general purpose processor, such as a central processing unit (central processing unit, CPU) or other processor that may invoke the program code. For another example, the modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
In the above embodiments, in the three-dimensional measurement system, all or part of the method steps performed by the electronic device may be implemented by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk Solid STATE DISK (SSD)), etc.
The application also provides an electronic device comprising a processor and a memory, wherein the memory stores a computer program, which when executed by the processor is operable to perform a three-dimensional measurement method as in any of the previous embodiments of the application.
The present application also provides a computer readable storage medium storing a computer program which when executed is operable to perform a three-dimensional measurement method as in any of the foregoing embodiments of the application.
The embodiment of the application also provides a chip for running the instructions, and the chip is used for executing the three-dimensional measurement method executed by the electronic equipment in any of the previous embodiments of the application.
Embodiments of the present application also provide a program product comprising a computer program stored in a storage medium, from which at least one processor can read, which at least one processor, when executing the computer program, can implement a three-dimensional measurement method as performed by an electronic device in any of the previous embodiments of the present application.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of implementing the various method embodiments described above may be implemented by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs the steps comprising the method embodiments described above, and the storage medium described above includes various media capable of storing program code, such as ROM, RAM, magnetic or optical disk.
It should be noted that the above embodiments are merely for illustrating the technical solution of the present application and not for limiting the same, and although the present application has been described in detail with reference to the above embodiments, it should be understood by those skilled in the art that the technical solution described in the above embodiments may be modified or some or all of the technical features may be equivalently replaced, and these modifications or substitutions do not make the essence of the corresponding technical solution deviate from the scope of the technical solution of the embodiments of the present application.