Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used.
Furthermore, the terms "first," "second," and the like in the various embodiments, independently, do not have to be used to limit the same terms to the same events in the various embodiments. It should be understood that the data so used may be arbitrarily changed where appropriate to facilitate describing the embodiments of the present application in real time herein.
Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present application, a three-dimensional scanner is provided.
Fig. 1 is a schematic diagram of a three-dimensional scanner according to an embodiment of the present application. As shown in fig. 1, the three-dimensional scanner includes: animage projection apparatus 10 and animage acquisition apparatus 20.
Theimage projection device 10 is configured to project preset stripe patterns corresponding to each preset period to the target object at each preset period, where each preset stripe pattern is arranged according to the preset color coding stripes, each preset stripe pattern includes stripes of at least one color of the preset color coding stripes, and a plurality of preset stripe patterns include stripes of at least two colors of the preset color coding stripes, and the stripes in the preset stripe patterns are arranged consistently with the stripes of the same color in the preset color coding stripes.
It should be noted that: projecting the preset stripe pattern corresponding to the preset period to the target object in each preset period may be: theimage projection device 10 projects a preset stripe pattern periodically, and theimage projection device 10 projects a plurality of preset stripe patterns in each preset period, wherein the preset stripe patterns are projected in a time-sharing manner. For example, theimage projection apparatus 10 projects a first preset stripe pattern in a first period, projects a second preset stripe pattern in a second period, theimage capture apparatus 20 captures the first preset stripe pattern in the first period, and captures the second preset stripe pattern in the second period, and theimage capture apparatus 20 repeats this process until the target object scanning is completed.
In an alternative example, as shown in fig. 2, theimage projection apparatus 10 further includes: aDLP projection unit 11, wherein theimage projection device 10 projects a plurality of preset fringe patterns corresponding to each preset period to the target object through theDLP projection unit 11 at each preset period.
That is, theimage projection apparatus 10 can realize its function by theDLP projection section 11.
Specifically, theDLP projection unit 11 projects a plurality of preset stripe patterns corresponding to each preset period to the target object in each preset period, wherein each preset stripe pattern is arranged according to the preset color coding stripe, each preset stripe pattern includes at least one color stripe of the preset color coding stripe, and the plurality of preset stripe patterns include at least two colors stripes of the preset color coding stripe, and the stripes of the preset stripe patterns are arranged consistently with the same color stripes of the preset color coding stripe.
In an alternative example, theimage projection apparatus 10 further includes: alight emitting unit 12, configured to emit a plurality of initial lights corresponding to each preset period in each preset period, where each of the initial lights is composed of lights of at least one stripe color, and the stripe color is a color of a stripe in the preset color coding stripes; and thelight transmission part 13 is arranged on the transmission path of the initial light, wherein each initial light generates a corresponding preset color stripe after being transmitted by a pattern of the preset color coding stripe arranged on thelight transmission part 13, namely the preset stripe pattern is projected onto a target object, and the stripes in the preset stripe pattern are arranged in the same way as the stripes with the same color in the preset color coding stripe.
It should be noted that, preset color coding stripes are preset for each preset color stripe arrangement standard, in the present application, a preset stripe pattern meeting the preset color stripe arrangement standard can be directly projected through theDLP projection portion 11, or thelight transmission portion 13 can also be used as a carrier of the preset color stripe arrangement standard, that is, thelight transmission portion 13 determines the preset color stripe arrangement standard, and the initial light passes through the light projection portion to generate the preset stripe pattern arranged according to the preset color stripe arrangement standard.
That is, theimage projection apparatus 10 can realize its function by thelight emitting part 12 and thelight transmitting part 13.
Specifically, the three-dimensional scanner may form different preset stripe patterns to project onto the target object in a transmission projection manner, and the generated stripes of each preset stripe pattern are arranged and set according to preset color coding stripes arranged on thelight transmission portion 13, each preset stripe pattern includes stripes of at least one color of the preset color coding stripes, and the plurality of preset stripe patterns include stripes of at least two colors of the preset color coding stripes, and the stripes in the preset stripe patterns are arranged consistently with the stripes of the same color in the preset color coding stripes.
Optionally, thelight emitting unit 12 further includes a plurality oflight source units 121, and the light wave bands emitted by each of thelight source units 121 are different, wherein thelight emitting unit 12 emits the initial light through the plurality oflight source units 121, and the initial light may be light of a single wave band emitted by only a singlelight source unit 121, or light of a plurality of wave bands emitted by the plurality oflight source units 121 simultaneously.
For example, the following steps are carried out: as shown in fig. 1, thelight emitting unit 12 includes threelight source units 121, and the wavelength band of the light emitted by eachlight source unit 121 is different, for example: the firstlight source unit 121 emits light of 605-700 wavelength bands, i.e., red light; the secondlight source unit 121 emits light of 435-480 bands, i.e., blue light; the thirdlight source unit 121 emits light of 500-560 wavelength bands, i.e., green light.
In the period a of the preset period, the firstlight source unit 121 emits light rays in the 605-700 wavelength band; in the B period of the preset period, the secondlight source unit 121 emits light in 435-480 wave band; in the period C of the preset period, the firstlight source unit 121 emits light with the wavelength band of 605-.
Or, in the a period of the preset period, the firstlight source unit 121 emits light rays in 605-700 wavelength bands; in the B period of the preset period, the secondlight source unit 121 emits light in the 450-480 band; in the period C of the preset period, the thirdlight source unit 121 emits light of the wavelength of 500-.
It should be noted that: the settings of the firstlight source unit 121, the secondlight source unit 121, and the thirdlight source unit 121 are schematic examples, and the wavelength band of the light emitted by thelight source unit 121 is not particularly limited. In addition to the above illustrative examples, the wavelength band of the light emitted by thelight source unit 121 may be arbitrarily selected, and the present application is not particularly limited thereto.
It should also be noted that: the above-mentioned setting of thelight source units 121 operated in the preset periods a, B, and C is an illustrative example, and thelight source units 121 capable of emitting light in each preset period are not specifically limited. In addition to the above-mentioned exemplary embodiment, thelight source unit 121 that can be activated in each preset period may be arbitrarily selected, and the present application is not particularly limited thereto.
Alternatively, thelight source unit 121 may include at least one of: LED light source, laser emitter.
That is, thelight source unit 121 may implement its function by a laser emitter, and may also implement its function by an LED light source. The laser has the advantages of directional light emission, extremely high brightness, extremely pure color and good coherence.
Specifically, thelight emitting portion 12 further includes a plurality of LED light sources, and light bands emitted by each of the LED light sources are different from each other, wherein thelight emitting portion 12 emits the initial light through the plurality of LED light sources.
Specifically, thelight emitting unit 12 further includes a plurality of laser emitters, and each of the laser emitters emits light in a different wavelength band, wherein thelight emitting unit 12 emits the initial light through the plurality of laser emitters.
Optionally, thelight emitting portion 12 further includes a light converging unit disposed on a transmission path of the light emitted from the plurality oflight source units 121, wherein the light emitted from the plurality oflight source units 121 is converged by the light converging unit and then projected to thelight transmitting portion 13 through the same transmission path.
That is, the initial light is a combination of light projected to thelight transmission part 13 through the same transmission path after being polymerized by the light polymerization unit, wherein the light polymerization unit can realize its function through the semi-reflective andsemi-transparent prism 22 c.
For example, the following steps are carried out: as shown in fig. 1, thelight emitting unit 12 includes threelight source units 121, and the wavelength band of the light emitted by eachlight source unit 121 is different, and a first half-reflecting and half-transmittingprism 22c is disposed on the light path of the firstlight source unit 121 and the secondlight source unit 121, where the first half-reflecting and half-transmittingprism 22c is used for performing a polymerization process on the light emitted by the firstlight source unit 121 and the secondlight source unit 121 to project the light onto the second half-reflecting and half-transmittingprism 22 c; the thirdlight source unit 121 is disposed on a side of the second half-reflecting and half-transmittingprism 22c away from the polymerized light, wherein the light emitted by the thirdlight source unit 121 and the polymerized light are polymerized by the second half-reflecting and half-transmittingprism 22c to generate a light combination projected to thelight transmission part 13 through the same transmission path.
Optionally, thelight transmitting portion 13 further includes a grating, and specifically, thelight transmitting portion 13 generates a preset stripe pattern through the grating so as to project the preset stripe pattern onto the target object.
Specifically, different regions are arranged on the grating, and the different regions correspond to different wavebands, that is, the different regions can transmit light of different wavebands, and the different regions on the grating determine preset color coding stripes, which can also be understood as that each region on the grating is consistent with the arrangement of each stripe in the preset color coding stripes, and the wavebands corresponding to the regions correspond to the stripe colors corresponding to the stripes with the consistent arrangement. For example, the grating includes a first region for light of a first wavelength band to transmit and a second region for light of a second wavelength band to transmit, the light of the first wavelength band forms stripes of the first wavelength band after passing through the grating, and the arrangement of the stripes is consistent with the arrangement of the first region, and the light of the second wavelength band forms stripes of the second wavelength band after passing through the grating, and the arrangement of the stripes is consistent with the arrangement of the second region.
That is, thelight emitting part 12 emits different initial lights at different periods of the preset cycle; at this time, when some initial light is projected onto the grating, the light of each color is transmitted through the corresponding region, and a predetermined stripe pattern is formed.
It should be noted that: in the case where thelight emitting part 12 emits the initial light by the plurality of laser emitters, thelight emitting part 12 may further include a phase modulation unit, wherein the phase modulation unit is disposed on a transmission path of the initial light, so that the initial light is projected to thelight transmitting part 13 after the diffraction spots are eliminated by the phase modulation unit.
Specifically, the phase modulation unit may include: the phase modulation element is arranged on a transmission path of initial light and rotates around a preset axis, wherein the transmission path of the initial light is parallel to the preset axis of the phase modulation element; the light beam coupling element is arranged on a transmission path of the initial light ray and used for collimating and adjusting the initial light ray and reducing the divergence angle of the initial light ray.
Wherein the phase modulation element may be in any form as follows: a transparent optical material sheet, micro-optical element or random phase plate; the phase modulation unit also comprises a driving motor, and the phase modulation element is driven by the driving motor to rotate around the rotating shaft at a certain speed;
the beam coupling element may be composed of a collimating system and a converging lens, or an optical system having equivalent function.
The phase modulation element may be located before the beam coupling element or located after the beam coupling element.
It should be noted that: in the case where thelight emitting unit 12 emits the initial light through the plurality oflight source units 121, thelight emitting unit 12 may further include a solid medium element disposed on a transmission path of the initial light, and the initial light is reflected and mixed by the solid medium element for a plurality of times and then projected to thelight transmitting unit 13 in a form of uniform light field intensity.
In particular, the solid media element may be in any of the following forms: a slender hexahedral prism, a cylindrical prism and a pyramidal prism; meanwhile, the solid medium element can be a hollow rod for reflecting light rays for multiple times in a space surrounded by a solid interface, and also can be a solid rod for reflecting light rays for multiple times in a solid transparent medium, wherein the input end face and the output end face of the solid rod are plated with antireflection films, the inner surface of the hollow rod is plated with an antireflection film, and in addition, the emergent end face and the incident end face of the solid medium element are arranged in parallel.
Optionally, the three-dimensional scanner further includes a timing control unit, where the timing control unit is connected to theimage projection device 10 and theimage acquisition device 20, and is configured to control theimage projection device 10 to emit the preset fringe patterns corresponding to each preset period in each preset period, and control theimage acquisition device 20 to acquire the light modulated by the target object in a plurality of preset periods, so as to acquire the fringe image corresponding to each preset fringe pattern.
That is, the three-dimensional scanner controls theimage projection device 10 to emit the preset fringe pattern corresponding to each preset period in each preset period through the timing control unit, and controls theimage capture device 20 to capture the light modulated by the target object in a plurality of preset periods, respectively, so as to obtain the fringe image corresponding to each preset fringe pattern.
That is, the three-dimensional scanner allows theimage projection apparatus 10 and theimage pickup apparatus 20 to perform the same process by the timing control unit.
Optionally, the three-dimensional scanner further includes a timing control unit, where the timing control unit is connected to thelight source units 121 and theimage acquisition device 20, and is configured to control thelight source units 121 to emit light rays in different preset periods respectively, so as to generate initial light rays corresponding to the preset periods in each preset period respectively; and controlling theimage capturing device 20 to capture the light modulated by the target object in a plurality of preset periods respectively, so as to obtain a fringe image corresponding to each of the initial light.
That is, the three-dimensional scanner controls, through the timing control unit, the plurality oflight source units 121 to emit light beams in different preset periods, respectively, so as to generate preset fringe patterns projected onto the target object corresponding to the preset periods, and controls theimage capturing device 20 to capture the light beams modulated by the target object in the preset periods, respectively, so as to obtain a fringe image corresponding to each of the initial light beams.
That is, the three-dimensional scanner allows the plurality oflight source units 121 and theimage pickup device 20 to be aligned in the sequence control unit.
It should be noted that: the two kinds of timing control portions may be selectable examples of the present application, that is, the three-dimensional scanning apparatus includes: a first timing control unit or a second timing control unit, where the first timing control unit is connected to theimage projection apparatus 10 and theimage acquisition apparatus 20, and is configured to control theimage projection apparatus 10 to emit a preset fringe pattern corresponding to each preset period in each preset period, and control theimage acquisition apparatus 20 to acquire light modulated by the target object in a plurality of preset periods, respectively, so as to obtain a fringe image corresponding to each preset fringe pattern; the second timing control unit is connected to thelight source units 121 and theimage capturing device 20, and configured to control thelight source units 121 to emit light rays in different preset periods, so as to generate initial light rays corresponding to the preset periods in each preset period; and controlling theimage capturing device 20 to capture the light modulated by the target object in a plurality of preset periods respectively, so as to obtain a fringe image corresponding to each of the initial light.
Optionally, the three-dimensional scanner further includes anilluminator 30, and the three-dimensional scanner further includes theilluminator 30, wherein theimage acquisition device 20 is further configured to acquire the illumination light reflected by the target object to acquire the texture data of the target object when the target object is illuminated by theilluminator 30.
Further, in the case that the three-dimensional scanner further includes the illuminatingmember 30, theimage capturing device 20 may identify and determine red light, blue light, and green light, so that theimage capturing device 20 captures a texture image of the target object in the case that the target object is projected with the illuminating light by the illuminatingmember 30, and generates a three-dimensional model having a color (or a color substantially) identical to that of the target object through the texture image and the three-dimensional data, that is, a true color scan is performed.
For example, the following steps are carried out: the illuminatingmember 30 may be an LED lamp emitting white light, and if theimage projection apparatus 10 includes theDLP projection unit 11, theimage projection apparatus 10 may be an integrated apparatus of the illuminatingmember 30 and theDLP projection unit 11 projecting illuminating light.
Further, in the case that the three-dimensional scanner further includes an illuminatingmember 30, the timing control unit is further connected to the illuminatingmember 30, and is configured to control the illuminatingmember 30 to project the illuminating light onto the target object, and control theimage capturing device 20 to capture the texture map of the target object in the case that the illuminatingmember 30 projects the illuminating light onto the target object.
Further, under the condition that the three-dimensional scanner further comprises an illuminatingpiece 30 and the timing control part is further connected with the illuminatingpiece 30, the timing control part is used for controlling theimage projecting device 10 and the illuminatingpiece 30 to alternately project the preset stripe pattern and the illuminating light onto the target object, and the timing control part is used for controlling theimage collecting device 20 to synchronously collect the preset stripe pattern relative to theimage projecting device 10 and controlling theimage collecting device 20 to synchronously collect the texture image relative to the illuminatingpiece 30; or, the timing control unit is configured to control the plurality oflight source units 121 and the illuminatingelement 30 to alternately project the preset stripe pattern and the illuminating light onto the target object, and the timing control unit is configured to control theimage capturing device 20 to synchronously capture the preset stripe pattern with respect to theimage projecting device 10, and is configured to control theimage capturing device 20 to synchronously capture the texture image with respect to the illuminatingelement 30.
Optionally, the three-dimensional scanner further includes amirror 40, and themirror 40 is configured to change a transmission path of light.
For example, the following steps are carried out: thereflective mirror 40 is disposed on a transmission path of the preset stripe pattern, specifically, the preset stripe pattern is reflected to the target object by thereflective mirror 40, and then is modulated by the target object and then is reflected to theimage acquisition device 20; in this case, the installation constraints of theimage projection apparatus 10 and theimage capture apparatus 20 can be reduced, and the size of the space required for theimage projection apparatus 10 and theimage capture apparatus 20 can be reduced.
For example, the following steps are carried out: thereflective mirror 40 is disposed on a transmission path of the light emitted from thelight source units 121, and specifically, thereflective mirror 40 is used to change the transmission path of the light emitted from thelight source units 121, so as to reduce installation constraints of thelight source units 121 and reduce the size of the space required by thelight source units 121.
Alternatively, in the case where the three-dimensional scanner further includes an illuminatingmember 30 and a reflectingmirror 40, and the reflectingmirror 40 is disposed on a transmission path of a preset stripe pattern, as shown in fig. 3, the illuminatingmember 30 may be disposed on an outer circumference of the reflectingmirror 40; theillumination device 30 may be disposed at another position of the scanner and configured to cooperate with thereflective mirror 40 to reflect the illumination light to the target object through thereflective mirror 40, for example, theillumination device 30 is disposed at a side of thefirst imaging lens 14 close to thelight source unit 121, so that the illumination light and the light projected by thelight source unit 121 can both pass through thefirst imaging lens 14 and be reflected to the target object through thereflective mirror 40.
For example, the following steps are carried out: the three-dimensional scanner includes a holding portion and an entrance portion provided at a front end of the holding portion, theimage projection device 10 and theimage capture device 20 are both mounted to the holding portion, thereflective mirror 40 is mounted to the entrance portion, and the illuminatingmember 30 may be mounted to the entrance portion or the holding portion.
And theimage acquisition device 20 is used for acquiring light modulated by the target object to acquire a plurality of stripe images under the condition that the target object is projected with a preset stripe pattern, wherein the acquired stripe images are used as an encoding diagram to determine each stripe sequence and are used as a reconstruction diagram to carry out three-dimensional reconstruction on the target object, and three-dimensional data of the target object is generated.
That is, in the case that the target object is projected with a preset stripe pattern, the projected preset stripe pattern is mapped on the target object, and the preset stripe pattern is deformed (i.e., modulated) based on the shape of the target object, at this time, theimage acquisition device 20 acquires the deformed preset stripe pattern, and further acquires a stripe image, where the stripe image is used to determine each stripe sequence and perform three-dimensional reconstruction on the target object.
In an optional example, theimage capturing apparatus 20 further includes a plurality ofcameras 21, where the plurality ofcameras 21 includes at least one black andwhite camera 21, where theimage capturing apparatus 20 captures light modulated by the target object through the plurality ofcameras 21 to obtain a plurality of fringe images, where the fringe image obtained by at least one black andwhite camera 21 is used as a reconstruction map to perform three-dimensional reconstruction on the target object; and the stripe images obtained by at least a plurality of black andwhite cameras 21 are used as coding patterns to determine each stripe sequence, and/or the stripe image obtained by at least onecolor camera 21 is used as a coding pattern to determine each stripe sequence.
That is, theimage capturing device 20 captures the light modulated by the target object through the plurality ofcameras 21 to obtain a plurality of stripe images, and the plurality ofcameras 21 at least include one black-and-white camera 21, wherein the stripe image obtained by at least one black-and-white camera 21 is used as a reconstruction map to reconstruct the target object in three dimensions.
It should be noted that: the imaging resolution of the black-and-white camera 21 is higher than that of thecolor camera 21, so that the plurality ofcameras 21 at least include one black-and-white camera 21, and the stripe image generated by the black-and-white camera 21 is used for three-dimensional reconstruction, thereby improving the accuracy of three-dimensional reconstruction of the target object.
Specifically, the three-dimensional reconstruction of the target object by using the fringe image obtained by at least one black-and-white camera 21 as a reconstruction map includes: a fringe image obtained by a black andwhite camera 21 is used as a reconstruction image to carry out three-dimensional reconstruction on the target object; the stripe images obtained by the plurality of black andwhite cameras 21 are used as reconstruction images to perform three-dimensional reconstruction on the target object; the stripe images obtained by one black andwhite camera 21 and at least onecolor camera 21 are used as reconstruction images to carry out three-dimensional reconstruction on the target object; the streak images obtained by the plurality of black andwhite cameras 21 and the at least onecolor camera 21 are used as reconstruction images to reconstruct the target object in three dimensions.
Specifically, the determining each stripe sequence by using the stripe images obtained by at least a plurality of black-and-white cameras 21 as the code pattern, and/or the determining each stripe sequence by using the stripe image obtained by at least onecolor camera 21 as the code pattern includes: the stripe images obtained by the black andwhite cameras 21 are used as encoding patterns to determine each stripe sequence; the fringe image obtained by the at least onecolor camera 21 is used as a code pattern to determine each fringe sequence; the fringe images obtained by the at least onecolor camera 21 and the at least one black andwhite camera 21 serve as coding patterns to determine the respective fringe sequences.
That is, as stripe information included in at least one stripe image of the code pattern, a code sequence capable of determining each stripe is required; that is, the code pattern is composed of a stripe image that can determine the code sequence of each stripe.
Optionally, thecamera 21 may be a CDD camera or a CMOS camera. Specifically, the camera form is not specifically limited in the present application, and technicians may make corresponding replacements according to technical requirements.
It should be noted that: the CCD camera has the characteristics of small volume, light weight, no influence of a magnetic field and vibration and impact resistance, so that the volume of the three-dimensional scanner can be correspondingly reduced under the condition that the three-dimensional scanner adopts the 2CCD camera to acquire a fringe image, the three-dimensional scanner is convenient to use by hands, and the three-dimensional scanner is applied to an environment to be scanned (such as an oral cavity) with a small space.
For example, the following steps are carried out: projecting a preset stripe image A designed in advance to a target object in a period a of a preset period, projecting a preset stripe image B designed in advance to the target object in a period B of the preset period, and controlling animage acquisition device 20 to quickly acquire an image of the target object with the preset stripe image, whereincameras 21 included in theimage acquisition device 20 respectively acquire different stripe images, for example, a camera 211 is acolor camera 21 for acquiring a color stripe image in the case that the target object is projected with the preset stripe pattern A; the camera 212 is a black andwhite camera 21 for acquiring a black and white stripe image in a case where the target object is projected with a preset stripe pattern B.
At the moment, the color stripe image and the black and white stripe image are transmitted to a computer terminal, and the computer takes the color stripe image as coding information and takes the black and white stripe image as a reconstruction image so as to acquire the three-dimensional morphology of the target object.
In an optional example, theimage capturing device 20 further includes a lightbeam processing device 22, where the lightbeam processing device 22 includes an optical light inlet portion and at least two optical light outlet portions, and eachcamera 21 is respectively disposed corresponding to a different optical light outlet portion, and theimage capturing device 20 passes through the lightbeam processing device 22 to capture the light modulated by the target object.
Theimage capturing device 20 further includes asecond imaging lens 23, thesecond imaging lens 23 is disposed corresponding to the light entering portion of the lightbeam processing device 22, wherein the light collected by theimage capturing device 20 is emitted to the light entering portion of the lightbeam processing device 22 through thesecond imaging lens 23 to reach different light exiting portions of the lightbeam processing device 22.
That is, theimage capturing apparatus 20 is configured to set thebeam processing apparatus 22 such that the plurality ofcameras 21 can respectively perform imaging based on the coaxial light incident from the samesecond imaging lens 23, that is, such that the fringe patterns respectively acquired by the plurality ofcameras 21 have uniform fields of view and angles. Specifically, the lightbeam processing device 22 has asecond imaging lens 23 disposed at the light inlet thereof, the lightbeam processing device 22 includes a plurality of light emitting portions, the light emitting portions are respectively disposed in one-to-one correspondence with thecameras 21, and the lightbeam processing device 22 performs direction adjustment and/or wavelength band separation on the light rays incident therein, so that eachcamera 21 can respectively image based on the light rays in the same incident direction and can image based on the light rays in a designated wavelength band.
For example, the following steps are carried out: as shown in fig. 4, the light of the target object is incident through the light entrance portion of the lightbeam processing device 22; the lightbeam processing device 22 performs a light splitting process on the image light of the target object, so that the image light is emitted from at least two light emitting parts respectively to be projected to the plurality ofcameras 21; in this case, the streak images collected by the plurality ofcameras 21 are all streak images acquired from the same viewing angle.
Optionally, the lightbeam processing device 22 further includes at least one first light beam splitting unit, where the first light beam splitting unit is configured to perform light splitting processing on the light beams projected from the light inlet portion, so that the light beams are respectively projected from the at least two light outlet portions to thecamera 21 correspondingly disposed on the light outlet portion.
That is, the lightbeam processing device 22 separates the received light into light projected in a plurality of directions by the first beam splitting unit. For example, the following steps are carried out: after being processed by the first light beam separation unit, one red and blue light beam forms two red and blue light beams, and the two red and blue light beams are respectively emitted towards different directions.
Optionally, the lightbeam processing apparatus 22 further includes at least one second light beam splitting unit, where the second light beam splitting unit is configured to split light to be acquired by the designatedcamera 21, so that the designatedcamera 21 acquires light of a designated wavelength band, where the designated wavelength band at least includes: at least one initial light ray comprises a light ray band.
That is, the lightbeam processing device 22 can separate the received light into partial wavelength bands of light by the second beam splitting unit. For example, the following steps are carried out: and the red light beam and the blue light beam are processed by the second light beam separation unit to form a blue light beam.
It should be noted that: the first beam splitting unit and the second beam splitting unit in the present application may be integrated into one physical unit, or each unit may exist separately and physically.
For example, the following steps are carried out: the first beam splitting unit may be a half-reflecting and half-transmittingprism 22 c; the second beam splitting unit may be afilter 22 d; the first beam splitting unit and the second beam splitting unit may be integrated in a right-angle two-channeldichroic prism 22 a; the first beam splitting unit and the second beam splitting unit may be integrated in the three-channeldichroic prism 22 b.
For example, the following steps are carried out: in a period a of the preset period, theimage projection apparatus 10 projects a preset fringe image a designed in advance to the target object, where the preset fringe image a is formed by combining a blue fringe and a green fringe, and when the camera 211 in theimage capture apparatus 20 captures light modulated by the target object, a second light beam separation unit corresponding to the camera 211 performs separation processing on the light captured by the camera 211, so that the camera 211 can obtain the green light and the blue light. Preferably, the camera 211 can only acquire green light and blue light.
Preferably, the plurality ofcameras 21 included in theimage capturing device 20 correspond to the plurality of preset stripe patterns one to one, that is, eachcamera 21 can identify a certain light color consistent with a stripe color included in the corresponding preset stripe pattern.
Optionally, the number of stripe colors of the reconstructed image is less than the number of stripe colors in the preset color coding stripes, so that the distance between adjacent stripes is not too small, and the problem that the stripes cannot be accurately matched due to the too small distance in the stripe matching process is solved. Preferably, the reconstructed image consists of only one color stripe. Preferably, the reconstructed image is acquired by a black andwhite camera 21. Preferably, the reconstructed image is a black-and-white stripe image generated by blue light only, and the blue light has higher interference resistance and higher stability than light of other colors.
It should be noted that: the three-dimensional scanner may further include: the system comprises a heat dissipation system, a heating antifogging system, a software algorithm system and the like, wherein the heat dissipation system is used for preventing the inside of the three-dimensional scanner device from being overheated to cause the damage of the scanner; the heating antifogging system is used for preventing each optical instrument in the three-dimensional scanner from generating a fog surface phenomenon, so that the fog surface phenomenon can not be generated under the condition that an accurate stripe image cannot be acquired; the software algorithm system is configured to perform three-dimensional reconstruction on the target object according to the at least one fringe image acquired by theimage acquisition device 20.
In summary, the three-dimensional scanner provided in the embodiment of the present application, based on the stripe extraction algorithm of the spatial coding, realizes the projection requirement of canceling the dynamic projection, and only needs a small number of two-dimensional images to realize the technical effect of three-dimensional reconstruction of the target object, and solves the technical problems in the related art that the hardware cost required by the three-dimensional reconstruction method is high, and the popularization and use of the three-dimensional scanner are not facilitated.
In addition, the three-dimensional scanner also improves the three-dimensional identification accuracy by using the color as the space coding information.
In order to make the technical solutions of the present application more clearly understood by those skilled in the art, the following description will be given with reference to specific embodiments:
the first embodiment is as follows:
taking fig. 1 as an example, the lightbeam processing device 22 includes a right-angle two-channeldichroic prism 22a, and the right-angle two-channeldichroic prism 22a includes a third light emitting portion and a fourth light emitting portion, wherein the lightbeam processing device 22 implements a light splitting process on the light projected from the light entering portion through the right-angle two-channeldichroic prism 22a, so that the light is projected from the third light emitting portion and the fourth light emitting portion to thecameras 21 corresponding to the respective light emitting portions.
Correspondingly, theimage capturing device 20 includes athird camera 21 disposed corresponding to the third light-emitting portion, and afourth camera 21 disposed corresponding to the fourth light-emitting portion, where thethird camera 21 generates a third fringe image based on the collected light, thefourth camera 21 generates a fourth fringe image based on the collected light, and the third fringe image and the fourth fringe image each include fringes of at least two colors and recognizable fringes of at least two colors.
It should be noted that: the stripes of the third stripe image and the fourth stripe image, which both comprise at least two colors, are used for realizing the distinguishing processing of the two stripes in color, but not for limiting the color.
In addition, the lightbeam processing device 22 implements, through the right-angle two-channeldichroic prism 22a, separation processing on light rays obtained by the designatedcamera 21, so that the designatedcamera 21 obtains light rays containing a designated wavelength band, where the obtaining of light rays containing a designated wavelength band by the designatedcamera 21 includes: thethird camera 21 acquires light of a third specified wavelength band, and thefourth camera 21 acquires light of a fourth specified wavelength band.
The following examples illustrate:
preferably, thethird camera 21 is a black andwhite camera 21, and thefourth camera 21 is acolor camera 21.
Thelight emitting part 12 emits red light to thelight transmitting part 13 at a first period, and the red light is projected by a preset pattern arranged on thelight transmitting part 13 to generate a first preset stripe pattern; the first preset stripe pattern is projected onto the target object in the form of red coding stripe, and the light is modulated by the target object and then transmitted to the image processing device, in this embodiment, the right-angle two-channeldichroic prism 22a is a red, green and blue dichroic prism, so that the red light is emitted from the third light emitting portion, and the green light and the blue light are emitted from the fourth light emitting portion; at this time, the red coding stripe is emitted from the third light-emitting part through the right-angle two-channeldichroic prism 22a and collected by the black-and-white camera 21, and the black-and-white camera 21 generates a third stripe image containing the red stripe;
thelight emitting part 12 emits green light and blue light to thelight transmitting part 13 at a second period, and the green light and the blue light are transmitted through a preset image arranged on thelight transmitting part 13 to generate a second preset stripe pattern; the second preset stripe pattern is projected to the target object in a green-blue coding stripe mode, and light is modulated by the target object and then transmitted to the image processing device; at this time, the green-blue encoded stripe is emitted from the fourth light-emitting portion through the right-angle two-channeldichroic prism 22a and collected by thecolor camera 21, and thecolor camera 21 generates a fourth stripe image including the green stripe and the blue stripe.
The illuminatingpart 30 projects illuminating light onto the target object in the eighth time period, the illuminating light is transmitted to the image processing device after being emitted by the target object, the blue light and the green light in the illuminating light are collected by thecolor camera 21 to generate a fourth texture map, the red light is collected by the black-and-white camera 21 to generate a third texture map, and the third texture map and the fourth texture map are synthesized into the texture map of the target object. It can be seen that, in order to obtain the texture map of the target object, red light, green light and blue light all need to be collected and recognized by thecolor camera 21, or red light, green light and blue light all need to be collected and recognized by thecolor camera 21 and the black-and-white camera 21, that is, part of color light is collected and recognized by thecolor camera 21, and part of color light is collected and recognized by the black-and-white camera 21.
Further, since the third fringe image and the fourth fringe image both correspond to the samelight transmission portion 13, the fringes in the third fringe image and the fourth fringe image correspond to each other, and specifically, the fringes in the third fringe image and the fourth fringe image after being combined based on the same coordinate system correspond to the preset color-coded fringes on thelight transmission portion 13.
Specifically, the third fringe image is used as a reconstruction image, the fourth fringe image is used as an encoding image, the fourth fringe image is collected by thecolor camera 21, and both the green fringe and the blue fringe in the fourth fringe image can be identified, so that the encoding sequence of each fringe in the fourth fringe image can be determined; based on the corresponding relation of the stripes of the third stripe image and the fourth stripe image, all the stripes of the third stripe image can be identified and matched through the coding sequence of the fourth stripe image, and three-dimensional reconstruction is achieved.
Preferably, the black andwhite camera 21 only obtains monochromatic light, so that the third fringe image can be identified and determined, and the third fringe image can be combined with the fourth fringe image to determine the coding sequence of each fringe, that is, both the third fringe image and the fourth fringe image are used as coding patterns.
In addition, thefilter 22d may be provided in this embodiment, or thefilter 22d may not be provided, and thefilter 22d may be provided in cooperation with the two-channeldichroic prism 22 a.
It is worth emphasizing that: in this embodiment, the lightbeam processing device 22 implements a light splitting process on the light projected from the light inlet portion through the right-angle two-channeldichroic prism 22a, so that the light is projected from the third light outlet portion and the fourth light outlet portion to thecameras 21 respectively disposed corresponding to the light outlet portions; that is, the lightbeam processing device 22 realizes the function corresponding to the first light beam splitting unit by the right-angle two-channeldichroic prism 22 a.
For the same reason, it is also worth emphasizing: in this embodiment, the lightbeam processing device 22 further performs separation processing on the light rays obtained by the designatedcamera 21 through the right-angle two-channeldichroic prism 22a, so that the designatedcamera 21 obtains the light rays containing the designated wavelength band; that is, the lightbeam processing device 22 realizes the function corresponding to the second light beam splitting unit by the right-angle two-channeldichroic prism 22 a.
For example, the right-angle two-channeldichroic prism 22a allows red, green and blue light rays to be emitted from the third light emitting portion and the blue light ray to be emitted from the fourth light emitting portion, when a light beam containing red, green and blue light rays passes through the right-angle two-channeldichroic prism 22a, the red, green and blue light rays are separated from the blue light rays, the red, green and blue light rays are emitted through the third light emitting portion, and the blue light ray is emitted through the third light emitting portion.
Example two:
taking fig. 5 as an example, the lightbeam processing device 22 includes a three-channeldichroic prism 22b, and the three-channeldichroic prism 22b includes a fifth light emitting portion, a sixth light emitting portion, and a seventh light emitting portion, wherein the lightbeam processing device 22 implements a light splitting process on the light rays projected from the light entering portion through the three-channeldichroic prism 22b, so that the light rays are projected to thecameras 21 respectively disposed corresponding to the light emitting portions from the fifth light emitting portion, the sixth light emitting portion, and the seventh light emitting portion.
Correspondingly, theimage capturing device 20 includes afifth camera 21 disposed corresponding to the fifth light-emitting portion, asixth camera 21 disposed corresponding to the sixth light-emitting portion, and aseventh camera 21 disposed corresponding to the seventh light-emitting portion, thefifth camera 21 generates a fifth stripe image based on the collected light, thesixth camera 21 generates a sixth stripe image based on the collected light, theseventh camera 21 generates a seventh stripe image based on the collected light, and the fifth stripe image, the sixth stripe image, and the seventh stripe image each include stripes of at least two colors and the stripes of at least two colors are identifiable.
It should be noted that: the stripes including at least two colors in the fifth stripe image, the sixth stripe image and the seventh stripe image are used for realizing the distinguishing processing of the two stripes in color, but not for limiting the color.
At this time, the lightbeam processing device 22 implements separation processing on the light rays obtained by the designatedcamera 21 through the three-channeldichroic prism 22b, so that the designatedcamera 21 can obtain the light rays containing the designated wavelength band, where the obtaining of the light rays containing the designated wavelength band by the designatedcamera 21 at least includes: thefifth camera 21 acquires light of a fifth specified wavelength band, and thesixth camera 21 acquires light of a sixth specified wavelength band, where the fifth specified wavelength band is different from the sixth specified wavelength band.
Preferably, at least one of thefifth camera 21, thesixth camera 21 and theseventh camera 21 is a black-and-white camera 21, specifically, thefifth camera 21 is a black-and-white camera 21, and thesixth camera 21 and theseventh camera 21 arecolor cameras 21; or, thefifth camera 21 and thesixth camera 21 are black andwhite cameras 21, and theseventh camera 21 is acolor camera 21; alternatively, thefifth camera 21, thesixth camera 21, and theseventh camera 21 are all black-and-white cameras 21.
The following examples illustrate:
preferably, thefifth camera 21, thesixth camera 21 and theseventh camera 21 are all black andwhite cameras 21.
Thelight emitting part 12 emits red light to thelight transmitting part 13 in a third time period, and the red light is projected by the preset color coding stripes arranged on thelight transmitting part 13 to generate a third preset stripe pattern; the third preset stripe pattern is projected onto the target object in the form of red coding stripe, and the light is modulated by the target object and then transmitted to the image processing device, in this embodiment, the light beam processing device is a three-channeldichroic prism 22b for separating three colors of red, green and blue, so that the red light is emitted from the fifth light emitting part, the green light is emitted from the sixth light emitting part, and the blue light is emitted from the seventh light emitting part; at this time, the red encoding stripe is decomposed by the three-channeldichroic prism 22b, and is collected by thefifth camera 21 through the fifth light-emitting part, and thefifth camera 21 generates a fifth stripe image containing the red stripe;
thelight emitting part 12 emits blue light to thelight transmitting part 13 at a fourth period, and the blue light is projected by a preset image arranged on thelight transmitting part 13 to generate a fourth preset stripe pattern; the fourth preset stripe pattern is projected to the target object in a blue coding stripe mode, and light is modulated by the target object and then transmitted to the image processing device; at this time, the blue-coded stripe three-channeldichroic prism 22b is decomposed and collected by thesixth camera 21 through the sixth light-emitting portion, and thesixth camera 21 generates a sixth stripe image including blue stripes.
Thelight emitting part 12 emits green light to thelight transmitting part 13 in a fifth time period, and the green light is projected by a preset image arranged on thelight transmitting part 13 to generate a fifth preset stripe pattern; the fifth preset stripe pattern is projected to the target object in a green coding stripe mode, and light is modulated by the target object and then transmitted to the image processing device; at this time, the green-coded stripes are decomposed by the three-channeldichroic prism 22b, and are collected by theseventh camera 21 through the seventh light-emitting portion, and theseventh camera 21 generates a seventh stripe image including the green stripes.
The illuminatingpart 30 projects illuminating light onto the target object in a ninth time period, the illuminating light is transmitted to the image processing device after being emitted by the target object, red light in the illuminating light is collected by thefifth camera 21 to generate a fifth texture map, blue light is collected by thesixth camera 21 to generate a sixth texture map, green light is collected by theseventh camera 21 to generate a seventh texture map, and the fifth texture map, the sixth texture map and the seventh texture map are synthesized into a texture map of the target object. It can be seen that, in order to obtain the texture map of the target object, red light, green light and blue light all need to be collected and identified by thecolor camera 21, or red light, green light and blue light all need to be collected and identified by thecolor camera 21 and the black-and-white camera 21, that is, a part of color light is collected and identified by thecolor camera 21, a part of color light is collected and identified by the black-and-white camera 21, or red light, green light and blue light all need to be collected and identified by the black-and-white camera 21, that is, each color light is separately collected and identified by one black-and-white camera 21.
Further, since the fifth stripe image, the sixth stripe image and the seventh stripe image all correspond to the samelight transmission portion 13, each stripe in the fifth stripe image, the sixth stripe image and the seventh stripe image corresponds to each other, and specifically, the fifth stripe image, the sixth stripe image and the seventh stripe image after being combined correspond to a preset pattern on thelight transmission portion 13.
Specifically, any stripe image combination determined by the fifth stripe image, the sixth stripe image and the seventh stripe image may be used as a reconstructed image, and any stripe image combination determined by the fifth stripe image, the sixth stripe image and the seventh stripe image may be used as an encoded image. Preferably, the fifth stripe image, the sixth stripe image and the seventh stripe image are used together as a coding pattern to determine coding sequences of the respective stripes; and the fifth fringe image, the sixth fringe image and the seventh fringe image are used as reconstruction images together to realize three-dimensional reconstruction.
In addition, thefilter 22d may be provided in this embodiment, or thefilter 22d may not be provided, and thefilter 22d may be provided in cooperation with the three-channeldichroic prism 22 b.
It is worth emphasizing that: in this embodiment, the lightbeam processing device 22 implements, through the three-channeldichroic prism 22b, a light splitting process on the light projected from the light inlet portion, so that the light is projected from the fifth light outlet portion, the sixth light outlet portion, and the seventh light outlet portion to thecameras 21 respectively disposed corresponding to the light outlet portions; that is, the lightbeam processing device 22 realizes the function corresponding to the first light beam splitting unit through the three-channeldichroic prism 22 b.
Similarly, in this embodiment, the lightbeam processing device 22 further performs separation processing on the light beam obtained by the designatedcamera 21 through the three-channeldichroic prism 22b, so that the designatedcamera 21 obtains the light beam containing the designated wavelength band; that is, the lightbeam processing device 22 realizes the function corresponding to the second light beam splitting unit through the three-channeldichroic prism 22 b.
Example three:
taking fig. 6 as an example, the lightbeam processing device 22 includes a semi-reflective andsemi-transparent prism 22c, and the semi-reflective andsemi-transparent prism 22c includes a first light emitting portion and a second light emitting portion, wherein the lightbeam processing device 22 implements a light splitting process on the light projected from the light inlet portion through the semi-reflective andsemi-transparent prism 22c, so that the light is projected from the first light emitting portion and the second light emitting portion to thecameras 21 corresponding to the respective light emitting portions;
correspondingly, theimage capturing device 20 includes afirst camera 21 disposed corresponding to the first light emitting portion, and asecond camera 21 disposed corresponding to the second light emitting portion, where thefirst camera 21 generates a first stripe image based on the collected light, and thesecond camera 21 generates a second stripe image based on the collected light, where the first stripe image and the second stripe image each include stripes of at least two colors and the stripes of at least two colors are identifiable.
It should be noted that: the stripes of the first stripe image and the second stripe image, which both comprise at least two colors, are used for realizing the distinguishing processing of the two stripes in color, but not the limitation of the color.
In addition, in an embodiment, the lightbeam processing apparatus 22 further includes afilter 22d, wherein the lightbeam processing apparatus 22 performs separation processing on the light rays to be acquired by the designatedcamera 21 through thefilter 22d, so that the designatedcamera 21 acquires the light rays containing the designated wavelength band, and at least one of the plurality ofcameras 21 is the designatedcamera 21.
In an alternative example, theoptical filter 22d is disposed between the first light emitting portion and thefirst camera 21 so that thefirst camera 21 acquires light of the first specified wavelength band, and/or disposed between the second light emitting portion and thesecond camera 21 so that thesecond camera 21 acquires light of the second specified wavelength band.
The following examples illustrate:
preferably, thefirst camera 21 is a black-and-white camera 21, thesecond camera 21 is acolor camera 21, and the black-and-white camera 21 is disposed corresponding to thefilter 22 d.
Thelight emitting part 12 emits red light to thelight transmitting part 13 in a sixth time period, and the red light is projected through a preset pattern (i.e. preset coding stripes) arranged on thelight transmitting part 13 to generate a sixth preset stripe pattern; the sixth preset stripe pattern is projected to the target object in a red coding stripe mode, and light is modulated by the target object and then transmitted to the image processing device; at this time, the red encoding stripe is decomposed by the half-reflecting and half-transmittingprism 22c into two red light beams, wherein at least one light beam is collected by the black-and-white camera 21 to generate a first stripe image.
In addition, the light is filtered by ared filter 22d before being collected by the black-and-white camera 21. That is, the filter color of thefilter 22d provided in front of thecamera 21 corresponds to the color of the light beam collected by thecamera 21.
Thelight emitting part 12 emits red light and blue light to thelight transmitting part 13 in a seventh period, and the red light and the blue light are projected through a preset image arranged on thelight transmitting part 13 to generate a seventh preset stripe pattern; the seventh preset stripe pattern is projected to the target object in a red-blue coding stripe mode, and light is modulated by the target object and then transmitted to the image processing device; at this time, the red-blue encoding stripe is decomposed by the half-reflecting and half-transmittingprism 22c into two red-blue light beams, wherein at least one light beam is collected by thecolor camera 21 to generate a second stripe image.
The illuminator 30 projects illumination light onto the target object in the tenth period, and transmits the illumination light to the image processing device after being emitted by the target object, and the red light, the blue light and the green light in the illumination light are collected by thesecond camera 21 to generate a texture map. In the present embodiment, if theoptical filter 22d is disposed in front of thecolor camera 21, in order to obtain the texture map of the target object, red light, green light, and blue light need to be collected and recognized by thecolor camera 21 and the black-and-white camera 21, that is, part of the color light is collected and recognized by thecolor camera 21, and part of the color light is collected and recognized by the black-and-white camera 21.
Further, since the first stripe image and the second stripe image both correspond to the samelight transmitting portion 13, the first stripe image and the second stripe image correspond to each stripe, and specifically, the first stripe image and the second stripe image are combined to correspond to a predetermined pattern on thelight transmitting portion 13.
Specifically, the first stripe image is used as a reconstruction image, the second stripe image is used as a coding image, wherein the second stripe image is collected by thecolor camera 21, and both red stripes and blue stripes in the second stripe image can be identified and determined, so that the coding sequence of each stripe in the second stripe image can be determined; based on the corresponding relation of the stripes of the first stripe image and the second stripe image, each stripe of the first stripe image can be identified and matched through the coding sequence of the second stripe image, and three-dimensional reconstruction is achieved.
It should be noted that: thefilter 22d is disposed in front of the black-and-white camera 21 as an optional example, and whether thefilter 22d is disposed in front of thecamera 21 is not specifically limited in the present application, and it is only necessary to ensure that at least two colors of stripes in the stripe image acquired by eachcamera 21 can be identified and determined.
Specifically, theoptical filter 22d is not arranged in front of the black-and-white camera 21, and the first stripe image acquired by the black-and-white camera 21 includes red stripes; or, ablue color filter 22d is disposed in front of thecolor camera 21, the second stripe image acquired by thecolor camera 21 includes blue stripes, and because the red light emitted by thelight emitting unit 12 in the sixth time period, the red light and the blue light emitted in the seventh time period, in order to ensure that at least two color stripes in the stripe image acquired by eachcamera 21 can be identified, thered color filter 22d cannot be disposed in front of thecolor camera 21, so that only red stripes are avoided in the stripe images acquired by the black-and-white camera 21 and thecolor camera 21; alternatively, adichroic filter 22d is provided in front of thecolor camera 21, and the second streak image acquired by thecolor camera 21 includes red streaks and blue streaks.
It should be noted that the preset fringe pattern and the illumination light are projected at a very small time interval in each period, so as to ensure that the three-dimensional scanner remains stationary or substantially stationary in the period, and the preset fringe pattern and the illumination light are (substantially) projected on the same region of the target object.
It is worth emphasizing that: in this embodiment, the lightbeam processing device 22 transmits and reflects light through the half-reflecting and half-transmittingprism 22c to realize light splitting processing on the light projected from the light inlet portion, so that the light is projected from the first light outlet portion and the second light outlet portion to thecameras 21 respectively arranged corresponding to the light outlet portions; that is, the lightbeam processing device 22 realizes the function corresponding to the first light beam splitting unit through the half-reflecting and half-transmittingprism 22 c.
At the same time, it is also worth emphasizing: in this embodiment, the lightbeam processing device 22 performs separation processing on the light rays to be acquired by the designatedcamera 21 through theoptical filter 22d, so that the designatedcamera 21 acquires the light rays containing the designated wavelength band; that is, the lightbeam processing device 22 realizes the function corresponding to the second light beam splitting unit through thefilter 22 d.
It should be noted that: the first embodiment, the second embodiment and the third embodiment are all listed in the present application, so that a person skilled in the art can more clearly understand an exemplary illustration of the technical solution of the present application, and the present application is not limited specifically herein. Other specific devices may also be used as a practical solution to the present application, if they can implement the function limitation description of thebeam processing device 22 in the present application.
Further, it should be noted that: for example, in the second embodiment and the third embodiment, after the lightbeam processing device 22 realizes the function corresponding to the second light beam splitting unit through the right-angle two-channeldichroic prism 22a or the three-channeldichroic prism 22b, the lightbeam processing device 22 may continue to realize the function corresponding to the second light beam splitting unit again through theoptical filter 22 d.
In summary, compared with the prior art, the invention has the following beneficial effects:
1. the stripe extraction algorithm based on the spatial coding realizes the technical purpose that the target object can be subjected to three-dimensional reconstruction only by a small amount of two-dimensional images, and achieves the technical effects of reducing the frame rate of thecamera 21 and the operation cost of the algorithm;
2. the color is used as the information of the space coding, so that the coded information is easy to identify, and the technical effect of improving the identification accuracy is achieved;
3. based on the technical principle of the three-dimensional scanner, the three-dimensional scanner can perform pattern projection processing in a simple transmission projection mode; furthermore, under the condition that the three-dimensional scanner performs pattern projection processing in a transmission projection mode, the hardware cost is greatly reduced;
4. in the case where the three-dimensional scanner performs the pattern projection processing using the laser as the light source, the luminance and the depth of field of the projection device (i.e., the combination of thelight emitting unit 12 and the light transmitting unit 13) can be improved, and the technical effect of realizing low cost, high luminance, and high depth of field can be achieved.
That is, the three-dimensional scanner provided by the application has the advantages of low hardware cost, low real-time frame rate requirement, high brightness and large depth of field of an optical system, and miniaturization of equipment; and the three-dimensional scanner can directly perform dynamic real-time three-dimensional scanning with colored textures on materials with the characteristics of light reflection, light transmission, light diffusion and the like, such as teeth, gum and the like in the mouth.
According to an embodiment of the application, a three-dimensional scanning system is also provided.
FIG. 7 is a schematic diagram of a three-dimensional scanning system according to an embodiment of the present application. As shown in fig. 7, the three-dimensional scanning system includes: a three-dimensional scanner 71 and an image processor 73.
The three-dimensional scanner 71 is configured to project a preset stripe pattern corresponding to each preset period to a target object in each preset period, and collect light modulated by the target object to obtain a plurality of stripe images under the condition that the preset stripe pattern is projected to the target object, where the stripe of each preset stripe pattern is arranged according to a preset color coding stripe, each preset stripe pattern includes stripes of at least one color of the preset color coding stripes, the plurality of preset stripe patterns includes stripes of at least two colors of the preset color coding stripes, and the stripes of the preset stripe patterns are arranged in accordance with the stripes of the same color in the preset color coding stripes;
an image processor 73, connected to the three-dimensional scanner 71, configured to acquire a plurality of stripe images acquired by the three-dimensional scanner 71, determine each stripe sequence according to the stripe images as an encoding map, and perform three-dimensional reconstruction on the target object as a reconstruction map;
it should be noted that: the three-dimensional scanner 71 is any one of the three-dimensional scanners provided in the above embodiments.
It should also be noted that: the three-dimensional scanning system is based on the stripe extraction algorithm of the space coding, realizes the technical effects that the three-dimensional scanner 71 can perform pattern projection processing in a simple transmission projection mode, and can realize the three-dimensional reconstruction of a target object only by a small amount of two-dimensional images, and solves the technical problems that the three-dimensional reconstruction method in the related technology needs higher hardware cost and is not beneficial to the popularization and the use of a three-dimensional scanning device.
In addition, the three-dimensional scanning system also improves the three-dimensional identification accuracy by using the color as the space coding information.
In an optional example, in a case where the three-dimensional scanner 71 acquires light modulated by the target object through a plurality ofcameras 21 to obtain a plurality of stripe images, and at least one black-and-white camera 21 is included in the plurality ofcameras 21, the image processor 73 is further configured to: taking the stripe image obtained by at least one black-and-white camera 21 as a reconstruction image to perform three-dimensional reconstruction on the target object; the strip images obtained by at least one black andwhite camera 21 are used as code patterns to determine the respective strip sequences, and/or the strip images obtained by at least onecolor camera 21 are used as code patterns to determine the respective strip sequences.
According to the embodiment of the application, a three-dimensional scanning method is further provided.
It should be noted that: the three-dimensional scanning method provided by the embodiment of the application is applied to the three-dimensional scanner provided by the embodiment of the application. The following describes a three-dimensional scanning method provided in an embodiment of the present application.
Fig. 8 is a flowchart of a three-dimensional scanning method according to an embodiment of the present application. As shown in fig. 8, the three-dimensional scanning method includes:
step S801, respectively emitting initial light corresponding to each preset period in each preset period, where each initial light is composed of light of at least one color in a preset color coding stripe, and after each initial light is transmitted through a pattern of the preset color coding stripe arranged on thelight transmission portion 13, each initial light generates a corresponding preset color stripe and projects the corresponding preset color stripe onto a target object;
step S803, respectively collecting light modulated by the target object in the preset periods, and obtaining a plurality of fringe images based on the light, where the obtained fringe images are used as a code pattern to determine fringe sequences, and are used as a reconstruction pattern to perform three-dimensional reconstruction on the target object;
step S805, determining a sequence of each stripe in the plurality of stripe images based on the coding pattern;
step S807, three-dimensional reconstruction is performed on the reconstructed map based on the sequence, and three-dimensional data of the target object is acquired.
In summary, the three-dimensional scanning method provided in the embodiment of the present application is based on the stripe extraction algorithm of the spatial coding, so that the three-dimensional scanner can perform the pattern projection processing in a simple transmission projection manner, and the three-dimensional reconstruction of the target object can be achieved only with a small number of two-dimensional images, thereby solving the technical problems that the three-dimensional reconstruction method in the related art requires high hardware cost and is not beneficial to the popularization and use of the three-dimensional scanning device.
In addition, the three-dimensional scanning method also achieves the technical effect of improving the three-dimensional identification accuracy by taking the color as the information of the space coding.
In an optional example, the three-dimensional scanning method further comprises: projecting illumination light onto a target object and acquiring texture data of the target object based on the illumination light; and acquiring the color three-dimensional data of the target object based on the three-dimensional data and the texture data of the target object.
Alternatively, the texture data may be acquired by asingle camera 21, or may be synthesized by data acquired by a plurality ofcameras 21.
Preferably, in step S803, light modulated by the target object is collected, and at least two stripe images are acquired based on the light, wherein at least one stripe image is acquired by the black-and-white camera 21, and the stripe image acquired by the black-and-white camera 21 is used as a reconstructed image.
Specifically, in step S805, a sequence of each stripe in the plurality of stripe images is determined based on the code map, and a code sequence is determined based on the arrangement information and the color information of each stripe in the code map, for example, if four stripes arranged in red, green, and red are coded and decoded by red (1,0) and green (0,1), the code sequence is (1,0) (0,1) (0,1) (1,0), and if five stripes arranged in red, blue, green, and red are coded and decoded by red (1,0,0), green (0,1,0), and blue (0,0,1), the code sequence is (1,0,0), (0,0,1), (0,1, 0);
specifically, in step S807, stripe matching is performed on each stripe of the reconstructed image based on the coding sequence, for binocular reconstruction, twoimage acquisition devices 20 are provided in combination with the present embodiment, stripe matching is performed on the reconstructed images of the twoimage acquisition devices 20, point cloud reconstruction is performed after matching, three-dimensional data of the target object is obtained, for monocular reconstruction, oneimage acquisition device 20 is provided in combination with the present embodiment, stripe matching is performed on the reconstructed image of theimage acquisition device 20 and a preset color coding stripe provided on thelight transmission portion 13, point cloud reconstruction is performed after matching, and three-dimensional data of the target object is obtained.
The following is explained by a specific method:
in an alternative example, thelight emitting part 12 and thelight transmitting part 13 project red and blue color coding stripes to a target object in a first period, the red and blue color coding stripes are modulated by the target object and then transmitted to the image processing device, the light of the red and blue color coding stripes is separated into at least one beam of light of the red and blue color coding stripes by the semi-reflective andsemi-transparent prism 22c, wherein the beam of light of the red and blue color coding stripes is collected by thecolor camera 21, and thecolor camera 21 generates a corresponding image of the red and blue color coding stripes; and thelight emitting part 12 and thelight transmitting part 13 project blue coding stripes to the target object in the second time period, the blue coding stripes are modulated by the target object and then transmitted to the image processing device, the light of the blue coding stripes is separated into at least one beam of light of the blue coding stripes by the semi-reflective andsemi-transparent prism 22c, wherein one beam of light of the blue coding stripes is collected by the black-and-white camera 21 through theblue filter 22d, and the black-and-white camera 21 generates corresponding blue stripe images.
In addition, the illuminatingpart 30 irradiates white light to the target object in a third time period, the white light is collected by thecolor camera 21 after being reflected by the target object, thecolor camera 21 generates a texture map, a coding sequence of each stripe is determined based on the red and blue color coding stripe images, stripe matching is performed on each stripe of the blue stripe images based on the coding sequence, three-dimensional reconstruction is achieved, three-dimensional data of the target object is obtained, and true color three-dimensional data of the target object is obtained based on the three-dimensional data and the texture map.
In an alternative example, thelight emitting part 12 and thelight transmitting part 13 project red and green color coding stripes to a target object in a first period, the red and green color coding stripes are modulated by the target object and then transmitted to the image processing device, the light of the red and green color coding stripes is decomposed into a beam of light of the red and green color coding stripes through the right-angle two-channelcolor separation prism 22a, wherein the beam of light of the red and green color coding stripes is collected by thecolor camera 21, and thecolor camera 21 generates a corresponding image of the red and green color coding stripes; and thelight emitting part 12 and thelight transmitting part 13 project blue coding stripes to the target object in the second time period, the blue coding stripes are modulated by the target object and then transmitted to the image processing device, the light of the blue coding stripes is decomposed into a beam of light of the blue coding stripes by the right-angle two-channeldichroic prism 22a, wherein the beam of light of the blue coding stripes is collected by the black andwhite camera 21, and the black andwhite camera 21 generates a corresponding blue stripe image.
In addition, thelighting device 30 irradiates white light to the target object in a third time period, the white light is collected by thecolor camera 21 and the black-and-white camera 21 after being reflected by the target object, thecolor camera 21 generates texture maps based on red light and green light, the black-and-white camera 21 generates texture maps based on blue light, a coding sequence of each stripe is determined based on a red-green color coding stripe image, each stripe of the blue stripe image is subjected to stripe matching based on the coding sequence, three-dimensional reconstruction is achieved, three-dimensional data of the target object is obtained, the texture maps based on the white light are synthesized based on the texture maps of thecolor camera 21 and the texture maps of the black-and-white camera 21, and true color three-dimensional data of the target.
In an alternative example, the light emitting unit 12 and the light transmitting unit 13 project red coding stripes to a target object in a first time period, the red coding stripes are modulated by the target object and then transmitted to the image processing device, the light of the red coding stripes is decomposed into a bundle of light of the red coding stripes by the three-channel dichroic prism 22b, wherein the bundle of light of the red coding stripes is collected by the first black and white camera 21, and the first black and white camera 21 generates a corresponding image of the red coding stripes; the light emitting part 12 and the light transmitting part 13 project green coding stripes to a target object in a second time period, the green coding stripes are modulated by the target object and then transmitted to the image processing device, light of the green coding stripes is decomposed into a beam of light of the green coding stripes through the three-channel color separation prism 22b, wherein the beam of light of the green coding stripes is collected by the second black-and-white camera 21, and the second black-and-white camera 21 generates corresponding green coding stripe images; and the light emitting part 12 and the light transmitting part 13 project blue coding stripes to the target object in a third time period, the blue coding stripes are modulated by the target object and then transmitted to the image processing device, the light of the blue coding stripes is decomposed into a beam of light of the blue coding stripes by the three-channel color separation prism 22b, wherein the beam of light of the blue coding stripes is collected by the third black and white camera 21, and the third black and white camera 21 generates a corresponding blue coding stripe image.
In addition, thelighting device 30 irradiates white light to the target object at a fourth time period, the white light is collected by the three black andwhite cameras 21 after being reflected by the target object, the first black andwhite camera 21 generates a texture map based on red light, the second black andwhite camera 21 generates a texture map based on green light, the third black andwhite camera 21 generates a texture map based on blue light, a coding sequence of each stripe is determined based on a combination of the red stripe image, the green stripe image and the blue stripe image, stripe matching is performed on each stripe of the red stripe image, the green stripe image and the blue stripe image based on the coding sequence, three-dimensional reconstruction is achieved to obtain three-dimensional data of the target object, the texture map based on the white light is synthesized based on the texture maps of the three black andwhite cameras 21, and true color three-dimensional data of the target object is.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
An embodiment of the present invention provides a storage medium on which a program is stored, the program implementing the three-dimensional scanning method when executed by a processor.
The embodiment of the invention provides a processor, which is used for running a program, wherein the three-dimensional scanning method is executed when the program runs.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.
In addition, in the above embodiments of the present invention, the description of each embodiment has a respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to the related description of other embodiments.