Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments that can be obtained by a person skilled in the art without making any inventive step based on the embodiments in the present application belong to the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an embodiment of an image denoising method according to the present application, including the following steps:
step S11, acquiring an infrared image and a visible light image at the current time, where the infrared image includes an infrared luminance channel and the visible light image includes a visible luminance channel and a visible color channel. Taking the visible light image in YUV format as an example, the visible luminance channel refers to the Y channel, and the visible color channel refers to the UV channel.
In the process of acquiring a monitored image, the monitoring device is usually interfered by external environment and software and hardware, for example, due to defects of sensor materials, electronic components and circuit structures, transmission media, recording devices, and the like, so that noise in the image is often difficult to avoid.
The presence of noise in the image not only has poor visual appearance, but also directly affects the performance of advanced computer vision applications, such as face recognition, vehicle detection, and the like. For this reason, noise suppression needs to be performed through a noise reduction algorithm, however, when noise in an image is severe, the noise reduction algorithm is difficult to effectively distinguish between a signal and the noise, and great challenges are brought to the noise reduction algorithm. This phenomenon is increasingly obvious under low illumination environment, and in practical application scene, the low illumination condition is more common again, though use white light filling lamp can compensate the weak defect of ambient light better, but also can bring comparatively serious light pollution, and the use scene receives certain restriction. Compared with the prior art, the infrared light supplement lamp does not bring excessive light pollution, and simultaneously, the infrared image collected by the infrared camera has a higher signal-to-noise ratio, but the infrared image has inherent defects, namely the infrared image hardly carries any color information of an object and cannot better reflect the real information of the object. The use of either visible or infrared images alone has disadvantages. For the two images with complementarity, if the noise reduction of the visible light image can be guided through the infrared image information, the visible light image can be used for effectively discriminating the signals and the noise, so that the noise reduction effect of the visible light image is better, the real scene information can be reflected, and the defect that the infrared image signals cannot reflect the object color information is avoided. Therefore, a combined noise reduction algorithm needs to be developed to process the image, and by mining infrared image information and feeding the infrared image information back to the noise reduction process of the visible light image, the noise reduction effect of the visible light image is better, and the visual impression is better.
However, the infrared image itself may have a low signal-to-noise ratio region, an infrared information loss region, an overexposure region, and the like, and the infrared image also needs noise reduction processing. Of course, the infrared luminance channel of the infrared image after noise reduction may also be used to perform noise reduction guidance on the visible light image. In addition, after the infrared image and the visible light image are acquired, the infrared image and the visible light image are registered and aligned, so that subsequent noise reduction processing is facilitated.
And step S12, respectively performing spatial domain noise reduction and temporal domain noise reduction on the infrared image and the visible light image by using signals of the infrared brightness channel, the visible brightness channel and the visible color channel. After the infrared image and the visible light image at the current moment are obtained, the color information of the infrared channel is omitted because the infrared image hardly carries the color information of the shot object, and the infrared brightness channel, the visible brightness channel and the visible color channel can be obtained. The visible color channel is not limited to one, that is, the present application does not limit the expression format of the visible light image, such as HSV and YUV, as long as the luminance channel and the color channel are included. When there are multiple color channels, the same noise reduction algorithm is used for the multiple color channels. The YUV is taken as an example to explain here, if there are two U channels and two V channels in a color channel, the same denoising algorithm is applied to the U channel and the V channel to obtain denoising strengths at various positions, then the two denoising strengths are weighted and fused to obtain the denoising strength corresponding to the visible color channel, and the weighting coefficient can be specified from the outside. Hereinafter, "three channels" includes an infrared luminance channel, a visible luminance channel, and a visible color channel, wherein when there are a plurality of visible color channels, "three channels" include the visible color channel after fusion.
After the signals of the three channels are obtained, the signals can be used for respectively carrying out space domain noise reduction and time domain noise reduction on the infrared image and the visible light image. When the noise reduction is performed on the visible light image in a space domain or a time domain, different noise reduction intensities can be adopted for the visible brightness channel and the visible color channel so as to improve the performance of the noise reduction algorithm. In addition, the order of spatial domain noise reduction and time domain noise reduction is not limited, and time domain noise reduction can be performed first and then spatial domain noise reduction can be performed, or spatial domain noise reduction can be performed first and then time domain noise reduction can be performed.
In the embodiment, when performing spatial domain noise reduction and time domain noise reduction on the infrared image and the visible light image, signals of one or more channels of an infrared brightness channel of the infrared image, a visible brightness channel of the visible light image and a visible color channel can be used, and for spatial domain or time domain noise reduction of the visible light image, different noise reduction algorithms can be realized according to two different channel characteristics of brightness and color, so that differentiated noise reduction processing is performed on the two different channels. Therefore, the image denoising method can be used for processing the image, reducing the noise in the image, improving the signal to noise ratio of the image and enabling the visual impression to be better.
In some embodiments, referring to fig. 2, fig. 2 is a flowchart illustrating an embodiment of step S12 in fig. 1, and the following steps may be performed to perform spatial domain noise reduction and temporal domain noise reduction on the infrared image and the visible light image, respectively.
And step S21, acquiring motion information corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel respectively.
Specifically, referring to fig. 3, fig. 3 is a flowchart illustrating an embodiment of step S21 in fig. 2, and the motion information corresponding to the infrared luminance channel, the visible luminance channel, and the visible color channel may be obtained through the following steps.
Step S31, respectively acquiring a first frame difference image, a second frame difference image and a third frame difference image of the current frame image and the previous frame image of the infrared luminance channel, the visible luminance channel and the visible color channel. Here, the third frame difference image may be obtained through a single color channel, or may be obtained by fusing a plurality of color channels, and the fusion weight may be specified through the outside.
In any channel, the current frame image is used to subtract the previous frame image to obtain the frame difference image of the previous frame image and the next frame image. In another embodiment, when the corresponding noise-reduced image exists in the previous frame image, the noise-reduced image of the previous frame image may be subtracted from the current frame image to obtain the corresponding frame difference image. And if the total number of the three channels of the infrared brightness channel, the visible brightness channel and the visible color channel is three, the corresponding three frame difference images can be obtained.
Step S32, performing mean filtering processing on the first frame difference image, the second frame difference image, and the third frame difference image, and taking the absolute value result as the motion information corresponding to the infrared luminance channel, the visible luminance channel, and the visible color channel, respectively.
After frame difference images corresponding to each of the three channels are obtained, n × n mean filtering processing is performed on each frame difference image, and then an absolute value of a mean result is taken as corresponding motion information, wherein the size of a window n for mean filtering processing is specified by the outside, and specific operations of the mean filtering processing are the same as those disclosed in the prior art, and are not described herein again. After the processing, the motion information corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel is obtained. In addition, the larger the value of the motion information of a certain pixel point is, the more the pixel point is biased to the motion area, otherwise, the more the pixel point is biased to the static area.
The method and the device adopt a frame difference and mean filtering mode to obtain the motion information of each channel, and perform noise reduction processing subsequently, so that the algorithm is mature, and the accuracy and the efficiency of the noise reduction algorithm are improved.
And step S22, acquiring infrared brightness space domain noise reduction intensity, visible brightness space domain noise reduction intensity and visible color space domain noise reduction intensity respectively corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel by utilizing the motion information, and acquiring infrared brightness time domain noise reduction intensity, visible brightness time domain noise reduction intensity and visible color time domain noise reduction intensity respectively corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel by utilizing the motion information.
After the motion information corresponding to the three channels is obtained, the motion information is used to obtain the spatial domain noise reduction intensity and the temporal domain noise reduction intensity corresponding to the three channels, that is, three spatial domain noise reduction intensities and three temporal domain noise reduction intensities are obtained in total, so as to perform spatial domain noise reduction and temporal domain noise reduction on the infrared image and the visible light image respectively in different channels, and a specific process of obtaining the noise reduction intensity will be described below.
And step S23, respectively performing space domain noise reduction on the infrared brightness channel, the visible brightness channel and the visible color channel by utilizing the infrared brightness space domain noise reduction intensity, the visible brightness space domain noise reduction intensity and the visible color space domain noise reduction intensity so as to respectively perform space domain noise reduction on the infrared image and the visible light image, and respectively performing time domain noise reduction on the infrared brightness channel, the visible brightness channel and the visible color channel by utilizing the infrared brightness time domain noise reduction intensity, the visible brightness time domain noise reduction intensity and the visible color time domain noise reduction intensity so as to respectively perform time domain noise reduction on the infrared image and the visible light image.
After the three space domain noise reduction strengths and the three time domain noise reduction strengths are obtained, the three space domain noise reduction strengths are utilized to respectively perform space domain noise reduction on the three corresponding channels, and the three time domain noise reduction strengths are utilized to respectively perform time domain noise reduction on the three corresponding channels, so that the noise reduction processing on the infrared image and the visible light image is completed. The specific spatial domain or temporal domain denoising process is the same as that in the prior art, and is not described herein again.
In the embodiment, the motion information of each channel is obtained by adopting a frame difference and mean filtering mode, three space domain noise reduction intensities and three time domain noise reduction intensities corresponding to the three channels are further obtained by utilizing the motion information, and the infrared image and the visible light image are subjected to noise reduction processing in the three channels respectively. The method is used for reducing noise in a space domain or a time domain of a visible light image, different noise reduction intensities can be realized according to two different channel characteristics of brightness and color, and the two different channels are subjected to differentiated noise reduction processing. Therefore, the image denoising method can be used for processing the image, reducing the noise in the image and improving the performance of the denoising algorithm.
In some embodiments, referring to fig. 4, fig. 4 is a flowchart illustrating an embodiment of step S22 in fig. 2, and the spatial noise reduction strength corresponding to each of the three channels may be obtained through the following steps.
And step S41, obtaining the infrared brightness initial airspace noise reduction intensity, the visible brightness initial airspace noise reduction intensity and the visible color initial airspace noise reduction intensity which respectively correspond according to the edge information and the non-edge information of the infrared brightness channel, the visible brightness channel and the visible color channel.
The wavelet denoising based on the hard threshold is taken as an example for explanation, the initial spatial domain denoising strength is the size of the hard threshold, for the non-edge region of the image, a larger hard threshold is designated for region smoothing, and for the edge region of the image, a smaller hard threshold is designated to achieve the purpose of edge preservation. The distinction between the edge region and the non-edge region can be determined by using an edge detection operator such as sobel, Prewitt and the like. In addition, when the edge region and the non-edge region are calculated, the image depended on can be the current frame image or any historical frame noise reduction image. And respectively executing the operations in the infrared brightness channel, the visible brightness channel and the visible color channel so as to obtain the initial noise reduction intensity of the infrared brightness airspace, the initial noise reduction intensity of the visible brightness airspace and the initial noise reduction intensity of the visible color airspace, and the total three initial noise reduction intensities of the airspace. Of course, in other noise reduction algorithm processes such as NLM, BM3D, the initial spatial noise reduction strength of different channels may be specified according to the specific algorithm process.
And step S42, acquiring infrared brightness difference airspace noise reduction intensity, visible brightness difference airspace noise reduction intensity and visible color difference airspace noise reduction intensity by utilizing the motion information, and acquiring airspace fusion noise reduction intensity by utilizing the infrared brightness initial airspace noise reduction intensity and the visible brightness initial airspace noise reduction intensity.
Specifically, referring to fig. 5, fig. 5 is a flowchart illustrating an embodiment of step S42 in fig. 4, and the infrared luminance difference spatial domain noise reduction intensity, the visible luminance difference spatial domain noise reduction intensity, and the visible color difference spatial domain noise reduction intensity may be obtained through the following steps.
In step S51, it is determined whether the motion information is smaller than the first threshold.
And step S52, if yes, assigning the corresponding differential airspace noise reduction intensity as a first intensity.
Step S53, otherwise, it is further determined whether the motion information is greater than a second threshold, where the second threshold is greater than the first threshold.
And step S54, if yes, assigning the corresponding differential spatial domain noise reduction intensity as a second intensity, wherein the second intensity is greater than the first intensity.
And step S55, otherwise, assigning the corresponding differential spatial noise reduction intensity as a third intensity, wherein the third intensity is greater than or equal to the first intensity and less than or equal to the second intensity, and is positively correlated with the motion information.
And (4) respectively repeating the steps S51-S55 aiming at the infrared brightness channel, the visible brightness channel and the visible color channel, so as to respectively obtain the differential airspace noise reduction intensity corresponding to each channel. Specifically, taking the infrared luminance channel as an example for explanation, please refer to fig. 6, where fig. 6 is a functional relationship diagram of the infrared luminance differentiation spatial domain noise reduction intensity and the motion information, in the diagram, T1 and T2 respectively correspond to a first threshold and a second threshold of the motion information of the infrared luminance channel, and S1 and S2 respectively correspond to a first intensity and a second intensity of the infrared luminance differentiation spatial domain noise reduction intensity of the infrared luminance channel. As can be seen from fig. 6, in the infrared luminance channel, when the motion information is less than the first threshold T1, the infrared luminance differentiation spatial noise reduction intensity is the first intensity S1; when the motion information is greater than the second threshold T2, the infrared brightness differentiation spatial domain noise reduction intensity is a second intensity S2; when the motion information is between the first threshold T1 and the second threshold T2, the infrared luminance differentiation spatial noise reduction intensity is a third intensity, wherein the third intensity is between the first intensity S1 and the second intensity S2, and the third intensity is positively correlated with the motion information, and the third intensity linearly changes from S1 to S2 as the motion information changes from T1 to T2.
Aiming at three channels, namely an infrared brightness channel, a visible brightness channel and a visible color channel, a first threshold value and a second threshold value, a first intensity and a second intensity of each channel can be specified from the outside and can be the same or different, and in each channel, the functional relation between the differentiated airspace noise reduction intensity and the motion information can be the same or different. When the motion information value of a certain pixel point in a certain channel is smaller than a first threshold value, the pixel point is located in a static area, and when the motion information value is larger than a second threshold value, the pixel point is located in a motion area.
According to the embodiment, the differential processing is carried out on the moving area and the static area when the airspace noise reduction strength is obtained, the differential airspace noise reduction strength of the moving area is obviously higher than that of the static area, the trailing effect caused by simply increasing the noise reduction strength of the moving area can be effectively inhibited, the final noise reduction effect is better, and the accuracy of the noise reduction algorithm is improved.
Further, the spatial domain fusion noise reduction intensity can be obtained by utilizing the infrared brightness initial spatial domain noise reduction intensity and the visible brightness initial spatial domain noise reduction intensity through the following steps:
and taking the sum of the product of the infrared brightness initial spatial domain noise reduction intensity and the first spatial domain fusion coefficient and the product of the visible brightness initial spatial domain noise reduction intensity and the second spatial domain fusion coefficient as the spatial domain fusion noise reduction intensity, wherein the sum of the first spatial domain fusion coefficient and the second spatial domain fusion coefficient is 1.
That is, the spatial domain fusion noise reduction intensity beta is calculated by the following formulafuvisy:
betafuvisy=betaorinir*m1+betaorivisy*m2;
Wherein, beta isorinirAnd betaorivisyRespectively representing the noise reduction intensity of an infrared brightness initial space domain and the noise reduction intensity of a visible brightness initial space domain, m1And m2Respectively representing a first airspace fusion coefficient and a second airspace fusion coefficient, the value ranges are all 0-1, and m is1+m2=0。
Specifically, the airspace fusion noise reduction intensity beta can be calculated according to the quality difference of the infrared image and the visible light imagefuvisyHere, the brightness of the image is taken as an example to determine the quality of the image. First, a first brightness range and a second brightness range are defined, if the brightness of a certain pixel point in the infrared image is within the first brightness range, the quality of the pixel point in the infrared image is excellent, and if the brightness of the certain pixel point in the visible image is within the second brightness range, the quality of the pixel point in the visible image is excellent. The first luminance range and the second luminance range are both designated from the outside, and may be the same or different.
When the brightness of a certain pixel point in the infrared image is within a first brightness range and the brightness of a corresponding pixel point in the visible light image is within a second brightness range, the quality of the infrared image and the quality of the visible light image are considered to be excellent, and m can be defined1=m20.5, that is, the fusion weight of the infrared luminance channel and the visible luminance channel is equivalent.
When the brightness of a certain pixel point in the infrared image is within a first brightness range and the brightness of a corresponding pixel point in the visible light image is not within a second brightness range, the quality of the infrared image is considered to be better than that of the visible light image, and m can be defined1>0.5>m2I.e. mainly in the infrared luminance channelAnd (5) spatial domain fusion.
When the brightness of a certain pixel point in the infrared image is not in a first brightness range and the brightness of a corresponding pixel point in the visible light image is in a second brightness range, the quality of the visible light image is considered to be better than that of the infrared image, and m can be defined2>0.5>m1That is, spatial domain fusion is performed mainly on the visible luminance channel.
When the brightness of a certain pixel point in the infrared image is not in a first brightness range and the brightness of a corresponding pixel point in the visible light image is not in a second brightness range, the quality of the infrared image and the quality of the visible light image are considered to be poor, and m is still defined at the moment2>0.5>m1That is, the spatial domain fusion is performed mainly by the visible brightness channel, but the initial spatial domain noise reduction intensity of the visible brightness used in the formula is adjusted to the initial spatial domain noise reduction intensity of the visible brightness of the historical noise-reduced visible light image.
Of course, in other embodiments, indexes such as variance of an image and high and low frequency information may be used as criteria for determining whether the infrared image and the visible light image are good or bad, and the present application does not limit the criteria.
In the embodiment, when the infrared image is used for guiding the airspace noise reduction process of the visible light image, various conditions that the infrared image has poor quality compared with the visible light image, has better quality compared with the visible light image, has equivalent quality with the visible light image and the like are considered. And for the situation that the quality of both the visible light image and the infrared image is poor, the historical denoising result with less noise and better edge information is used for generating the airspace fusion denoising intensity, so that the airspace denoising has better differentiation degree to the edge region and the non-edge region, the finally presented denoising effect is better, and the accuracy of the denoising algorithm is improved.
And step S43, acquiring infrared brightness spatial domain noise reduction intensity by using the infrared brightness initial spatial domain noise reduction intensity and the infrared brightness differentiated spatial domain noise reduction intensity, acquiring visible brightness spatial domain noise reduction intensity by using the spatial domain fusion noise reduction intensity and the visible brightness differentiated spatial domain noise reduction intensity, and acquiring visible color spatial domain noise reduction intensity by using the visible color initial spatial domain noise reduction intensity and the visible color differentiated spatial domain noise reduction intensity.
Specifically, step S43 includes:
obtaining a first product of the infrared brightness differentiation spatial domain noise reduction intensity and a first weight and a second product of the infrared brightness initial spatial domain noise reduction intensity and a second weight, and taking the sum of the first product and the second product as the infrared brightness spatial domain noise reduction intensity, wherein the sum of the first weight and the second weight is 1; obtaining a third product of the visible brightness differential spatial domain noise reduction intensity and a third weight and a fourth product of the spatial domain fusion noise reduction intensity and a fourth weight, and taking the sum of the third product and the fourth product as the visible brightness spatial domain noise reduction intensity, wherein the sum of the first alkali weight and the fourth weight is 1; and obtaining a fifth product of the visible color differential spatial noise reduction intensity and a fifth weight and a sixth product of the visible color initial spatial noise reduction intensity and a sixth weight, and taking the sum of the fifth product and the sixth product as the visible color spatial noise reduction intensity, wherein the sum of the fifth weight and the sixth weight is 1.
That is, the infrared luminance spatial domain noise reduction intensity, the visible luminance spatial domain noise reduction intensity, and the visible color spatial domain noise reduction intensity may be calculated using the following formulas:
betanir=alpha0*betamovenir+(1-alpha0)*betaorinir,
betavisy=alpha1*betamovevisy+(1-alpha1)*betafuvisy,
betavisuv=alpha2*betamovevisuv+(1-alpha2)*betaorivisuv;
wherein, beta isnir、betavisyAnd betavisuvRespectively representing the noise reduction intensity of an infrared brightness space domain, the noise reduction intensity of a visible brightness space domain and the noise reduction intensity of a visible color space domain, betamovenir、betamovevisyAnd betamovevisuvRespectively represents the noise reduction intensity of an infrared brightness difference airspace, the noise reduction intensity of a visible brightness difference airspace and the noise reduction intensity of a visible color difference airspaceorinirAnd betaorivisuvRespectively representing the noise reduction intensity of the initial space domain of the infrared brightness and the initial space domain reduction of the visible colorNoise intensity, betafuvisyThe spatial domain fusion noise reduction intensity is represented, and the alpha0, the alpha1 and the alpha2 respectively represent a first weight, a third weight and a fifth weight which are preset, can be specified from the outside, and all the value ranges are 0-1.
In the embodiment, the spatial domain noise reduction intensities corresponding to the three channels are respectively weighted and calculated according to the externally specified right coefficients, the guidance of the infrared image on the spatial domain noise reduction of the visible light image is realized in the calculation process, the differential spatial domain noise reduction intensities aiming at the moving area and the static area are also introduced into the final calculation process, the finally presented noise reduction effect is better, and the accuracy of the noise reduction algorithm is improved.
In some embodiments, referring to fig. 7, fig. 7 is a schematic flowchart of another embodiment of step S22 in fig. 2, and the time domain noise reduction intensities corresponding to the three channels may be obtained through the following steps.
And step S61, acquiring infrared brightness initial time domain noise reduction intensity, visible brightness initial time domain noise reduction intensity and visible color initial time domain noise reduction intensity respectively corresponding to the infrared brightness channel, the visible brightness channel and the visible color channel by utilizing the motion information, and assigning the infrared brightness time domain noise reduction intensity as the infrared brightness initial time domain noise reduction intensity.
Specifically, after the motion information corresponding to the three channels is obtained in the steps S31-S32, the infrared brightness initial time domain noise reduction intensity, the visible brightness initial time domain noise reduction intensity, and the visible color initial time domain noise reduction intensity are assigned to the motion information corresponding to the infrared brightness channel, the visible brightness channel, and the visible color channel, respectively. And further assigning the infrared brightness time domain noise reduction intensity as the infrared brightness initial time domain noise reduction intensity, namely, the infrared brightness time domain noise reduction intensity is equal to the motion information corresponding to the infrared brightness channel.
And step S62, acquiring a first time domain fusion noise reduction intensity by using the infrared brightness initial time domain noise reduction intensity and the visible brightness initial time domain noise reduction intensity, and assigning the visible brightness time domain noise reduction intensity as the first time domain fusion noise reduction intensity.
Specifically, the first time domain fusion noise reduction strength may be obtained by:
and taking the sum of the product of the infrared brightness initial time domain noise reduction intensity and the first time domain fusion coefficient and the product of the visible brightness initial time domain noise reduction intensity and the second time domain fusion coefficient as the first time domain fusion noise reduction intensity, wherein the sum of the first time domain fusion coefficient and the second time domain fusion coefficient is 1.
That is, the first time-domain fusion noise reduction strength gama is calculated using the following formulafuvisy:
gamafuvisy=gamaorinir*n1+gamaorivisy*n2;
Wherein, gamafuvisyRepresenting the first time domain fusion noise reduction strength, gamaorinirAnd bamaorivisyRespectively representing the initial time domain noise reduction intensity of the infrared brightness and the initial time domain noise reduction intensity of the visible brightness, n1And n2Respectively representing a first time domain fusion coefficient and a second time domain fusion coefficient, the value ranges are all 0-1, and n1+n2=0。
The same process as the spatial domain fusion, the first time domain fusion noise reduction intensity gama can be calculated according to the quality difference of the infrared image and the visible light imagefuvisyHere, the brightness of the image is taken as an example to determine the quality of the image. First, a first brightness range and a second brightness range are defined, if the brightness of a certain pixel point in the infrared image is within the first brightness range, the quality of the pixel point in the infrared image is excellent, and if the brightness of the certain pixel point in the visible image is within the second brightness range, the quality of the pixel point in the visible image is excellent. The first luminance range and the second luminance range are both designated from the outside, and may be the same or different.
When the brightness of a certain pixel point in the infrared image is within a first brightness range and the brightness of a corresponding pixel point in the visible light image is not within a second brightness range, the quality of the infrared image is considered to be better than that of the visible light image, and n can be defined1>0.5>n2Namely, the time domain fusion is performed mainly by the infrared brightness channel.
When the brightness of a certain pixel point in the infrared image is not within a first brightness range and the brightness of a corresponding pixel point in the visible light image is within a second brightness range, the quality of the visible light image is considered to be better than that of the infrared image, and n can be defined2>0.5>n1I.e. the time domain fusion is performed mainly on the visible luminance channel.
When the brightness of a certain pixel point in the infrared image is within a first brightness range and the brightness of a corresponding pixel point in the visible light image is within a second brightness range, or when the brightness of a certain pixel point in the infrared image is not within the first brightness range and the brightness of a corresponding pixel point in the visible light image is not within the second brightness range, the quality of the infrared image and the quality of the visible light image are considered to be both excellent or poor, and n can be defined1=n20.5, that is, the fusion weight of the infrared luminance channel and the visible luminance channel is equivalent.
After the first time domain fusion noise reduction intensity is obtained through the steps, the visible brightness time domain noise reduction intensity is assigned as the first time domain fusion noise reduction intensity, namely, the infrared brightness channel is used for guiding the time domain noise reduction of the visible brightness channel.
And step S63, acquiring a second time domain fusion noise reduction intensity by using the visible color initial time domain noise reduction intensity and the first time domain fusion noise reduction intensity, and assigning the visible color time domain noise reduction intensity as the second time domain fusion noise reduction intensity.
Specifically, the second time domain fusion noise reduction strength may be obtained by:
and taking the sum of the product of the visible color initial time domain noise reduction intensity and the third time domain fusion coefficient and the product of the first time domain fusion noise reduction intensity and the fourth time domain fusion coefficient as the second time domain fusion noise reduction intensity, wherein the sum of the third time domain fusion coefficient and the fourth time domain fusion coefficient is 1.
That is, the second time-domain fusion noise reduction strength gama may be calculated using the following formulafuvisuv:
gamafuvisuv=gamaorivisuv*n3+gamafuvisy*n4;
Wherein, gamafuvisuvRepresenting a second time-domain fusion noise reduction strength, gamafuvisyRepresenting the first time domain fusion noise reduction strength, gamaorivisuvRepresenting the initial temporal noise reduction intensity, n, of the visible color3And n4Respectively representing a third time domain fusion coefficient and a fourth time domain fusion coefficient, the value range is 0-1, and n3+n4=0。
Because the first time domain fusion noise reduction intensity is obtained by weighted calculation of the infrared brightness initial time domain noise reduction intensity and the visible brightness initial time domain noise reduction intensity, the time domain noise reduction process of the visible color channel is specially processed by the embodiment, so that the visible color channel can effectively utilize the information of the infrared brightness channel and the visible brightness channel at the same time. That is to say, the embodiment guides the time domain noise reduction of the visible brightness channel and the visible color channel by using the infrared image, so that the noise reduction effect presented finally is better, and the accuracy of the noise reduction algorithm is improved.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an embodiment of an image noise reduction apparatus according to the present application, where the image noise reduction apparatus includes amemory 810 and aprocessor 820 coupled to each other, where thememory 810 stores program instructions, and theprocessor 820 can execute the program instructions to implement an image noise reduction method according to any of the above embodiments. For details, reference may be made to the above embodiments, which are not described in detail.
In addition, the present application further provides a computer-readable storage medium, please refer to fig. 9, fig. 9 is a schematic structural diagram of an embodiment of the computer-readable storage medium of the present application, thestorage medium 900stores program instructions 910, and theprogram instructions 910 can be executed by a processor to implement the image denoising method according to any of the above embodiments. For details, reference may be made to the above embodiments, which are not described in detail.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.