Low-altitude complex background unmanned aerial vehicle target detection methodTechnical Field
The invention belongs to the technical field of image detection, and particularly relates to a low-altitude complex background unmanned aerial vehicle target detection method based on a panoramic infrared search system.
Background
In recent years, with the rapid development and wide application of unmanned aerial vehicle technology, unmanned aerial vehicle target detection technology has become a current hot research topic. At present, most of mainstream target detection methods at home and abroad based on a panoramic infrared search system extract targets based on means such as background difference, a large amount of hardware resources are consumed to store background images, requirements on the stability precision of a search platform and the performance of a registration algorithm are high, and the method is not convenient to realize in a real-time embedded system taking a DSP as a core.
Disclosure of Invention
Technical problem to be solved
The invention provides a low-altitude complex background unmanned aerial vehicle target detection method, which aims to solve the technical problems of how to effectively eliminate complex ground object background, bird and cloud interference and accurately detect an unmanned aerial vehicle target in sky background.
(II) technical scheme
In order to solve the technical problem, the invention provides a low-altitude complex background unmanned aerial vehicle target detection method, which comprises the following steps:
s1, image filtering: the following filtering template is used for convolution with the original image, the relatively uniform sky background in the filtered original image is compressed to a lower gray level, and the relatively bright small target in the sky and the edge of the ground bright background still keep a higher gray level;
-1,-2,-3,-2,-1
-2,3,4,3,-2
-3,4,4,4,-3
-2,3,4,3,-2
-1,-2,-3,-2,-1
s2, extracting a suspected target: setting a gray threshold, an edge tracking threshold and a size threshold for the filtered image data, and extracting a suspected target meeting the requirements of the gray threshold and the size threshold in the image by adopting an edge tracking method;
s3, background interference is eliminated: counting the number of neighborhood bright pixels of the suspected target in the step S2 respectively, judging the suspected target with the bright pixel number exceeding the threshold range as a ground feature or cloud background interference and removing the ground feature or the cloud background interference, wherein the residual target is a suspected small target in the sky;
s4, establishing a target track by correlation of time domain information: and further confirming the unmanned aerial vehicle target by judging the motion trail of the suspected small target.
Further, step S4 specifically includes: searching a suspected target which is closest to the gray level of a certain suspected target in a previous frame of image in a neighborhood of the suspected target with a certain size, if the suspected target exists, determining the suspected target to be the same target, and calculating the movement displacement and direction of the target according to the target positions of the previous frame of image and the next frame of image; and if the motion directions of two continuous frames of a suspected target are consistent and the displacement is within a certain range, judging that the suspected target is an unmanned aerial vehicle target.
(III) advantageous effects
The invention provides a low-altitude complex background unmanned aerial vehicle target detection method which comprises four steps of image filtering, suspected target extraction, background interference elimination and target track related establishment of time domain information. The filtering template used in image filtering is specially designed for small and weak sky background targets, the contrast between the target and the neighborhood background is obviously improved after filtering, the target detection of the low signal-to-noise ratio image is facilitated, and meanwhile, the selection of a gray threshold and an edge tracking threshold in target extraction is facilitated; by utilizing a suspected target neighborhood characteristic value statistical calculation method, the method can effectively eliminate most typical background interferences such as ground objects, cloud layers and the like, does not need to register and differentiate images of adjacent frames, has low requirement on the stability of a search turntable, has low time complexity compared with an algorithm based on image registration, and is more beneficial to being realized in a real-time embedded system; the method fully utilizes the difference of the motion characteristics of the unmanned aerial vehicle, the clouding, the ground objects and the birds, judges whether the suspected target is the unmanned aerial vehicle target or not according to the limitation of the displacement of the target motion track and the requirement of direction consistency, and has low false alarm rate.
Drawings
Fig. 1 is a flowchart of a target detection method for an unmanned aerial vehicle according to an embodiment of the present invention;
FIG. 2 is a flowchart of an edge tracking algorithm according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a target neighborhood in an embodiment of the present invention;
fig. 4a is an infrared image of a high drone of 700m to 80m in embodiment 1;
FIG. 4b is the filtered image of FIG. 4a in example 1;
FIG. 4c is the result of the suspected target extracted from FIG. 4b in example 1;
FIG. 4d is the result of the detection of the target after background removal processing in FIG. 4c in example 1;
FIG. 4e is the result of the rectangular window of the overlapped target detection results in FIG. 4a in example 1;
FIG. 5a is the infrared image of the high unmanned aerial vehicle of example 2 at 1.2km distance of 70 m;
FIG. 5b is the image enhancement result of the local area where the target of FIG. 5a is located in embodiment 2;
FIG. 5c is the filtered image of FIG. 5a in example 2;
FIG. 5d is the result of the detection of the target in FIG. 5c after background removal in example 2;
FIG. 5e is the result of the rectangular window of the overlapped target detection results of FIG. 5b in example 2;
fig. 6a is an infrared image of a 400m long 80m high drone with a cloud background according to example 3;
FIG. 6b is the filtered image of FIG. 6a in example 3;
FIG. 6c is the result of the suspected target extracted from FIG. 6b in example 3;
FIG. 6d is the result of the detection of the target after background removal processing in FIG. 6c in example 3;
fig. 6e is a rectangular window result of the superimposed target detection result of fig. 6a in embodiment 3.
Detailed Description
In order to make the objects, contents and advantages of the present invention clearer, the following detailed description of the embodiments of the present invention will be made in conjunction with the accompanying drawings and examples.
The embodiment provides a low-altitude complex background unmanned aerial vehicle target detection method, and the flow of the method is shown in fig. 1. The target detection method comprises the following steps:
s1, image filtering: the following filtering template is used for convolution with the original image, the relatively uniform sky background in the filtered original image is compressed to a lower gray level, and the relatively bright small target in the sky and the edge of the ground bright background still keep a higher gray level, so that the suspected target in the step S2 can be extracted more favorably.
-1,-2,-3,-2,-1
-2,3,4,3,-2
-3,4,4,4,-3
-2,3,4,3,-2
-1,-2,-3,-2,-1
S2, extracting a suspected target: and setting reasonable gray threshold, edge tracking threshold and size threshold for the filtered image data, and extracting a suspected target meeting the requirements of the gray threshold and the size threshold in the image by adopting an edge tracking method. The edge tracking algorithm flow is shown in fig. 2.
S3, background interference is removed: because the real target is in a uniform sky background, the number of neighborhood bright pixels (pixel points with gray values larger than a certain threshold are marked as bright pixels) after filtering is less, and the number of neighborhood bright pixels after ground object and cloud background filtering is more, the interference of the space target and the ground object background is distinguished by using the difference, the number of neighborhood bright pixels of the suspected target in the step S2 is respectively counted, the suspected target with the bright pixel number exceeding the threshold range is judged as the ground object or the cloud background interference to be removed, and the residual target is a suspected small target in the sky.
S4, establishing a target track in relation to time domain information: after the step S3, the suspected small targets still have interferences such as birds, dotted cloudiness, individual ground scenery, etc., and the unmanned aerial vehicle targets are further confirmed by judging the motion trajectory of the suspected small targets by using the characteristics that the transverse motion speed of the unmanned aerial vehicle is slower than that of the birds, the dotted cloudiness and the individual ground scenery are relatively static, etc., that is, the suspected targets closest to the gray level in the next frame image are searched in the neighborhood of a certain suspected target of the previous frame image with a certain size, if the suspected targets exist, the suspected targets are regarded as the same target, and the motion displacement and direction of the target are calculated according to the target positions of the previous and next two frames of images. Because the unmanned aerial vehicle is in a substantially linear motion state in a short time (adjacent 3 frames), the displacement directions of the unmanned aerial vehicle are kept consistent, and therefore if the motion directions of two continuous frames of a suspected target are consistent and the displacement is within a certain range, the unmanned aerial vehicle is judged to be the target of the unmanned aerial vehicle.
The following description is given by way of specific examples
A panoramic infrared search system is used for acquiring 14 low-altitude unmanned aerial vehicle infrared images, the image resolution is 640 multiplied by 512, the unmanned aerial vehicle is a 300 mm-great-diameter four-rotor unmanned aerial vehicle, the flying height is 50-120 m, and the horizontal distance is 0.1-1.2 km.
Example 1
Fig. 4a is an infrared image of a high unmanned aerial vehicle with a distance of 700m and a height of 80m, wherein the image has ground object backgrounds such as a crown, a building top and the like. The image is filtered using the template and the result is shown in figure 4 b. Fig. 4c shows the result of extracting the suspected target by setting the detection thresholds TH1=300 to 500 and TH2=200 to 300 using the edge tracking algorithm of fig. 2. Setting the size of a target neighborhood to be 8-15 pixels, the gray degree judgment threshold of bright pixels to be 120-140, and the number judgment threshold of bright pixels of a ground object and a cloud background to be 12-18, performing ground object removing processing on the extracted suspected target to obtain a target detection result as shown in figure 5d, and superposing a rectangular window result of the detection result on an original image as shown in figure 4 e.
Example 2
Fig. 5a is a 1.2km far 70m high unmanned aerial vehicle infrared image, and the image has ground object backgrounds such as telegraph poles, tree crowns, building roofs and the like. Fig. 5b shows the result of image enhancement of the region where the target is located. The image is filtered using the template and the result is shown in figure 5 c. By adopting the edge tracking algorithm of fig. 2, the detection thresholds TH1= 300-400, TH2= 150-300, the target neighborhood size is 8-15 pixels, the bright pixel gray level judgment threshold is 120-140, the number judgment threshold of the bright pixels of the ground object and the cloud background is 12-18, the suspected target extraction and ground object removal processing is performed on the image, the target detection result is obtained as shown in fig. 5d, and the rectangular window result of the detection result is superimposed on fig. 5b as shown in fig. 5 e.
Example 3
Fig. 6a is an infrared image of a 400m long and 80m high drone, with more cloud background in the sky. The image is filtered using the template and the result is shown in figure 6 b. Fig. 6c shows the result of extracting the suspected target by setting the detection thresholds TH1=300 to 500 and TH2=200 to 300 using the edge-tracing algorithm of fig. 2. Setting the size of a target neighborhood to be 8-15 pixels, the gray degree judgment threshold of bright pixels to be 120-140, and the number judgment threshold of bright pixels of a ground object and a cloud background to be 12-18, performing ground object removing processing on the extracted suspected target to obtain a target detection result as shown in figure 6d, and superposing a rectangular window result of the detection result on an original image as shown in figure 6 e.
The above description is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, several modifications and variations can be made without departing from the technical principle of the present invention, and these modifications and variations should also be regarded as the protection scope of the present invention.