Disclosure of Invention
The invention aims to overcome the defect that a tracking target is difficult to select in a frame mode in an image due to the fact that a shot image changes constantly along with the high-speed movement of an unmanned aerial vehicle in the prior art, and provides a method for selecting a tracking target in a frame mode
The invention solves the technical problems through the following technical scheme:
a method for selecting a tracking target in image tracking comprises the following steps:
transmitting images acquired by a camera in the unmanned aerial vehicle pod to a tracking control end in real time;
judging whether a target tracking instruction sent by the tracking control end is received, if so, then:
controlling the pod to maintain a current angle;
calculating the moving angle of the camera in the pod at the current angle relative to the object in the image;
controlling the nacelle to reversely rotate by the moving angle;
judging whether a tracking target image selected by the tracking control end in the images is received, if so, then:
and identifying the tracking target image to obtain a tracking target.
Preferably, the step of calculating the moving angle of the camera in the pod at the current angle relative to the object in the image comprises:
and calculating the movement angle of the camera relative to an object in the image in a time interval from the previous frame image to the current frame image by using the previous frame image and the current frame image through an optical flow algorithm.
Preferably, the method further comprises, after obtaining the tracking target:
the position of the tracking target in the newly acquired image is identified by an image recognition algorithm and the tracking target is kept present in each acquired image by controlling the rotation of the nacelle.
A method for selecting a tracking target in image tracking comprises the following steps:
receiving images acquired by a camera in an unmanned aerial vehicle pod in real time;
sending a target tracking instruction to an unmanned aerial vehicle, wherein the target tracking instruction is used for triggering the unmanned aerial vehicle to execute the following steps: controlling the pod to keep a current angle, calculating a movement angle of the camera in the pod at the current angle relative to an object in the image, and controlling the pod to reversely rotate the movement angle;
selecting a tracking target image in the images;
and sending the tracking target image to the unmanned aerial vehicle.
An apparatus for selecting a tracking target in image tracking, comprising:
the image transmission module is used for transmitting the image acquired by the camera in the unmanned aerial vehicle nacelle to the tracking control end in real time;
the pod control module is used for judging whether a target tracking instruction sent by the tracking control end is received or not, and if yes, controlling the pod to keep the current angle;
the image calculation module is used for calculating the moving angle of the camera in the pod at the current angle relative to the object in the image after the pod keeps the current angle;
the pod control module is also used for controlling the pod to reversely rotate by the movement angle after the movement angle is calculated;
and the image identification module is used for judging whether a tracking target image selected by the tracking control end in the image is received or not, and identifying the tracking target image to obtain a tracking target if the tracking target image is received.
Preferably, the image calculation module is specifically configured to calculate, by using an optical flow algorithm, a moving angle of the camera with respect to an object in the image in a time interval from a previous frame image to a current frame image by using the previous frame image and the current frame image.
Preferably, the apparatus further comprises:
and the target tracking module is used for identifying the position of the tracking target in the newly acquired image through an image identification algorithm and keeping the tracking target in the image acquired each time by controlling the rotation of the nacelle.
An unmanned aerial vehicle, comprising:
a pod having a camera disposed therein;
the device of the tracking target is selected in the image tracking as described above.
An apparatus for selecting a tracking target in image tracking, comprising:
the image receiving module is used for receiving images acquired by a camera in the unmanned aerial vehicle nacelle in real time;
the command sending module is used for sending a target tracking command to the unmanned aerial vehicle, and the target tracking command is used for triggering the unmanned aerial vehicle to execute the following operations: controlling the pod to keep a current angle, calculating a movement angle of the camera in the pod at the current angle relative to an object in the image, and controlling the pod to reversely rotate the movement angle;
the target selection module is used for selecting a tracking target image in the images;
and the target sending module is used for sending the tracking target image to the unmanned aerial vehicle.
On the basis of the common knowledge in the field, the above preferred conditions can be combined randomly to obtain the preferred embodiments of the invention.
The positive progress effects of the invention are as follows: according to the unmanned aerial vehicle control system, the pod is controlled to adjust the rotation angle through the target tracking instruction, so that the relative rotation of the camera relative to the shooting object is counteracted, the image collected by the camera is relatively stable, and the object in the image cannot rapidly run away even if the unmanned aerial vehicle continuously and rapidly moves; and then, the tracking target is selected in the stable image, so that sufficient selection time is provided for a user, and the difficulty in selecting the tracking target is reduced.
Detailed Description
The invention is further illustrated by the following examples, which are not intended to limit the scope of the invention.
Example 1
The embodiment provides a method for selecting a tracking target in image tracking. The method is generally applied to unmanned planes (especially fixed-wing airplanes), and assists users in accurately selecting and tracking targets in the movement of the unmanned planes through controlling the unmanned planes. As shown in fig. 1, the method comprises the steps of:
step 101: and transmitting the image collected by the camera in the unmanned aerial vehicle pod to the tracking control end in real time. Wherein, in order to be adapted to unmanned aerial vehicle's high-speed removal, gather clear image, the camera can adopt high-speed global shutter (global shutter) camera.
Step 102: and judging whether a target tracking instruction sent by the tracking control end is received, if so, executingstep 103, otherwise, returning to step 102 to execute the judgment again.
Step 103: controlling the pod to maintain a current angle. Wherein the angle of the camera is also fixed while the pod maintains the current angle.
Step 104: and calculating the moving angle of the camera in the pod at the current angle relative to the object in the image. The object may be an object subsequently selected as a tracking target, or any object appearing in the image, such as an object with a relatively obvious color or a relatively large volume.
Step 105: and controlling the nacelle to reversely rotate by the moving angle. The reverse rotation is to rotate the pod in the direction opposite to the moving angle, so as to counteract the relative rotation between the camera and the object in the image, and to stabilize the picture relative to the object. For example, if the camera is moved to the left relative to the object in the image at the current angle, then the pod should be controlled to rotate to the right accordingly. Of course, steps 104 and 105 may be replaced by: calculating the moving angle of the object in the image relative to the camera in the pod at the current angle; and controlling the pod to rotate in the same direction by the moving angle. The replaced steps have the same principle and the same function as the original steps. When the rotation of the nacelle is specifically controlled, the specific structure and principle of a driving mechanism for controlling the rotation of the nacelle can be considered, the moving angle is converted into the control quantity of the driving mechanism, and a nacelle stability augmentation algorithm can be used in the conversion process to ensure the stability degree of the rotation of the nacelle.
Step 106: and judging whether a tracking target image selected by the tracking control end in the image is received, if so, executing thestep 107, otherwise, returning to thestep 106 to execute the judgment again. The tracking target image may be selected by the tracking control terminal in various ways, for example, by dragging a mouse or touching a hand, the tracking target image is directly selected in the image by a frame, and the image selected by the frame may be a rectangle, a circle, an ellipse, or other irregular shapes.
Step 107: and identifying the tracking target image to obtain a tracking target.
In the method, the image acquisition and the image transmission of the camera in thestep 101 are always continuous in the whole tracking process, and the tracking control end selects a tracking target by observing the image and records the tracking process and the tracking result.
According to the method, the pod is controlled to adjust the rotation angle through the target tracking instruction, the relative rotation of the camera relative to the shooting object is counteracted, the image collected by the camera is relatively stable, and the object in the image cannot rapidly run away even if the unmanned aerial vehicle continuously and rapidly moves; and then, the tracking target is selected in the stable image, so that sufficient selection time is provided for a user, and the difficulty in selecting the tracking target is reduced.
In this embodiment, in order to quickly and accurately calculate the movement angle and prevent the original object in the image from being removed from the image before the pod angle is adjusted,step 104 may specifically include, but is not limited to: and calculating the movement angle of the camera relative to an object in the image in a time interval from the previous frame image to the current frame image by using the previous frame image and the current frame image through an optical flow algorithm. Specifically, the method comprises the following steps:
comparing the pixel displacement conditions of the previous frame image and the current frame image through an optical flow algorithm, and calculating a displacement value Ht between the two frames;
calculating the moving angle Wx, Wy of the camera relative to the object in the image in the time interval from the previous frame image to the current frame image by combining the visual angle Va of the camera, the number Hx of X-axis pixels of an image picture and the number Hy of Y-axis pixels of the image picture, wherein:
Wx=Ht/Hx*Va;
Wy=Ht/Hy*Va。
in addition, in order to realize real-time tracking of the tracking target, the method may further include, after step 107:
the position of the tracking target in the newly acquired image is identified by an image recognition algorithm and the tracking target is kept present in each acquired image by controlling the rotation of the nacelle. The image recognition algorithm may be an existing image recognition algorithm, which is not limited in this embodiment.
The method can further and completely realize the selection and tracking of the tracked target, reduces the difficulty of the user in selecting the tracked target in the high-speed movement of the unmanned aerial vehicle, and brings great progress to the field of image tracking.
Example 2
The embodiment provides a method for selecting a tracking target in image tracking. The method is generally applied to a tracking control end, which can be operated by a user to select a tracking target, and can be a computer, a remote controller or other control equipment which can be located on the ground and can communicate with an unmanned aerial vehicle. As shown in fig. 2, the method comprises the steps of:
step 201: and receiving images acquired by a camera in the unmanned aerial vehicle pod in real time.
Step 202: sending a target tracking instruction to an unmanned aerial vehicle, wherein the target tracking instruction is used for triggering the unmanned aerial vehicle to execute the following steps: controlling the pod to maintain a current angle, calculating a movement angle of the camera in the pod at the current angle relative to an object in the image, and controlling the pod to reversely rotate the movement angle. And waiting for the unmanned aerial vehicle to finish the steps to obtain a stable image picture.
Step 203: and selecting a tracking target image in the images. The tracking target image may be selected in various ways, for example, by dragging a mouse or touching a hand to directly frame out the tracking target image in the image, where the framed image may be rectangular, circular, elliptical, or other irregular shape.
Step 204: and sending the tracking target image to the unmanned aerial vehicle. After that, the unmanned aerial vehicle waits for receiving the image frame of the tracking target returned by the unmanned aerial vehicle.
The method of this embodiment can be used in combination with the method of embodiment 1 to realize image tracking, for example: as shown in fig. 3, the unmannedaerial vehicle 3 transmits the image collected by the camera to thetracking control terminal 4 in real time; meanwhile, thetracking control terminal 4 receives the images in real time, a user determines whether a target which needs to be tracked exists in the images by observing the images, and after the determination, a target tracking instruction is sent to the unmannedaerial vehicle 3 through thetracking control terminal 4; after receiving the target tracking instruction, the unmannedaerial vehicle 3 controls the pod to keep a current angle, calculates a movement angle of the camera in the pod at the current angle relative to an object in the image, and controls the pod to reversely rotate the movement angle; through the above adjustment of the pod of the unmannedaerial vehicle 3, the trackingcontrol end 4 can obtain a stable image, a user can select a target to be tracked from the stable image, and the trackingcontrol end 4 sends the selected tracking target image to the unmannedaerial vehicle 3; and the unmannedaerial vehicle 3 identifies the tracking target image to obtain a tracking target, and finally carries out real-time tracking.
Example 3
This embodiment provides an unmanned aerial vehicle, unmanned aerial vehicle can be the fixed wing aircraft. As shown in fig. 4, the drone includes apod 31 and adevice 32 for selecting a target to be tracked in image tracking. Thepod 31 is provided with acamera 311, and in order to adapt to the high-speed movement of the unmanned aerial vehicle and acquire clear images, thecamera 311 may be a high-speed global shutter (global shutter) camera.
Thedevice 32 for selecting a tracking target in image tracking specifically includes: animage transmission module 321, apod control module 322, animage calculation module 323, and animage recognition module 324.
Theimage transmission module 321 is used for transmitting the image acquired by thecamera 311 in the unmannedaerial vehicle pod 31 to the tracking control end in real time. Theimage transmission module 321 is preferably a wireless transmission module, and the wireless transmission module may also be configured to receive a target tracking instruction sent by the tracking control terminal and a tracking target image selected in the image.
Thepod control module 322 is configured to determine whether a target tracking command sent by the tracking control end is received, if so, control thepod 31 to maintain the current angle, and if not, may perform the determination again.
Theimage calculation module 323 is used for calculating the moving angle of thecamera 311 in thepod 31 at the current angle relative to the object in the image after thepod 31 keeps the current angle. The object may be an object subsequently selected as a tracking target, or any object appearing in the image, such as an object with a relatively obvious color or a relatively large volume.
Thepod control module 322 is further configured to control thepod 31 to rotate reversely by the movement angle after calculating the movement angle. The reverse rotation is to rotate thepod 31 in the direction opposite to the moving angle, so as to counteract the relative rotation between thecamera 311 and the object in the image, thereby stabilizing the picture relative to the object. For example, if thecamera 311 moves to the left with respect to the object in the image at the current angle, thepod 31 should be controlled to rotate to the right accordingly. Of course, theimage calculation module 323 and thepod control module 322 may be replaced by: calculating the moving angle of the object in the image relative to thecamera 311 in thepod 31 at the current angle; thenacelle 31 is controlled to rotate the moving angle in the same direction. The replaced module has the same principle and the same function as the original module. When the rotation of thenacelle 31 is specifically controlled, the specific structure and principle of a driving mechanism for controlling the rotation of thenacelle 31 can be considered, the moving angle can be converted into the control quantity of the driving mechanism, and a nacelle stability augmentation algorithm can be used in the conversion process to ensure the smoothness degree of the rotation of thenacelle 31.
Theimage recognition module 324 is configured to determine whether a tracking target image selected by the tracking control end in the image is received, if yes, recognize the tracking target image to obtain a tracking target, and if not, may re-execute the determination. The tracking target image may be selected by the tracking control terminal in various ways, for example, by dragging a mouse or touching a hand, the tracking target image is directly selected in the image by a frame, and the image selected by the frame may be a rectangle, a circle, an ellipse, or other irregular shapes.
In thedevice 32, the image acquisition and image transmission of thecamera 311 are always continuous in the whole tracking process, and the tracking control end selects a tracking target by observing an image and records the tracking process and the tracking result.
The device of the embodiment controls thepod 31 to adjust the rotation angle through the target tracking instruction, and offsets the relative rotation of thecamera 311 relative to the shooting object, so that the image collected by thecamera 311 is relatively stable, and the object in the image cannot rapidly run away even if the unmanned aerial vehicle continuously and rapidly moves; and then, the tracking target is selected in the stable image, so that sufficient selection time is provided for a user, and the difficulty in selecting the tracking target is reduced.
In this embodiment, in order to quickly and accurately calculate the movement angle and prevent the original object in the image from being lost from the image before thepod 31 angle is adjusted, theimage calculation module 323 may be specifically configured to calculate the movement angle of thecamera 311 relative to the object in the image in the time interval from the previous frame image to the current frame image by using the previous frame image and the current frame image through an optical flow algorithm. Specifically, the method comprises the following steps:
comparing the pixel displacement conditions of the previous frame image and the current frame image through an optical flow algorithm, and calculating a displacement value Ht between the two frames;
calculating the moving angle Wx, Wy of thecamera 311 relative to the object in the image in the time interval from the previous frame image to the current frame image by combining the field angle Va of thecamera 311, the X-axis pixel number Hx of the image picture and the Y-axis pixel number Hy of the image picture, wherein:
Wx=Ht/Hx*Va;
Wy=Ht/Hy*Va。
in addition, in order to realize real-time tracking of the tracking target, the apparatus may further include: atarget tracking module 325.
Thetarget tracking module 325 is used to identify the position of the tracking target in the newly acquired image through an image recognition algorithm and keep the tracking target present in each acquired image by controlling the rotation of thepod 31. The image recognition algorithm may be an existing image recognition algorithm, which is not limited in this embodiment.
Thedevice 32 can further and completely realize the selection and tracking of the tracking target, reduces the difficulty of the user in selecting the tracking target in the high-speed movement of the unmanned aerial vehicle, and brings great progress to the field of image tracking.
Example 4
The embodiment provides a device for selecting a tracking target in image tracking. The apparatus is typically used as a tracking control terminal that can be operated by a user to select a tracking target, which may be a computer, remote control, or other control device located on the ground that can communicate with the drone. As shown in fig. 5, the apparatus includes: animage receiving module 41, aninstruction transmitting module 42, atarget selecting module 43 and atarget transmitting module 44.
Theimage receiving module 41 is used for receiving images acquired by a camera in the unmanned aerial vehicle pod in real time.
Theinstruction sending module 42 is configured to send a target tracking instruction to the drone, where the target tracking instruction is used to trigger the drone to perform the following operations: controlling the pod to maintain a current angle, calculating a movement angle of the camera in the pod at the current angle relative to an object in the image, and controlling the pod to reversely rotate the movement angle.
Thetarget selection module 43 is used to select a tracking target image from the images. The tracking target image may be selected in various ways, for example, by dragging a mouse or touching a hand to directly frame out the tracking target image in the image, where the framed image may be rectangular, circular, elliptical, or other irregular shape.
Thetarget sending module 44 is configured to send the tracking target image to the drone.
The device of this embodiment can cooperate with the device inembodiment 3 to use, and the former is used as the tracking control end, and the latter is used as the controller on the unmanned aerial vehicle, and the specific cooperation process is referred to embodiment 2, and is not described herein again.
While specific embodiments of the invention have been described above, it will be appreciated by those skilled in the art that these are by way of example only, and that the scope of the invention is defined by the appended claims. Various changes and modifications to these embodiments may be made by those skilled in the art without departing from the spirit and scope of the invention, and these changes and modifications are within the scope of the invention.