Movatterモバイル変換


[0]ホーム

URL:


CN112672032A - Unmanned aerial vehicle and method and device for selecting tracking target in image tracking - Google Patents

Unmanned aerial vehicle and method and device for selecting tracking target in image tracking
Download PDF

Info

Publication number
CN112672032A
CN112672032ACN201910977313.5ACN201910977313ACN112672032ACN 112672032 ACN112672032 ACN 112672032ACN 201910977313 ACN201910977313 ACN 201910977313ACN 112672032 ACN112672032 ACN 112672032A
Authority
CN
China
Prior art keywords
image
tracking
target
pod
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910977313.5A
Other languages
Chinese (zh)
Inventor
田瑜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Autoflight Co Ltd
Original Assignee
Shanghai Autoflight Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Autoflight Co LtdfiledCriticalShanghai Autoflight Co Ltd
Priority to CN201910977313.5ApriorityCriticalpatent/CN112672032A/en
Publication of CN112672032ApublicationCriticalpatent/CN112672032A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Landscapes

Abstract

The invention discloses an unmanned aerial vehicle and a method and a device for selecting a tracking target in image tracking. The method comprises the following steps: transmitting images acquired by a camera in the unmanned aerial vehicle pod to a tracking control end in real time; judging whether a target tracking instruction sent by the tracking control end is received or not, and if so, controlling the pod to keep the current angle; calculating the moving angle of the camera in the pod at the current angle relative to the object in the image; controlling the nacelle to reversely rotate by the moving angle; and judging whether a tracking target image selected by the tracking control end in the images is received, if so, identifying the tracking target image to obtain a tracking target. According to the unmanned aerial vehicle tracking system, the pod is controlled to adjust the rotation angle, so that the relative rotation of the camera relative to the shot object is counteracted, the image is relatively stable, the object in the image cannot rapidly run away even if the unmanned aerial vehicle continuously and rapidly moves, and the difficulty in selecting the tracked target is reduced.

Description

Unmanned aerial vehicle and method and device for selecting tracking target in image tracking
Technical Field
The invention belongs to the field of image tracking, and particularly relates to an unmanned aerial vehicle and a method and a device for selecting a tracking target in image tracking.
Background
Along with the more and more popularization of unmanned aerial vehicle application, use unmanned aerial vehicle to carry out forest fire prevention, electric power patrols and examines, and border patrol, coast patrol more and more. When using the photoelectric pod for real-time image monitoring, a certain target needs to be tracked. For an unmanned aerial vehicle (such as a fixed-wing aircraft) which needs to keep flying at a high speed all the time, since a shooting picture of the unmanned aerial vehicle is moving all the time, it is very difficult for a user to frame a target object to be tracked.
At present, the following methods are used for selecting a target object on a photoelectric pod of an unmanned aerial vehicle:
(1) direct frame selects, and unmanned aerial vehicle passes back the image of shooing in real time and gives the ground control end promptly, and the user passes through ground control end and monitors the image of shooing and directly frames the target object that will track out in the image. According to the method, the image shot by the unmanned aerial vehicle changes constantly along with the movement of the unmanned aerial vehicle, so that the target object passes through the image instantly, the difficulty of directly selecting the target object in the image by a user is high, and the higher the moving speed of the unmanned aerial vehicle is, the higher the difficulty of selecting the target object is.
(2) And (4) freeze frame selection, namely, the unmanned aerial vehicle returns the shot image to the ground control end in real time, once a target object which the user wants to track appears in the image, the user needs to freeze the image to complete frame selection, then the image of the framed target object is transmitted to the nacelle, and finally the unmanned aerial vehicle finds out the position of the target object in the latest image through an image contrast technology. This approach has two disadvantages: firstly, uploading the image of the framed target object to the unmanned aerial vehicle at a lower speed; secondly, when the unmanned aerial vehicle moves very fast, when the tracked target image is uploaded to the photoelectric hanging cabin, the target object is likely to be moved out of the picture, and the latest position of the target object cannot be found out through visual identification.
Disclosure of Invention
The invention aims to overcome the defect that a tracking target is difficult to select in a frame mode in an image due to the fact that a shot image changes constantly along with the high-speed movement of an unmanned aerial vehicle in the prior art, and provides a method for selecting a tracking target in a frame mode
The invention solves the technical problems through the following technical scheme:
a method for selecting a tracking target in image tracking comprises the following steps:
transmitting images acquired by a camera in the unmanned aerial vehicle pod to a tracking control end in real time;
judging whether a target tracking instruction sent by the tracking control end is received, if so, then:
controlling the pod to maintain a current angle;
calculating the moving angle of the camera in the pod at the current angle relative to the object in the image;
controlling the nacelle to reversely rotate by the moving angle;
judging whether a tracking target image selected by the tracking control end in the images is received, if so, then:
and identifying the tracking target image to obtain a tracking target.
Preferably, the step of calculating the moving angle of the camera in the pod at the current angle relative to the object in the image comprises:
and calculating the movement angle of the camera relative to an object in the image in a time interval from the previous frame image to the current frame image by using the previous frame image and the current frame image through an optical flow algorithm.
Preferably, the method further comprises, after obtaining the tracking target:
the position of the tracking target in the newly acquired image is identified by an image recognition algorithm and the tracking target is kept present in each acquired image by controlling the rotation of the nacelle.
A method for selecting a tracking target in image tracking comprises the following steps:
receiving images acquired by a camera in an unmanned aerial vehicle pod in real time;
sending a target tracking instruction to an unmanned aerial vehicle, wherein the target tracking instruction is used for triggering the unmanned aerial vehicle to execute the following steps: controlling the pod to keep a current angle, calculating a movement angle of the camera in the pod at the current angle relative to an object in the image, and controlling the pod to reversely rotate the movement angle;
selecting a tracking target image in the images;
and sending the tracking target image to the unmanned aerial vehicle.
An apparatus for selecting a tracking target in image tracking, comprising:
the image transmission module is used for transmitting the image acquired by the camera in the unmanned aerial vehicle nacelle to the tracking control end in real time;
the pod control module is used for judging whether a target tracking instruction sent by the tracking control end is received or not, and if yes, controlling the pod to keep the current angle;
the image calculation module is used for calculating the moving angle of the camera in the pod at the current angle relative to the object in the image after the pod keeps the current angle;
the pod control module is also used for controlling the pod to reversely rotate by the movement angle after the movement angle is calculated;
and the image identification module is used for judging whether a tracking target image selected by the tracking control end in the image is received or not, and identifying the tracking target image to obtain a tracking target if the tracking target image is received.
Preferably, the image calculation module is specifically configured to calculate, by using an optical flow algorithm, a moving angle of the camera with respect to an object in the image in a time interval from a previous frame image to a current frame image by using the previous frame image and the current frame image.
Preferably, the apparatus further comprises:
and the target tracking module is used for identifying the position of the tracking target in the newly acquired image through an image identification algorithm and keeping the tracking target in the image acquired each time by controlling the rotation of the nacelle.
An unmanned aerial vehicle, comprising:
a pod having a camera disposed therein;
the device of the tracking target is selected in the image tracking as described above.
An apparatus for selecting a tracking target in image tracking, comprising:
the image receiving module is used for receiving images acquired by a camera in the unmanned aerial vehicle nacelle in real time;
the command sending module is used for sending a target tracking command to the unmanned aerial vehicle, and the target tracking command is used for triggering the unmanned aerial vehicle to execute the following operations: controlling the pod to keep a current angle, calculating a movement angle of the camera in the pod at the current angle relative to an object in the image, and controlling the pod to reversely rotate the movement angle;
the target selection module is used for selecting a tracking target image in the images;
and the target sending module is used for sending the tracking target image to the unmanned aerial vehicle.
On the basis of the common knowledge in the field, the above preferred conditions can be combined randomly to obtain the preferred embodiments of the invention.
The positive progress effects of the invention are as follows: according to the unmanned aerial vehicle control system, the pod is controlled to adjust the rotation angle through the target tracking instruction, so that the relative rotation of the camera relative to the shooting object is counteracted, the image collected by the camera is relatively stable, and the object in the image cannot rapidly run away even if the unmanned aerial vehicle continuously and rapidly moves; and then, the tracking target is selected in the stable image, so that sufficient selection time is provided for a user, and the difficulty in selecting the tracking target is reduced.
Drawings
FIG. 1 is a flowchart illustrating a method for selecting a tracking target in image tracking according to a preferred embodiment 1 of the present invention;
FIG. 2 is a flowchart illustrating a method for selecting a tracking target in image tracking according to a preferred embodiment 2 of the present invention;
FIG. 3 is a schematic diagram of image tracking;
fig. 4 is a schematic block diagram of an unmanned aerial vehicle according topreferred embodiment 3 of the present invention;
fig. 5 is a schematic block diagram of an apparatus for selecting a tracking target in image tracking according to apreferred embodiment 4 of the present invention.
Detailed Description
The invention is further illustrated by the following examples, which are not intended to limit the scope of the invention.
Example 1
The embodiment provides a method for selecting a tracking target in image tracking. The method is generally applied to unmanned planes (especially fixed-wing airplanes), and assists users in accurately selecting and tracking targets in the movement of the unmanned planes through controlling the unmanned planes. As shown in fig. 1, the method comprises the steps of:
step 101: and transmitting the image collected by the camera in the unmanned aerial vehicle pod to the tracking control end in real time. Wherein, in order to be adapted to unmanned aerial vehicle's high-speed removal, gather clear image, the camera can adopt high-speed global shutter (global shutter) camera.
Step 102: and judging whether a target tracking instruction sent by the tracking control end is received, if so, executingstep 103, otherwise, returning to step 102 to execute the judgment again.
Step 103: controlling the pod to maintain a current angle. Wherein the angle of the camera is also fixed while the pod maintains the current angle.
Step 104: and calculating the moving angle of the camera in the pod at the current angle relative to the object in the image. The object may be an object subsequently selected as a tracking target, or any object appearing in the image, such as an object with a relatively obvious color or a relatively large volume.
Step 105: and controlling the nacelle to reversely rotate by the moving angle. The reverse rotation is to rotate the pod in the direction opposite to the moving angle, so as to counteract the relative rotation between the camera and the object in the image, and to stabilize the picture relative to the object. For example, if the camera is moved to the left relative to the object in the image at the current angle, then the pod should be controlled to rotate to the right accordingly. Of course, steps 104 and 105 may be replaced by: calculating the moving angle of the object in the image relative to the camera in the pod at the current angle; and controlling the pod to rotate in the same direction by the moving angle. The replaced steps have the same principle and the same function as the original steps. When the rotation of the nacelle is specifically controlled, the specific structure and principle of a driving mechanism for controlling the rotation of the nacelle can be considered, the moving angle is converted into the control quantity of the driving mechanism, and a nacelle stability augmentation algorithm can be used in the conversion process to ensure the stability degree of the rotation of the nacelle.
Step 106: and judging whether a tracking target image selected by the tracking control end in the image is received, if so, executing thestep 107, otherwise, returning to thestep 106 to execute the judgment again. The tracking target image may be selected by the tracking control terminal in various ways, for example, by dragging a mouse or touching a hand, the tracking target image is directly selected in the image by a frame, and the image selected by the frame may be a rectangle, a circle, an ellipse, or other irregular shapes.
Step 107: and identifying the tracking target image to obtain a tracking target.
In the method, the image acquisition and the image transmission of the camera in thestep 101 are always continuous in the whole tracking process, and the tracking control end selects a tracking target by observing the image and records the tracking process and the tracking result.
According to the method, the pod is controlled to adjust the rotation angle through the target tracking instruction, the relative rotation of the camera relative to the shooting object is counteracted, the image collected by the camera is relatively stable, and the object in the image cannot rapidly run away even if the unmanned aerial vehicle continuously and rapidly moves; and then, the tracking target is selected in the stable image, so that sufficient selection time is provided for a user, and the difficulty in selecting the tracking target is reduced.
In this embodiment, in order to quickly and accurately calculate the movement angle and prevent the original object in the image from being removed from the image before the pod angle is adjusted,step 104 may specifically include, but is not limited to: and calculating the movement angle of the camera relative to an object in the image in a time interval from the previous frame image to the current frame image by using the previous frame image and the current frame image through an optical flow algorithm. Specifically, the method comprises the following steps:
comparing the pixel displacement conditions of the previous frame image and the current frame image through an optical flow algorithm, and calculating a displacement value Ht between the two frames;
calculating the moving angle Wx, Wy of the camera relative to the object in the image in the time interval from the previous frame image to the current frame image by combining the visual angle Va of the camera, the number Hx of X-axis pixels of an image picture and the number Hy of Y-axis pixels of the image picture, wherein:
Wx=Ht/Hx*Va;
Wy=Ht/Hy*Va。
in addition, in order to realize real-time tracking of the tracking target, the method may further include, after step 107:
the position of the tracking target in the newly acquired image is identified by an image recognition algorithm and the tracking target is kept present in each acquired image by controlling the rotation of the nacelle. The image recognition algorithm may be an existing image recognition algorithm, which is not limited in this embodiment.
The method can further and completely realize the selection and tracking of the tracked target, reduces the difficulty of the user in selecting the tracked target in the high-speed movement of the unmanned aerial vehicle, and brings great progress to the field of image tracking.
Example 2
The embodiment provides a method for selecting a tracking target in image tracking. The method is generally applied to a tracking control end, which can be operated by a user to select a tracking target, and can be a computer, a remote controller or other control equipment which can be located on the ground and can communicate with an unmanned aerial vehicle. As shown in fig. 2, the method comprises the steps of:
step 201: and receiving images acquired by a camera in the unmanned aerial vehicle pod in real time.
Step 202: sending a target tracking instruction to an unmanned aerial vehicle, wherein the target tracking instruction is used for triggering the unmanned aerial vehicle to execute the following steps: controlling the pod to maintain a current angle, calculating a movement angle of the camera in the pod at the current angle relative to an object in the image, and controlling the pod to reversely rotate the movement angle. And waiting for the unmanned aerial vehicle to finish the steps to obtain a stable image picture.
Step 203: and selecting a tracking target image in the images. The tracking target image may be selected in various ways, for example, by dragging a mouse or touching a hand to directly frame out the tracking target image in the image, where the framed image may be rectangular, circular, elliptical, or other irregular shape.
Step 204: and sending the tracking target image to the unmanned aerial vehicle. After that, the unmanned aerial vehicle waits for receiving the image frame of the tracking target returned by the unmanned aerial vehicle.
The method of this embodiment can be used in combination with the method of embodiment 1 to realize image tracking, for example: as shown in fig. 3, the unmannedaerial vehicle 3 transmits the image collected by the camera to thetracking control terminal 4 in real time; meanwhile, thetracking control terminal 4 receives the images in real time, a user determines whether a target which needs to be tracked exists in the images by observing the images, and after the determination, a target tracking instruction is sent to the unmannedaerial vehicle 3 through thetracking control terminal 4; after receiving the target tracking instruction, the unmannedaerial vehicle 3 controls the pod to keep a current angle, calculates a movement angle of the camera in the pod at the current angle relative to an object in the image, and controls the pod to reversely rotate the movement angle; through the above adjustment of the pod of the unmannedaerial vehicle 3, the trackingcontrol end 4 can obtain a stable image, a user can select a target to be tracked from the stable image, and the trackingcontrol end 4 sends the selected tracking target image to the unmannedaerial vehicle 3; and the unmannedaerial vehicle 3 identifies the tracking target image to obtain a tracking target, and finally carries out real-time tracking.
Example 3
This embodiment provides an unmanned aerial vehicle, unmanned aerial vehicle can be the fixed wing aircraft. As shown in fig. 4, the drone includes apod 31 and adevice 32 for selecting a target to be tracked in image tracking. Thepod 31 is provided with acamera 311, and in order to adapt to the high-speed movement of the unmanned aerial vehicle and acquire clear images, thecamera 311 may be a high-speed global shutter (global shutter) camera.
Thedevice 32 for selecting a tracking target in image tracking specifically includes: animage transmission module 321, apod control module 322, animage calculation module 323, and animage recognition module 324.
Theimage transmission module 321 is used for transmitting the image acquired by thecamera 311 in the unmannedaerial vehicle pod 31 to the tracking control end in real time. Theimage transmission module 321 is preferably a wireless transmission module, and the wireless transmission module may also be configured to receive a target tracking instruction sent by the tracking control terminal and a tracking target image selected in the image.
Thepod control module 322 is configured to determine whether a target tracking command sent by the tracking control end is received, if so, control thepod 31 to maintain the current angle, and if not, may perform the determination again.
Theimage calculation module 323 is used for calculating the moving angle of thecamera 311 in thepod 31 at the current angle relative to the object in the image after thepod 31 keeps the current angle. The object may be an object subsequently selected as a tracking target, or any object appearing in the image, such as an object with a relatively obvious color or a relatively large volume.
Thepod control module 322 is further configured to control thepod 31 to rotate reversely by the movement angle after calculating the movement angle. The reverse rotation is to rotate thepod 31 in the direction opposite to the moving angle, so as to counteract the relative rotation between thecamera 311 and the object in the image, thereby stabilizing the picture relative to the object. For example, if thecamera 311 moves to the left with respect to the object in the image at the current angle, thepod 31 should be controlled to rotate to the right accordingly. Of course, theimage calculation module 323 and thepod control module 322 may be replaced by: calculating the moving angle of the object in the image relative to thecamera 311 in thepod 31 at the current angle; thenacelle 31 is controlled to rotate the moving angle in the same direction. The replaced module has the same principle and the same function as the original module. When the rotation of thenacelle 31 is specifically controlled, the specific structure and principle of a driving mechanism for controlling the rotation of thenacelle 31 can be considered, the moving angle can be converted into the control quantity of the driving mechanism, and a nacelle stability augmentation algorithm can be used in the conversion process to ensure the smoothness degree of the rotation of thenacelle 31.
Theimage recognition module 324 is configured to determine whether a tracking target image selected by the tracking control end in the image is received, if yes, recognize the tracking target image to obtain a tracking target, and if not, may re-execute the determination. The tracking target image may be selected by the tracking control terminal in various ways, for example, by dragging a mouse or touching a hand, the tracking target image is directly selected in the image by a frame, and the image selected by the frame may be a rectangle, a circle, an ellipse, or other irregular shapes.
In thedevice 32, the image acquisition and image transmission of thecamera 311 are always continuous in the whole tracking process, and the tracking control end selects a tracking target by observing an image and records the tracking process and the tracking result.
The device of the embodiment controls thepod 31 to adjust the rotation angle through the target tracking instruction, and offsets the relative rotation of thecamera 311 relative to the shooting object, so that the image collected by thecamera 311 is relatively stable, and the object in the image cannot rapidly run away even if the unmanned aerial vehicle continuously and rapidly moves; and then, the tracking target is selected in the stable image, so that sufficient selection time is provided for a user, and the difficulty in selecting the tracking target is reduced.
In this embodiment, in order to quickly and accurately calculate the movement angle and prevent the original object in the image from being lost from the image before thepod 31 angle is adjusted, theimage calculation module 323 may be specifically configured to calculate the movement angle of thecamera 311 relative to the object in the image in the time interval from the previous frame image to the current frame image by using the previous frame image and the current frame image through an optical flow algorithm. Specifically, the method comprises the following steps:
comparing the pixel displacement conditions of the previous frame image and the current frame image through an optical flow algorithm, and calculating a displacement value Ht between the two frames;
calculating the moving angle Wx, Wy of thecamera 311 relative to the object in the image in the time interval from the previous frame image to the current frame image by combining the field angle Va of thecamera 311, the X-axis pixel number Hx of the image picture and the Y-axis pixel number Hy of the image picture, wherein:
Wx=Ht/Hx*Va;
Wy=Ht/Hy*Va。
in addition, in order to realize real-time tracking of the tracking target, the apparatus may further include: atarget tracking module 325.
Thetarget tracking module 325 is used to identify the position of the tracking target in the newly acquired image through an image recognition algorithm and keep the tracking target present in each acquired image by controlling the rotation of thepod 31. The image recognition algorithm may be an existing image recognition algorithm, which is not limited in this embodiment.
Thedevice 32 can further and completely realize the selection and tracking of the tracking target, reduces the difficulty of the user in selecting the tracking target in the high-speed movement of the unmanned aerial vehicle, and brings great progress to the field of image tracking.
Example 4
The embodiment provides a device for selecting a tracking target in image tracking. The apparatus is typically used as a tracking control terminal that can be operated by a user to select a tracking target, which may be a computer, remote control, or other control device located on the ground that can communicate with the drone. As shown in fig. 5, the apparatus includes: animage receiving module 41, aninstruction transmitting module 42, atarget selecting module 43 and atarget transmitting module 44.
Theimage receiving module 41 is used for receiving images acquired by a camera in the unmanned aerial vehicle pod in real time.
Theinstruction sending module 42 is configured to send a target tracking instruction to the drone, where the target tracking instruction is used to trigger the drone to perform the following operations: controlling the pod to maintain a current angle, calculating a movement angle of the camera in the pod at the current angle relative to an object in the image, and controlling the pod to reversely rotate the movement angle.
Thetarget selection module 43 is used to select a tracking target image from the images. The tracking target image may be selected in various ways, for example, by dragging a mouse or touching a hand to directly frame out the tracking target image in the image, where the framed image may be rectangular, circular, elliptical, or other irregular shape.
Thetarget sending module 44 is configured to send the tracking target image to the drone.
The device of this embodiment can cooperate with the device inembodiment 3 to use, and the former is used as the tracking control end, and the latter is used as the controller on the unmanned aerial vehicle, and the specific cooperation process is referred to embodiment 2, and is not described herein again.
While specific embodiments of the invention have been described above, it will be appreciated by those skilled in the art that these are by way of example only, and that the scope of the invention is defined by the appended claims. Various changes and modifications to these embodiments may be made by those skilled in the art without departing from the spirit and scope of the invention, and these changes and modifications are within the scope of the invention.

Claims (9)

CN201910977313.5A2019-10-152019-10-15Unmanned aerial vehicle and method and device for selecting tracking target in image trackingPendingCN112672032A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201910977313.5ACN112672032A (en)2019-10-152019-10-15Unmanned aerial vehicle and method and device for selecting tracking target in image tracking

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201910977313.5ACN112672032A (en)2019-10-152019-10-15Unmanned aerial vehicle and method and device for selecting tracking target in image tracking

Publications (1)

Publication NumberPublication Date
CN112672032Atrue CN112672032A (en)2021-04-16

Family

ID=75399727

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201910977313.5APendingCN112672032A (en)2019-10-152019-10-15Unmanned aerial vehicle and method and device for selecting tracking target in image tracking

Country Status (1)

CountryLink
CN (1)CN112672032A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN119854630A (en)*2025-03-172025-04-18成都浩孚科技有限公司Photoelectric pod image stabilization and retrace integrated measurement device and measurement method

Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105518555A (en)*2014-07-302016-04-20深圳市大疆创新科技有限公司 Target Tracking System and Method
CN105959625A (en)*2016-05-042016-09-21北京博瑞爱飞科技发展有限公司Method and device of controlling unmanned plane tracking shooting
US20170134631A1 (en)*2015-09-152017-05-11SZ DJI Technology Co., Ltd.System and method for supporting smooth target following
WO2018058309A1 (en)*2016-09-272018-04-05深圳市大疆创新科技有限公司Control method, control device, electronic device, and aerial vehicle control system
CN110147122A (en)*2019-06-142019-08-20深圳市道通智能航空技术有限公司A kind of method for tracing, device and the unmanned plane of mobile target

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105518555A (en)*2014-07-302016-04-20深圳市大疆创新科技有限公司 Target Tracking System and Method
US20170134631A1 (en)*2015-09-152017-05-11SZ DJI Technology Co., Ltd.System and method for supporting smooth target following
CN105959625A (en)*2016-05-042016-09-21北京博瑞爱飞科技发展有限公司Method and device of controlling unmanned plane tracking shooting
WO2018058309A1 (en)*2016-09-272018-04-05深圳市大疆创新科技有限公司Control method, control device, electronic device, and aerial vehicle control system
CN110147122A (en)*2019-06-142019-08-20深圳市道通智能航空技术有限公司A kind of method for tracing, device and the unmanned plane of mobile target

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN119854630A (en)*2025-03-172025-04-18成都浩孚科技有限公司Photoelectric pod image stabilization and retrace integrated measurement device and measurement method
CN119854630B (en)*2025-03-172025-06-03成都浩孚科技有限公司Photoelectric pod image stabilization and retrace integrated measurement device and measurement method

Similar Documents

PublicationPublication DateTitle
CN107148777B (en)Intelligent patrol equipment, cloud control device, patrol method and control method
JP6988146B2 (en) Arithmetic processing device and arithmetic processing method
WO2021022580A1 (en)Method and system for automatic tracking and photographing
CN110692027A (en)System and method for providing easy-to-use release and automatic positioning of drone applications
CN108363946B (en)Face tracking system and method based on unmanned aerial vehicle
CN1953547A (en)A low-altitude follow-up system and method aiming at the mobile ground object by unmanned aircraft
CN108476288A (en) Shooting control method and device
CN104811667A (en)Unmanned aerial vehicle target tracking method and system
CN110716579B (en)Target tracking method and unmanned aerial vehicle
CN108351650B (en)Flight control method and device for aircraft and aircraft
CN106254836A (en)Unmanned plane infrared image Target Tracking System and method
CN114281100B (en)Unmanned aerial vehicle inspection system and method without hovering
US20190158755A1 (en)Aerial vehicle and target object tracking method
CN108163203B (en)Shooting control method and device and aircraft
CN110139038B (en) An autonomous surround shooting method, device and unmanned aerial vehicle
CN105739544B (en)Course following method and device of holder
CN105068542A (en)Rotor unmanned aerial vehicle guided flight control system based on vision
CN109358656A (en)A kind of target acquistion method suitable for airborne lidar for fluorescence
CN109688323A (en)Unmanned plane Visual Tracking System and its control method
WO2019095210A1 (en)Smart glasses, method for controlling gimbal by means of same, gimbal, control method and unmanned aerial vehicle
WO2021135824A1 (en)Image exposure method and apparatus, and unmanned aerial vehicle
WO2022205116A1 (en)Unmanned aerial vehicle, control terminal, drone rescue method and drone rescue system
CN112672032A (en)Unmanned aerial vehicle and method and device for selecting tracking target in image tracking
CN110720210A (en)Lighting device control method, device, aircraft and system
WO2017173502A1 (en)Aerial devices, rotor assemblies for aerial devices, and device frameworks and methodologies configured to enable control of aerial devices

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination

[8]ページ先頭

©2009-2025 Movatter.jp