Movatterモバイル変換


[0]ホーム

URL:


CN119854761A - Unmanned aerial vehicle and ground robot collaborative phenotype acquisition system - Google Patents

Unmanned aerial vehicle and ground robot collaborative phenotype acquisition system
Download PDF

Info

Publication number
CN119854761A
CN119854761ACN202411992639.2ACN202411992639ACN119854761ACN 119854761 ACN119854761 ACN 119854761ACN 202411992639 ACN202411992639 ACN 202411992639ACN 119854761 ACN119854761 ACN 119854761A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
ground robot
edge computing
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202411992639.2A
Other languages
Chinese (zh)
Inventor
李文峰
何继中
黄悦
理绑
王南
付国栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunnan Agricultural University
Original Assignee
Yunnan Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunnan Agricultural UniversityfiledCriticalYunnan Agricultural University
Priority to CN202411992639.2ApriorityCriticalpatent/CN119854761A/en
Publication of CN119854761ApublicationCriticalpatent/CN119854761A/en
Pendinglegal-statusCriticalCurrent

Links

Classifications

Landscapes

Abstract

Translated fromChinese

本发明涉及农业智能监测技术领域,具体涉及一种无人机与地面机器人协同的表型采集系统。表型采集系统包括无人机、履带式地面机器人和便携式边缘计算设备。无人机搭载高清采集相机进行空中数据采集,履带式地面机器人配备防抖鱼眼相机进行地面数据采集,便携式边缘计算设备作为中枢,通过无线通信模块与无人机和地面机器人实时通信,协调两者同步作业,并接收、处理、分析数据。系统采用高性能嵌入式处理器作为主控单元,利用图像处理算法和机器学习算法构建农田监测模型。本发明应用于农业监测领域,可以提高数据采集的效率和准确性,为农业监测提供了更为丰富、全面的信息支持。

The present invention relates to the field of agricultural intelligent monitoring technology, and specifically to a phenotypic collection system in which an unmanned aerial vehicle (UAV) and a ground robot collaborate. The phenotypic collection system includes an unmanned aerial vehicle (UAV), a tracked ground robot, and a portable edge computing device. The UAV is equipped with a high-definition acquisition camera for aerial data collection, and the tracked ground robot is equipped with an anti-shake fisheye camera for ground data collection. The portable edge computing device serves as a hub, and communicates with the UAV and the ground robot in real time through a wireless communication module, coordinates the synchronous operation of the two, and receives, processes, and analyzes data. The system uses a high-performance embedded processor as the main control unit, and uses image processing algorithms and machine learning algorithms to construct a farmland monitoring model. The present invention is applied to the field of agricultural monitoring, which can improve the efficiency and accuracy of data collection and provide richer and more comprehensive information support for agricultural monitoring.

Description

Unmanned aerial vehicle and ground robot collaborative phenotype acquisition system
Technical Field
The invention relates to the technical field of agricultural intelligent monitoring, in particular to a phenotype acquisition system for cooperation of an unmanned aerial vehicle and a ground robot.
Background
In traditional agricultural monitoring, data acquisition is mainly carried out in modes of manual inspection, ground monitoring stations or single unmanned aerial vehicle operation and the like. The method has various limitations, such as time and labor consumption in manual inspection, limited coverage, limited geographical position of a ground monitoring station, difficulty in real-time monitoring of a wide farmland, and difficulty in comprehensively reflecting the actual condition of the farmland due to lack of supplement of ground details although the unmanned aerial vehicle can acquire data from an aerial view through independent operation.
With the continuous development of intelligent technology and agricultural modernization, the collaborative operation of unmanned aerial vehicles and ground robots has demonstrated great potential in the field of agricultural monitoring. However, how to achieve effective coordination and synchronous data acquisition between the two, and how to efficiently process, analyze and fuse agricultural data from different perspectives is a major problem faced by current agricultural monitoring technologies.
Disclosure of Invention
Aiming at the problems existing in the existing agricultural monitoring technology, the invention provides an agricultural monitoring system for synchronously collecting, processing and analyzing data by combining an unmanned aerial vehicle and a ground robot, and aims to realize comprehensive and three-dimensional monitoring of target areas such as farmlands, crop growth environments and the like through accurate coordination of portable edge computing equipment.
The agricultural monitoring system comprises three parts of an unmanned aerial vehicle 1, a crawler-type ground robot 2 and a portable edge computing device 3. The unmanned aerial vehicle 1 is provided with a high-definition acquisition camera 14 and is responsible for acquiring images and video data of farmlands from an aerial view, such as crop growth conditions, pest and disease damage conditions and the like. The crawler-type ground robot 2 is provided with an anti-shake fisheye camera 23 for collecting environmental parameters such as soil humidity and temperature and detailed data of crop roots from a ground view angle, the portable edge computing device 3 serves as a center of the system, performs real-time communication with the unmanned aerial vehicle 1 and the crawler-type ground robot 2 through a wireless communication module, coordinates synchronous operation of the unmanned aerial vehicle 1 and the crawler-type ground robot, and receives, processes and analyzes data of the unmanned aerial vehicle 1 and the crawler-type ground robot.
The phenotype acquisition system with the cooperation of the unmanned plane and the ground robot adopts a high-performance embedded processor as a main control unit and is responsible for coordination and control of the whole system. In the data acquisition process, the portable edge computing device sends time synchronization signals to the unmanned aerial vehicle and the ground robot, so that the data acquisition time of the unmanned aerial vehicle and the ground robot is ensured to be consistent. After the data acquisition is completed, the data is transmitted to the portable edge computing equipment in real time through the wireless communication module, and the terminal analyzes, classifies and fuses the data by utilizing an image processing algorithm and a machine learning algorithm, so that a comprehensive and three-dimensional farmland monitoring model is constructed.
The invention has the advantages that firstly, the anti-shake fisheye camera 23 is adopted in the system, so that the definition and accuracy of collected data can be ensured to be kept in complex terrains, meanwhile, in order to improve the efficiency and reliability of data collection, the image compression and transmission technology is integrated into the system, so that the data can be transmitted to the portable edge computing equipment 3 in real time and in a lossless manner, secondly, the system performs deep analysis and accurate classification on the collected data by means of image processing and a machine learning algorithm, and finally, the synchronous data collection of the unmanned aerial vehicle and the ground robot is realized through the accurate coordination of the portable edge computing equipment, so that a comprehensive and three-dimensional target area monitoring system is constructed. The monitoring mode not only greatly improves the efficiency and accuracy of data acquisition, but also provides richer and comprehensive information support for subsequent decisions.
Drawings
FIG. 1 is a diagram of the workflow of a phenotype acquisition system of the unmanned aerial vehicle and a ground robot in cooperation and the cooperative relationship among the components;
Fig. 2 is a schematic structural diagram of an unmanned aerial vehicle in a phenotype acquisition system in which the unmanned aerial vehicle and a ground robot cooperate with each other;
FIG. 3 is a schematic view of the structure of a crawler-type ground robot in the phenotype acquisition system of the cooperation of the unmanned aerial vehicle and the ground robot;
Fig. 4 is a schematic diagram of a portable edge computing antenna terminal structure of the phenotype acquisition system of the unmanned aerial vehicle and the ground robot.
The reference numerals in fig. 1-4 are illustrated as follows:
The unmanned aerial vehicle comprises a 1-unmanned aerial vehicle, a 11-machine body, a 111-rack, a 112-rotor, a 113-first motor, a 114-energy storage battery, a 115-landing bracket, a 12-navigation communication module, a 13-flight control system, a 131-obstacle avoidance radar, a 132-electronic speed regulator I, a 133-core controller, a 14-high definition acquisition camera, a 2-crawler type ground robot, a 21-crawler type mobile chassis, a 211-chassis box, a 212-chassis upper cover, a 213-crawler, a 214-crawler wheel, a 215-second motor, a 22-navigation communication module, a 221-communication antenna, a 222-communication receiving module 23-anti-shake fish eye camera, a 231-spring damper, a 232-balance ring, a 233-fish eye camera, a 24-obstacle avoidance radar, a 25-control lower computer, a 26-core upper computer, a 3-portable edge computing device, a 31-telescopic bracket, a 311-triangle fixing bracket, a 312-bottom rod, a 313-telescopic fixed buckle, a 314-upper rod, a 32-communication antenna, a 33-computing terminal 331-terminal box, a 332-terminal control button, a 4-unmanned aerial vehicle terminal and a robot, a ground robot, a communication screen and a ground robot, a ground-5-robot.
Detailed Description
The workflow of the system of the present invention is further described in detail below with reference to the accompanying drawings:
1-4, the embodiment of the invention provides a phenotype acquisition system for cooperation of an unmanned aerial vehicle and a ground robot, which comprises the unmanned aerial vehicle 1, a crawler-type ground robot 2 and a portable edge computing device 3, wherein communication connection is established among the unmanned aerial vehicle 1, the crawler-type ground robot 2 and the portable edge computing device 3;
The unmanned aerial vehicle 1 comprises a machine body 11, a navigation communication module 12, a flight control system 13 and a high-definition acquisition camera 14, wherein a rotor wing 112, a first motor 113, an energy storage battery 114 and a landing bracket 115 are arranged on the machine body 11, the flight control system 13 comprises an obstacle avoidance radar 131, an electronic speed regulator I132 and a core controller 133, the unmanned aerial vehicle 1 is used for executing an air monitoring task, and images and video data of farmlands are acquired from an air view angle;
The crawler-type ground robot 2 comprises a crawler-type mobile chassis 21, a navigation communication module 22, an anti-shake fisheye camera 23, an obstacle avoidance radar 24, a control lower computer 25 and a core upper computer 26, wherein the crawler-type mobile chassis 21 comprises a chassis box 211, a chassis upper cover 212, a crawler 213, crawler wheels 214 and a second motor 215, the anti-shake fisheye camera 23 comprises a spring damper 231, a balance ring 232 and a fisheye camera 233, the crawler-type ground robot 2 is used for executing a ground monitoring task, and collecting environmental parameters and detailed data of the bottom of crops from a ground view, wherein the environmental parameters comprise soil humidity and temperature;
The portable edge computing device 3 comprises a telescopic bracket 31, a communication antenna 32 and a computing terminal 33, wherein the telescopic bracket 31 comprises a triangular fixing frame 311, a bottom rod 312, a telescopic fixing buckle 313 and an upper rod 314, the computing terminal 33 comprises a terminal box 331, a terminal control button 332 and an interactive touch screen 333, the portable edge computing device 3 serves as a center of a system, the core functions of the portable edge computing device 3 are realized through an integrated high-performance wireless communication module, the wireless communication module adopts a wireless communication protocol to ensure high-speed and low-delay connection between a unmanned aerial vehicle and a ground robot, meanwhile, a dynamic frequency selection technology and a multiple-input multiple-output MIMO antenna array are utilized to enhance signal stability and coverage, after the system is started, the portable edge computing device 3 firstly ensures that time references of all devices are consistent through a preset time synchronization protocol, secondly dynamically distributes operation areas and task priorities of all devices according to a farmland environment through a centralized scheduling algorithm, and finally ensures synchronous operation of the unmanned aerial vehicle and the ground robot through sending command packets in real time, including target positions, action paths and speed limit information.
Specifically, the wireless communication protocol includes a 5G, wi-Fi 6, low power wide area network LPWAN technology related protocol. The time synchronization protocol includes an NTP network time protocol.
Further specifically, the unmanned aerial vehicle 1 acquires the air viewing angle data captured by the camera 14 through high definition, and transmits the air viewing angle data to the portable edge computing device 3 in real time through the wireless communication module.
Further specifically, the portable edge computing device 3 sends a time synchronization signal to the drone 1 and the tracked floor robot 2 to ensure that both the drone 1 and the tracked floor robot 2 remain synchronized in terms of data acquisition time.
Further specifically, the portable edge computing device 3 analyzes, classifies and fuses the received data by using an image processing algorithm and a machine learning algorithm, and constructs a comprehensive and three-dimensional farmland monitoring model.
The phenotype acquisition system further specifically adopts a high-performance embedded processor as a main control unit to be responsible for coordination and control of the whole system, and the main control unit outputs PWM signals to accurately control the movements of the unmanned aerial vehicle (1) and the crawler-type ground robot (2) and integrates high-precision attitude sensors to acquire the attitude information of the unmanned aerial vehicle and the crawler-type ground robot in real time, and the high-precision attitude sensors are combined with a PID control algorithm to realize accurate and stable control.
Further specifically, the high-definition acquisition camera 14 of the unmanned aerial vehicle 1 and the anti-shake fisheye camera 23 of the crawler-type ground robot 2 both adopt high-resolution cameras so as to ensure the definition and accuracy of data, integrate image compression and transmission technologies and ensure that the data can be transmitted to the portable edge computing device 3 in real time and without damage.
After the system is started, the portable edge computing device 3 firstly sends instructions to the unmanned aerial vehicle 1 and the crawler-type ground robot 2 respectively through the built-in wireless communication module, and the instructions are precisely coordinated to synchronously go to a preset target area. After receiving the instruction, the unmanned aerial vehicle 1 quickly goes up and climbs to a preset height, and then starts the high-definition acquisition camera 14 carried by the unmanned aerial vehicle, so that the image and the video of the target area are comprehensively acquired from the air view angle. At the same time, the tracked floor robot 2 is also flexibly moved to a target area by its tracked mobile chassis 21 according to instructions, and starts an anti-shake fisheye camera 23 to perform fine data acquisition from a floor view.
In this series of operations the system employs a high-performance embedded processor, i.e. a high-performance microcontroller of the STM32 series, which is not directly indicated in the figures as the main control unit of the whole system, but is contained in the control system of the terminal 3 or the tracked floor robot 2. The main control unit is not only responsible for the overall coordination and control of the system, but also receives and primarily processes the data transmitted back by the unmanned aerial vehicle and the ground robot in real time through the wireless communication module in the motion process.
Specifically, the unmanned aerial vehicle 1 acquires the air viewing angle data captured by the camera 14 through high definition, and transmits the air viewing angle data to the portable edge computing device 3 in real time via the wireless communication module 12. After the terminal receives the data, the terminal immediately applies an image processing algorithm to carry out deep analysis on the image, and key information is extracted from the image. Also, the ground view data captured by the ground robot 2 through the anti-shake fisheye camera 23 is also transmitted to the terminal 3 through the navigation communication module 22 thereof in a wired or wireless manner for subsequent processing.
In order to ensure that the unmanned plane and the ground robot can accurately and synchronously acquire data, the system particularly designs a set of time synchronization mechanism. Before each data acquisition task starts, the portable edge computing device 3 sends a time synchronization signal to the unmanned aerial vehicle 1 and the ground robot 2, so that high consistency of the two in data acquisition time is ensured.
Entering the data processing stage, the portable edge computing device 3 uses its powerful computing power to perform deep fusion and comprehensive analysis of data from the unmanned aerial vehicle and the ground robot. By comparing the data of the air view angle and the ground view angle, the system can construct a more complete and three-dimensional target area model, and powerful support is provided for subsequent decisions.
At the control level, the main control chip accurately controls the movements of the unmanned aerial vehicle and the ground robot by outputting PWM signals. Meanwhile, the system also integrates a high-precision attitude sensor so as to acquire the attitude information of the attitude sensor in real time. The information is used as a feedback signal and combined with a PID control algorithm, so that the unmanned aerial vehicle and the ground robot move more accurately and stably.
In the aspect of data acquisition technology, the system adopts a high-resolution camera and a fisheye camera so as to ensure the definition and accuracy of data. Meanwhile, in order to improve the efficiency and reliability of data acquisition, the system also integrates image compression and transmission technology, so that the data can be transmitted to the portable edge computing device 3 in real time and without damage.
In the data processing and analysis layer, the system adopts image processing and machine learning algorithm to carry out deep analysis and accurate classification on the acquired data. By comparing the data of the air view and the ground view, the system can accurately identify different objects and features in the target area, and provide a solid basis for subsequent monitoring, analysis or decision making.
In summary, the invention has the core advantage that the synchronous data acquisition of the unmanned aerial vehicle and the ground robot is realized through the accurate coordination of the portable edge computing equipment, so that a comprehensive and three-dimensional target area monitoring system is constructed. The monitoring mode not only greatly improves the efficiency and accuracy of data acquisition, but also provides richer and comprehensive information support for subsequent decisions.

Claims (7)

The crawler-type ground robot (2) comprises a crawler-type mobile chassis (21), a navigation communication module (22), an anti-shake fisheye camera (23), an obstacle avoidance radar (24), a control lower computer (25) and a core upper computer (26), wherein the crawler-type mobile chassis (21) comprises a chassis box body (211), a chassis upper cover (212), a crawler (213), a crawler wheel (214) and a second motor (215), the anti-shake fisheye camera (23) comprises a spring damper (231), a balance ring (232) and a fisheye camera (233), the crawler-type ground robot (2) is used for executing a ground monitoring task, and collecting environmental parameters and detailed data of the bottom of crops from a ground view angle, wherein the environmental parameters comprise soil humidity and temperature;
The portable edge computing device (3) comprises a telescopic bracket (31), a communication antenna (32) and a computing terminal (33), the telescopic bracket (31) comprises a triangular fixing frame (311), a bottom rod (312), a telescopic fixing buckle (313) and an upper rod (314), the computing terminal (33) comprises a terminal box body (331), a terminal control button (332) and an interactive touch screen (333), the portable edge computing device (3) serves as a central of the system, the core functions of the portable edge computing device are realized through an integrated high-performance wireless communication module, the wireless communication module adopts a wireless communication protocol to ensure high-speed and low-delay connection between the unmanned aerial vehicle and the ground robot, and meanwhile, a dynamic frequency selection technology and a multiple-input multiple-output MIMO antenna array are utilized to enhance signal stability and coverage range;
CN202411992639.2A2024-12-312024-12-31Unmanned aerial vehicle and ground robot collaborative phenotype acquisition systemPendingCN119854761A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202411992639.2ACN119854761A (en)2024-12-312024-12-31Unmanned aerial vehicle and ground robot collaborative phenotype acquisition system

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202411992639.2ACN119854761A (en)2024-12-312024-12-31Unmanned aerial vehicle and ground robot collaborative phenotype acquisition system

Publications (1)

Publication NumberPublication Date
CN119854761Atrue CN119854761A (en)2025-04-18

Family

ID=95360654

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202411992639.2APendingCN119854761A (en)2024-12-312024-12-31Unmanned aerial vehicle and ground robot collaborative phenotype acquisition system

Country Status (1)

CountryLink
CN (1)CN119854761A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN120152006A (en)*2025-05-152025-06-13贵州大学 A synchronous data perception method, device and equipment based on edge multi-point collaboration

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN120152006A (en)*2025-05-152025-06-13贵州大学 A synchronous data perception method, device and equipment based on edge multi-point collaboration

Similar Documents

PublicationPublication DateTitle
US11822353B2 (en)Simple multi-sensor calibration
CN106802658B (en) A fully automatic high-precision indoor rapid positioning method
WO2022261678A1 (en)Systems and methods for configuring a swarm of drones
CN102967297B (en) Space movable visual sensor array system and image information fusion method
CN119854761A (en)Unmanned aerial vehicle and ground robot collaborative phenotype acquisition system
CN112965507B (en)Cluster unmanned aerial vehicle cooperative work system and method based on intelligent optimization
CN112015200B (en)Agricultural unmanned aerial vehicle group collaborative operation system, collaborative operation method and unmanned aerial vehicle
CN102156481A (en) Intelligent tracking control method and system for unmanned aerial vehicles
CN109557880A (en)A kind of ecological cruising inspection system based on unmanned plane
CN110850889B (en) A UAV autonomous inspection system based on RTK navigation
CN114115289A (en) An autonomous unmanned swarm reconnaissance system
CN110498039A (en) An intelligent monitoring system based on bionic flapping wing aircraft
CN112162565A (en) An uninterrupted autonomous tower inspection method based on multi-machine cooperative operation
Zhang et al.Aerial and ground-based collaborative mapping: an experimental study
CN110187718A (en) Urban logistics system and method based on Scrapy framework and quadrotor aircraft
CN211293749U (en)A robot is walked by oneself in field for breeding district survey is produced
Liang et al.Design and development of ground control system for tethered uav
EP3896627B1 (en)Automated work system
CN111476134A (en)Geological survey data processing system and method based on augmented reality
CN118123861A (en) A multifunctional bionic cockroach robot
Li et al.A novel meteorological sensor data acquisition approach based on unmanned aerial vehicle
CN205952332U (en)Fixed wing uavs sways formula oblique photography system
CN212259087U (en)Multi-view-angle cooperative automatic searching and tracking equipment
CN114526725A (en)Super-fusion navigation system based on system-on-chip
JP2023033991A (en)Autonomous flight survey system and method by unmanned flying body

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination

[8]ページ先頭

©2009-2025 Movatter.jp