Disclosure of Invention
Aiming at the problems existing in the existing agricultural monitoring technology, the invention provides an agricultural monitoring system for synchronously collecting, processing and analyzing data by combining an unmanned aerial vehicle and a ground robot, and aims to realize comprehensive and three-dimensional monitoring of target areas such as farmlands, crop growth environments and the like through accurate coordination of portable edge computing equipment.
The agricultural monitoring system comprises three parts of an unmanned aerial vehicle 1, a crawler-type ground robot 2 and a portable edge computing device 3. The unmanned aerial vehicle 1 is provided with a high-definition acquisition camera 14 and is responsible for acquiring images and video data of farmlands from an aerial view, such as crop growth conditions, pest and disease damage conditions and the like. The crawler-type ground robot 2 is provided with an anti-shake fisheye camera 23 for collecting environmental parameters such as soil humidity and temperature and detailed data of crop roots from a ground view angle, the portable edge computing device 3 serves as a center of the system, performs real-time communication with the unmanned aerial vehicle 1 and the crawler-type ground robot 2 through a wireless communication module, coordinates synchronous operation of the unmanned aerial vehicle 1 and the crawler-type ground robot, and receives, processes and analyzes data of the unmanned aerial vehicle 1 and the crawler-type ground robot.
The phenotype acquisition system with the cooperation of the unmanned plane and the ground robot adopts a high-performance embedded processor as a main control unit and is responsible for coordination and control of the whole system. In the data acquisition process, the portable edge computing device sends time synchronization signals to the unmanned aerial vehicle and the ground robot, so that the data acquisition time of the unmanned aerial vehicle and the ground robot is ensured to be consistent. After the data acquisition is completed, the data is transmitted to the portable edge computing equipment in real time through the wireless communication module, and the terminal analyzes, classifies and fuses the data by utilizing an image processing algorithm and a machine learning algorithm, so that a comprehensive and three-dimensional farmland monitoring model is constructed.
The invention has the advantages that firstly, the anti-shake fisheye camera 23 is adopted in the system, so that the definition and accuracy of collected data can be ensured to be kept in complex terrains, meanwhile, in order to improve the efficiency and reliability of data collection, the image compression and transmission technology is integrated into the system, so that the data can be transmitted to the portable edge computing equipment 3 in real time and in a lossless manner, secondly, the system performs deep analysis and accurate classification on the collected data by means of image processing and a machine learning algorithm, and finally, the synchronous data collection of the unmanned aerial vehicle and the ground robot is realized through the accurate coordination of the portable edge computing equipment, so that a comprehensive and three-dimensional target area monitoring system is constructed. The monitoring mode not only greatly improves the efficiency and accuracy of data acquisition, but also provides richer and comprehensive information support for subsequent decisions.
Drawings
FIG. 1 is a diagram of the workflow of a phenotype acquisition system of the unmanned aerial vehicle and a ground robot in cooperation and the cooperative relationship among the components;
Fig. 2 is a schematic structural diagram of an unmanned aerial vehicle in a phenotype acquisition system in which the unmanned aerial vehicle and a ground robot cooperate with each other;
FIG. 3 is a schematic view of the structure of a crawler-type ground robot in the phenotype acquisition system of the cooperation of the unmanned aerial vehicle and the ground robot;
Fig. 4 is a schematic diagram of a portable edge computing antenna terminal structure of the phenotype acquisition system of the unmanned aerial vehicle and the ground robot.
The reference numerals in fig. 1-4 are illustrated as follows:
The unmanned aerial vehicle comprises a 1-unmanned aerial vehicle, a 11-machine body, a 111-rack, a 112-rotor, a 113-first motor, a 114-energy storage battery, a 115-landing bracket, a 12-navigation communication module, a 13-flight control system, a 131-obstacle avoidance radar, a 132-electronic speed regulator I, a 133-core controller, a 14-high definition acquisition camera, a 2-crawler type ground robot, a 21-crawler type mobile chassis, a 211-chassis box, a 212-chassis upper cover, a 213-crawler, a 214-crawler wheel, a 215-second motor, a 22-navigation communication module, a 221-communication antenna, a 222-communication receiving module 23-anti-shake fish eye camera, a 231-spring damper, a 232-balance ring, a 233-fish eye camera, a 24-obstacle avoidance radar, a 25-control lower computer, a 26-core upper computer, a 3-portable edge computing device, a 31-telescopic bracket, a 311-triangle fixing bracket, a 312-bottom rod, a 313-telescopic fixed buckle, a 314-upper rod, a 32-communication antenna, a 33-computing terminal 331-terminal box, a 332-terminal control button, a 4-unmanned aerial vehicle terminal and a robot, a ground robot, a communication screen and a ground robot, a ground-5-robot.
Detailed Description
The workflow of the system of the present invention is further described in detail below with reference to the accompanying drawings:
1-4, the embodiment of the invention provides a phenotype acquisition system for cooperation of an unmanned aerial vehicle and a ground robot, which comprises the unmanned aerial vehicle 1, a crawler-type ground robot 2 and a portable edge computing device 3, wherein communication connection is established among the unmanned aerial vehicle 1, the crawler-type ground robot 2 and the portable edge computing device 3;
The unmanned aerial vehicle 1 comprises a machine body 11, a navigation communication module 12, a flight control system 13 and a high-definition acquisition camera 14, wherein a rotor wing 112, a first motor 113, an energy storage battery 114 and a landing bracket 115 are arranged on the machine body 11, the flight control system 13 comprises an obstacle avoidance radar 131, an electronic speed regulator I132 and a core controller 133, the unmanned aerial vehicle 1 is used for executing an air monitoring task, and images and video data of farmlands are acquired from an air view angle;
The crawler-type ground robot 2 comprises a crawler-type mobile chassis 21, a navigation communication module 22, an anti-shake fisheye camera 23, an obstacle avoidance radar 24, a control lower computer 25 and a core upper computer 26, wherein the crawler-type mobile chassis 21 comprises a chassis box 211, a chassis upper cover 212, a crawler 213, crawler wheels 214 and a second motor 215, the anti-shake fisheye camera 23 comprises a spring damper 231, a balance ring 232 and a fisheye camera 233, the crawler-type ground robot 2 is used for executing a ground monitoring task, and collecting environmental parameters and detailed data of the bottom of crops from a ground view, wherein the environmental parameters comprise soil humidity and temperature;
The portable edge computing device 3 comprises a telescopic bracket 31, a communication antenna 32 and a computing terminal 33, wherein the telescopic bracket 31 comprises a triangular fixing frame 311, a bottom rod 312, a telescopic fixing buckle 313 and an upper rod 314, the computing terminal 33 comprises a terminal box 331, a terminal control button 332 and an interactive touch screen 333, the portable edge computing device 3 serves as a center of a system, the core functions of the portable edge computing device 3 are realized through an integrated high-performance wireless communication module, the wireless communication module adopts a wireless communication protocol to ensure high-speed and low-delay connection between a unmanned aerial vehicle and a ground robot, meanwhile, a dynamic frequency selection technology and a multiple-input multiple-output MIMO antenna array are utilized to enhance signal stability and coverage, after the system is started, the portable edge computing device 3 firstly ensures that time references of all devices are consistent through a preset time synchronization protocol, secondly dynamically distributes operation areas and task priorities of all devices according to a farmland environment through a centralized scheduling algorithm, and finally ensures synchronous operation of the unmanned aerial vehicle and the ground robot through sending command packets in real time, including target positions, action paths and speed limit information.
Specifically, the wireless communication protocol includes a 5G, wi-Fi 6, low power wide area network LPWAN technology related protocol. The time synchronization protocol includes an NTP network time protocol.
Further specifically, the unmanned aerial vehicle 1 acquires the air viewing angle data captured by the camera 14 through high definition, and transmits the air viewing angle data to the portable edge computing device 3 in real time through the wireless communication module.
Further specifically, the portable edge computing device 3 sends a time synchronization signal to the drone 1 and the tracked floor robot 2 to ensure that both the drone 1 and the tracked floor robot 2 remain synchronized in terms of data acquisition time.
Further specifically, the portable edge computing device 3 analyzes, classifies and fuses the received data by using an image processing algorithm and a machine learning algorithm, and constructs a comprehensive and three-dimensional farmland monitoring model.
The phenotype acquisition system further specifically adopts a high-performance embedded processor as a main control unit to be responsible for coordination and control of the whole system, and the main control unit outputs PWM signals to accurately control the movements of the unmanned aerial vehicle (1) and the crawler-type ground robot (2) and integrates high-precision attitude sensors to acquire the attitude information of the unmanned aerial vehicle and the crawler-type ground robot in real time, and the high-precision attitude sensors are combined with a PID control algorithm to realize accurate and stable control.
Further specifically, the high-definition acquisition camera 14 of the unmanned aerial vehicle 1 and the anti-shake fisheye camera 23 of the crawler-type ground robot 2 both adopt high-resolution cameras so as to ensure the definition and accuracy of data, integrate image compression and transmission technologies and ensure that the data can be transmitted to the portable edge computing device 3 in real time and without damage.
After the system is started, the portable edge computing device 3 firstly sends instructions to the unmanned aerial vehicle 1 and the crawler-type ground robot 2 respectively through the built-in wireless communication module, and the instructions are precisely coordinated to synchronously go to a preset target area. After receiving the instruction, the unmanned aerial vehicle 1 quickly goes up and climbs to a preset height, and then starts the high-definition acquisition camera 14 carried by the unmanned aerial vehicle, so that the image and the video of the target area are comprehensively acquired from the air view angle. At the same time, the tracked floor robot 2 is also flexibly moved to a target area by its tracked mobile chassis 21 according to instructions, and starts an anti-shake fisheye camera 23 to perform fine data acquisition from a floor view.
In this series of operations the system employs a high-performance embedded processor, i.e. a high-performance microcontroller of the STM32 series, which is not directly indicated in the figures as the main control unit of the whole system, but is contained in the control system of the terminal 3 or the tracked floor robot 2. The main control unit is not only responsible for the overall coordination and control of the system, but also receives and primarily processes the data transmitted back by the unmanned aerial vehicle and the ground robot in real time through the wireless communication module in the motion process.
Specifically, the unmanned aerial vehicle 1 acquires the air viewing angle data captured by the camera 14 through high definition, and transmits the air viewing angle data to the portable edge computing device 3 in real time via the wireless communication module 12. After the terminal receives the data, the terminal immediately applies an image processing algorithm to carry out deep analysis on the image, and key information is extracted from the image. Also, the ground view data captured by the ground robot 2 through the anti-shake fisheye camera 23 is also transmitted to the terminal 3 through the navigation communication module 22 thereof in a wired or wireless manner for subsequent processing.
In order to ensure that the unmanned plane and the ground robot can accurately and synchronously acquire data, the system particularly designs a set of time synchronization mechanism. Before each data acquisition task starts, the portable edge computing device 3 sends a time synchronization signal to the unmanned aerial vehicle 1 and the ground robot 2, so that high consistency of the two in data acquisition time is ensured.
Entering the data processing stage, the portable edge computing device 3 uses its powerful computing power to perform deep fusion and comprehensive analysis of data from the unmanned aerial vehicle and the ground robot. By comparing the data of the air view angle and the ground view angle, the system can construct a more complete and three-dimensional target area model, and powerful support is provided for subsequent decisions.
At the control level, the main control chip accurately controls the movements of the unmanned aerial vehicle and the ground robot by outputting PWM signals. Meanwhile, the system also integrates a high-precision attitude sensor so as to acquire the attitude information of the attitude sensor in real time. The information is used as a feedback signal and combined with a PID control algorithm, so that the unmanned aerial vehicle and the ground robot move more accurately and stably.
In the aspect of data acquisition technology, the system adopts a high-resolution camera and a fisheye camera so as to ensure the definition and accuracy of data. Meanwhile, in order to improve the efficiency and reliability of data acquisition, the system also integrates image compression and transmission technology, so that the data can be transmitted to the portable edge computing device 3 in real time and without damage.
In the data processing and analysis layer, the system adopts image processing and machine learning algorithm to carry out deep analysis and accurate classification on the acquired data. By comparing the data of the air view and the ground view, the system can accurately identify different objects and features in the target area, and provide a solid basis for subsequent monitoring, analysis or decision making.
In summary, the invention has the core advantage that the synchronous data acquisition of the unmanned aerial vehicle and the ground robot is realized through the accurate coordination of the portable edge computing equipment, so that a comprehensive and three-dimensional target area monitoring system is constructed. The monitoring mode not only greatly improves the efficiency and accuracy of data acquisition, but also provides richer and comprehensive information support for subsequent decisions.