Movatterモバイル変換


[0]ホーム

URL:


CN119034148B - A front-end fire-fighting robot control method and system - Google Patents

A front-end fire-fighting robot control method and system
Download PDF

Info

Publication number
CN119034148B
CN119034148BCN202411440658.4ACN202411440658ACN119034148BCN 119034148 BCN119034148 BCN 119034148BCN 202411440658 ACN202411440658 ACN 202411440658ACN 119034148 BCN119034148 BCN 119034148B
Authority
CN
China
Prior art keywords
fire
node
robot
point cloud
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202411440658.4A
Other languages
Chinese (zh)
Other versions
CN119034148A (en
Inventor
林致胜
李雪峰
刘涛
李腾
赵思聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Construction International Low Carbon Technology Co ltd
China Construction International Urban Construction Co ltd
Jinan Shuhui Information Technology Co ltd
Original Assignee
China Construction International Low Carbon Technology Co ltd
Jinan Shuhui Information Technology Co ltd
China Construction International Urban Construction Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Construction International Low Carbon Technology Co ltd, Jinan Shuhui Information Technology Co ltd, China Construction International Urban Construction Co ltdfiledCriticalChina Construction International Low Carbon Technology Co ltd
Priority to CN202411440658.4ApriorityCriticalpatent/CN119034148B/en
Publication of CN119034148ApublicationCriticalpatent/CN119034148A/en
Application grantedgrantedCritical
Publication of CN119034148BpublicationCriticalpatent/CN119034148B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

Translated fromChinese

本申请公开了一种前端灭火机器人控制方法及系统,涉及火灾防护领域,包括:在目标区域内设置多个烟雾传感器和红外摄像机,并与灭火机器人连接,组成分布式火情探测网络;通过火焰识别算法监测到火灾时,将发生火灾区域的定位信息发送给灭火机器人;灭火机器人接收到火灾区域的定位信息后,利用搭载的激光雷达传感器采集目标区域内的环境信息,采用同步定位与地图构建SLAM算法,建立区域内建筑物的三维点云模型,利用构建的三维点云模型进行自身定位和导航路径规划。针对现有技术中复杂环境下灭火机器人跨区域作业及导航能力差,本申请提高了各组成单元的网络协作能力、复杂环境的快速识别应对能力以及导航能力。

The present application discloses a front-end fire-fighting robot control method and system, which relates to the field of fire protection, including: setting multiple smoke sensors and infrared cameras in the target area, and connecting them with the fire-fighting robot to form a distributed fire detection network; when a fire is detected through a flame recognition algorithm, the positioning information of the fire area is sent to the fire-fighting robot; after the fire-fighting robot receives the positioning information of the fire area, it uses the laser radar sensor on board to collect environmental information in the target area, adopts the synchronous positioning and map construction SLAM algorithm, establishes a three-dimensional point cloud model of the buildings in the area, and uses the constructed three-dimensional point cloud model to perform self-positioning and navigation path planning. In view of the poor cross-regional operation and navigation capabilities of the fire-fighting robot in complex environments in the prior art, the present application improves the network collaboration capabilities of each component unit, the rapid identification and response capabilities of complex environments, and the navigation capabilities.

Description

Front-end fire extinguishing robot control method and system
Technical Field
The application relates to the field of fire protection, in particular to a front-end fire extinguishing robot control method and system.
Background
With the continuous promotion of industrialization and town progress in China, large industrial parks and urban construction sites are continuously increased. However, due to the complex environment and the numerous combustibles, once a fire occurs, the fire is difficult to find and rescue in time, and serious casualties and property loss are easily caused. Traditional fire control mode mainly relies on fixed fire alarm facility and artifical to put out a fire, has that reaction rate is slow, coverage is limited, fire extinguishing efficiency low grade problem, is difficult to satisfy the fire safety demand of current complicacy changeable. Therefore, there is a need to develop an intelligent fire-fighting device capable of realizing early fire detection, quick response and self-extinguishing, and improving fire prevention, control and treatment capability of complex environments.
In recent years, the emerging technologies represented by robots, internet of things and artificial intelligence are developed vigorously, and a new idea is provided for solving the fire safety problem. Some scholars propose to utilize intelligent fire extinguisher robot to realize the autonomous detection of condition of a fire and put out a fire operation, through technology such as environment perception, autonomous navigation, intelligent decision-making make the robot possess similar intelligent behavior ability of people, replace the fireman to carry out high-risk fire and put out a fire task. Meanwhile, the fire sensor composition Internet of things is deployed in a large range, and sensing data of the fire sensor composition Internet of things is used for commanding and dispatching the fire extinguisher robot, so that the fire monitoring range can be effectively enlarged, and the reaction speed and the collaborative operation efficiency of the fire extinguishing equipment are improved. The intelligent fire extinguisher robot is combined with the Internet of things technology to form a full-chain intelligent fire-fighting mode of sensing perception, intelligent analysis and autonomous execution, so that the fire prevention and emergency treatment level of a complex environment is expected to be improved fundamentally.
At present, fire safety problems of construction sites and industrial parks are increasingly prominent. Because the environments of the areas are complex and changeable, the conventional fixed fire detection and fire extinguishing facilities are deployed, the defects are that the construction site buildings and facilities need to be changed in real time along with the construction progress, the fixed fire extinguishing equipment is difficult to flexibly deploy, a large number of open areas exist in a large-scale park, the coverage cost of the fixed fire extinguishing equipment is high, the whole area monitoring is difficult to realize, and the response range of the fixed fire extinguishing device is limited once a fire occurs, so that the effective fire extinguishing is difficult to be carried out on a large scale of fire.
Existing mobile fire-fighting equipment such as remote control crawler-type fire-fighting truck overcomes the defects of fixed equipment to a certain extent, but still needs fire-fighting personnel to operate on site, has low fire-fighting efficiency and has personnel injury risk. Some scholars aim at the problems, and the intelligent fire extinguisher robot is used for realizing autonomous detection of fire and mobile fire extinguishing operation. However, most existing fire extinguisher robots adopt a single body sensor, are difficult to fully sense complex fire environments, have limited navigation capability, and cannot adapt to the requirements of large-range regional operation. In addition, due to the lack of information interaction with the environment monitoring sensing network, the fire information acquired by the robot is incomplete, and early warning and active fire extinguishing of the fire are difficult to achieve.
Therefore, the technical bottlenecks are broken through, namely, the intelligent fusion and fire area accurate positioning technology of the multi-source heterogeneous sensor in the complex environment is researched, the autonomous positioning, obstacle avoidance navigation and mobile operation technology of the fire extinguisher robot in the complex environment is researched, and the information interaction and collaborative operation scheduling technology of the fire fighting Internet of things and the intelligent robot is researched to realize early warning and active fire extinguishing of the fire.
Disclosure of Invention
Aiming at the problem of poor navigation capability of the fire extinguishing robot in the complex environment in the prior art, the application provides a front-end fire extinguishing robot control method and a system, which are characterized in that a distributed fire detection network is built, the accurate position of the fire is obtained by Bayesian reasoning, the map building, the positioning and the path planning are carried out on the target area according to the obtained position, the closed loop detection is carried out on the planned path, and the like, so that the path navigation capability of the fire extinguishing robot in a complex environment is improved.
The aim of the application is achieved by the following technical scheme.
The application provides a front-end fire extinguishing robot control method, which comprises the steps of arranging a plurality of smoke sensors and infrared cameras in a target area, connecting the smoke sensors and the infrared cameras with a fire extinguishing robot in a wireless communication mode to form a distributed fire detection network, sending positioning information of a fire area to the fire extinguishing robot when the distributed fire detection network monitors fire through a flame identification algorithm, collecting environment information in the target area after the fire extinguishing robot receives the positioning information of the fire area, constructing an SLAM algorithm by using a carried laser radar sensor, constructing a three-dimensional point cloud model of a building in the area, and performing self positioning and navigation path planning by using the constructed three-dimensional point cloud model.
And the fire extinguishing robot navigates to the fire disaster area according to the planned navigation path to extinguish the fire. The fire extinguishing robot senses the environment according to a planned navigation path by utilizing a laser radar and a vision sensor, automatically navigates to a fire area according to local environment information and a target point position through a local path planning algorithm DWA, after the fire area is reached, a visible light camera and an infrared thermal imager on the robot are controlled to collect fire scene images, the collected visible light images and the collected infrared images are registered and fused, a depth learning target detection algorithm YOLOv is utilized to detect a flame area, the detected flame area is positioned in the infrared images, flame pixel coordinates are converted into three-dimensional space coordinates according to internal and external parameters of the infrared imager, fire source position information is obtained, and a fire extinguishing device is controlled to be aligned with a fire source according to the fire source position information to extinguish the fire.
Further, a plurality of smoke sensors and infrared cameras are arranged in a target area and are connected with a fire extinguishing robot in a wireless communication mode to form a distributed fire detection network, the distributed fire detection network comprises the smoke sensors and the infrared cameras which are arranged at different positions of the target area, the smoke sensors are used for detecting smoke concentration, the infrared cameras are used for collecting infrared images, a LoRa or WIFI wireless communication module is integrated on each smoke sensor and each infrared camera, position information of each smoke sensor and each infrared camera is bound with the position information of each sensor node in a system, and a system platform receives data of each sensor node in real time and performs fire analysis and data preprocessing.
Further, when the distributed fire detection network monitors fire through a flame recognition algorithm, positioning information of a fire area is sent to a fire extinguishing robot, the system platform carries out multi-source information fusion on smoke concentration data and infrared image data of all sensor nodes after pretreatment, a Bayesian inference method is adopted to establish a fire probability model, in the fire probability model, the smoke concentration data and the infrared image data are used as observers, probability of fire occurrence is used as hidden variables, posterior probability of fire occurrence under the condition of given observers is calculated through Bayesian inference, the area where the fire occurs is determined according to the sensor nodes with the posterior probability exceeding a threshold value, the position information of the sensor nodes in the determined area is used as primary positioning information of the fire area, and the primary positioning information is sent to the fire extinguishing robot which determines a starting point and an end point of walking and a navigation path according to the position and environment data of the fire extinguishing robot. After reaching the fire disaster area, the fire extinguishing robot can find the final fire starting point according to the self flame detector and perform local navigation
Further, the system platform carries out multi-source information fusion on the smoke concentration data and the infrared image data of each sensor node after pretreatment, a Bayesian inference method is adopted to establish a fire probability model, the system platform comprises the steps of defining each sensor node as one node in a Markov random field, defining observed quantity of the nodes as smoke concentration data and infrared image data collected by corresponding nodes, determining a neighborhood node set of each node according to the distribution of the sensor nodes in space, correlating observed quantity of the nodes with observed quantity of the neighborhood nodes to construct spatial correlation among the nodes, establishing the Markov random field model based on the spatial correlation among the nodes, defining the state of fire as hidden variables of the random field, defining observed quantity of all the nodes as observed variables of the random field, defining an energy function E (x) in the Markov random field, and solving the maximum posterior probability estimation of fire occurrence in the Markov random field by adopting a minimized energy function E (x).
Further, in the Markov random field, an energy function E (x) is defined: Wherein x represents the state of fire, xi represents the state of fire of the ith node, y represents the observed quantity of all nodes, yi represents the observed quantity of the ith node, phii(xi,yi) represents the local evidence function of the ith node, the compatibility of the observed quantity yi and the state of fire xi is measured, the larger the value is, the more the observed quantity supports the state of fire, and the calculation formula is as follows: mu (xi) and sigma (xi) respectively represent observed mean and covariance matrix of the ith node in fire state xi, psiij(xi,yi) represents a spatial correlation function between the ith node and the jth node, the correlation strength of the fire states of adjacent nodes is measured, the larger the value is, the more consistent the fire states of the two nodes are, and the calculation formula is as follows: Wherein d (xi,yj) represents a distance measure between the fire states xi and xj, betaij is a related intensity parameter, Z (y) represents a distribution function for normalizing the probability distribution so that the cumulative probability of the posterior probability P (x|y) is equal to 1, and the calculation formula is as follows:
The energy function comprehensively considers the influence of local observation evidence of the nodes and the spatial correlation among the nodes on fire state estimation, measures the compatibility of the observed quantity and the fire state through a local evidence function phii(xi,yi), measures the correlation strength of the fire state of the adjacent nodes through a spatial correlation function phiij(xi,yi), and normalizes the probability distribution through a distribution function Z (y). Local evidence item when minimizing energy function to solve fire state estimationThe estimation result is matched with the observed data, the spatial correlation term sigmaijψij(xi,xj) enables the estimation result of the adjacent node to be consistent, the two items are mutually restricted and act together, so that the Markov random field model can balance between the observed evidence and the spatial correlation, and the position of the fire can be deduced more accurately. The energy function fully utilizes the spatial correlation characteristic of the sensor data, and improves the robustness and reliability of fire positioning in a complex environment.
Further, the fire extinguishing robot performs checksum optimization on preliminary positioning information to obtain final positioning information of a fire area, and comprises the steps of optimizing the preliminary positioning information by adopting a particle filtering algorithm, initializing a group of particle samples according to position distribution of sensor nodes in the fire area, each particle comprises a fire position, a particle priori weight and node observation data, estimating smoke concentration distribution and infrared radiation distribution at the corresponding particle according to the fire position contained by each particle, comparing the estimated distribution at the particle with observation data acquired by the nodes, calculating likelihood probability of the corresponding particle, multiplying the likelihood probability of the particle by the corresponding priori weight to obtain posterior weight of the particle, resampling the particle sample according to the posterior weight of each particle in the particle sample, retaining particles with posterior weight greater than a threshold value to obtain a resampled particle sample set, iterating the particle posterior weight calculation and the particle resampling on the resampled particle sample set until the distribution or iteration number of the particle sample set reaches the threshold value, weighting the average position of each particle in the converged particle sample set to obtain the final positioning information.
Further, after the fire extinguishing robot receives positioning information of a fire area, environment information in a target area is collected by the aid of a carried laser radar sensor, a SLAM algorithm is used for synchronous positioning and map construction, a three-dimensional point cloud model of a building in the area is built, self positioning and navigation path planning are conducted by the aid of the built three-dimensional point cloud model, the fire extinguishing robot is used for conducting 360-degree scanning on the target area after receiving the positioning information of the fire area, original point cloud data of the building and obstacles in the area are obtained, downsampling is conducted on the original point cloud data to reduce data quantity, ground points are removed by means of a random sampling consistency algorithm RANSAC, preprocessed point cloud data are obtained, the preprocessed point cloud data are input into the synchronous positioning and map construction algorithm, real-time pose of the fire extinguishing robot in the target area is obtained, a three-dimensional point cloud map of the building in the area is built according to the real-time pose, the three-dimensional point cloud map of the building in the area is projected to a two-dimensional plane, the three-dimensional point cloud map is represented in a grid map mode, the current position of the fire extinguishing robot is used as a starting point, the fire extinguishing robot is used for the positioning information, the fire extinguishing robot is used as a starting point, a starting point is built on the map, a map is used for planning a map, and a navigation path is conducted from the starting point of the fire extinguishing robot is used for the starting point to the map is used for the map. The fire extinguishing robot determines a starting point, an end point and a navigation path of walking according to the self position and the environmental data, and after reaching a fire disaster occurrence area, the fire extinguishing robot searches a final fire starting point according to a self flame detector and performs local navigation.
Further, the preprocessed point cloud data is input into a synchronous positioning and map construction algorithm to obtain the real-time pose of the fire extinguishing robot in a target area, and a three-dimensional point cloud map of a building in the area is built according to the real-time pose; inputting a local point cloud map and a robot pose into a synchronous positioning and map construction algorithm, constructing a factor map model by adopting a map optimization-based method, predicting the robot pose according to laser radar data and a robot motion model by adopting an extended Kalman filtering algorithm to obtain a priori estimate of the robot pose, adding the predicted pose as a laser radar odometer factor into the factor map model, constructing a geometrical constraint relation among nodes in the map by taking the laser radar odometer factor, a loop detection factor and the local map factor as edges in the map in the factor map model, judging whether the current position of the robot reaches a previous position by extracting characteristic points containing angular points and plane points in the local point cloud map by adopting a closed loop detection algorithm based on appearance and space geometrical information, adding a closed loop factor in the factor map model if a closed loop is detected, constructing a nonlinear least square optimization problem by taking the constructed factor map model as a basis, solving the optimal pose and characteristic point position of each node in the factor map model, taking the optimized robot pose as a real-time positioning result of the robot pose, converting the local point cloud coordinate of the robot according to the optimized robot pose, and obtaining a three-dimensional point cloud map of the building in the area.
Further, judging whether the current position of the robot reaches the previous position by extracting characteristic points containing angular points and plane points in the local point cloud map and adopting a closed loop detection algorithm based on appearance and space geometric information, if a closed loop is detected, adding a closed loop factor in a factor map model, and corresponding to the current position and pose node, wherein the method comprises the steps of dividing the local point cloud map into cube voxels with the same size, obtaining a characteristic point set of the local point cloud map by extracting the characteristic points with the most angular points or plane points in each voxel as the characteristic points of the corresponding voxels, calculating a geometric descriptor and a visual descriptor of each characteristic point in the characteristic point set, constructing a word bag model according to the geometric descriptor and the visual descriptor of the characteristic points, obtaining a word bag representation vector of the local point cloud map, matching the word bag representation vector of the local point cloud map at the current position and pose node of the robot with the word bag representation vector of the history pose node, and calculating the similarity between the word bag representation vector of the local point cloud map at the pose node, and the pose node, if the similarity is greater than a threshold, judging that the current position of the robot reaches the previous position and the position and pose node, and the pose node is relatively matched with the current position and pose node, and pose node is relatively matched with the current position and pose node.
Another aspect of the present application also provides a front-end fire extinguishing robot control system for performing a front-end fire extinguishing robot control method of the present application.
Compared with the prior art, the application has the advantages that:
According to the application, a distributed fire detection network is formed by utilizing smoke sensors and infrared cameras deployed in a target environment, a fire probability model based on a Markov random field is established through multi-source fusion of smoke concentration data and infrared image data, the local observation evidence of sensor nodes and the spatial correlation among the nodes are fully considered in the probability map model, the Markov random field is solved through Bayesian reasoning to obtain the maximum posterior probability estimation of the fire area position, the defect that a single sensor is easy to be interfered in a complex environment is overcome, and the accuracy and reliability of fire positioning are improved.
According to the application, synchronous positioning and map construction are realized by utilizing the laser radar sensor carried by the fire extinguishing robot through factor graph optimization. The preprocessed point cloud data are input into an SLAM algorithm, a robot pose and a local point cloud map are taken as nodes, a laser radar odometer, loop detection and feature matching are taken as edges, a factor graph model is built, a graph optimization problem is solved, a three-dimensional geometric map of a target area is built on line, and the robot pose is tracked in real time, so that the robot can realize autonomous positioning and map building under an unknown environment. Based on the constructed map, the three-dimensional point cloud is projected to a two-dimensional plane, a cost map is constructed, and then a path search is carried out by utilizing an A-based algorithm, so that the robot can autonomously plan a collision-free navigation path, and obstacle avoidance navigation under a complex environment is realized.
The application provides a closed-loop detection algorithm based on appearance and geometric information, which is characterized in that feature points are extracted from a local point cloud map, bag-of-words representation is constructed, bag-of-words vector similarity of nodes with different poses is compared, whether a robot reaches a previous position is judged, closed-loop factors are added in a factor graph to apply closed-loop constraint to a graph model, and the drift of an odometer is effectively restrained through global optimization. Meanwhile, the loop detection provides a position priori for the factor graph, and the optimized pose and map are more accurate, so that the fire extinguishing robot can maintain stable and accurate navigation for a long time in a complex environment, and the navigation precision is improved.
According to the application, fire detection is realized by means of distributed multi-sensor fusion, and multi-source heterogeneous sensor data such as vision, infrared, radar and the like are fused, so that the sensing capability of the robot to the environment is enhanced. Meanwhile, the automatic positioning and mapping of the robot are realized by combining with an SLAM algorithm, so that the robot can construct a three-dimensional geometric map of a complex environment on line. The environment sensing and the autonomous positioning navigation are combined, so that the fire extinguishing robot can comprehensively sense the complex fire scene environment, and precise autonomous positioning and navigation are realized by utilizing the scene sensing information. The multi-sensor fusion SLAM improves the environmental adaptability of the fire extinguishing robot and enhances the operation capability of the fire extinguishing robot in complex disaster sites.
Drawings
The application will be further described by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is an exemplary flow chart of a front-end fire suppression robot control method according to some embodiments of the present application;
Fig. 2 is a schematic view of an application scenario of a front-end fire extinguishing robot control method according to some embodiments of the present application;
FIG. 3 is a block diagram of a fire extinguisher robot according to some embodiments of the present application;
fig. 4 is a block diagram of another fire extinguisher robot according to some embodiments of the present application.
Detailed Description
Example 1
The method and system provided by the embodiment of the application are described in detail below with reference to the accompanying drawings.
Fig. 1 is an exemplary flowchart of a front-end fire extinguishing robot control method according to some embodiments of the present application, and fig. 2 is a schematic view of an application scenario of a front-end fire extinguishing robot control method according to some embodiments of the present application, where a plurality of smoke sensors and infrared cameras are disposed in a target area, the smoke sensors and the infrared cameras are connected with a fire extinguishing robot by a wireless communication manner to form a distributed fire detection network, when the distributed fire detection network monitors a fire through a flame recognition algorithm, positioning information of the fire area is sent to the fire extinguishing robot, after the fire extinguishing robot receives the positioning information of the fire area, the environment information in the target area is collected by using a laser radar sensor mounted on the fire extinguishing robot, a SLAM algorithm is constructed by using synchronous positioning and mapping, a three-dimensional point cloud model of a building in the area is established, and the fire extinguishing robot performs self positioning and navigation path planning by using the constructed three-dimensional point cloud model, and navigates to the fire extinguishing area according to the planned navigation path.
Fig. 3 is a construction diagram of a fire extinguisher robot according to some embodiments of the present application, and fig. 4 is a construction diagram of another fire extinguisher robot according to some embodiments of the present application, it can be seen from fig. 3 and 4 that the fire extinguisher robot of the present application adopts a wheel-type or crawler-type moving chassis, has good passing ability and maneuverability, a body adopts a modularized design, mainly comprises a sensor module, a fire extinguishing operation module, a control communication module, etc., the sensor module integrates a camera, a thermal infrared imager, a laser radar, etc., for fire monitoring and environmental sensing, the fire extinguishing operation module comprises a fire extinguishing gun and a fire gun, and can spray fine water mist or foam for fire extinguishing, and the control communication module comprises a main control unit, The power management module provides electric energy required by the operation of the robot, distributes and manages power, and has compact and reasonable structure, and the sensor, the actuator and the control unit are properly arranged, thereby being convenient for fire scene operation. The camera collects visible light images in real time, monitors the fire scene environment, recognizes fire characteristics such as flames, dense smoke and the like, the infrared thermal imager collects infrared images, determines the position of a fire source according to a temperature distribution high-temperature area, adopts an image recognition algorithm based on deep learning to extract and classify the fire characteristics, fuses visual detection results and infrared detection results, improves the accuracy of fire judgment, and uploads fire information to a monitoring center in real time or alarms in a mode such as voice, a display screen and the like. The method comprises the steps of scanning the surrounding environment by a laser radar, obtaining point cloud data around the robot, constructing an environment map, adopting a SLAM algorithm, combining odometer data with laser point cloud, realizing autonomous positioning and map building of the robot, conducting path planning on the environment map, generating a navigation path from a current position to a target fire source, conducting path tracking control by adopting a Model Predictive Control (MPC) algorithm, generating control instructions such as speed, steering and the like in real time according to a path curvature and a robot dynamics model, sending the control instructions to a chassis motor, controlling the robot to move, conducting track correction according to laser radar feedback, enabling the robot to navigate to the vicinity of the fire source autonomously, and adjusting a fire extinguishing strategy according to a fire state. The fire extinguishing gun can spray high-pressure water mist, realize temperature reduction and fire extinguishing by gasifying and absorbing heat of the water mist and isolating oxygen, can emit fire extinguishing bomb, release a large amount of fire extinguishing agent to extinguish fire rapidly, adopts visual servo control, adjusts pitch angle and direction angle of the gun and the gun tube according to camera feedback to aim the fire position, realizes continuous spraying of the fire extinguishing agent through a water pump and a drug supply system, judges the fire intensity through infrared imaging, automatically adjusts the spraying pressure and flow of the fire extinguishing agent, and automatically stops spraying and returns to a standby position after the fire is extinguished. The communication antenna adopts a 5G module to support wireless communication with high bandwidth and low time delay, realizes real-time data transmission of the robot and a monitoring center, can be in communication connection with the monitoring center in a satellite communication or ad hoc network mode under the condition of no 5G network coverage, adopts a high-performance industrial personal computer to operate an ROS robot operation system to schedule and control each functional module of the robot, displays the state information, fire parameters and fire extinguishing operation pictures of the robot in real time, is convenient for firefighters to observe, supports a remote control function, can manually control the movement and fire extinguishing of the robot by the firefighters, improves flexibility, and has data recording and storage functions, can perform data acquisition and log recording on the operation process, and is convenient for post analysis.
Specifically, the sensor nodes are arranged, namely, a smoke sensor and an infrared camera are arranged at proper positions in a target area. The smoke sensor is a photoelectric smoke sensor, the smoke concentration is judged by measuring the light intensity attenuation caused by smoke, and the measuring range is 0.1%/m10%/m. The infrared camera selects uncooled focal plane thermal infrared imager, the detection wave band is 814 μm, and the spatial resolution is 320×240. When the sensor is arranged, the environmental characteristics and the monitoring blind areas are required to be fully considered, and the full coverage monitoring of the area is realized by reasonably setting the position of the sensor.
Integration of wireless communication modules Loar wireless communication modules are integrated on each sensor, the working frequency is 2.4GHz, and the transmitting power is 4.5dBm. The communication module is connected with the sensor through a UART interface and is responsible for framing and wirelessly transmitting data acquired by the sensor. Is responsible for setting up and maintaining the entire network. The sensor node data is transmitted to the robot by means of a repeater.
And preprocessing fire data, wherein a processor of the robot reads response data of the sensor nodes from a receiving buffer zone, and judges whether the frame is complete or not according to a length field in the frame header. And then carrying out CRC (cyclic redundancy check) on the data frame to judge whether the data has error codes or not. And if the verification is passed, unpacking the frame load according to the frame type and the data type to obtain smoke concentration data or infrared image data. And filtering the smoke concentration data by adopting a clipping average filtering algorithm. A sliding window is set to store the last n samples. If the new sampling value exceeds the set upper and lower limit threshold values, the abnormal value is judged to be removed, otherwise, the abnormal value is added into a sliding window, and the average value of the data in the window is updated to be used as a filtering result. For infrared image data, a median filtering algorithm is used to remove speckle noise and salt and pepper noise. For the image pixel-by-pixel traversal, the median value of the pixel values in the 5×5 neighborhood around the current pixel is used for replacing the current pixel value, so that isolated noise points can be effectively restrained. And normalizing the filtered smoke concentration data according to a sensor range to obtain a concentration value in the range of [0,1 ]. And extracting statistical characteristics such as mean value, variance, peak value and the like of the concentration data, and taking the statistical characteristics as a basis for judging subsequent fire conditions. The infrared image data is subjected to gray scale normalization, and the original image data is mapped into the range of [0,255 ]. Then, the image contrast is enhanced through a histogram equalization algorithm, and a high-temperature region is highlighted. And extracting characteristic parameters such as the center position of a hot spot, the high-temperature area, the average temperature and the like of the image, and preparing for subsequent fire positioning and temperature estimation.
Establishing a fire probability model, defining a Markov random field, and mapping each sensor node into one node in the Markov random field to form a node set V= {1, 2. Each node i and its neighborhood node set Ni form an undirected graph g= (V, E) of the markov random field, where E is the edge set of the graph, representing the spatial correlation between the nodes. The observed quantity yi of each node i consists of two parts, namely smoke concentration si and infrared image average temperature ti, and is expressed as a two-dimensional characteristic vector. The fire state variable xi is a hidden variable of the node i, and takes a value of 0 or 1 to indicate whether the fire occurs in the node i. The fire states of all nodes constitute the hidden variable sequence x of the random field.
Specifically, in the specific embodiment of the application, a Markov random field is constructed by uniformly arranging 100 sensor nodes in a10 multiplied by 10 grid, wherein the node spacing is 5m, the node set V= {1, 2..the use of the three-dimensional space is realized, 100}, each node i corresponds to a unique grid coordinate (xi,yi), and each node i and 8 surrounding nodes form a neighborhood set Ni, namely Ni={j||xi-xj I is less than or equal to 1 and yi-yj I is less than or equal to 1, and j is not equal to i. The inter-node edge set E contains all adjacent node pairs, and has 360 undirected edges, which represent the spatial correlation between nodes. The neighborhood of node 1 is N1 = {2,11,12}.
The edges connected with the node 1 are { (1, 2), (1, 11), (1, 12) }, and the undirected graph G= (V, E) of the whole random field contains 100 nodes and 360 edges, and describes the spatial topology structure of the sensing network. The observation feature and hidden variable are observation feature vector yi=(si,ti of each node i, wherein si is smoke concentration, ti is infrared image average temperature, the value range of si is [0,1], the normalized value of smoke concentration is represented, 0 represents no smoke, 1 represents thick smoke, the value range of ti is [20,100], the average temperature of infrared images is represented, the unit is the degree centigrade, hidden variable Xi epsilon {0,1} represents whether fire occurs at the node i, 0 represents normal, 1 represents fire, and the hidden variable of the whole random field is X= (X1,x2,......,x100) and represents fire state of all nodes. Probability map model according to Markov assumption, the fire state of each node is only related to the state of its neighborhood nodes. Given the observed features y= { Yi }, the joint probability distribution of the fire state x= { Xi } can be decomposed into: Wherein Z is a normalization factor, a local evidence function of a phi (xi,yi) node i is related to an observation feature yi, and phi (xi,xj) is a spatial correlation function between the nodes i and j and reflects the state correlation of a neighborhood node. The local evidence function may be defined as: Where w1 and w2 are weighting coefficients for smoke concentration and temperature characteristics, which can be set according to a priori knowledge. The spatial correlation function may be defined as an Ising model: where β is a spatially-dependent intensity parameter, β >0 indicates that the neighborhood nodes tend to take the same state.
Model inference-given the observation feature Y, the posterior probability distribution P (x|y) of the hidden variable X may be inferred, a bayesian approach may be used, the prior probability P (X) may be assumed to be a uniform distribution, and the likelihood function P (y|x) may be constructed from a local evidence function phi (Xi,yi). Because of the complex structure of the Markov random field, the posterior probability P (X|Y) is difficult to solve explicitly, an approximate inference method is needed, common inference algorithms comprise meanfield approximations, loopybeliefpropagation, and the like, and probability estimation is realized through iterative message passing. Taking loopyBP algorithm as an example, each node i maintains an edge distribution bi(xi), representing the probability that node i is in state xi, in each iteration node i receives message mj→i(xi from neighbor node j), and updates its edge distribution: After iterating for several rounds, the edge distribution bi(xi) converges, and the fire probability of each node i can be obtained. The decision and early warning are that according to the edge distribution bi(xi), the posterior probability P (xi =1|Y) of each node i on fire can be obtained, a fire early warning threshold delta=0.7 is set, if P (xi =1|Y) > delta, the fire of the node i is judged, otherwise, the node fire probability at a certain moment is considered as follows, node 1:P (x1 =1|Y) =0.82, node 2:P (x2 =1|Y) =0.35, node 3:P (x3 =1|Y) =0.91, the number of the fire nodes A= {1,3} can be obtained, and the node position in A is reported to a monitoring center for fire early warning.
And (3) dividing the node neighborhood, namely carrying out triangulation connection on the nodes by adopting a Delaunay triangulation algorithm according to the space coordinates (xi,yi,zi) of the sensor nodes. After triangularization, three vertices of each triangle are neighborhood nodes to form a neighborhood set Ni of the node i. Creating an adjacency matrixIf node j is a neighbor node of i, aij =1, otherwise aij =0. The adjacency matrix represents the connection relation of the nodes in fig. G.
Defining observed quantity condition distribution, namely for each node i, observed quantity thereofObeying the Gaussian distribution under the condition of the fire state xi, the probability density function is f (yi|xi)=Ν(yi|μ(xi),Σ(xi)), wherein mu (xi) and sigma (xi) are respectively mean vectors and covariance matrixes under the state xi, and the mean vectors and covariance matrixes can be obtained through statistics of historical data: wherein,The mean value and variance of smoke concentration and temperature when fire does not occur,Is the corresponding parameter when the fire happens, and ρ is the correlation coefficient of smoke concentration and temperature.
The prior distribution of fire states is defined, namely assuming that the prior probabilities of the fire states of all nodes are the same, the prior distribution is subjected to Bernoulli distribution with a parameter of pi, namely P (xi=1)=π,P(xi =0) =1-pi, wherein pi can be set according to the fire risk level of the environment, and the probability of fire occurrence is 1% as shown by pi=0.01. Defining the joint distribution of Markov random fields the joint distribution P (x, y) of Markov random fields can be decomposed into the product form of a potential function according to the HAMMERSLEY-Clifford theorem: Where Z is the partitioning function, phii(xi,yi) is the local potential function of node i, phiij(xi,xj) is the interaction potential function of nodes i and j. Solving a probability model, substituting a local potential function phii(xi,yi) and an interaction potential function phiij(xi,xj) into an expression of an energy function E (x), and obtaining the following formula: Where Z is a partitioning function, is a normalization constant such that the sum of the probability distributions P (x) is 1. Since Z is independent of x, it can be ignored when solving for the optimal state x. Thus, minimizing E (x) is equivalent to maximizing
Gibbs sampling algorithm Gibbs sampling approximates the posterior distribution P (x|y) by updating the state of each node one by one to generate a sample sequence of Markov random fields. The specific sampling steps are as follows, randomly initializing state of hidden variableLet iteration number t=0. For the t-th iteration, the state of each node i is updated in turnThe local energy of node i in two states xi=0,xi =1 is calculated: calculating the state transition probability of the node i according to the energy value: wherein,The states of nodes other than node i at time t are shown. From Bernoulli distributionMid-sample generationThe state of node i is updated. Repeating until the states of all nodes are updated, and obtaining a complete state sampleRepeating the preset iteration times (such as 1000 times) to obtain a set { X1,X2,......,XT } of state samples. Estimating the edge probability of the node i according to the sampled state sequence: wherein I (·) is an oscillography function, and T is the number of iterations. P (xi=0|y)=1-P(xi = 1|y).
And (3) positioning a fire area, namely setting a threshold lambda epsilon (0, 1) according to the posterior probability P (xi = 1|y) estimated by Gibbs sampling, and judging whether the fire occurs in the node i, wherein if P (xi = 1|y) > lambda, the node i judges that the fire occurs in the node i, and the other fire judges that the fire occurs in the node i. And adding the position coordinates (xi,yi,zi) of all fire nodes into the set S, wherein S is the estimated fire area. And obtaining the Zigbee short address of each node in the fire node set S by inquiring the position-address mapping table. And sending a fire confirmation frame to the fire node, and requesting the fire node to reply to the confirmation frame. Nodes which do not receive the response are removed from S. Updating the node list of the poller, deleting the node address in S from the conventional polling list, and adding the node address into the fire polling list. According to the size of the fire area, the query frequency of the nodes in the fire polling list is dynamically adjusted to realize key monitoring.
Particle filtering algorithm, namely initializing particles, namely randomly spreading N particles in a fire area according to the size and shape of the fire area, wherein each particle comprises the following information of a particle number i which is more than or equal to 1 and less than or equal to N, and a fire position coordinate Pi=[xi,yi,zi. A priori weights: Node observations Di={d1i,d2i,......,dni, whereIs the smoke concentration and infrared temperature observations of node j at particle i. Particle weight update for each particle i, the smoke concentration sji and the infrared temperature tji at each node are estimated from its fire position pi. The estimated observation data Di' at particle i can be obtained by estimating using a gaussian smoke diffusion model and a radiation transmission model. The estimated observation data Di' is compared with the actual observation data Di, and likelihood probabilities are calculated: wherein,AndIs the observed noise variance of smoke concentration and temperature. Multiplying the likelihood probability with the prior weight to obtain the un-normalized posterior weight of the particle i, wherein wi'=Pi×wi is the posterior weight of all particles, and normalizing:
Particle resampling, namely resampling N new particles from N particles by adopting a roulette method according to the normalized posterior weight wi. Particles with higher posterior weights are sampled multiple times, while particles with lower weights may be eliminated, such that the particle samples are concentrated in areas with higher posterior probabilities. And after resampling, enabling the weight of all particles to be 1/N, and obtaining a resampled particle set. Iterative updating, namely repeatedly and continuously updating the particle set until one of the following stopping conditions is met, namely that the distribution of the particle set converges, namely that the weight of the particle with the largest posterior weight exceeds a threshold value (such as 0.9). The number of iterations reaches a preset maximum (e.g., 100). Fire positioning, namely, carrying out weighted average on the coordinates of the fire position of the converged particle set according to the posterior weight of the particle to obtain the maximum posterior probability estimation of the fire position: and transmitting the p as the final positioning coordinates of the fire area to a navigation module of the fire extinguishing robot.
And after the fire extinguishing robot receives the positioning coordinates [ x, y and z ] of the fire area, setting the fire extinguishing robot as a target point p. The robot chassis is controlled to rotate so that the orientation of the airborne lidar is aligned with the p-x direction. The laser radar is started to perform 360-degree rotary scanning, the scanning frequency is fs (such as 10 Hz), and the scanning angle resolution is delta theta (such as 0.1 degree). And each time the laser radar finishes scanning, the acquired scanning point set Di={p1,p2,......,pn is sent to a point cloud processing module. The point cloud processing module receives the scanning point set Di, extracts the three-dimensional coordinates [ xj,yj,zj ] and the reflection intensity value ij of each scanning point Pj, constructs original point cloud data Pi:Pi={[x1,y1,z1,i1],[x2,y2,z2,i2],......,[xn,yn,zn,in]};, and accumulates the original point cloud data Pi obtained by each scanning to obtain complete environment point cloud data P: Where m is the number of scans, m=360°/Δθ. And (3) point cloud downsampling, namely performing voxel rasterization downsampling on the original point cloud P, and reducing the data quantity of the point cloud. The three-dimensional space is divided into a grid of cubes of dimensions d x d, d being the voxel side length (e.g. 0.1 m). Traversing each point Pj in the point cloud P, calculating the voxel number [ i, j, k ] to which it belongs: for each non-empty voxel, the coordinate average of all points within the voxel is calculated as the representative point of that voxel.
And constructing down-sampled point cloud data Qd:Qd={q1,q2,......,ql by using all voxel representative points, wherein l is a non-null pixel number, and l is less than or equal to n. After downsampling, the point cloud density is greatly reduced, and the data size is reduced. And removing ground points, namely performing plane fitting on the downsampled point cloud Qd by using a RANSAC algorithm, and extracting the ground point cloud Qg. 3 points in Qd are randomly selected, a plane equation defined by the 3 points is calculated, ax+by+cz+d=0, and the distances from all points in Qd to the plane are calculated, and the number ni of points with a statistical distance less than a threshold t (e.g., 0.05 m) is calculated. Repeating for several times (such as 100 times), and finding out the plane with the largest n_i as the ground plane. Points in Qd which are less than t from the ground plane are marked as ground points, and a ground point cloud Qg is formed. And eliminating the ground points from the Qd to obtain a non-ground point cloud Q:Q=Qd-Qg, so as to finish the acquisition and pretreatment of the laser radar data, obtain the non-ground point cloud Q representing the environmental obstacle, and prepare for subsequent positioning and mapping. The preprocessing link adopts a method combining voxel rasterization and RANSAC plane fitting, reduces the density of point cloud, effectively filters out a large number of redundant ground points, reduces the data volume and highlights the geometric features of the obstacle.
And (3) synchronous positioning and map construction, namely initializing the pose (position and direction) of the fire extinguishing robot to be x0 = [0, 0], and setting the pose as the origin of a coordinate system. According to the odometer data of the robot, predicting an priori estimated value xt' of the pose xt of the robot at the moment t, and establishing an odometer factor based on a motion model: wherein ut is the robot control quantity from t-1 to t, the index is the operation symbol on the plum group SE (3), and I Sigma represents the Markov distance. And carrying out coordinate transformation on the preprocessed point cloud Q to obtain a local point cloud map L:Lt=T(xt) Q taking xt as a reference system, wherein T (·) is a coordinate transformation matrix on SE (3). And extracting the corner points and plane point characteristics in the Lt, and judging whether the current pose xt reaches the previous position xj by adopting a closed loop detection algorithm based on appearance and geometric information.
If a closed loop is detected, adding a closed loop factor in the factor graph: Wherein Δt is the relative pose transformation at closed loop. Based on the local map Lt, extracting stable corner points and plane point characteristics, and constructing local map factors: Where pk is a feature point in Lt and qk is a corresponding global map point. Integrating the odometer factor, the closed loop factor and the local map factor, and constructing a factor graph optimization problem: And solving the nonlinear least square problem to obtain the optimal estimated value of the pose and the characteristic points of the robot. And according to the optimized robot pose, transforming the local point cloud map Lt into a global coordinate system, and adding the map Lt into the global map to realize online map construction.
And planning a path according to the A-search algorithm, and rasterizing a map, namely converting the global point cloud map into a two-dimensional occupied grid map, so that path searching is facilitated. The xy plane is divided into square grids of size d x d, d being the grid side length (e.g. 0.2 m). Traversing each point pi in the global point cloud map, and calculating the grid number [ m, n ] to which the point pi belongs: Counting the number of points in each grid, marking the grids with the number of points larger than a threshold to (such as 5) as occupied grids, marking the grids with the number of points smaller than tf (such as 1) as free grids, and remaining unknown grids. The value of the occupied grid is set to 1, the free grid is set to 0, the unknown grid is set to 0.5, a two-dimensional occupied grid map M is constructed,
And (3) searching by an algorithm, namely, taking grids corresponding to the current position xt of the robot and the target point p as a starting point s and an end point g, and executing the search A on the occupied grid map M. The start point s is added to the OPEN list OPEN and the cost of s is initialized to 0. The steps of fetching the node n with the smallest cost from OPEN and adding it to the CLOSED list until OPEN is empty or the end point g is found are circularly performed. If n is the end point g, the loop is jumped out, and the search is successful. Extension node n's 8 neighboring nodes { ni }, i=1, 2. For each neighboring node ni, its cost value f (ni):f(ni)=g(ni)+h(ni) is calculated, where g (ni) is the actual cost from the start point s to ni, and h (ni) is the estimated cost from ni to the end point g, where Euclidean distance is used. If ni is in CLOSED, skipping, otherwise, if ni is not in OPEN, adding it into OPEN and setting n as its parent node, if ni is in OPEN, comparing its original cost value with the newly calculated cost value, taking the smaller one and updating its parent node. If the search is successful, then starting from the end point g, tracing back along the parent pointer, generating the optimal path { p0,p1,......,pk } from s to g, where p0=s,pk = g.
And (3) path optimization and motion control, namely carrying out local optimization on the global path { pi } obtained by searching A so as to ensure that the global path { pi } meets the kinematic and dynamic constraints of the robot. And carrying out parameterized interpolation on the path nodes { pi } by adopting a cubic spline curve to generate a smooth reference track r (t). Based on the reference track r (t), a track tracking control algorithm is adopted to calculate the speed v (t) and the acceleration a (t) of the robot in real time: Where vmax and amax are the maximum speed and acceleration of the robot, kv and ka are the tracking error attenuation coefficients, and p (t) the real-time position of the robot. And issuing the calculated speed and acceleration control instruction to a robot chassis to control the robot to move along the reference track. In the motion process, laser radar data acquisition and positioning mapping are continuously executed, and the pose xt of the robot and the global point cloud map are updated. When the robot reaches the vicinity of the target point p (e.g., within 0.5 m), the movement is stopped, and the current pose xt of the robot is sent to the fire extinguishing control module to request the fire extinguishing operation to start.
And (3) a synchronous positioning and map construction algorithm, namely constructing a factor graph model, namely marking the pose of the robot at the time t as xt, and marking the local point cloud map as Lt. According to a motion model of the robot and the pose xt-1 at the time t-1, an Extended Kalman Filter (EKF) algorithm is adopted to predict xt to obtain a priori estimated valueWill beAs lidar odometer factor f0(xt-1,xt) to the factor graph, representing the robot motion constraints at times t-1 to t: Wherein ut is the robot control quantity from t-1 to t, is the operation symbol on SE (3) group,Representing the mahalanobis distance. Extracting the angular points and plane point characteristics in Lt, and constructing local map factorsWherein pk is a feature point in Lt, qk is a corresponding feature point in the global map, and T (·) is a coordinate transformation on SE (3). The local map factor fl(xt,Lt) is added to the factor graph, representing the constraint relationship between the robot pose and the local map.
And (3) closed loop detection, namely dividing the Lt into cube voxels with the same size, and extracting the point with the largest characteristic point in each voxel as the representative point of the voxel to obtain a characteristic point set Pt of the Lt. For each feature point Pi in Pt, a geometric descriptor gi and a visual descriptor vi:gi=[xi,yi,ziiii];vi=ORB(pi); of the feature point Pi are calculated, wherein xi,yi,zi is the three-dimensional coordinate of Pi, alphaiii is the normal vector direction angle of Pi, and ORB (·) is an ORB feature extraction algorithm. And constructing a bag-of-words model by using descriptors of all feature points in P_t to obtain a bag-of-words expression vector wt of Lt. Calculating the similarity of wt to the bag-of-words representation vector { wj } of the local map at the historical pose node: And selecting a historical pose node xm with highest similarity and larger than a threshold value tau, and judging that the position of the robot at the moment t reaches the position at the moment m, namely detecting a closed loop. The feature points in Lt and Lm are matched, and the relative pose transformation Δtmt between xt and xm is calculated. Addition of closed loop factors in factor graphsWhere T (xt)-1) is the inverse transform on SE (3), Σc is the covariance matrix of the closed-loop error.
Solving the factor graph optimization problem by adopting a nonlinear least square method to construct a least square problem, wherein all factors are expressed as residual blocks ri(xt,pk): rl(xt,pk)=T(xt)pk-qk;rc(xt,xm)=T(xt)-1T(xm)-ΔTmt; Taking the square norm of the residual block as an objective function, constructing a least square problem: Wherein Σi is the information matrix of the i-th residual block, which is used to represent the uncertainty of this residual block.
LM algorithm solving, namely initializing estimated values of the pose xt and the characteristic point pk of the robot and recording the estimated values asAndCalculating residual vector under current estimated valueSum jacobian matrixIteratively performing the steps of calculating an incremental equation (JTΣ-1J+μI)Δx=-JTΣ-1 r) until a maximum number of iterations is reached or a convergence condition is met, wherein,For the increment to be solved, μ is the damping factor of the LM algorithm, I is the identity matrix. And solving an increment equation to obtain deltax. Updating the estimated value: wherein,Is an exponential mapping operation on the lie algebra. Calculating residual error under new estimated valueAnd adjusting the size of the damping factor mu according to the change condition of the residual error, wherein if the damping factor mu is ri+1||<||ri, the damping factor mu is reduced so that the LM algorithm is more similar to the Gaussian Newton method, and if the damping factor mu is not the same, the damping factor mu is increased so that the LM algorithm is more similar to the gradient descent method. If the value of Deltax is smaller than epsilon or ri+1 is smaller than epsilon, the algorithm is considered to be converged, the iteration is jumped out, and if not, the next iteration is continued.
Map updating, namely, the pose of the robot obtained by the last iterationAs the optimal estimate. According toCalculating pose of robot under global coordinate systemWherein R (·) and t (·) represent the rotation matrix and translation vector, respectively, in SE (3). Transforming all point clouds Pt={pi in L (t) into the global coordinate system: the point cloud in Pt' is fused into the global map, and the octree and voxel occupancy of the map are updated. Outputting the optimized robot poseAnd an updated global map. By constructing the factor graph optimization problem, various constraints are integrated into a least square problem, and the LM algorithm is used for carrying out iterative solution, so that the convergence is ensured, and the optimization efficiency is improved. The LM algorithm adaptively adjusts between the Gaussian Newton method and the gradient descent method by introducing damping factors, overcomes the defects of the two methods, and is an effective nonlinear optimization algorithm. In each iteration, the coordinate transformation and fusion are carried out on the local map according to the robot pose obtained through optimization, so that the map is updated in real time.
Real-time positioning, namely optimizing and solving the optimal pose estimated value of the robot at the time t through a factor graphWill beAs real-time poses of the robot in the global coordinate system, position coordinates (x, y, z) and attitude angles (roll, pitch, yaw) are included. And releasing the global pose of the robot to other modules, such as path planning, control and the like, for use by the modules in real time. Global point cloud map construction, namely, according to the optimized robot poseConstructing a transformation matrix T:
And (3) carrying out coordinate transformation on each point pi = (x ', y', z ') in the local point cloud map Lt to obtain a point pi'=(x",y",z"):pi'=T×pi under the global coordinate system, and splicing the transformed point cloud { pi' } into the global point cloud map G. And (3) carrying out voxel downsampling on G, setting the side length of the voxel as d (e.g. 0.1 m), merging the point cloud in each d multiplied by d voxel into one point, and reducing the data volume of the point cloud. And smoothing the down-sampled point cloud by adopting a moving least square method to remove noise and outliers, namely selecting a neighbor point set { qj } with the radius of r (e.g. 0.5 m) for each point pi. The local tangent plane of { qj } was fitted using the least squares method to give the normal vector ni at pi. Pi is projected in the direction of ni to yield smoothed point pi'. And taking the smoothed global point cloud map G as a three-dimensional model of the building.
And (3) path planning, namely projecting the global point cloud map G to an xy plane to obtain a two-dimensional occupied grid map M. And constructing a cost map C, wherein the size and the resolution of C and M are the same, and the initial value is 0 by taking the current position of the robot as a starting point s and the fire area positioning information as a target point g. For each grid Ci in C, if Ci is the occupied grid, then Ci = infinity, if Ci is the starting point s or the target point g, then Ci =0, otherwise, Cij=d(ci,s)+d(ci, g), where d (·) is the euclidean distance. On the cost map C, an optimal path from s to g is planned by adopting an A-search algorithm, namely an OPEN list OPEN is initialized and a CLOSED list CLOSED is empty. S is added to OPEN, and f(s) =0. The steps of taking out the grid n with the minimum f value from OPEN and adding CLOSED are circularly executed until OPEN is empty or g is reached. If n is g, the planning is successful, and the search is terminated. An 8-neighborhood grid of n is expanded { n_i }, g (ni)=g(n)+d(n,ni), and h (ni)=d(ni, g) are calculated. For each ni, if ni is in CLOSED, skip, if ni is not in OPEN, add OPEN and set f (ni)=g(ni)+h(ni), if ni is in OPEN and new g (ni) is less than the original value, update f (ni) and parent pointer. If the planning is successful, backtracking the parent node pointer from g to obtain an optimal path { p0,p1,.......,pk } from s to g. And converting { pi } into a path point under the coordinate system of the odometer of the robot as a target path of navigation.
And the robot controls the chassis motor to move along the path from the main body to the fire area according to the navigation path obtained by the path planning module during fire extinguishing operation in the fire area. In the moving process, the heading deviation and the transverse deviation are calculated by real-time positioning through sensors such as an odometer, a laser radar and the like and comparing with a navigation path. And according to the deviation, adopting methods such as PID control or model predictive control and the like to adjust the linear speed and the angular speed of the robot so as to realize path tracking control. After reaching the fire area, the robot stops moving and enters a fire extinguishing operation mode.
And (3) selecting a fire evaluation and extinguishing strategy, namely acquiring multi-source data of a fire area through a carried flame sensor, a thermal infrared imager, a visible light camera and the like. And carrying out fusion processing on the acquired data, and extracting fire characteristics such as flame area, temperature distribution, smoke concentration and the like. Based on the fire characteristics, the severity of the fire and the type of fire (e.g., class A, class B, class C fires) are evaluated. In combination with environmental conditions (e.g., space size, ventilation conditions, combustible distribution, etc.), appropriate fire extinguishing media and fire extinguishing strategies are selected. The kind of fire extinguishing medium, the spraying parameters (such as pressure, flow rate, spraying time, etc.) and the fire extinguishing path are preset for different fire types and severity.
And (3) autonomous fire extinguishing operation, namely controlling a fire extinguishing device according to a selected fire extinguishing strategy, and spraying fire extinguishing medium to a fire point and a fire spreading area. The variable frequency speed regulation technology is adopted to automatically regulate the pressure and flow of the fire extinguishing pump, so that the spraying of the fire extinguishing medium achieves the optimal effect. The robot can autonomously plan a fire extinguishing path, such as a spiral shape, a Z shape and the like, according to the fire distribution and the environmental characteristics, so as to realize full coverage injection of a fire scene. In the fire extinguishing process, residual fire is monitored in real time through a flame sensor, and the fire extinguishing effect is evaluated. If the residual fire is found, the robot automatically adjusts the fire extinguishing path and the spraying parameters, and carries out supplementary spraying on the residual fire until the fire is completely extinguished.
And a plurality of fire extinguishing robots can be deployed on a large fire scene to perform collaborative operation, so that the fire extinguishing efficiency is improved. Each robot forms a distributed sensing network through a wireless communication module (such as 5G, wi-Fi, loar and the like), and interacts fire and environment information. A main control node is arranged in the network, the detection data of all robots are summarized, a global fire map is generated, and the global fire map is distributed to each robot. And each robot autonomously distributes fire extinguishing areas and tasks according to the global fire map, coordinates actions and avoids repeated spraying and blind area omission. In the fire extinguishing process, the robots share fire extinguishing progress and residual fire information in real time, and a fire extinguishing strategy is dynamically optimized, so that efficient coordination is realized.
The fire extinguishing medium is selected and used for different types of fires, and the fire extinguishing medium is selected as a.A types of fires (common solid matter fires) mainly using water or water-based fire extinguishing agents. b.B type fires (flammable liquid fires) using foam extinguishing agents, dry powder extinguishing agents or haloalkane extinguishing agents (such as heptafluoropropane). c.C fire (electric fire) by cutting off power supply and then using carbon dioxide, haloalkane or dry powder fire extinguishing agent. According to the space size and the structural characteristics of the fire disaster occurrence area, proper fire extinguishing medium reserves and spraying modes are selected. In a narrow space, a small-sized and short-spraying-distance water mist or aerosol fire extinguishing agent is preferably used. In the open area, a large-flow water cannon or a foam cannon can be used to realize long-distance and large-range injection.
Example 2
The fire detection network and the fire extinguishing robot are deployed in a warehouse area with the area of 100m multiplied by 100m, and the specific parameters are distributed fire detection network parameters, the number of smoke sensors is 25, the distribution interval is 20m, the sampling frequency is 1Hz, the measuring range is 0-10%/ft, the number of infrared cameras is 25, the distribution interval is 20m, the sampling frequency is 0.5Hz, the wavelength is 8-14 mu m, the Zigbee communication frequency is 2.4GHz, the transmitting power is 20dBm, and the communication distance is 150m. The sensor node location map is exemplified by node 1: (-40 ) < - > Addr:0x8e5f, node 2: (-20, -40) < - > Addr:0x3a2c, and robotic polling period: 10s.
The multi-source information is fused with fire positioning, the probability of detecting the primary fire at the positions (20, 30) is 0.95, and the probability exceeds the Bayesian inference threshold value by 0.8, and a fire confirmation request is triggered. The confirmation request is sent to the (-20, 20), (0, 20), (20, 20) three adjacent nodes, and the smoke concentration observation data returned by the received nodes are 1 (-20, 20) 1.6% of the node, 2 (0, 20) 2.1% of the node and 3 (20, 20) 3.5% of the node. From the observations, parameters in the markov random field energy function are estimated, local evidence terms μ1=1.6%, μ2=2.1%, μ3=3.5%, Σ1=0.64, Σ2=0.81, Σ3=1.44. The spatial correlation term is beta=1.5, d12=20, d13=40, d23=20, and a Markov random field energy function is constructed by E (x) =Σ { log [ N (f (xi) |μi, Σi) ] } beta- Σ { exp [ - |xi-xj|/d ] }, wherein x is a hidden variable fire probability, f (·) is an observation equation, and|·|is a Euclidean distance. The method comprises the steps of carrying out minimum solution on an energy function E (x) by using a particle filtering algorithm, estimating posterior distribution of fire probability, namely initializing 500 particles, randomly distributing the particles in a 40 m-by-40 m area around three nodes, repeatedly iterating 50 times, namely predicting the particles according to a particle set { x (i) t-1} at the previous moment to obtain { x-by-i (i) t }, updating the particles, namely calculating weight w (i) t=p (yt|x-by (i) t of each particle x-by-i), resampling, namely resampling according to weight w (i) t from { x-by-i) t } if Neff=1/Σ (w (i) t)/(2) t <250, outputting estimation, namely x (i) t=Σ { w (i) t-by-t }, iteratively solving, and obtaining convergent particle set distribution, namely calculating probability of 0.13 in the area around the node 1 (+ -5 m 65 →0.13, probability of 2 in the area around the node (+ -175 m) →0.35, and taking the probability of the largest subset around the node (260.35) as a final probability of the largest probability of the area around the node (25).
SLAM positioning navigation, laser radar data acquisition, wherein the laser radar model is VelodyneVLP-16, the vertical direction is 16 lines, the horizontal scanning is performed at 360 degrees, the scanning frequency is 40Hz, namely 40 frames of point cloud data are generated per second, each frame of point cloud comprises 57600 measuring points, the angular resolution is 0.25 degrees, the horizontal scanning angle of each frame is 360 degrees/0.25 degrees=1440 points, the robot moving speed is 0.5m/s, the data acquisition frequency is 2Hz, namely 1 frame of data is recorded every 0.5m, the total length of a moving track is 45m, and 45 m/(0.5 m/frame) =90 frames of laser data are recorded.
The point cloud data preprocessing comprises the steps of carrying out voxel grid filtering on each frame of point cloud, setting the leaf size of the voxel grid to be 5cm, reducing the size of each frame of point cloud to 1/8 of the original point cloud after filtering to about 7200 points, carrying out plane fitting by using a RANSAC algorithm, setting the distance threshold value from the set point to the plane to be 10cm, assuming that 25% of points in each frame of point cloud are ground points, carrying out 50 iterations to better fit the ground, and only preserving vertical features such as walls, columns and the like of each frame of point cloud after removing the ground points, wherein about 5400 points are reserved.
The factor graph construction comprises the steps of adding a key frame node every 5 frames, namely a distance of 2.5m, generating 18 key frames in total through a 45m track, calculating relative pose transformation between the key frames through a laser odometer to be used as an observation value of an odometer factor, assuming that the measurement noise of the laser odometer is translational by 5cm and rotated by 0.02rad to be used for constructing a covariance matrix, finding a closed loop through loop detection on the 5 th, 10 th and 15 th frames, forming a closed loop factor with frames (1, 4 th and 7 th frames) before 3 moments respectively, calculating the observation value of the closed loop factor through ICP registration, setting a distance threshold value at the time of registration to be 20cm and setting the maximum iteration number to be 100, and after ICP registration, averaging the pose transformation error at the closed loop to be about translational by 8cm and rotated by 0.05rad. Data examples are 1 st frame point cloud (x, y, z): (-2.1,0.3,1.2), (-1.8,0.5,0.9),. The.once.a., (3.7, -2.4, -0.6). The 5 th frame point cloud (x, y, z): (1.2, -0.8,1.5), (0.9, -1.1,1.8), the. Frame 1-5 odometer factor (dx, dy, dθ): (2.3,0.1,0.05), (2.4, -0.2, -0.02), (2.5,0.3,0.11), (2.2,0.0, -0.07). Frame 5 and frame 1 ICP closed loop factors (dx, dy, dθ): (9.8, -1.4, -0.12).
The loop detection comprises the steps of extracting 200 SHOT feature points from 18 key frame point clouds, wherein each feature point comprises 352-dimensional SHOT descriptors, for example, the SHOT feature point set of the 1 st frame is { s1_1, s1_2, }, s1_200} and has dimensions of 200X 352, carrying out K-means clustering on the SHOT feature points of all key frames to generate 100 visual words, quantifying the feature points of each key frame into 100-dimensional word bag vectors, representing the number of the feature points belonging to each visual word, for example, the word bag vectors of the 1 st frame are {15,8,3, & gt...
Factor graph optimization, namely constructing a factor graph comprising 18 key frame pose nodes, 17 milemeter factors, 3 closed loop factors (5-1, 10-4 and 15-7), wherein observed values of the milemeter factors are inter-frame relative poses, measurement covariance is diagonal matrix diag (0.05A 2, 0.05A 2 and 0.02A 2), observed values of the closed loop factors are obtained through ICP registration, measurement covariance is set to diag (0.08A 2, 0.08A 2 and 0.05A 2), the observed value of the 1 st closed loop factor is (-9.85,1.38,0.14) to represent pose transformation of the 5 th frame relative to the 1 st frame, and graph optimization is carried out by using g2o, wherein the optimized variables are 18 key frame poses, and the total 6×18=108 degrees of freedom.
The minimum error function :E(x)=Σ{(z_odom-h_odom(x))^T·Σ_odom^(-1)·(z_odom-h_odom(x))}+Σ{(z_loop-h_loop(x))^T·Σ_loop^(-1)·(z_loop-h_loop(x))}; is used for obtaining the maximum posterior estimation of the pose of each key frame after iterative optimization, wherein x is an optimized variable, z is an observed value, h is an observed equation, Σ is a measurement covariance, and the pose change amounts before and after optimization are as follows, namely the 1 st frame pose change amount (-0.08,0.12, -0.03), the 2 nd frame pose change amount (-0.15,0.21, -0.06), the 18 th frame pose change amount (0.32, -0.45,0.12).
The global map construction comprises the steps of transforming laser point clouds of 18 key frames into a global coordinate system according to the optimized pose, splicing into a unified dense point cloud map, wherein the point cloud level of the map is about 100 ten thousand points, downsampling and voxel grid filtering are carried out on the map point clouds to obtain a sparse point cloud map with 10cm resolution, the sparse point clouds are projected to a horizontal plane, and a two-dimensional occupied grid map is generated, and the occupied grid resolution is 10cm. Each grid in the occupancy grid map is assigned an occupancy probability, e.g., grid (25,18) 0.92, grid (25,19) 0.45, grids (26, 18) 0.68, and grids with occupancy probabilities greater than 0.5 are considered obstacles to avoid when the robot is navigating.
The path planning is to assume that the current position of the robot is (1.5 ), the target fire position is (22.5,27.5), the occupied grid map size is 50 multiplied by 50, the resolution is 10cm, and the actual environment size is 5m multiplied by 5m. The obstacles in the grid map are mainly distributed in the following areas (10, 20) - (15, 40), namely rectangular obstacles, (30, 5) - (35, 15), namely rectangular obstacles, and (20, 25) - (25, 40), namely L-shaped obstacles.
The path search is performed by using an algorithm A, wherein the search process is OpenSet { (1, 1) }, closeSet { }, the current node (1, 1), f=40.79, g=0, h=40.79, expansion (1, 1) → { (1, 2), (2, 2) }, adding OpenSet..the current node (22, 27), f=41.09, g=41.09, h=0, finding the target point, the search is completed, the total time is 35ms, the optimal path length is 41 steps, the corresponding actual path length is 41×0.1=4.1, and the optimal path obtained by the search A is (1, 1) - (2, 2) - (22, 26) - (22, 27).
The path optimization comprises the steps of connecting 41 path nodes obtained by searching A end to form a closed path, multiplying the coordinates of the path nodes by 0.1, converting the coordinates into a coordinate sequence in an actual environment, smoothing the path by utilizing a cubic Bezier curve, and setting control points as follows, P0 (0.1), P1 (1.2,0.2), P2 (2.0,1.8), P40 (2.2,2.6) and P41 (2.2,2.7). And generating 41 sections of Bezier curves, and splicing the sections into a smooth navigation path with the total length of about 45 m.
The smoothed path nodes are (0.1), (0.15,0.12), (0.22,0.18), and..the term, (2.18,2.68), and (2.20,2.70). The path execution comprises the steps of dispersing the smoothed path into 100 local target points, wherein the distance between each local target point is about 0.45m, and tracking and executing the local target points by the robot according to the wheel type odometer and the PID controller. Assuming that the robot linear velocity is 0.5m/s and the angular velocity is 0.8rad/s, the PID parameters are as follows, kp=1.5, ki=0.2, kd=0.3, the error statistics of local path tracking are as follows, the maximum lateral error is 0.15m, the maximum longitudinal error is 0.22m, the average lateral error is 0.08m, and the average longitudinal error is 0.12m. The robot successfully avoids the intermediate obstacle and safely reaches the target fire area within 90s (2.25,2.75).

Claims (6)

Translated fromChinese
1.一种前端灭火机器人控制方法,包括:1. A front-end fire-fighting robot control method, comprising:在目标区域内设置多个烟雾传感器和红外摄像机,烟雾传感器和红外摄像机通过无线通信方式与灭火机器人连接,组成分布式火情探测网络;Multiple smoke sensors and infrared cameras are set up in the target area, and the smoke sensors and infrared cameras are connected to the fire-fighting robot through wireless communication to form a distributed fire detection network;当分布式火情探测网络通过火焰识别算法监测到火灾时,将发生火灾区域的定位信息发送给灭火机器人;When the distributed fire detection network detects a fire through the flame recognition algorithm, the location information of the fire area is sent to the fire-fighting robot;灭火机器人接收到火灾区域的定位信息后,利用搭载的激光雷达传感器采集目标区域内的环境信息,采用同步定位与地图构建SLAM算法,建立区域内建筑物的三维点云模型,利用构建的三维点云模型进行自身定位和导航路径规划;After receiving the positioning information of the fire area, the fire-fighting robot uses the laser radar sensor on board to collect environmental information in the target area, adopts the simultaneous positioning and map construction SLAM algorithm, builds a three-dimensional point cloud model of the buildings in the area, and uses the constructed three-dimensional point cloud model for its own positioning and navigation path planning;灭火机器人根据规划的导航路径,导航到发生火灾的区域,进行灭火;The fire-fighting robot navigates to the area where the fire occurs according to the planned navigation path and extinguishes the fire;组成分布式火情探测网络,包括:The distributed fire detection network consists of:在目标区域的不同位置设置多个烟雾传感器和红外摄像机;其中,烟雾传感器用于检测烟雾浓度,红外摄像机用于采集红外图像;A plurality of smoke sensors and infrared cameras are arranged at different locations in the target area; wherein the smoke sensors are used to detect smoke concentration, and the infrared cameras are used to collect infrared images;在每个烟雾传感器和红外摄像机上集成LoRa或 WIFI无线通信模块,将每个烟雾传感器和红外摄像机的位置信息在系统中绑定;Integrate LoRa or WIFI wireless communication modules on each smoke sensor and infrared camera to bind the location information of each smoke sensor and infrared camera in the system;系统平台接收各个传感器节点的数据并进行火情分析;The system platform receives data from each sensor node and performs fire analysis;当分布式火情探测网络通过火焰识别算法监测到火灾时,将发生火灾区域的定位信息发送给灭火机器人,包括:When the distributed fire detection network detects a fire through the flame recognition algorithm, it sends the location information of the fire area to the fire-fighting robot, including:系统平台将预处理后的各个传感器节点的烟雾浓度数据和红外图像数据进行多源信息融合,采用贝叶斯推理方法,建立火情概率模型;The system platform fuses the pre-processed smoke concentration data and infrared image data of each sensor node into multi-source information and uses the Bayesian reasoning method to establish a fire probability model.在火情概率模型中,将烟雾浓度数据和红外图像数据作为观测量,将火情发生的概率作为隐变量,通过贝叶斯推理计算在给定观测量的条件下火情发生的后验概率;In the fire probability model, smoke concentration data and infrared image data are used as observations, and the probability of fire occurrence is used as a latent variable. The posterior probability of fire occurrence under given observations is calculated through Bayesian inference.根据后验概率超过阈值的传感器节点,确定出现火情的区域,将确定区域内传感器节点的位置信息,作为火灾区域的初步定位信息;According to the sensor nodes whose posterior probability exceeds the threshold, the area where the fire occurs is determined, and the location information of the sensor nodes in the determined area is used as the preliminary positioning information of the fire area;将初步定位信息传送给灭火机器人,灭火机器人根据自身位置、环境数据确定行走的起始点及终点及导航路径;The preliminary positioning information is transmitted to the fire-fighting robot, which determines the starting and ending points of the walk and the navigation path based on its own position and environmental data;采用贝叶斯推理方法,建立火情概率模型,包括:The Bayesian reasoning method is used to establish a fire probability model, including:将每个传感器节点定义为马尔科夫随机场中的一个节点,节点的观测量为对应节点采集的烟雾浓度数据和红外图像数据;Each sensor node is defined as a node in the Markov random field, and the observations of the node are the smoke concentration data and infrared image data collected by the corresponding node;根据传感器节点在空间上的分布,确定每个节点的邻域节点集合,将节点的观测量与邻域节点的观测量进行关联,构建节点之间的空间相关性;According to the spatial distribution of sensor nodes, the set of neighboring nodes of each node is determined, the observation of the node is associated with the observation of the neighboring nodes, and the spatial correlation between the nodes is constructed;基于节点之间的空间相关性,建立马尔科夫随机场模型,将火情发生的状态定义为随机场的隐变量,将所有节点的观测量定义为随机场的观测变量;Based on the spatial correlation between nodes, a Markov random field model is established, the state of fire occurrence is defined as the hidden variable of the random field, and the observations of all nodes are defined as the observed variables of the random field.在马尔科夫随机场中,定义能量函数In Markov random fields, the energy function is defined as ;采用最小化能量函数,求解马尔科夫随机场中火情发生的最大后验概率估计;Using the minimized energy function , solve the maximum a posteriori probability estimate of fire occurrence in the Markov random field;在马尔科夫随机场中,定义能量函数In Markov random fields, the energy function is defined as : ;其中,x表示火情发生的状态;表示第i个节点的火情状态,y表示所有节点的观测量,表示第i个节点的观测量;Among them, x represents the state of fire occurrence; represents the fire status of the i-th node, y represents the observation value of all nodes, represents the observation value of the i-th node;表示第i个节点的局部证据函数,度量观测量与火情状态的相容性,计算公式为: Represents the local evidence function of the i-th node, measuring the observation Fire status The compatibility is calculated as follows: ;其中,分别表示第i个节点在火情状态xᵢ下的观测量均值和协方差矩阵;in, and They represent the mean and covariance matrix of the observations of the ith node under the fire state xᵢ;表示第i个节点和第j个节点之间的空间相关性函数,度量相邻节点火情状态的相关强度,计算公式为: It represents the spatial correlation function between the i-th node and the j-th node, and measures the correlation strength of the fire status of adjacent nodes. The calculation formula is: ;其中,表示火情状态之间的距离度量,为相关强度参数;in, Indicates fire status and The distance measure between is the correlation strength parameter;表示配分函数,计算公式为: represents the partition function, and the calculation formula is: .2.根据权利要求1所述的前端灭火机器人控制方法,其特征在于:2. The front-end fire-fighting robot control method according to claim 1, characterized in that:对初步定位信息进行校验和优化,得到火灾区域的最终定位信息,包括:Verify and optimize the preliminary positioning information to obtain the final positioning information of the fire area, including:采用粒子滤波算法对初步定位信息进行优化,根据火情区域内传感器节点的位置分布,初始化一组粒子样本,每个粒子包含火情位置、粒子先验权重和节点观测数据;The particle filter algorithm is used to optimize the preliminary positioning information. According to the location distribution of sensor nodes in the fire area, a group of particle samples are initialized. Each particle contains the fire location, particle prior weight and node observation data.根据每个粒子包含的火情位置,估计相应粒子处的烟雾浓度分布和红外辐射分布,将粒子处的估计分布与节点采集的观测数据进行比较,计算对应粒子的似然概率;将粒子的似然概率乘以相应的先验权重,得到粒子的后验权重;According to the fire location contained in each particle, the smoke concentration distribution and infrared radiation distribution at the corresponding particle are estimated, the estimated distribution at the particle is compared with the observation data collected by the node, and the likelihood probability of the corresponding particle is calculated; the likelihood probability of the particle is multiplied by the corresponding prior weight to obtain the posterior weight of the particle;根据粒子样本中各个粒子的后验权重,对粒子样本进行重采样,保留后验权重大于阈值的粒子,得到重采样后的粒子样本集合;According to the posterior weight of each particle in the particle sample, the particle sample is resampled, and the particles whose posterior weight is greater than the threshold are retained to obtain the resampled particle sample set;对重采样后的粒子样本集合迭代进行粒子后验权重计算和粒子重采样,更新粒子样本集合,直至粒子样本集合的分布收敛或迭代次数达到阈值;Iteratively perform particle posterior weight calculation and particle resampling on the resampled particle sample set, and update the particle sample set until the distribution of the particle sample set converges or the number of iterations reaches a threshold;对收敛后的粒子样本集合中的各个粒子的火情位置进行加权平均,得到火情位置的最大后验概率估计,作为火情区域的最终定位信息。The fire positions of each particle in the converged particle sample set are weighted averaged to obtain the maximum a posteriori probability estimate of the fire position as the final positioning information of the fire area.3.根据权利要求2所述的前端灭火机器人控制方法,其特征在于:3. The front-end fire-fighting robot control method according to claim 2, characterized in that:采用同步定位与地图构建SLAM算法,建立区域内建筑物的三维点云模型,利用构建的三维点云模型进行自身定位和导航路径规划,包括:The SLAM algorithm is used to build a 3D point cloud model of the buildings in the area. The built 3D point cloud model is used for self-positioning and navigation path planning, including:灭火机器人在接收到火灾区域的定位信息后,利用搭载的激光雷达传感器对目标区域进行360°扫描,获取区域内建筑物和障碍物的原始点云数据;After receiving the positioning information of the fire area, the fire-fighting robot uses the laser radar sensor on board to perform a 360-degree scan of the target area and obtain the original point cloud data of buildings and obstacles in the area;对原始点云数据进行下采样以减少数据量,并利用随机采样一致性算法RANSAC去除地面点,获取预处理后的点云数据;The original point cloud data is downsampled to reduce the data volume, and the random sampling consensus algorithm RANSAC is used to remove ground points to obtain the preprocessed point cloud data;将预处理后的点云数据输入到同步定位与地图构建算法中,获取灭火机器人在目标区域内的实时位姿,并根据实时位姿建立区域内建筑物的三维点云地图;The pre-processed point cloud data is input into the simultaneous positioning and mapping algorithm to obtain the real-time position and posture of the fire-fighting robot in the target area, and a three-dimensional point cloud map of the buildings in the area is established based on the real-time position and posture;将三维点云地图投影至二维平面,采用栅格地图的形式表示,以灭火机器人当前位置为起点,火灾区域定位信息为目标点,构建起点到目标点的代价地图;The 3D point cloud map is projected onto a 2D plane and represented in the form of a grid map. The current position of the fire-fighting robot is taken as the starting point and the fire area positioning information is taken as the target point. A cost map from the starting point to the target point is constructed.在代价地图上,采用A*搜索算法,规划灭火机器人从起点到目标点的导航路径。On the cost map, the A* search algorithm is used to plan the navigation path of the fire-fighting robot from the starting point to the target point.4.根据权利要求3所述的前端灭火机器人控制方法,其特征在于:4. The front-end fire-fighting robot control method according to claim 3 is characterized in that:获取灭火机器人在目标区域内的实时位姿,并根据实时位姿建立区域内建筑物的三维点云地图,包括:Obtain the real-time position and posture of the fire-fighting robot in the target area, and build a three-dimensional point cloud map of the buildings in the area based on the real-time position and posture, including:以灭火机器人的初始位置为原点,根据灭火机器人的实时位姿,将预处理后的点云数据进行坐标转换,得到以机器人当前位姿为参考系的局部点云地图;Taking the initial position of the fire-fighting robot as the origin, the pre-processed point cloud data is transformed into coordinates according to the real-time posture of the fire-fighting robot to obtain a local point cloud map with the current posture of the robot as the reference system;将局部点云地图和机器人位姿输入到同步定位与地图构建算法中,采用基于图优化的方法构建因子图模型;The local point cloud map and robot pose are input into the simultaneous localization and mapping algorithm, and the factor graph model is constructed using a graph optimization-based method.根据激光雷达数据和机器人运动模型,采用扩展卡尔曼滤波算法对机器人位姿进行预测,得到机器人位姿的先验估计,并将预测的位姿作为激光雷达里程计因子添加到因子图模型中;According to the lidar data and the robot motion model, the extended Kalman filter algorithm is used to predict the robot's posture, obtain a priori estimation of the robot's posture, and add the predicted posture as the lidar odometer factor to the factor graph model;在因子图模型中,以激光雷达里程计因子、回环检测因子和局部地图因子作为图中边,构建图中节点间的几何约束关系;In the factor graph model, the lidar odometer factor, loop detection factor, and local map factor are used as edges in the graph to construct the geometric constraint relationship between the nodes in the graph;通过提取局部点云地图中包含角点和平面点的特征点,采用基于外观和空间几何信息的闭环检测算法,判断机器人当前位置是否达到过先前位置;若检测到闭环,则在因子图模型中添加闭环因子;By extracting feature points including corner points and plane points from the local point cloud map, a closed-loop detection algorithm based on appearance and spatial geometric information is used to determine whether the robot's current position has reached the previous position; if a closed loop is detected, a closed-loop factor is added to the factor graph model;以构建的因子图模型为基础,构建非线性最小二乘优化问题,求解因子图模型中各节点的最优位姿和特征点位置;Based on the constructed factor graph model, a nonlinear least squares optimization problem is constructed to solve the optimal pose and feature point position of each node in the factor graph model;将优化后的机器人位姿作为机器人的实时定位结果,根据优化后的机器人位姿,将局部点云地图进行坐标转换,得到区域内建筑物的三维点云地图。The optimized robot posture is used as the real-time positioning result of the robot. According to the optimized robot posture, the local point cloud map is transformed into coordinates to obtain a three-dimensional point cloud map of the buildings in the area.5.根据权利要求4所述的前端灭火机器人控制方法,其特征在于:5. The front-end fire-fighting robot control method according to claim 4, characterized in that:采用基于外观和空间几何信息的闭环检测算法,判断机器人当前位置是否达到过先前位置;若检测到闭环,则在因子图模型中添加闭环因子,对应当前位姿节点与闭环匹配位姿节点,包括:A closed-loop detection algorithm based on appearance and spatial geometry information is used to determine whether the robot's current position has reached the previous position; if a closed loop is detected, a closed-loop factor is added to the factor graph model, corresponding to the current pose node and the closed-loop matching pose node, including:将局部点云地图划分为大小相同的立方体体素,通过提取每个体素中包含角点或平面点最多的点作为相应体素的特征点,得到局部点云地图的特征点集;The local point cloud map is divided into cubic voxels of the same size, and the feature point set of the local point cloud map is obtained by extracting the points containing the most corner points or plane points in each voxel as the feature points of the corresponding voxel;计算特征点集中每个特征点的几何描述子和视觉描述子,根据特征点的几何描述子和视觉描述子构建词袋模型,得到局部点云地图的词袋表示向量;Calculate the geometric descriptor and visual descriptor of each feature point in the feature point set, build a bag-of-words model based on the geometric descriptor and visual descriptor of the feature point, and obtain the bag-of-words representation vector of the local point cloud map;将机器人当前位姿节点处的局部点云地图的词袋表示向量与历史位姿节点处的局部点云地图的词袋表示向量进行匹配,计算词袋表示向量之间的相似度;Match the bag-of-words representation vector of the local point cloud map at the robot's current posture node with the bag-of-words representation vector of the local point cloud map at the historical posture node, and calculate the similarity between the bag-of-words representation vectors;如果相似度大于阈值,则判断机器人当前位置达到过先前位置,检测到闭环;If the similarity is greater than the threshold, it is judged that the current position of the robot has reached the previous position and a closed loop is detected;根据局部点云地图的匹配关系,计算当前位姿节点与闭环匹配位姿节点之间的相对位姿变换,在因子图模型中添加闭环因子,闭环因子连接当前位姿节点和闭环匹配位姿节点,表示两个节点之间的相对位姿约束。According to the matching relationship of the local point cloud map, the relative pose transformation between the current pose node and the closed-loop matching pose node is calculated, and a closed-loop factor is added to the factor graph model. The closed-loop factor connects the current pose node and the closed-loop matching pose node to represent the relative pose constraint between the two nodes.6.一种前端灭火机器人控制系统,其特征在于,6. A front-end fire-fighting robot control system, characterized in that:包括:include:至少一个处理单元PE;用于执行指令以实现权利要求1至5任一项所述的前端灭火机器人控制方法。At least one processing unit PE; used to execute instructions to implement the front-end fire-fighting robot control method according to any one of claims 1 to 5.
CN202411440658.4A2024-10-162024-10-16 A front-end fire-fighting robot control method and systemActiveCN119034148B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202411440658.4ACN119034148B (en)2024-10-162024-10-16 A front-end fire-fighting robot control method and system

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202411440658.4ACN119034148B (en)2024-10-162024-10-16 A front-end fire-fighting robot control method and system

Publications (2)

Publication NumberPublication Date
CN119034148A CN119034148A (en)2024-11-29
CN119034148Btrue CN119034148B (en)2025-04-15

Family

ID=93583957

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202411440658.4AActiveCN119034148B (en)2024-10-162024-10-16 A front-end fire-fighting robot control method and system

Country Status (1)

CountryLink
CN (1)CN119034148B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN119356369A (en)*2024-12-232025-01-24铭派科技集团有限公司 Unmanned navigation path planning system and method based on adaptive planning
CN120526528B (en)*2025-07-252025-10-03厦门太星机电有限公司 Zero false alarm automatic fire alarm and fire extinguishing system based on multi-condition judgment

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109558879A (en)*2017-09-222019-04-02华为技术有限公司A kind of vision SLAM method and apparatus based on dotted line feature
CN118662830A (en)*2024-07-162024-09-20安徽师范大学Indoor autonomous fire-fighting linkage system and method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7245315B2 (en)*2002-05-202007-07-17Simmonds Precision Products, Inc.Distinguishing between fire and non-fire conditions using cameras
WO2020033186A1 (en)*2018-08-092020-02-13Cobalt Robotics, Inc.Security automation in a mobile robot
US11460849B2 (en)*2018-08-092022-10-04Cobalt Robotics Inc.Automated route selection by a mobile robot

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109558879A (en)*2017-09-222019-04-02华为技术有限公司A kind of vision SLAM method and apparatus based on dotted line feature
CN118662830A (en)*2024-07-162024-09-20安徽师范大学Indoor autonomous fire-fighting linkage system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
智能巡检灭火机器人关键技术研究;张建亚;中国优秀硕士论文全文数据库;20230215;第6,23-54页*

Also Published As

Publication numberPublication date
CN119034148A (en)2024-11-29

Similar Documents

PublicationPublication DateTitle
CN119034148B (en) A front-end fire-fighting robot control method and system
Tranzatto et al.Cerberus: Autonomous legged and aerial robotic exploration in the tunnel and urban circuits of the darpa subterranean challenge
CN111932588B (en) A tracking method for airborne UAV multi-target tracking system based on deep learning
Petrlík et al.A robust UAV system for operations in a constrained environment
WO2019138836A1 (en)Information processing device, information processing system, information processing method, and program
Brust et al.A networked swarm model for UAV deployment in the assessment of forest environments
CN116352722A (en)Multi-sensor fused mine inspection rescue robot and control method thereof
CN115267820A (en)Fire scene map construction method and system fusing laser radar/vision/UWB
CN118576940A (en) Intelligent fire extinguishing and detection robot
Mittal et al.Vision-based autonomous landing in catastrophe-struck environments
Kay et al.Semantically informed next best view planning for autonomous aerial 3D reconstruction
Zhang et al.Air-ground collaborative robots for fire and rescue missions: Towards mapping and navigation perspective
Baudoin et al.View-finder: robotics assistance to fire-fighting services and crisis management
Abro et al.Signal strength-based alien drone detection and containment in indoor UAV swarm simulations
Wang et al.A Path Planning Algorithm for UAV 3D Surface Inspection Based on Normal Vector Filtering and Integrated Viewpoint Evaluation
Ali et al.Distributed multi-agent deep reinforcement learning based navigation and control of UAV swarm for wildfire monitoring
Legovich et al.Integration of modern technologies for solving territory patroling problems with the use of heterogeneous autonomous robotic systems
Kim et al.Active object tracking using context estimation: handling occlusions and detecting missing targets
Tan et al.A novel fusion positioning navigation system for greenhouse strawberry spraying robot using LiDAR and ultrasonic tags
RossiDAEDALUS-Descent And Exploration in Deep Autonomy of Lava Underground Structures: Open Space Innovation Platform (OSIP) Lunar Caves-System Study
Wang et al.A robust navigation framework for uneven terrain with traversability-focused planning
Zhou et al.Research on Slam and Path Planning Method based on Multi-sensor Fusion in Complex Factory Scene
WangDesign and Implementation of Indoor and Outdoor Security Patrol Mobile Robot
MysorewalaSimultaneous robot localization and mapping of parameterized spatio-temporal fields using multi-scale adaptive sampling
CN116740297A (en)Unmanned aerial vehicle-based autonomous indoor facility detection and scene reconstruction exploration and development system

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant
CP03Change of name, title or address

Address after:215163 Jiangsu Province, Suzhou City, Gaoxin District, Jinfeng International Business Plaza, Building 1, Room 1101

Patentee after:China Construction International Low Carbon Technology Co.,Ltd.

Country or region after:China

Patentee after:China Construction International Urban Construction Co.,Ltd.

Patentee after:Jinan Shuhui Information Technology Co.,Ltd.

Address before:215163 Jiangsu Province, Suzhou City, Gaoxin District, Jinfeng Road No. 199, Jinfeng International Business Plaza Building 1 - 9 - 901

Patentee before:China Construction International Urban Construction Co.,Ltd.

Country or region before:China

Patentee before:China Construction International Low Carbon Technology Co.,Ltd.

Patentee before:Jinan Shuhui Information Technology Co.,Ltd.

CP03Change of name, title or address

[8]ページ先頭

©2009-2025 Movatter.jp