Movatterモバイル変換


[0]ホーム

URL:


CN119697344A - A remote monitoring method and system for warehouse cleaning robots based on the Internet of Things - Google Patents

A remote monitoring method and system for warehouse cleaning robots based on the Internet of Things
Download PDF

Info

Publication number
CN119697344A
CN119697344ACN202411876901.7ACN202411876901ACN119697344ACN 119697344 ACN119697344 ACN 119697344ACN 202411876901 ACN202411876901 ACN 202411876901ACN 119697344 ACN119697344 ACN 119697344A
Authority
CN
China
Prior art keywords
cleaning robot
snapshot
internet
nodes
generating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202411876901.7A
Other languages
Chinese (zh)
Other versions
CN119697344B (en
Inventor
颜敏
王栋
陈家骏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CHN Energy Jianbi Power Plant
Original Assignee
CHN Energy Jianbi Power Plant
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CHN Energy Jianbi Power PlantfiledCriticalCHN Energy Jianbi Power Plant
Priority to CN202411876901.7ApriorityCriticalpatent/CN119697344B/en
Publication of CN119697344ApublicationCriticalpatent/CN119697344A/en
Application grantedgrantedCritical
Publication of CN119697344BpublicationCriticalpatent/CN119697344B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Landscapes

Abstract

Translated fromChinese

本发明适用于远程监控技术领域,尤其涉及一种基于物联网的清仓机器人远程监控方法及系统,所述方法包括:利用预设的物联网视频监控设备,采集船舱内的视频监控数据,并截取出快照,划定出清仓机器人的使用范围,将所述使用范围切分为若干个分块,创建与所述分块一一对应的节点,将所述快照同步到对应的节点,提取出快照中的前景特征,并覆盖所述节点。本发明通过生成任务链,能够大大提高清仓任务的流畅度,提高清仓效率,通过计算边缘坐标,能够识别出残留物的位置,能够优化清仓路径,极大地提高了作业精度和清仓效率,通过对残留物进行验证,能够进一步提高清仓效果,减少误差与重复清仓,确保清仓过程更加精准、高效。

The present invention is applicable to the field of remote monitoring technology, and in particular to a remote monitoring method and system for a warehouse cleaning robot based on the Internet of Things, the method comprising: using a preset Internet of Things video monitoring device to collect video monitoring data in the cabin, and taking out a snapshot, defining the use range of the warehouse cleaning robot, dividing the use range into a number of blocks, creating nodes corresponding to the blocks one by one, synchronizing the snapshots to the corresponding nodes, extracting the foreground features in the snapshots, and covering the nodes. The present invention can greatly improve the fluency of warehouse cleaning tasks and improve warehouse cleaning efficiency by generating a task chain, can identify the location of residues by calculating edge coordinates, can optimize the warehouse cleaning path, greatly improve the operation accuracy and warehouse cleaning efficiency, and can further improve the warehouse cleaning effect by verifying the residues, reduce errors and repeated warehouse cleaning, and ensure that the warehouse cleaning process is more accurate and efficient.

Description

Remote monitoring method and system for warehouse cleaning robot based on Internet of things
Technical Field
The invention relates to the technical field of remote monitoring, in particular to a remote monitoring method and system for a warehouse cleaning robot based on the Internet of things.
Background
The transporting ship usually utilizes the bucket wheel reclaimer, the grab crane and other equipment to load and unload transported goods, however, for some special types of goods, such as high-value rare earth, moldy grains, coal with high pollution risk and the like, residues are often left in the loading and unloading process of the goods, if the goods are not timely cleaned, economic losses are caused, cross pollution among different goods is caused, the ship body is polluted, and the sailing safety is influenced.
Most of the existing cleaning robots adopt a preset path planning mode, cannot dynamically cope with sudden environmental changes, work in a larger space, and easily cause the problems of repeated cleaning or missing cleaning and the like, so that the problem of how to provide cleaning guidance for the cleaning robot by using video monitoring equipment is the technical problem to be solved by the invention.
Disclosure of Invention
The invention aims to provide a remote monitoring method and a remote monitoring system for a cleaning robot based on the Internet of things, which are used for solving the problem of how to provide cleaning guidance for the cleaning robot by using video monitoring equipment in the background art.
In order to achieve the above purpose, the present invention provides the following technical solutions:
a remote monitoring method of a warehouse cleaning robot based on the Internet of things comprises the following steps:
The method comprises the steps of utilizing preset video monitoring equipment of the Internet of things to collect video monitoring data in a ship cabin, intercepting a snapshot, defining a use range of a warehouse cleaning robot, dividing the use range into a plurality of blocks, creating nodes corresponding to the blocks one by one, synchronizing the snapshot to the corresponding nodes, extracting foreground features in the snapshot, covering the nodes, integrating all the nodes, generating a task chain, constructing a position coordinate system, and utilizing an edge detection algorithm to calculate edge coordinates of the foreground features in each node;
Acquiring the use authority of laser generating equipment which is integrated on the warehouse cleaning robot in advance, reading out the reflected light signals at the edge coordinates, sequentially carrying out residue verification on all nodes, and deleting the corresponding nodes from a task chain if the verification is not passed;
positioning a real-time position of the warehouse cleaning robot, taking the real-time position as a starting point, taking a preset position as an end point, generating a moving path, generating a passing point by utilizing the edge coordinates, inserting the passing point into the moving path, and transmitting the moving path to the warehouse cleaning robot;
embedding a monitoring mechanism into the task chain, updating the snapshot, and adjusting the task chain, wherein the adjustment at least comprises merging and splitting;
Further, the step of utilizing the preset video monitoring equipment of the internet of things to collect video monitoring data in the ship cabin, intercepting a snapshot, demarcating a use range of the warehouse cleaning robot, and dividing the use range into a plurality of blocks comprises the following steps:
Acquiring equipment distribution in a ship cabin and dividing a forbidden area;
generating a virtual fence based on the forbidden area, and marking the virtual fence into a snapshot;
judging whether the real-time position exceeds the virtual fence, and if so, triggering a preset alarm mechanism.
Further, the creating a node corresponding to the partition one by one, synchronizing the snapshot to the corresponding node, extracting foreground features in the snapshot, covering the node, integrating all the nodes, and generating a task chain includes:
collecting historical data of residues in the cabin, and performing marking and preprocessing to generate a training set;
creating a target detection model, and optimizing by utilizing the training set;
and inputting the snapshot into a target detection model, and outputting to obtain foreground features and background features.
Further, the step of constructing a position coordinate system and calculating the edge coordinates of the foreground feature in each node by using an edge detection algorithm includes:
determining a reference point, defining an origin and an axis direction, correcting the snapshot, integrating the origin and the axis coordinates, and constructing a coordinate system;
And (3) defining the boundary of the foreground characteristic in each block by utilizing an edge detection algorithm, and reading out the edge coordinates of the boundary.
Further, the step of acquiring the use authority of the laser generating device pre-integrated on the warehouse cleaning robot, reading out the reflected light signals at the edge coordinates, and sequentially performing residue verification on all the nodes includes:
configuring scanning priority of each block, setting scanning frequency, and clustering the reflected light signals into a smooth part and a rough part;
and judging whether the reflected light signal at the edge coordinates is a rough part or not, and if so, verifying to pass.
Further, the step of locating the real-time position of the cleaning robot, using the real-time position as a starting point and a preset position as an end point to generate a moving path, generating a passing point by using the edge coordinates, inserting the passing point into the moving path, and transmitting the moving path to the cleaning robot includes:
Integrating the starting point, the passing point and the ending point, generating a moving path, and determining deviation factors, wherein the deviation factors at least comprise electric quantity, progress and barriers;
calculating a deviation value of the real-time position and the moving path, generating alarm information if the deviation value is larger than a preset threshold value, and sending the alarm information to a preset terminal.
Further, the step of embedding a monitoring mechanism into the task chain, updating the snapshot, and adjusting the task chain includes:
collecting fusion data of the multi-level internet of things sensor, integrating the fusion data into the task chain, activating the monitoring mechanism, and monitoring the state of the warehouse cleaning robot;
And recording the generation time of the snapshot, updating the snapshot based on a preset frequency, integrating a state monitoring result and the snapshot, and adjusting a task chain.
Further, the system includes:
The computing module is used for acquiring video monitoring data in a ship cabin by using preset video monitoring equipment of the Internet of things, intercepting a snapshot, defining a use range of the bin cleaning robot, dividing the use range into a plurality of blocks, creating nodes corresponding to the blocks one by one, synchronizing the snapshot to the corresponding nodes, extracting foreground features in the snapshot, covering the nodes, integrating all the nodes, generating a task chain, constructing a position coordinate system, and calculating edge coordinates of the foreground features in each node by using an edge detection algorithm;
the deleting module is used for acquiring the use authority of the laser generating equipment which is integrated on the warehouse cleaning robot in advance, reading out the reflected light signals at the edge coordinates, sequentially carrying out residue verification on all the nodes, and deleting the corresponding nodes from the task chain if the verification is not passed;
The sending module is used for locating the real-time position of the warehouse cleaning robot, taking the real-time position as a starting point, taking a preset position as an end point, generating a moving path, generating a passing point by utilizing the edge coordinates, inserting the passing point into the moving path, and sending the moving path into the warehouse cleaning robot;
And the adjusting module is used for embedding a monitoring mechanism into the task chain, updating the snapshot and adjusting the task chain, wherein the adjusting at least comprises merging and dividing.
Further, the computing module includes:
The marking unit is used for obtaining equipment distribution in the ship cabin, dividing a forbidden area, generating a virtual fence based on the forbidden area, and marking the virtual fence into a snapshot;
the judging unit is used for judging whether the real-time position exceeds the virtual fence or not, and if so, triggering a preset alarm mechanism;
The output unit is used for collecting historical data of residues in the cabin, performing labeling and preprocessing, generating a training set, creating a target detection model, optimizing by using the training set, inputting the snapshot into the target detection model, and outputting to obtain foreground features and background features;
The reading unit is used for determining a reference point, defining an origin and an axis direction, correcting the snapshot, integrating the origin and the axis coordinates, constructing a coordinate system, defining the boundary of the foreground feature in each block by using an edge detection algorithm, and reading out the edge coordinates of the boundary.
Further, the deletion module includes:
A clustering unit, configured to configure a scanning priority of each block, set a scanning frequency, and cluster the reflected light signals into a smooth portion and a rough portion;
and the verification unit is used for judging whether the reflected light signal at the edge coordinates is a rough part or not, and if so, the verification unit verifies to pass.
Compared with the prior art, the invention has the beneficial effects that:
Through gathering video monitoring data to the segmentation is a plurality of blocking, can carry out the fine management to application range, improves the precision of clearing house greatly, through intercepting the snapshot, can provide data basis and clearance guide for the clearance robot, can also carry out real-time supervision to the clearance robot simultaneously, in time handles emergency, through generating the task chain, can improve the smoothness of clearance task greatly, improves the efficiency of clearing house, through calculating the edge coordinate, can discern the position of residue, optimize the clearance route, greatly improved operation precision and efficiency of clearing house, through verifying the residue, can further improve the effect of clearing house, reduce error and repeated clearance, ensure that the clearance process is more accurate, high-efficient.
Drawings
Fig. 1 is a flow chart diagram of a remote monitoring method of a warehouse cleaning robot based on the internet of things, which is provided by the embodiment of the invention;
fig. 2 is a first sub-flowchart block diagram of a remote monitoring method of a warehouse cleaning robot based on the internet of things, which is provided by an embodiment of the invention;
Fig. 3 is a second sub-flowchart block diagram of a remote monitoring method of a warehouse cleaning robot based on the internet of things, which is provided by the embodiment of the invention;
fig. 4 is a third sub-flowchart block diagram of a remote monitoring method of a warehouse cleaning robot based on the internet of things, which is provided by the embodiment of the invention;
fig. 5 is a fourth sub-flowchart block diagram of a remote monitoring method for a warehouse cleaning robot based on the internet of things, which is provided by the embodiment of the invention;
fig. 6 is a block diagram of a remote monitoring system of a warehouse cleaning robot based on the internet of things, which is provided by the embodiment of the invention;
Fig. 7 is a block diagram of a calculation module in a remote monitoring system of a warehouse cleaning robot based on the internet of things, which is provided by the embodiment of the invention;
fig. 8 is a block diagram of a deletion module in a remote monitoring system of a warehouse cleaning robot based on the internet of things, which is provided by the embodiment of the invention;
Fig. 9 is a block diagram of a sending module in a remote monitoring system of a warehouse cleaning robot based on the internet of things, which is provided by the embodiment of the invention;
fig. 10 is a block diagram of an adjustment module in a remote monitoring system of a warehouse cleaning robot based on the internet of things, which is provided by the embodiment of the invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
In embodiment 1, fig. 1 shows a flow of implementation of a remote monitoring method of a warehouse cleaning robot based on the internet of things, which is provided in the embodiment of the invention, and is described in detail below:
S100, acquiring video monitoring data in a ship cabin by using preset video monitoring equipment of the Internet of things, intercepting a snapshot, defining a use range of the warehouse cleaning robot, dividing the use range into a plurality of blocks, creating nodes corresponding to the blocks one by one, synchronizing the snapshot to the corresponding nodes, extracting foreground features in the snapshot, covering the nodes, integrating all the nodes, generating a task chain, constructing a position coordinate system, and calculating edge coordinates of the foreground features in each node by using an edge detection algorithm.
The method comprises the steps of acquiring video monitoring data in a ship cabin through Internet of things video monitoring equipment arranged on a deck of a transport ship, intercepting key frames, generating a snapshot, determining a using range of a cleaning robot in the ship cabin, namely an area needing cleaning, dividing the using range into a plurality of blocks, creating a node for each block, synchronizing snapshot parts corresponding to the blocks into the nodes, extracting foreground features in the snapshot by utilizing an image processing technology, wherein the foreground features are foreground parts in the snapshot, covering the snapshots in the nodes by utilizing the foreground parts, integrating all the nodes, generating a task chain, ensuring that each block can be cleaned in sequence, constructing a global position coordinate system, determining a spatial relation of each block, processing the foreground features contained in each node by utilizing an edge detection algorithm (such as a Canny or Sobel algorithm, and the like), and calculating and extracting edge coordinates of the foreground features.
And S200, acquiring the use authority of laser generating equipment which is integrated on the warehouse cleaning robot in advance, reading out the reflected light signals at the edge coordinates, sequentially carrying out residue verification on all nodes, and deleting the corresponding nodes from the task chain if the verification is not passed.
Deploying laser generating equipment in a warehouse cleaning machine, acquiring the use authority of the laser generating equipment, activating the laser generating equipment through the use authority, emitting laser to edge coordinates point by point, recording the intensity and mode of a reflected light signal, judging whether residues exist or not by combining a preset signal threshold, and deleting the corresponding node from a task chain if residues exist in the blocks.
And S300, positioning the real-time position of the warehouse cleaning robot, taking the real-time position as a starting point, taking a preset position as an end point, generating a moving path, generating a passing point by utilizing the edge coordinates, inserting the passing point into the moving path, and transmitting the moving path to the warehouse cleaning robot.
The method comprises the steps of determining a real-time position of a warehouse cleaning robot in a position coordinate system by using video monitoring equipment, taking the real-time position as a starting point, taking a preset position as an end point, and generating a moving path, wherein the preset position can be the starting point, a charging position, or the end position, and the like.
And S400, embedding a monitoring mechanism into the task chain, updating the snapshot, and adjusting the task chain, wherein the adjustment at least comprises merging and segmentation.
The method comprises the steps of calling video monitoring equipment of the Internet of things in real time to dynamically monitor a task area, wherein a monitoring mechanism is used for monitoring a warehouse cleaning robot in real time to judge whether the warehouse cleaning robot deviates, is interfered by obstacles, is in mechanical failure and the like, updating a snapshot according to preset frequency, and adjusting a task chain, and the specific adjustment mode comprises the steps of deleting processed blocks, merging and reordering the blocks and the like.
In embodiment 2, fig. 2 shows a flow of implementing the remote monitoring method of the cleaning robot based on the internet of things, where the steps of using a preset video monitoring device of the internet of things to collect video monitoring data in a ship cabin, intercepting a snapshot, defining a use range of the cleaning robot, and dividing the use range into a plurality of blocks are described in detail, as follows:
S101, acquiring equipment distribution in a ship cabin, dividing a forbidden area, generating a virtual fence based on the forbidden area, and marking the virtual fence into a snapshot.
The method comprises the steps of determining equipment distribution in a cabin by utilizing video monitoring equipment of the Internet of things, determining a forbidden area of a cleaning robot in the cabin, dividing the forbidden area into a dynamic forbidden area (such as an area blocked by temporary operation) and a static forbidden area (such as a fixed dangerous equipment area), and generating a virtual fence according to the forbidden area to prevent the cleaning robot from entering the virtual fence.
S102, judging whether the real-time position exceeds the virtual fence, and if so, triggering a preset alarm mechanism.
And if the real-time position of the warehouse cleaning robot exceeds the virtual fence, triggering a preset alarm mechanism, wherein the alarm mechanism is used for sending alarm information to a preset terminal, and the preset terminal is a loading and unloading charge personnel terminal.
In embodiment 3, fig. 2 shows a flow of implementing the remote monitoring method for the warehouse cleaning robot based on the internet of things, where the steps of creating nodes corresponding to the blocks one by one, synchronizing the snapshots to the corresponding nodes, extracting foreground features in the snapshots, covering the nodes, integrating all the nodes, and generating a task chain are described in detail as follows:
And S103, collecting historical data of residues in the cabin, performing marking and preprocessing to generate a training set, creating a target detection model, optimizing by using the training set, inputting the snapshot into the target detection model, and outputting to obtain foreground features and background features.
And under the historical moment, taking snapshots of residues in the cabin, manually marking the positions, the sizes and the like of the residues, strengthening the snapshots, enhancing the data and the like, integrating all the snapshots, and generating a training set.
And selecting and initializing a target detection model framework from the existing data set, training the target detection model framework by utilizing a training set, and optimizing to obtain a target detection model, wherein the target detection model is mainly used for separating foreground features and background features in the snapshot.
In embodiment 4, fig. 2 shows a flow of implementation of the remote monitoring method of the warehouse cleaning robot based on the internet of things, and the steps of calculating the edge coordinates of the foreground features in each node by using the edge detection algorithm for constructing the position coordinate system are described in detail as follows:
S104, determining a reference point, defining an origin and an axis direction, correcting the snapshot, integrating the origin and the axis coordinates, constructing a coordinate system, defining the boundary of the foreground feature in each block by using an edge detection algorithm, and reading out the edge coordinates of the boundary.
Determining a reference point, selecting a fixed point as an origin of a coordinate system, determining directions of coordinate axes by using the reference point and the origin as equipment in a cabin, constructing the coordinate system, processing the snapshot in each block by using an edge detection algorithm (such as Canny or Sobel, etc.), extracting foreground features in the snapshot, defining a boundary of the foreground features, and reading edge coordinates of the boundary from the coordinate system, wherein all the edge coordinates also form specific positions of residues.
In embodiment 5, fig. 3 shows a flow of implementing the remote monitoring method of the cleaning robot based on the internet of things, where the steps of acquiring the use authority of the laser generating device pre-integrated on the cleaning robot, reading out the reflected light signal at the edge coordinates, and sequentially performing residue verification on all nodes are described in detail as follows:
And S201, configuring the scanning priority of each block, setting the scanning frequency, and clustering the reflected light signals into a smooth part and a rough part.
The scanning priority of each block is determined, and if the occupation of the area formed by the edge coordinates in the block is relatively large, a relatively large scanning frequency is set.
S202, judging whether the reflected light signal at the edge coordinates is a rough part, and if so, verifying to pass.
The reflected light signal in each block is read and divided into a smooth portion, i.e. a non-residual area, and a rough portion, i.e. an area where a residual exists.
In embodiment 6, fig. 4 shows a flow of implementing the remote monitoring method of the cleaning robot based on the internet of things provided by the embodiment of the present invention, and the following details the steps of locating the real-time position of the cleaning robot, using the real-time position as a starting point, using a preset position as an end point, generating a moving path, generating a passing point by using the edge coordinates, inserting the passing point into the moving path, and sending the moving path to the cleaning robot, where the following details are provided:
And S301, integrating the starting point, the passing point and the end point, generating a moving path, and determining deviation factors, wherein the deviation factors at least comprise electric quantity, progress and barriers.
Integrating the starting point, the passing point and the end point to generate a moving path, and reading out deviation factors in the moving path by utilizing video monitoring equipment of the Internet of things.
S302, calculating a deviation value of the real-time position and the moving path, generating alarm information if the deviation value is larger than a preset threshold value, and sending the alarm information to a preset terminal.
Marking a point position of the warehouse cleaning robot, which starts to deviate from a moving path, calculating the distance between a real-time position and the point position, defining the distance as a deviation value, monitoring the deviation value in real time, and sending alarm information to a preset terminal if the deviation value is larger than a preset threshold value, wherein the preset terminal is a loading and unloading charge personnel terminal.
In embodiment 7, fig. 5 shows a flow of implementing the remote monitoring method of the cleaning robot based on the internet of things, where the steps of embedding a monitoring mechanism into the task chain, updating the snapshot, and adjusting the task chain are described in detail as follows:
s401, collecting fusion data of the multi-level internet of things sensor, integrating the fusion data into the task chain, activating the monitoring mechanism, and monitoring the state of the cleaning robot.
Data are collected in real time through various Internet of things sensors (such as a temperature and humidity sensor, an infrared sensor and the like) arranged in the ship cabin, the collected data are fused, the running state of the cleaning robot is determined, and whether faults such as route deviation and wrong entering forbidden areas occur or not is judged.
S402, recording the generation time of the snapshot, updating the snapshot based on a preset frequency, integrating a state monitoring result and the snapshot, and adjusting a task chain.
And updating the snapshot according to the preset frequency, and adjusting the task chain in the specific modes of deleting the processed blocks, merging and reordering the blocks and the like.
Fig. 6 shows a block diagram of a composition structure of a remote monitoring system of a cleaning robot based on the internet of things, where the remote monitoring system 1 of a cleaning robot based on the internet of things includes:
The computing module 11 is used for acquiring video monitoring data in a ship cabin by using preset video monitoring equipment of the internet of things, intercepting a snapshot, defining a use range of the bin cleaning robot, dividing the use range into a plurality of blocks, creating nodes corresponding to the blocks one by one, synchronizing the snapshot to the corresponding nodes, extracting foreground features in the snapshot, covering the nodes, integrating all the nodes, generating a task chain, constructing a position coordinate system, and calculating edge coordinates of the foreground features in each node by using an edge detection algorithm;
the deleting module 12 is configured to acquire a use right of a laser generating device pre-integrated on the warehouse cleaning robot, read out a reflected light signal at the edge coordinate, sequentially perform residue verification on all nodes, and delete a corresponding node from a task chain if the verification is not passed;
A sending module 13, configured to locate a real-time position of the cleaning robot, generate a moving path with the real-time position as a starting point and a preset position as an end point, generate a passing point by using the edge coordinates, insert the passing point into the moving path, and send the moving path to the cleaning robot;
and the adjustment module 14 is used for embedding a monitoring mechanism into the task chain, updating the snapshot and adjusting the task chain, wherein the adjustment at least comprises merging and splitting.
Fig. 7 shows a block diagram of a composition structure of a remote monitoring system of a warehouse cleaning robot based on the internet of things, where the computing module 11 includes:
A marking unit 111, configured to obtain equipment distribution in a cabin, divide a forbidden area, generate a virtual fence based on the forbidden area, and mark the virtual fence into a snapshot;
a judging unit 112, configured to judge whether the real-time position exceeds the virtual fence, and if so, trigger a preset alarm mechanism;
the output unit 113 is used for collecting historical data of residues in the cabin, performing labeling and preprocessing, generating a training set, creating a target detection model, optimizing by using the training set, inputting the snapshot into the target detection model, and outputting to obtain foreground features and background features;
And a reading unit 114, configured to determine a reference point, define an origin and an axis direction, correct the snapshot, integrate the origin and the axis coordinates, construct a coordinate system, and define a boundary of the foreground feature in each block by using an edge detection algorithm, and read edge coordinates of the boundary.
Fig. 8 shows a block diagram of a composition structure of a remote monitoring system of a warehouse cleaning robot based on the internet of things, where the deletion module 12 includes:
a clustering unit 121 configured to configure a scanning priority of each block, and set a scanning frequency, and cluster the reflected light signals into a smooth portion and a rough portion;
and a verification unit 122, configured to determine whether the reflected light signal at the edge coordinates is a rough portion, and if so, verify passing.
Fig. 9 shows a block diagram of a composition structure of a remote monitoring system of a warehouse cleaning robot based on the internet of things, where the sending module 13 includes:
a deviation unit 131 for integrating the start point, the route point and the end point, generating a moving path, and determining deviation factors, wherein the deviation factors at least include an electric quantity, a progress and an obstacle;
and the sending unit 132 is configured to calculate a deviation value between the real-time position and the moving path, generate alarm information if the deviation value is greater than a preset threshold, and send the alarm information to a preset terminal.
Fig. 10 shows a block diagram of a composition structure of a remote monitoring system of a warehouse cleaning robot based on the internet of things, where the adjusting module 14 includes:
The activating unit 141 is configured to collect fusion data of the multi-level internet of things sensor, integrate the fusion data into the task chain, activate the monitoring mechanism, and perform state monitoring on the cleaning robot;
And an updating unit 142, configured to record the generation time of the snapshot, update the snapshot based on a preset frequency, integrate the status monitoring result and the snapshot, and adjust the task chain.
The calculation module 11 is mainly used for completing step S100, the deletion module 12 is mainly used for completing step S200, the transmission module 13 is mainly used for completing step S300, and the adjustment module 14 is mainly used for completing step S400;
The marking unit 111 is mainly used for completing step S101, the judging unit 112 is mainly used for completing step S102, the output unit 113 is mainly used for completing step S103, and the reading unit 114 is mainly used for completing step S104;
the clustering unit 121 is mainly used for completing step S201, and the verification unit 122 is mainly used for completing step S202;
The deviating unit 131 is mainly used for completing step S301, and the transmitting unit 132 is mainly used for completing step S302;
The activating unit 141 is mainly used for completing step S401, and the updating unit 142 is mainly used for completing step S402.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the invention and are described in detail herein without thereby limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.

Claims (10)

CN202411876901.7A2024-12-192024-12-19 A remote monitoring method and system for warehouse clearance robots based on the Internet of ThingsActiveCN119697344B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202411876901.7ACN119697344B (en)2024-12-192024-12-19 A remote monitoring method and system for warehouse clearance robots based on the Internet of Things

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202411876901.7ACN119697344B (en)2024-12-192024-12-19 A remote monitoring method and system for warehouse clearance robots based on the Internet of Things

Publications (2)

Publication NumberPublication Date
CN119697344Atrue CN119697344A (en)2025-03-25
CN119697344B CN119697344B (en)2025-09-16

Family

ID=95030953

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202411876901.7AActiveCN119697344B (en)2024-12-192024-12-19 A remote monitoring method and system for warehouse clearance robots based on the Internet of Things

Country Status (1)

CountryLink
CN (1)CN119697344B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109984684A (en)*2019-04-112019-07-09云鲸智能科技(东莞)有限公司Cleaning control method, device, clean robot and storage medium
CN112739244A (en)*2018-07-132021-04-30美国iRobot公司 Mobile Robot Cleaning System
CN113503877A (en)*2021-06-222021-10-15深圳拓邦股份有限公司Robot partition map establishing method and device and robot
CN114603561A (en)*2022-04-132022-06-10开封大学 Intelligent robot vision sensor control system and method
CN114837252A (en)*2022-05-192022-08-02瑞诺(济南)动力科技有限公司Automatic cabin cleaning method, equipment and medium in cabin
CN115236628A (en)*2022-07-262022-10-25中国矿业大学 A method for detecting residual cargo in carriages based on lidar
CN116277072A (en)*2023-05-042023-06-23杭州萤石软件有限公司Task object processing method and system, camera equipment and mobile robot
CN116673958A (en)*2023-06-262023-09-01山东新一代信息产业技术研究院有限公司Granary cleaning robot based on artificial intelligence
CN118244771A (en)*2024-05-292024-06-25武汉理工大学Asynchronous cooperative control method and device for multi-cabin cleaning robot
CN118500380A (en)*2024-05-172024-08-16深圳市普渡科技有限公司Map updating method and device, robot and storage medium
WO2024212441A1 (en)*2023-04-132024-10-17深圳乐动机器人股份有限公司Region division method, apparatus and computer-readable storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN112739244A (en)*2018-07-132021-04-30美国iRobot公司 Mobile Robot Cleaning System
CN109984684A (en)*2019-04-112019-07-09云鲸智能科技(东莞)有限公司Cleaning control method, device, clean robot and storage medium
CN113503877A (en)*2021-06-222021-10-15深圳拓邦股份有限公司Robot partition map establishing method and device and robot
CN114603561A (en)*2022-04-132022-06-10开封大学 Intelligent robot vision sensor control system and method
CN114837252A (en)*2022-05-192022-08-02瑞诺(济南)动力科技有限公司Automatic cabin cleaning method, equipment and medium in cabin
CN115236628A (en)*2022-07-262022-10-25中国矿业大学 A method for detecting residual cargo in carriages based on lidar
WO2024212441A1 (en)*2023-04-132024-10-17深圳乐动机器人股份有限公司Region division method, apparatus and computer-readable storage medium
CN116277072A (en)*2023-05-042023-06-23杭州萤石软件有限公司Task object processing method and system, camera equipment and mobile robot
CN116673958A (en)*2023-06-262023-09-01山东新一代信息产业技术研究院有限公司Granary cleaning robot based on artificial intelligence
CN118500380A (en)*2024-05-172024-08-16深圳市普渡科技有限公司Map updating method and device, robot and storage medium
CN118244771A (en)*2024-05-292024-06-25武汉理工大学Asynchronous cooperative control method and device for multi-cabin cleaning robot

Also Published As

Publication numberPublication date
CN119697344B (en)2025-09-16

Similar Documents

PublicationPublication DateTitle
CN110587597B (en)SLAM closed loop detection method and detection system based on laser radar
JP7582504B2 (en) Method and device for updating an environmental map used for robot self-position estimation
US20220196410A1 (en)Vehicle navigation
CN111898835B (en)Intelligent traffic management method, device, computer and readable storage medium
CN113867356B (en) Robot path planning method, device and robot
JP6992876B2 (en) Inspection management device, inspection management method, program, inspection robot, and inspection robot management system
CN114603561A (en) Intelligent robot vision sensor control system and method
CN118225074B (en)Self-adaptive map updating method and device for bulk cargo ship cabin cleaning robot
CN118616338B (en) Single-piece separation method and system for express delivery sorting
Garrote et al.Mobile robot localization with reinforcement learning map update decision aided by an absolute indoor positioning system
CN118752498A (en) Autonomous navigation method and system for a steel bar tying robot
Sagar et al.Computer vision and IoT-enabled robotic platform for automated crack detection in road and bridges
CN119697344B (en) A remote monitoring method and system for warehouse clearance robots based on the Internet of Things
CN112551195A (en)Method for butting vehicle and boarding bridge and platform management system
Pagnot et al.Fast cross-country navigation on fair terrains
EP4242996A1 (en)Status determination device, status determination system, and status determination method
CN115345446A (en)Unmanned aerial vehicle line patrol and point cloud data-based project completion acceptance method
Bharathi et al.Automating construction safety inspections using robots and unsupervised deep domain adaptation by backpropagation
CN115283360A (en)Automatic visual point cloud path planning system and method based on intelligent subway purging
CN120638653B (en)Cooperative control system and method for autonomous inspection and foreign matter cleaning of transformer substation
CN115494878B (en)Unmanned aerial vehicle line patrol path adjustment method and device based on obstacle avoidance
CN118913294B (en) IGV navigation warning method, system, terminal and storage medium
JP2021196489A (en)Map correction system, and map correction program
CN110850856A (en)Data processing method and device and robot
CN118952226B (en)Control system based on robot controller and control method thereof

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp