Disclosure of Invention
Aiming at the problems of the existing medicament allocation robot, the invention provides a medicament allocation robot platform which is based on a multi-sensor fusion system and a dynamic capture database and realizes modeling of the working environment of the robot and the space positioning of the tail end of an arm through a neural network model, thereby effectively reducing the difficulty of human-computer interaction and avoiding excessive human intervention. The scheme of the medicament preparing robot platform is as follows:
a medicament deployment robot platform, comprising: the robot comprises a robot body, a control unit and a control unit, wherein the robot body comprises a mechanical structure, an electronic circuit and a controller of the robot; the sensor array is arranged on the robot body and used for sensing environmental data of the robot body; a sensing system to receive and transmit environmental data sensed by the sensor array; and the cloud platform is used for adjusting the control strategy of the robot or/and modifying the function of the robot based on the environment data transmitted by the sensing system.
Preferably, the robot forms interactive feedback data based on the environmental data transmitted by the sensing system, and controls the motion trajectory and the start-stop state of the robot body.
Preferably, the perception system periodically transmits the environment data to the cloud platform.
Preferably, the cloud platform is connected with the robot body through a wireless network.
Preferably, the sensor array comprises one or more of a visual sensor, an acoustic sensor, a laser distance type sensor.
Preferably, the robot is provided with a motion decoupling model for mapping the joint space of the robot to a three-dimensional space.
Preferably, the motion decoupling model comprises a native built-in model and an environment learning model, the native built-in model models and measures the working space of the robot to realize basic modeling of the robot motion, and the environment learning model imports cloud data by setting a typical working environment to rapidly model the working environment of the robot.
Preferably, the environment learning model supports the robot to perform deep learning and training on a specific environment through a neural network technology.
Preferably, the robot body is a two-arm robot.
Preferably, each arm of the two-arm robot has 7 degrees of freedom.
According to the technical scheme, the embodiment of the invention has the following advantages:
the embodiment of the invention is based on the sensor array and the dynamic capture database, and realizes the modeling of the working environment of the robot body and the space positioning of the tail end of the robot arm through the learning and training of the neural network model. Specifically, the database only needs to perform dragging training iteration on the mechanical arm working space motion to form motion capture, the database is compared and corrected with a neural network motion model of the robot, the database is gradually corrected, and motion positioning of all the mechanical arm working spaces is gradually perfected by means of a large amount of training. Therefore, the embodiment of the invention can realize online or offline programming of the robot through dragging teaching, greatly reduce the difficulty of the robot in the field of human-computer cooperation and effectively reduce the medical cost.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a schematic diagram of a logic architecture of an integrated platform of a drug dispensing robot according to an embodiment of the present invention. In this embodiment, the integrated pharmaceutical dispensing robot platform includes arobot body 10, asensor array 40, asensing system 30, and acloud platform 20.
Therobot body 10 mainly includes a mechanical structure, an electronic circuit, and a controller of the robot. In a preferred embodiment, therobot body 10 employs a two-arm robot having 7 degrees of freedom per arm. Fig. 2 is a schematic diagram of a mechanical structure of a dual-arm drug dispensing robot according to the present embodiment. Therobot body 10 includes abase 105, abody 102, ahead 101, and two arms (103 and 104) on both sides of the body. The two arms realize the medicament allocation work through accurate positioning and control. The double-arm robot adopts the same master control system to control, and compared with a plurality of single-arm robots, the hardware expenditure in the aspects of the microprocessor and the sensor is reduced, so that the cost is effectively reduced. The double-arm robot also reduces the communication pressure in the multi-machine cooperative work, and the cooperativity in the working process of the double arms is greatly improved, thereby effectively improving the safety of the robot.
The integrated platform of the double-arm robot has the following requirements on electromechanical composition: by adopting the integrated joint design, the occupied space is effectively reduced, and the robot is easy to deploy and move due to the smaller volume and lighter body design. In the embodiment, each arm of the double-arm robot has 7 degrees of freedom respectively, and the design of the arms of the double-arm robot is simulated, so that the collision avoidance, the optimal path trajectory planning, the minimum energy consumption trajectory planning and the like for obstacles in a complex environment can be realized. During long-term use, the 7-degree-of-freedom degree can show better flexibility and flexibility than the 6-degree-of-freedom degree. The double-arm robot adopts the high-performance main control unit to carry out centralized control on the control system, can support cloud scheduling, and is convenient for an operator to carry out remote monitoring and remote control on the working state of the robot through the portable terminals such as the smart phone and the tablet personal computer.
Thesensor array 40 is disposed on therobot body 10 and is mainly used for sensing environmental data of therobot body 10. Thesensor array 40 includes one or more of a visual sensor, an acoustic sensor, a laser distance type sensor. Preferably, thesensor array 40 further includes mechanical sensors, tactile sensors, and the like.
Thesensing system 30 is used to receive and transmit environmental data sensed by thesensor array 40. Thesensing system 30 interactively feeds back the environmental data and therobot body 10, and therobot body 10 transmits the motion data to thesensing system 30. The motion of therobot body 10 and thesensor array 40 can be parallel, and therobot body 10 can acquire the environment interaction feedback data transmitted by thesensing system 30 to control the motion track and the start-stop state of the robot body. The integrated platform of the medicament allocation robot adopts a sensing system based on multi-sensor fusion, so that the robot body can be rapidly deployed and safely controlled, an operator can use the robot in a completely relaxed state, and the condition cannot be met by a traditional robot control platform.
Thecloud platform 20 adjusts the control strategy of therobot body 10 or/and modifies the function of the robot based on the environment data (i.e. state information) transmitted by thesensing system 30. Based on thecloud platform 20, an operator can realize a remote control function for the robot, and the processes of deployment, modification, logout and the like of the working process of the robot in an unattended state can be realized by combining a perception system, so that man-machine cooperation is really realized. Thecloud platform 20 serves as a master control component, collects control system state data and environment data, modifies control strategies in a targeted manner, and provides a function of directly manipulating the robot through the portable terminal. Therobot body 10 is in butt joint with cloud service through a wireless network, so that the safety and privacy of the intelligent terminal are effectively provided, and the robot integration platform is interconnected. In a preferred embodiment, the cloud data transmission can be based on secure encrypted communication, so that the secure control of the robot body by an operator is effectively ensured. Based on high in the clouds interconnection, the state control of robot can carry out the propelling movement through intelligent terminal, realizes effectively the intelligence and the use of being careful to the robot.
Thesensing system 30 combines sensor data such as vision, touch, distance and the like with a cloud database through a multi-sensor fusion system to realize deep sensing and learning of the robot body to the working environment, and effectively improve the safety and flexibility of the robot in working. Thesensing system 30 can collect user usage data, detect user usage habits, provide product usage experience and cautions through the cloud platform, and facilitate maximum protection of user rights and maximum output of the usage value of the robot platform. In a preferred embodiment, the environmental data is periodically sent to thecloud platform 20 through thesensing system 30 for the cloud to modify and update the accessed robot body.
In this embodiment therobot body 10 is provided with a motion decoupling model for mapping the joint space of the robot into a three-dimensional space. In this embodiment, the drug dispensing robot integration platform adopts the dynamic capture database to perform dual modeling for the robot body and the external environment, and is mainly divided into two units: a native built-in model and an environmental learning model. The primary built-in model carries out modeling and measurement aiming at the working space of the robot so as to realize basic modeling of the robot motion, and the motion space range and the position of a high-frequency use space of the robot are calibrated through a multi-sensor fusion technology so as to realize the basic modeling of the robot motion. The environment learning model is used for rapidly modeling the working environment of the robot by setting typical working environment leading-in cloud data, and meanwhile, deep learning of a specific environment through a neural network technology can be supported. After the neural network training, the robot can perform more freely. The basic model of the robot detection environment space is trained through the neural network, and the complexity of the learning model can be customized according to an expected learning period.
Fig. 3 is a schematic view of an environment learning model of an integrated platform of a dual-arm drug dispensing robot according to an embodiment of the present invention. In the actual research and development process, a functional interface for adjusting the complexity of the model according to expected deployment time is provided, and the deployment conditions of the robot can be set from the local and cloud sides.
In the embodiment of the invention, the medicament preparation robot platform adopts a more convenient dragging control mode and is assisted by a sensing system, so that the operation process of the robot is simplified, and the working safety of the robot is ensured; through visual compensation in the sensing array, the robot working space can be corrected with high precision, the movement precision in the drug configuration process is ensured, and the safety of the operation process can be further ensured; the robot is provided with a force sensor and a vision sensor, can be safely protected by identifying a human body and sensing touch, and can avoid and stop suddenly when necessary, so that the safety performance is effectively improved; the robot can carry out intelligent learning to the working space scope, can find commonly used, reasonable working position to reduce the probability that directly or indirectly causes the injury to the human body, be particularly suitable for the application scene of hospital class to the security is high.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.