Movatterモバイル変換


[0]ホーム

URL:


CN109202861B - Medicament allotment robot integration platform - Google Patents

Medicament allotment robot integration platform
Download PDF

Info

Publication number
CN109202861B
CN109202861BCN201710519673.1ACN201710519673ACN109202861BCN 109202861 BCN109202861 BCN 109202861BCN 201710519673 ACN201710519673 ACN 201710519673ACN 109202861 BCN109202861 BCN 109202861B
Authority
CN
China
Prior art keywords
robot
platform
environment
model
robot body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710519673.1A
Other languages
Chinese (zh)
Other versions
CN109202861A (en
Inventor
孙若怀
邹风山
刘晓帆
梁亮
赵彬
钱益舟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Siasun Robot and Automation Co Ltd
Original Assignee
Shenyang Siasun Robot and Automation Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Siasun Robot and Automation Co LtdfiledCriticalShenyang Siasun Robot and Automation Co Ltd
Priority to CN201710519673.1ApriorityCriticalpatent/CN109202861B/en
Publication of CN109202861ApublicationCriticalpatent/CN109202861A/en
Application grantedgrantedCritical
Publication of CN109202861BpublicationCriticalpatent/CN109202861B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The embodiment of the invention discloses a medicament allocation robot platform, which comprises a robot body, a medicament allocation robot platform and a medicament allocation robot platform, wherein the robot body comprises a mechanical structure, an electronic circuit and a controller of a robot; the sensor array is arranged on the robot body and used for sensing environmental data of the robot body; a sensing system to receive and transmit environmental data sensed by the sensor array; and the cloud platform is used for adjusting the control strategy of the robot or/and modifying the function of the robot based on the environment data transmitted by the sensing system. The embodiment of the invention is based on a multi-sensor fusion system and a dynamic capture database, and realizes modeling of the working environment of the robot and spatial positioning of the tail end of the arm through a neural network model, thereby effectively reducing the difficulty of human-computer interaction and avoiding excessive human intervention.

Description

Medicament allotment robot integration platform
Technical Field
The invention relates to the field of intelligent manufacturing and automatic control, in particular to a robot integration platform for dispensing medicaments, and particularly relates to a double-arm medicament dispensing robot integration platform.
Background
With the wide application of the intelligent robot technology in the leading-edge fields of precision manufacturing, aerospace and the like, the intelligent robot technology is also gradually applied to civil fields such as industry, education, medical services and the like. The existing intelligent robot used in the civil field is often limited by the factors of the rigidity of the robot body, complex programming control and the like, better human-computer interaction cannot be carried out, and the aims of industrial upgrading and complete manual release by replacing manual work with the robot cannot be achieved.
At present, robots (such as medicament deployment robots) in the field of medical services are mainly based on traditional industrial robot platforms, and a robot kinematics model is applied to a vertical multi-joint type serial robot to establish a mapping relation between robot joints and a user space, so that position control of each joint is realized. The control method has the disadvantages that the robot control system designed based on the framework loses the flexibility of robot control and the convenience of human-computer interaction while ensuring the positioning accuracy of the robot. Moreover, the perception of the external environment by the robot control system with the structural design needs to be completely dependent on the experience of the operator.
Therefore, there is a need for an intelligent robotic platform with positioning accuracy, flexible control and easy human-computer interaction to achieve precise dispensing of medication without requiring much intervention by a doctor or nurse, thereby putting an experienced doctor or nurse into more valuable work.
Disclosure of Invention
Aiming at the problems of the existing medicament allocation robot, the invention provides a medicament allocation robot platform which is based on a multi-sensor fusion system and a dynamic capture database and realizes modeling of the working environment of the robot and the space positioning of the tail end of an arm through a neural network model, thereby effectively reducing the difficulty of human-computer interaction and avoiding excessive human intervention. The scheme of the medicament preparing robot platform is as follows:
a medicament deployment robot platform, comprising: the robot comprises a robot body, a control unit and a control unit, wherein the robot body comprises a mechanical structure, an electronic circuit and a controller of the robot; the sensor array is arranged on the robot body and used for sensing environmental data of the robot body; a sensing system to receive and transmit environmental data sensed by the sensor array; and the cloud platform is used for adjusting the control strategy of the robot or/and modifying the function of the robot based on the environment data transmitted by the sensing system.
Preferably, the robot forms interactive feedback data based on the environmental data transmitted by the sensing system, and controls the motion trajectory and the start-stop state of the robot body.
Preferably, the perception system periodically transmits the environment data to the cloud platform.
Preferably, the cloud platform is connected with the robot body through a wireless network.
Preferably, the sensor array comprises one or more of a visual sensor, an acoustic sensor, a laser distance type sensor.
Preferably, the robot is provided with a motion decoupling model for mapping the joint space of the robot to a three-dimensional space.
Preferably, the motion decoupling model comprises a native built-in model and an environment learning model, the native built-in model models and measures the working space of the robot to realize basic modeling of the robot motion, and the environment learning model imports cloud data by setting a typical working environment to rapidly model the working environment of the robot.
Preferably, the environment learning model supports the robot to perform deep learning and training on a specific environment through a neural network technology.
Preferably, the robot body is a two-arm robot.
Preferably, each arm of the two-arm robot has 7 degrees of freedom.
According to the technical scheme, the embodiment of the invention has the following advantages:
the embodiment of the invention is based on the sensor array and the dynamic capture database, and realizes the modeling of the working environment of the robot body and the space positioning of the tail end of the robot arm through the learning and training of the neural network model. Specifically, the database only needs to perform dragging training iteration on the mechanical arm working space motion to form motion capture, the database is compared and corrected with a neural network motion model of the robot, the database is gradually corrected, and motion positioning of all the mechanical arm working spaces is gradually perfected by means of a large amount of training. Therefore, the embodiment of the invention can realize online or offline programming of the robot through dragging teaching, greatly reduce the difficulty of the robot in the field of human-computer cooperation and effectively reduce the medical cost.
Drawings
Fig. 1 is a schematic diagram of a logic architecture of an integrated platform of a drug dispensing robot according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a mechanical structure of a dual-arm drug dispensing robot according to an embodiment of the present invention;
fig. 3 is a schematic view of an environment learning model of an integrated platform of a dual-arm drug dispensing robot according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a schematic diagram of a logic architecture of an integrated platform of a drug dispensing robot according to an embodiment of the present invention. In this embodiment, the integrated pharmaceutical dispensing robot platform includes arobot body 10, asensor array 40, asensing system 30, and acloud platform 20.
Therobot body 10 mainly includes a mechanical structure, an electronic circuit, and a controller of the robot. In a preferred embodiment, therobot body 10 employs a two-arm robot having 7 degrees of freedom per arm. Fig. 2 is a schematic diagram of a mechanical structure of a dual-arm drug dispensing robot according to the present embodiment. Therobot body 10 includes abase 105, abody 102, ahead 101, and two arms (103 and 104) on both sides of the body. The two arms realize the medicament allocation work through accurate positioning and control. The double-arm robot adopts the same master control system to control, and compared with a plurality of single-arm robots, the hardware expenditure in the aspects of the microprocessor and the sensor is reduced, so that the cost is effectively reduced. The double-arm robot also reduces the communication pressure in the multi-machine cooperative work, and the cooperativity in the working process of the double arms is greatly improved, thereby effectively improving the safety of the robot.
The integrated platform of the double-arm robot has the following requirements on electromechanical composition: by adopting the integrated joint design, the occupied space is effectively reduced, and the robot is easy to deploy and move due to the smaller volume and lighter body design. In the embodiment, each arm of the double-arm robot has 7 degrees of freedom respectively, and the design of the arms of the double-arm robot is simulated, so that the collision avoidance, the optimal path trajectory planning, the minimum energy consumption trajectory planning and the like for obstacles in a complex environment can be realized. During long-term use, the 7-degree-of-freedom degree can show better flexibility and flexibility than the 6-degree-of-freedom degree. The double-arm robot adopts the high-performance main control unit to carry out centralized control on the control system, can support cloud scheduling, and is convenient for an operator to carry out remote monitoring and remote control on the working state of the robot through the portable terminals such as the smart phone and the tablet personal computer.
Thesensor array 40 is disposed on therobot body 10 and is mainly used for sensing environmental data of therobot body 10. Thesensor array 40 includes one or more of a visual sensor, an acoustic sensor, a laser distance type sensor. Preferably, thesensor array 40 further includes mechanical sensors, tactile sensors, and the like.
Thesensing system 30 is used to receive and transmit environmental data sensed by thesensor array 40. Thesensing system 30 interactively feeds back the environmental data and therobot body 10, and therobot body 10 transmits the motion data to thesensing system 30. The motion of therobot body 10 and thesensor array 40 can be parallel, and therobot body 10 can acquire the environment interaction feedback data transmitted by thesensing system 30 to control the motion track and the start-stop state of the robot body. The integrated platform of the medicament allocation robot adopts a sensing system based on multi-sensor fusion, so that the robot body can be rapidly deployed and safely controlled, an operator can use the robot in a completely relaxed state, and the condition cannot be met by a traditional robot control platform.
Thecloud platform 20 adjusts the control strategy of therobot body 10 or/and modifies the function of the robot based on the environment data (i.e. state information) transmitted by thesensing system 30. Based on thecloud platform 20, an operator can realize a remote control function for the robot, and the processes of deployment, modification, logout and the like of the working process of the robot in an unattended state can be realized by combining a perception system, so that man-machine cooperation is really realized. Thecloud platform 20 serves as a master control component, collects control system state data and environment data, modifies control strategies in a targeted manner, and provides a function of directly manipulating the robot through the portable terminal. Therobot body 10 is in butt joint with cloud service through a wireless network, so that the safety and privacy of the intelligent terminal are effectively provided, and the robot integration platform is interconnected. In a preferred embodiment, the cloud data transmission can be based on secure encrypted communication, so that the secure control of the robot body by an operator is effectively ensured. Based on high in the clouds interconnection, the state control of robot can carry out the propelling movement through intelligent terminal, realizes effectively the intelligence and the use of being careful to the robot.
Thesensing system 30 combines sensor data such as vision, touch, distance and the like with a cloud database through a multi-sensor fusion system to realize deep sensing and learning of the robot body to the working environment, and effectively improve the safety and flexibility of the robot in working. Thesensing system 30 can collect user usage data, detect user usage habits, provide product usage experience and cautions through the cloud platform, and facilitate maximum protection of user rights and maximum output of the usage value of the robot platform. In a preferred embodiment, the environmental data is periodically sent to thecloud platform 20 through thesensing system 30 for the cloud to modify and update the accessed robot body.
In this embodiment therobot body 10 is provided with a motion decoupling model for mapping the joint space of the robot into a three-dimensional space. In this embodiment, the drug dispensing robot integration platform adopts the dynamic capture database to perform dual modeling for the robot body and the external environment, and is mainly divided into two units: a native built-in model and an environmental learning model. The primary built-in model carries out modeling and measurement aiming at the working space of the robot so as to realize basic modeling of the robot motion, and the motion space range and the position of a high-frequency use space of the robot are calibrated through a multi-sensor fusion technology so as to realize the basic modeling of the robot motion. The environment learning model is used for rapidly modeling the working environment of the robot by setting typical working environment leading-in cloud data, and meanwhile, deep learning of a specific environment through a neural network technology can be supported. After the neural network training, the robot can perform more freely. The basic model of the robot detection environment space is trained through the neural network, and the complexity of the learning model can be customized according to an expected learning period.
Fig. 3 is a schematic view of an environment learning model of an integrated platform of a dual-arm drug dispensing robot according to an embodiment of the present invention. In the actual research and development process, a functional interface for adjusting the complexity of the model according to expected deployment time is provided, and the deployment conditions of the robot can be set from the local and cloud sides.
In the embodiment of the invention, the medicament preparation robot platform adopts a more convenient dragging control mode and is assisted by a sensing system, so that the operation process of the robot is simplified, and the working safety of the robot is ensured; through visual compensation in the sensing array, the robot working space can be corrected with high precision, the movement precision in the drug configuration process is ensured, and the safety of the operation process can be further ensured; the robot is provided with a force sensor and a vision sensor, can be safely protected by identifying a human body and sensing touch, and can avoid and stop suddenly when necessary, so that the safety performance is effectively improved; the robot can carry out intelligent learning to the working space scope, can find commonly used, reasonable working position to reduce the probability that directly or indirectly causes the injury to the human body, be particularly suitable for the application scene of hospital class to the security is high.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (8)

CN201710519673.1A2017-06-302017-06-30Medicament allotment robot integration platformActiveCN109202861B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201710519673.1ACN109202861B (en)2017-06-302017-06-30Medicament allotment robot integration platform

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201710519673.1ACN109202861B (en)2017-06-302017-06-30Medicament allotment robot integration platform

Publications (2)

Publication NumberPublication Date
CN109202861A CN109202861A (en)2019-01-15
CN109202861Btrue CN109202861B (en)2021-11-23

Family

ID=64960961

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201710519673.1AActiveCN109202861B (en)2017-06-302017-06-30Medicament allotment robot integration platform

Country Status (1)

CountryLink
CN (1)CN109202861B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN115890656A (en)*2022-10-252023-04-04中国电信股份有限公司 Warehousing logistics robot control method, system, electronic device and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN100498600C (en)*2007-09-182009-06-10湖南大学Large condenser underwater operation environment two-joint robot control method
WO2013009887A1 (en)*2011-07-112013-01-17Board Of Regents Of The University Of NebraskaRobotic surgical devices, systems and related methods
CN105127997B (en)*2015-08-102017-04-05深圳百思拓威机器人技术有限公司Pharmacists' intelligent robot system and its control method
CN106695840A (en)*2017-03-092017-05-24哈尔滨理工大学Remote monitored robot based on instruction navigation

Also Published As

Publication numberPublication date
CN109202861A (en)2019-01-15

Similar Documents

PublicationPublication DateTitle
Zhou et al.IoT-enabled dual-arm motion capture and mapping for telerobotics in home care
CN103192390B (en)Control system of humanoid robot
CN113829343B (en)Real-time multitasking and multi-man-machine interaction system based on environment perception
Li et al.A dexterous hand-arm teleoperation system based on hand pose estimation and active vision
Dean-Leon et al.Whole-body active compliance control for humanoid robots with robot skin
CN108127673A (en)A kind of contactless robot man-machine interactive system based on Multi-sensor Fusion
CN110039547A (en)A kind of human-computer interaction terminal and method of flexible mechanical arm remote operating
CN103832504B (en)Bionic foot-type robot comprehensive simulation method
WO2017115385A2 (en)System and method for operating and controlling a hyper configurable humanoid robot to perform multiple applications in various work environments
CN104440926A (en)Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect
CN114800535A (en)Robot control method, mechanical arm control method, robot and control terminal
Garrido et al.Modular design and control of an upper limb exoskeleton
CN113183147B (en)Large-area coverage electronic skin system with remote proximity sense
Krupke et al.Prototyping of immersive HRI scenarios
CN105014672A (en) A wearable robot control system for assisting the disabled
CN104002307A (en)Wearable rescue robot control method and system
Yang et al.Sensor fusion-based teleoperation control of anthropomorphic robotic arm
CN112171672B (en)System and method for monitoring and controlling movement behaviors of insect robot
CN109202861B (en)Medicament allotment robot integration platform
Ai et al.Master-slave control technology of isomeric surgical robot for minimally invasive surgery
CN104097208B (en)A kind of multiplex's industry mechanical arm controller based on double-deck CPG
CN103213143A (en)Multi-element touch sense interactive perceiving system with temperature perceiving function
CN112631148A (en)Exoskeleton robot platform communication protocol and online simulation control system
CN111309152A (en) A human-computer flexible interaction system and method based on intent recognition and impedance matching
Kogawa et al.Development of a remote-controlled drone system by using only eye movements: Design of a control screen considering operability and microsaccades

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp