Disclosure of Invention
In view of the above, it is necessary to provide a head position identification system, an intraoperative control system, and a control method, which are directed to the problem of how to improve the safety of an operating device.
A head position identification system includes a headgear, a connecting frame, a visual identifier, a visual tracking device, and a central control device. The head frame is used for being fixed on the head of a patient. One end of the connecting frame is connected with the head frame. The visual identifier is connected with the other end of the connecting frame. The visual marker is used for generating a first light signal. The visual identifier is arranged in the lighting range of the visual tracking device. The visual tracking device is used for collecting the first optical signal and converting the first optical signal into a first position data signal. The central control device is connected with the visual tracking device. The central control device is used for acquiring the first position data signal in real time and obtaining head displacement according to the first position data signal.
In one embodiment, the connecting frame comprises a plurality of connecting rods connected end to end in sequence.
In one embodiment, two adjacent links are pivotally connected.
In one embodiment, the visual identifier includes a bracket and an optical marker. The support is connected with one end of the connecting frame, which is far away from the head frame. The stent includes at least one branch. At least one branch sets the optical mark. The optical mark is used for generating the first optical signal.
An intraoperative control system including the head position identification system of any of the embodiments described above further includes a robotic arm, a second position identification device, and a controller. The second position marking device is arranged on the mechanical arm. The second position identification device is used for generating a second optical signal for identifying the position of the mechanical arm. The visual tracking device is further configured to collect the second optical signal and convert the second optical signal into a second position data signal. The central control device is also used for acquiring the second position data signal in real time and generating traveling path information according to the head displacement and the second position data signal. The central control device and the mechanical arm are respectively connected with the controller. The central control device is used for outputting the traveling path information to the controller. The controller is used for controlling the mechanical arm to reach a target position according to the traveling path information.
In one embodiment, the intraoperative control system further comprises a mobile cart. The central control device and the controller are accommodated in the movable trolley. The mechanical arm is arranged on the moving trolley.
In one embodiment, the headgear is adapted to be coupled to a patient bed and the trolley is coupled to the headgear.
A method of controlling an intraoperative control system as in any one of the embodiments above, comprising:
and S100, controlling the visual tracking device to acquire the first optical signal and the second optical signal. The visual tracking device converts the first light signal into a first position data signal. The visual tracking device also converts the second light signal to a second position data signal.
S200, controlling the central control device to collect the first position data signal and the second position data signal. The central control device generates travel path information from the first position data signal and the second position data signal.
And S300, controlling the central control device to output the traveling path information to the controller. And the controller controls the mechanical arm to reach the target position according to the traveling path information.
In one embodiment, the step of generating the travel path information from the first position data signal and the second position data signal in S200 includes:
s210, the central control device judges whether the control state of the controller is in an automatic state or a manual state.
And S220, if the controller is in a manual state, the central control device prohibits the mechanical arm from moving through the controller.
S230, the central control device updates the target location information according to the first location data signal. The central control device generates the travel path information according to the target position information and the second position data signal. And transmitting the travel path information to the controller.
In one embodiment, after S210, the method further includes:
and S211, if the controller is in an automatic state, the central control device enables the mechanical arm to stop moving through the controller.
S212, the central control device updates target position information according to the first position data signal, and the central control device generates the traveling path information according to the updated target position information and the second position data signal and sends the traveling path information to the controller.
And S213, the controller controls the mechanical arm to reach the target position according to the traveling path information.
In one embodiment, after the step of the central control device acquiring the first position data signal and the second optical signal in S200, the method further includes:
s201, the central control device obtains the real-time displacement of the head according to the first position data signal and the previous first position data signal, and if the real-time displacement is larger than a preset threshold value, S210 is executed.
In the head position identification system provided by the embodiment of the present application, the visual identifier is connected to the head frame through the connecting frame. The head frame is used for being fixed on the head of a patient. The visual marker forms a rigid connection with the head of the patient. When the head of the human body is displaced, the visual marker moves along with the head. The position of the first optical signal is changed synchronously. The visual tracking device is used for collecting the first optical signal and converting the first optical signal into a first position data signal. The central control device acquires the first optical signal and obtains head displacement according to the first position data signal. The propagation speed of light in air is high. The first optical signal has no time difference with the head movement of the patient. The optical signal is adopted to measure the displacement, so that the precision of the head position identification system is improved. The head position identification system provides accurate head displacement information for the operating equipment, and improves the safety of the operating equipment.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments accompanying the present application are described in detail below with reference to the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of embodiments in many different forms than those described herein and those skilled in the art will be able to make similar modifications without departing from the spirit of the application and it is therefore not intended to be limited to the embodiments disclosed below.
The numbering of the components as such, e.g., "first", "second", etc., is used herein for the purpose of describing the objects only, and does not have any sequential or technical meaning. The term "connected" and "coupled" when used in this application, unless otherwise indicated, includes both direct and indirect connections (couplings). In the description of the present application, it is to be understood that the terms "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like, indicate orientations or positional relationships based on those shown in the drawings, and are used only for convenience in describing the present application and for simplicity in description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and thus, are not to be considered as limiting the present application.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through intervening media. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
Referring to fig. 1, 2 and 3, a headposition identification system 20 according to an embodiment of the present application includes ahead frame 210, a connectingframe 220, avisual identifier 230, avisual tracking device 30 and acentral control device 50. Theheadgear 210 is adapted to be secured to the head of a patient. The connectingframe 220 one end of the connectingframe 220 is connected to thehead frame 210. Thevisual identifier 230 is connected to the other end of theconnection frame 220. Thevisual identifier 230 is used to generate a first light signal. Thevisual marker 230 is disposed within a lighting range of thevisual tracking device 30. Thevisual tracking device 30 is configured to collect the first light signal and convert the first light signal into a first position data signal. Thecentral control device 50 is connected to thevisual tracking device 30. Thecentral control device 50 is configured to acquire the first position data signal in real time, and obtain the head displacement according to the first position data signal.
Embodiments of the present application provide that thevisual identifier 230 of the headposition identification system 20 is connected to theheadgear 210 via the connectingframe 220. Theheadgear 210 is adapted to be secured to the head of a patient. Thevisual marker 230 forms a rigid connection with the patient's head. When the head of the human body is displaced, thevisual marker 230 moves together with the head. The position of the first optical signal is changed synchronously. Thevisual tracking device 30 is configured to collect the first light signal and convert the first light signal into a first position data signal. Thecentral control device 50 acquires the first optical signal and obtains the head displacement according to the first position data signal. The propagation speed of light in air is high. The first optical signal has no time difference with the head movement of the patient. The optical signal is used for measuring the displacement, so that the precision of the headposition identification system 20 is improved. The headposition identification system 20 provides accurate head displacement information for the operating device, and improves the safety of the operating device.
During operation, the body lies on the bed. The headposition identification system 20 keeps the headposition identification system 20 away from the head through theconnection frame 220, and head surgical instrument movement is prevented from being hindered. The headposition identification system 20 allows sufficient space for head surgery instruments to facilitate real-time head surgery. At the same time, thesystem 20 is far away from the head due to the head position. The headposition identification system 20 can monitor the head displacement in real time in the whole course of the operation process. In one embodiment, thecentral control device 50 is further configured to update the head position information according to the first position data signal.
Thecentral control device 50 includes a computer, a CPU, a central control unit or a remote control unit.
Thevisual Marker 230 is a visual tracking Marker. The visual tracking Marker is matched to thevisual tracking apparatus 30.
In one embodiment, thevisual identifier 230 is at least one. Thevisual tracking device 30 is at least one. At least one of thevisual tracking devices 30 is connected to thecentral control device 50.
In one embodiment, thevisual identifier 230 is plural. Thevisual tracking device 30 is plural. The plurality ofvisual markers 230 respectively generate a plurality of first light signals. A plurality of the first light signals may be collected by a plurality of thevisual tracking devices 30 or may be collected by one of thevisual tracking devices 30.
In one embodiment, thecentral control device 50 derives the head displacement from the first position data signal. The head displacement is a vector, including the magnitude of the direction and distance of movement.
Thevisual tracking device 30 is connected to thecentral control device 50 for information transfer. Thevisual tracking device 30 is connected to thecentral control device 50 by a wireless connection or a wired connection. The wireless connection comprises Bluetooth, WIFI or a cellular network and the like. Referring also to fig. 4, in one embodiment, the connectingframe 220 includes a plurality of connectingrods 221 connected end to end in series.
In one embodiment, the maximum diameter of thelink 221 distal from theheadgear 210 is greater than the maximum diameter of thelink 221 proximal to theheadgear 210 to ensure stability of the connectingframe 220.
In one embodiment, the rotational connection between twoadjacent links 221 facilitates adjusting the position of thevisual marker 230.
Thevisual marker 230 may be a marked point or an array of marks, etc. The mark point can be an active luminous body or a passive reflecting body. The mark points can be in regular structures such as spheres, sheets and squares, and can also be in irregular structures. The marker array can be one or more of a polygon array, a cross array or other irregular array. Optical marks are disposed on the array of marks. The optical markers may be active emitters or passive reflectors. The optical mark can be in a regular structure such as a sphere, a sheet, a square and the like, and can also be in an irregular structure. The number of the optical marks is not limited. The kind of the optical mark is not limited.
In one embodiment, thevisual marker 230 includes abracket 231 and anoptical marker 232. Theholder 231 is connected to an end of the connectingframe 220 away from thehead frame 210. Theholder 231 comprises at least onebranch 201. At least onebranch 201 sets theoptical mark 232. Theoptical marker 232 is used to generate the first optical signal.
In one embodiment, theholder 231 includes onebranch 201. One of theoptical markers 232 is disposed at the middle or end of thebranch 201. One end of the connectingframe 220 is connected to the middle or end of thebranch 201.
In one embodiment, theholder 231 includes onebranch 201. A plurality of saidoptical markers 232 is arranged on one of saidbranches 201. The plurality ofoptical marks 232 may be disposed at thebranches 201 in a dispersed manner, or may be disposed at the middle or end of thebranches 201 in a concentrated manner. One end of the connectingframe 220 is connected to the middle or end of thebranch 201.
In one embodiment, theholder 231 includes a plurality ofbranches 201. The plurality ofbranches 201 may be arranged in one or more of a polygonal array, a cross array or other irregular array, and may also be arranged irregularly. A plurality ofoptical marks 232 are respectively disposed on the plurality ofbranches 201. The number ofoptical marks 232 on each of thebranches 201 may be the same or different.
In the above embodiment, at least one of thebranches 201 is not provided with theoptical mark 232.
In one embodiment, thevisual marker 230 includes a cross brace and fouroptical markers 232. Theoptical markers 232 are light-reflecting spheres. The middle part of the cross-shaped bracket is fixedly connected with one end of the connectingframe 220 far away from thehead frame 210. The fouroptical markers 232 are correspondingly arranged at the end of the cross-shaped bracket. Four of theoptical markers 232 are within the light collection range of thevisual tracking device 30.
Displacement monitoring is ensured as long as one of the fouroptical markers 420 is capable of normally generating the first optical signal, increasing the reliability of the visual marker 40.
When the head of thehuman body 100 moves, thehead frame 210, the connectingframe 220, the cross-shaped support and theoptical mark 232 are driven to move. The positions of the fouroptical markers 232 are different, and the positions of the light reflected by the fouroptical markers 232 are different. The first light signal is the light reflection position signal of theoptical mark 232. The light reflection position signals of the fouroptical markers 232 are collected by thevisual tracking device 30, and the data signals corresponding to the light reflection position signals are uploaded to thecentral control device 50.
In the above embodiments, thehead frame 210 may be replaced by other supports to adapt the headposition identification system 20 to other surgical scenarios.
Referring to fig. 5, anintraoperative control system 10 including a headposition identification system 20 according to any of the embodiments is provided. Theintraoperative control system 10 further includes arobotic arm 410, a secondposition identifying device 420, and acontroller 430.
The secondposition identifying device 420 is disposed on therobot 410. The secondposition identifying device 420 is used to generate a second optical signal that identifies the position of therobotic arm 410. Thevisual tracking device 30 is further configured to collect the second light signal and convert the second light signal into a second position data signal. Thecentral control device 50 is further configured to collect the second position data signal in real time, and generate the traveling path information according to the head displacement and the second position data signal. Thecentral control unit 50 and therobot arm 410 are connected to thecontroller 430, respectively. Thecentral control device 50 is configured to output the travel path information to thecontroller 430. Thecontroller 430 is configured to control therobot arm 410 to reach a target position according to the travel path information.
Thecentral control unit 50 and therobot arm 410 are connected to thecontroller 430, respectively. The connection mode comprises wireless connection and wired connection. The wireless connection comprises Bluetooth, WIFI or a cellular network and the like. The surgical procedure, from start to finish, may involve manual and automated operation of therobotic arm 410. Wherein the automated operation may cause the end of therobotic arm 410 to reach a target location according to a planned path. The surgeon then inserts the surgical instrument through the adapter at the end of therobotic arm 410 and into the patient's head for surgery. Therobotic arm 410 functions as a positioning aid throughout the procedure to address the problems of manual positioning inaccuracies and manual jitter for the physician.
The secondposition identifying device 420 is disposed on therobot 410. Therobotic arm 410 moves or rotates. Thesecond position marker 420 moves in synchronization with therobot arm 410. Therobot arm 410 displacement and real-time position may be monitored by monitoring the displacement of the secondposition identification device 420.
Theintraoperative control system 10 provided by the embodiment of the present application simultaneously monitors the movement of the head of the human body and the movement of therobotic arm 410. The intraoperative control system updates the travel path according to the first position data signal and the second position data signal, reduces the operation error caused by the movement of the head of the human body or themechanical arm 410, and improves the accuracy and the safety of theintraoperative control system 10.
In one embodiment, theintraoperative control system 10 further includes amobile cart 60. Thecentral control device 50 and thecontroller 430 are housed in the movingcart 60. Therobot arm 410 is disposed on the movingcart 60.
The travelingcarriage 60 is used to coarsely adjust the position of therobot arm 410. Before surgery, the operator pushes thedolly 60 to the operating range of therobotic arm 410.
Referring also to fig. 6, in one embodiment, thehead frame 210 is adapted to be fixedly connected to a patient's bed, and themobile cart 60 is fixedly connected to thehead frame 210 to substantially fix thehead frame 210 and establish a rigid connection between therobotic arm 410 and thehead frame 210.
In one embodiment, thehead frame 210 is only fixedly connected to the patient bed, and is not connected to themobile cart 60, so as to prevent the patient bed from collapsing and damaging the head and cervical vertebrae of the human body.
In one embodiment, theintraoperative control system 10 further includes adisplay device 70. Thedisplay device 70 is electrically connected to thecentral control device 50. Thedisplay device 70 is used for displaying the movement position information of the human head, the movement information or the traveling path information of therobot arm 410, and the like.
In one embodiment, thedisplay device 70 is fixedly disposed on themobile cart 60, so that an operator can obtain relevant information in time.
Thedisplay device 70 includes an LED display screen, a display or a display device, and the like.
In one embodiment, theintraoperative control system 10 further includes analarm device 80. Thealarm device 80 is connected to thecentral control device 50. Thealarm device 80 includes an audible alarm, a photoelectric alarm, a screen reminder, or the like.
Alarm device 80 set up in removedolly 60, the operator of being convenient for in time discovers alarm information. Thealarm device 80 and thedisplay device 70 are respectively connected to thecentral control device 50. The connection mode comprises wireless connection and wired connection. The wireless connection comprises Bluetooth, WIFI or a cellular network and the like.
Referring to fig. 7, the present embodiment provides a control method of theintraoperative control system 10, wherein theintraoperative control system 10 comprises aheadgear 210, alinkage frame 220, avisual identifier 230, avisual tracking device 30, arobotic arm 410, a secondposition identifying device 420, acontroller 430, and acentral control device 50, theheadgear 210 is adapted to be secured to the head of a patient, the connectingframe 220 is secured at one end to theheadgear 210, thevisual identifier 230 is fixedly connected to the other end of the connectingframe 220, thevisual identifier 230 is used for generating a first light signal, the secondposition identifying device 420 is fixed to therobot arm 410, the secondposition identifying device 420 is used for generating a second optical signal, thevisual tracking device 30, thecontroller 430 and therobot arm 410 are respectively connected to thecentral control device 50, and the control method includes:
s100, controlling thevisual tracking device 30 to collect the first light signal and the second light signal. Thevisual tracking device 30 converts the first light signal into the first position data signal. Thevisual tracking device 30 also converts the second light signal to the second position data signal.
S200, controlling thecentral control device 50 to collect the first position data signal and the second position data signal. Thecentral control device 50 generates travel path information from the first position data signal and the second position data signal.
S300, controlling thecentral control device 50 to output the travel path information to thecontroller 430. Thecontroller 430 controls therobot arm 410 to reach a target position according to the travel path information.
The embodiments of the present application provide a control method for theintraoperative control system 10 while monitoring the movement of the head and the movement of therobotic arm 410. The control method of the intraoperative control system updates a travel path according to the first position data signal and the second position data signal. The control method of the intraoperative control system reduces the operation error caused by the movement of the head of the human body or themechanical arm 410, and improves the accuracy and safety of theintraoperative control system 10.
In one embodiment, before S100, the control method further includes:
s010, adjusting thevisual tracking device 30 to make thevisual identifier 230 and the secondposition identifying device 420 within the lighting range of thevisual tracking device 30.
S010 is the positioning of thevisual tracking apparatus 30.
In one embodiment, the surgeon may manually make gross adjustments to the position of therobotic arm 410 to bring therobotic arm 410 within the confines of an automated operation while performing a procedure on a human body using therobotic arm 410. After the manual coarse adjustment is completed, the path is automatically planned through theintraoperative control system 10, and the fine operation is automatically performed.
In one embodiment, the step of generating the travel path information from the first position data signal and the second position data signal in S200 includes:
s210, thecentral control device 50 determines whether the control state of thecontroller 430 is the automatic state or the manual state.
S220, if thecontroller 430 is in the manual state, thecentral control apparatus 50 prohibits therobot arm 410 from moving through thecontroller 430.
S230, thecentral control device 50 updates the target position information according to the first position data signal, and thecentral control device 50 generates the travel route information according to the target position information and the second position data signal, and sends the travel route information to thecontroller 430.
In one embodiment, after S230, the control method further includes:
s240, thecentral control apparatus 50 controls thecontroller 430 to put therobot arm 410 in a standby state, and therobot arm 410 waits for a doctor' S operation.
In one embodiment, after S210, the control method further includes:
s211, if thecontroller 430 is in the automatic state, thecentral control apparatus 50 stops therobot 410 by thecontroller 430.
S212, thecentral control device 50 updates the target position information according to the first position data signal, and thecentral control device 50 generates the travel route information according to the updated target position information and the second position data signal, and sends the travel route information to thecontroller 430.
S213, thecontroller 430 controls therobot arm 410 to reach a target position according to the travel path information.
In one embodiment, after the step of thecentral control device 50 acquiring the first position data signal and the second optical signal in S200, the control method further includes:
s201, thecentral control device 50 obtains a real-time displacement of the head according to the first position data signal and the previous first position data signal, and if the real-time displacement is greater than a predetermined threshold, S210 is executed.
In one embodiment, the predetermined threshold is 0.3mm, which can ensure both the safety of the human body and the smooth operation.
The intraoperative control system described above may also be applied to other surgical sites. The position identification device may be provided at the site to be operated. The control method is used for controlling the intraoperative control system to perform other surgical site operations.
Although the individual steps in the above flowcharts are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the flowchart may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-described examples merely represent several embodiments of the present application and are not to be construed as limiting the scope of the claims. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.