Movatterモバイル変換


[0]ホーム

URL:


CN111407406B - Head position identification system, intraoperative control system and control method - Google Patents

Head position identification system, intraoperative control system and control method
Download PDF

Info

Publication number
CN111407406B
CN111407406BCN202010243839.3ACN202010243839ACN111407406BCN 111407406 BCN111407406 BCN 111407406BCN 202010243839 ACN202010243839 ACN 202010243839ACN 111407406 BCN111407406 BCN 111407406B
Authority
CN
China
Prior art keywords
data signal
position data
head
central control
control device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010243839.3A
Other languages
Chinese (zh)
Other versions
CN111407406A (en
Inventor
汪全全
李盛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan United Imaging Zhirong Medical Technology Co Ltd
Original Assignee
Wuhan United Imaging Zhirong Medical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan United Imaging Zhirong Medical Technology Co LtdfiledCriticalWuhan United Imaging Zhirong Medical Technology Co Ltd
Priority to CN202010243839.3ApriorityCriticalpatent/CN111407406B/en
Publication of CN111407406ApublicationCriticalpatent/CN111407406A/en
Application grantedgrantedCritical
Publication of CN111407406BpublicationCriticalpatent/CN111407406B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The application relates to a head position identification system, an intraoperative control system and a control method. The visual identifier of the head position identification system is fixed on the head frame through the connecting frame. The headgear is adapted to be secured to the head of a patient. The visual marker forms a rigid connection with the head of the patient. When the head of the human body is displaced, the visual marker moves along with the head. The position of the first optical signal is changed synchronously. The visual tracking device is used for collecting the first optical signal and converting the first optical signal into a first position data signal. The central control device acquires the first optical signal and obtains head displacement according to the first position data signal. The propagation speed of light in air is high. The optical signal is used for measuring the displacement, so that the precision of the head position identification system is improved. The head position identification system provides accurate head displacement information for the operating equipment, and improves the safety of the operating equipment.

Description

Head position identification system, intraoperative control system and control method
Technical Field
The present application relates to the field of medical technology, and in particular, to a head position identification system, an intraoperative control system, and a control method.
Background
Conventional surgical operating devices do not have the capability to detect head movement of a patient during use. The surgical operating equipment needs to establish a rigid connection with the patient's head. The head of the human body cannot move or rotate relative to the surgical operating device throughout the surgical procedure.
If the operation operating equipment or the operating table collapses in the operation process, serious damage can be caused to the patient. Therefore, how to improve the security of the operating device is an urgent problem to be solved.
Disclosure of Invention
In view of the above, it is necessary to provide a head position identification system, an intraoperative control system, and a control method, which are directed to the problem of how to improve the safety of an operating device.
A head position identification system includes a headgear, a connecting frame, a visual identifier, a visual tracking device, and a central control device. The head frame is used for being fixed on the head of a patient. One end of the connecting frame is connected with the head frame. The visual identifier is connected with the other end of the connecting frame. The visual marker is used for generating a first light signal. The visual identifier is arranged in the lighting range of the visual tracking device. The visual tracking device is used for collecting the first optical signal and converting the first optical signal into a first position data signal. The central control device is connected with the visual tracking device. The central control device is used for acquiring the first position data signal in real time and obtaining head displacement according to the first position data signal.
In one embodiment, the connecting frame comprises a plurality of connecting rods connected end to end in sequence.
In one embodiment, two adjacent links are pivotally connected.
In one embodiment, the visual identifier includes a bracket and an optical marker. The support is connected with one end of the connecting frame, which is far away from the head frame. The stent includes at least one branch. At least one branch sets the optical mark. The optical mark is used for generating the first optical signal.
An intraoperative control system including the head position identification system of any of the embodiments described above further includes a robotic arm, a second position identification device, and a controller. The second position marking device is arranged on the mechanical arm. The second position identification device is used for generating a second optical signal for identifying the position of the mechanical arm. The visual tracking device is further configured to collect the second optical signal and convert the second optical signal into a second position data signal. The central control device is also used for acquiring the second position data signal in real time and generating traveling path information according to the head displacement and the second position data signal. The central control device and the mechanical arm are respectively connected with the controller. The central control device is used for outputting the traveling path information to the controller. The controller is used for controlling the mechanical arm to reach a target position according to the traveling path information.
In one embodiment, the intraoperative control system further comprises a mobile cart. The central control device and the controller are accommodated in the movable trolley. The mechanical arm is arranged on the moving trolley.
In one embodiment, the headgear is adapted to be coupled to a patient bed and the trolley is coupled to the headgear.
A method of controlling an intraoperative control system as in any one of the embodiments above, comprising:
and S100, controlling the visual tracking device to acquire the first optical signal and the second optical signal. The visual tracking device converts the first light signal into a first position data signal. The visual tracking device also converts the second light signal to a second position data signal.
S200, controlling the central control device to collect the first position data signal and the second position data signal. The central control device generates travel path information from the first position data signal and the second position data signal.
And S300, controlling the central control device to output the traveling path information to the controller. And the controller controls the mechanical arm to reach the target position according to the traveling path information.
In one embodiment, the step of generating the travel path information from the first position data signal and the second position data signal in S200 includes:
s210, the central control device judges whether the control state of the controller is in an automatic state or a manual state.
And S220, if the controller is in a manual state, the central control device prohibits the mechanical arm from moving through the controller.
S230, the central control device updates the target location information according to the first location data signal. The central control device generates the travel path information according to the target position information and the second position data signal. And transmitting the travel path information to the controller.
In one embodiment, after S210, the method further includes:
and S211, if the controller is in an automatic state, the central control device enables the mechanical arm to stop moving through the controller.
S212, the central control device updates target position information according to the first position data signal, and the central control device generates the traveling path information according to the updated target position information and the second position data signal and sends the traveling path information to the controller.
And S213, the controller controls the mechanical arm to reach the target position according to the traveling path information.
In one embodiment, after the step of the central control device acquiring the first position data signal and the second optical signal in S200, the method further includes:
s201, the central control device obtains the real-time displacement of the head according to the first position data signal and the previous first position data signal, and if the real-time displacement is larger than a preset threshold value, S210 is executed.
In the head position identification system provided by the embodiment of the present application, the visual identifier is connected to the head frame through the connecting frame. The head frame is used for being fixed on the head of a patient. The visual marker forms a rigid connection with the head of the patient. When the head of the human body is displaced, the visual marker moves along with the head. The position of the first optical signal is changed synchronously. The visual tracking device is used for collecting the first optical signal and converting the first optical signal into a first position data signal. The central control device acquires the first optical signal and obtains head displacement according to the first position data signal. The propagation speed of light in air is high. The first optical signal has no time difference with the head movement of the patient. The optical signal is adopted to measure the displacement, so that the precision of the head position identification system is improved. The head position identification system provides accurate head displacement information for the operating equipment, and improves the safety of the operating equipment.
Drawings
FIG. 1 is a schematic structural view of the head position identification system and the intraoperative control system provided in one embodiment of the present application;
FIG. 2 is a communication diagram of the head position identification system provided in one embodiment of the present application;
FIG. 3 is a schematic diagram of a partial structure of the head position identification system provided in an embodiment of the present application;
FIG. 4 is a communication diagram of the intraoperative control system provided in one embodiment of the present application;
FIG. 5 is a schematic structural diagram of the intraoperative control system provided in another embodiment of the present application;
FIG. 6 is a flow chart of a control method of the intraoperative control system provided in one embodiment of the present application;
fig. 7 is a flowchart of a control method of the intraoperative control system provided in another embodiment of the present application.
Reference numerals:
intraoperative control system 10
Human body 100
Headposition identification system 20
Head frame 210
Connectingframe 220
Connecting rod 221
Visual identifier 230
Support 231
Branch 201
Optical marker 232
Visual tracking device 30
Therobotic arm 410
Secondposition identification device 420
Controller 430
Central control device 50
Movable trolley 60
Display device 70
Alarm device 80
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments accompanying the present application are described in detail below with reference to the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of embodiments in many different forms than those described herein and those skilled in the art will be able to make similar modifications without departing from the spirit of the application and it is therefore not intended to be limited to the embodiments disclosed below.
The numbering of the components as such, e.g., "first", "second", etc., is used herein for the purpose of describing the objects only, and does not have any sequential or technical meaning. The term "connected" and "coupled" when used in this application, unless otherwise indicated, includes both direct and indirect connections (couplings). In the description of the present application, it is to be understood that the terms "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like, indicate orientations or positional relationships based on those shown in the drawings, and are used only for convenience in describing the present application and for simplicity in description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and thus, are not to be considered as limiting the present application.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through intervening media. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
Referring to fig. 1, 2 and 3, a headposition identification system 20 according to an embodiment of the present application includes ahead frame 210, a connectingframe 220, avisual identifier 230, avisual tracking device 30 and acentral control device 50. Theheadgear 210 is adapted to be secured to the head of a patient. The connectingframe 220 one end of the connectingframe 220 is connected to thehead frame 210. Thevisual identifier 230 is connected to the other end of theconnection frame 220. Thevisual identifier 230 is used to generate a first light signal. Thevisual marker 230 is disposed within a lighting range of thevisual tracking device 30. Thevisual tracking device 30 is configured to collect the first light signal and convert the first light signal into a first position data signal. Thecentral control device 50 is connected to thevisual tracking device 30. Thecentral control device 50 is configured to acquire the first position data signal in real time, and obtain the head displacement according to the first position data signal.
Embodiments of the present application provide that thevisual identifier 230 of the headposition identification system 20 is connected to theheadgear 210 via the connectingframe 220. Theheadgear 210 is adapted to be secured to the head of a patient. Thevisual marker 230 forms a rigid connection with the patient's head. When the head of the human body is displaced, thevisual marker 230 moves together with the head. The position of the first optical signal is changed synchronously. Thevisual tracking device 30 is configured to collect the first light signal and convert the first light signal into a first position data signal. Thecentral control device 50 acquires the first optical signal and obtains the head displacement according to the first position data signal. The propagation speed of light in air is high. The first optical signal has no time difference with the head movement of the patient. The optical signal is used for measuring the displacement, so that the precision of the headposition identification system 20 is improved. The headposition identification system 20 provides accurate head displacement information for the operating device, and improves the safety of the operating device.
During operation, the body lies on the bed. The headposition identification system 20 keeps the headposition identification system 20 away from the head through theconnection frame 220, and head surgical instrument movement is prevented from being hindered. The headposition identification system 20 allows sufficient space for head surgery instruments to facilitate real-time head surgery. At the same time, thesystem 20 is far away from the head due to the head position. The headposition identification system 20 can monitor the head displacement in real time in the whole course of the operation process. In one embodiment, thecentral control device 50 is further configured to update the head position information according to the first position data signal.
Thecentral control device 50 includes a computer, a CPU, a central control unit or a remote control unit.
Thevisual Marker 230 is a visual tracking Marker. The visual tracking Marker is matched to thevisual tracking apparatus 30.
In one embodiment, thevisual identifier 230 is at least one. Thevisual tracking device 30 is at least one. At least one of thevisual tracking devices 30 is connected to thecentral control device 50.
In one embodiment, thevisual identifier 230 is plural. Thevisual tracking device 30 is plural. The plurality ofvisual markers 230 respectively generate a plurality of first light signals. A plurality of the first light signals may be collected by a plurality of thevisual tracking devices 30 or may be collected by one of thevisual tracking devices 30.
In one embodiment, thecentral control device 50 derives the head displacement from the first position data signal. The head displacement is a vector, including the magnitude of the direction and distance of movement.
Thevisual tracking device 30 is connected to thecentral control device 50 for information transfer. Thevisual tracking device 30 is connected to thecentral control device 50 by a wireless connection or a wired connection. The wireless connection comprises Bluetooth, WIFI or a cellular network and the like. Referring also to fig. 4, in one embodiment, the connectingframe 220 includes a plurality of connectingrods 221 connected end to end in series.
In one embodiment, the maximum diameter of thelink 221 distal from theheadgear 210 is greater than the maximum diameter of thelink 221 proximal to theheadgear 210 to ensure stability of the connectingframe 220.
In one embodiment, the rotational connection between twoadjacent links 221 facilitates adjusting the position of thevisual marker 230.
Thevisual marker 230 may be a marked point or an array of marks, etc. The mark point can be an active luminous body or a passive reflecting body. The mark points can be in regular structures such as spheres, sheets and squares, and can also be in irregular structures. The marker array can be one or more of a polygon array, a cross array or other irregular array. Optical marks are disposed on the array of marks. The optical markers may be active emitters or passive reflectors. The optical mark can be in a regular structure such as a sphere, a sheet, a square and the like, and can also be in an irregular structure. The number of the optical marks is not limited. The kind of the optical mark is not limited.
In one embodiment, thevisual marker 230 includes abracket 231 and anoptical marker 232. Theholder 231 is connected to an end of the connectingframe 220 away from thehead frame 210. Theholder 231 comprises at least onebranch 201. At least onebranch 201 sets theoptical mark 232. Theoptical marker 232 is used to generate the first optical signal.
In one embodiment, theholder 231 includes onebranch 201. One of theoptical markers 232 is disposed at the middle or end of thebranch 201. One end of the connectingframe 220 is connected to the middle or end of thebranch 201.
In one embodiment, theholder 231 includes onebranch 201. A plurality of saidoptical markers 232 is arranged on one of saidbranches 201. The plurality ofoptical marks 232 may be disposed at thebranches 201 in a dispersed manner, or may be disposed at the middle or end of thebranches 201 in a concentrated manner. One end of the connectingframe 220 is connected to the middle or end of thebranch 201.
In one embodiment, theholder 231 includes a plurality ofbranches 201. The plurality ofbranches 201 may be arranged in one or more of a polygonal array, a cross array or other irregular array, and may also be arranged irregularly. A plurality ofoptical marks 232 are respectively disposed on the plurality ofbranches 201. The number ofoptical marks 232 on each of thebranches 201 may be the same or different.
In the above embodiment, at least one of thebranches 201 is not provided with theoptical mark 232.
In one embodiment, thevisual marker 230 includes a cross brace and fouroptical markers 232. Theoptical markers 232 are light-reflecting spheres. The middle part of the cross-shaped bracket is fixedly connected with one end of the connectingframe 220 far away from thehead frame 210. The fouroptical markers 232 are correspondingly arranged at the end of the cross-shaped bracket. Four of theoptical markers 232 are within the light collection range of thevisual tracking device 30.
Displacement monitoring is ensured as long as one of the fouroptical markers 420 is capable of normally generating the first optical signal, increasing the reliability of the visual marker 40.
When the head of thehuman body 100 moves, thehead frame 210, the connectingframe 220, the cross-shaped support and theoptical mark 232 are driven to move. The positions of the fouroptical markers 232 are different, and the positions of the light reflected by the fouroptical markers 232 are different. The first light signal is the light reflection position signal of theoptical mark 232. The light reflection position signals of the fouroptical markers 232 are collected by thevisual tracking device 30, and the data signals corresponding to the light reflection position signals are uploaded to thecentral control device 50.
In the above embodiments, thehead frame 210 may be replaced by other supports to adapt the headposition identification system 20 to other surgical scenarios.
Referring to fig. 5, anintraoperative control system 10 including a headposition identification system 20 according to any of the embodiments is provided. Theintraoperative control system 10 further includes arobotic arm 410, a secondposition identifying device 420, and acontroller 430.
The secondposition identifying device 420 is disposed on therobot 410. The secondposition identifying device 420 is used to generate a second optical signal that identifies the position of therobotic arm 410. Thevisual tracking device 30 is further configured to collect the second light signal and convert the second light signal into a second position data signal. Thecentral control device 50 is further configured to collect the second position data signal in real time, and generate the traveling path information according to the head displacement and the second position data signal. Thecentral control unit 50 and therobot arm 410 are connected to thecontroller 430, respectively. Thecentral control device 50 is configured to output the travel path information to thecontroller 430. Thecontroller 430 is configured to control therobot arm 410 to reach a target position according to the travel path information.
Thecentral control unit 50 and therobot arm 410 are connected to thecontroller 430, respectively. The connection mode comprises wireless connection and wired connection. The wireless connection comprises Bluetooth, WIFI or a cellular network and the like. The surgical procedure, from start to finish, may involve manual and automated operation of therobotic arm 410. Wherein the automated operation may cause the end of therobotic arm 410 to reach a target location according to a planned path. The surgeon then inserts the surgical instrument through the adapter at the end of therobotic arm 410 and into the patient's head for surgery. Therobotic arm 410 functions as a positioning aid throughout the procedure to address the problems of manual positioning inaccuracies and manual jitter for the physician.
The secondposition identifying device 420 is disposed on therobot 410. Therobotic arm 410 moves or rotates. Thesecond position marker 420 moves in synchronization with therobot arm 410. Therobot arm 410 displacement and real-time position may be monitored by monitoring the displacement of the secondposition identification device 420.
Theintraoperative control system 10 provided by the embodiment of the present application simultaneously monitors the movement of the head of the human body and the movement of therobotic arm 410. The intraoperative control system updates the travel path according to the first position data signal and the second position data signal, reduces the operation error caused by the movement of the head of the human body or themechanical arm 410, and improves the accuracy and the safety of theintraoperative control system 10.
In one embodiment, theintraoperative control system 10 further includes amobile cart 60. Thecentral control device 50 and thecontroller 430 are housed in the movingcart 60. Therobot arm 410 is disposed on the movingcart 60.
The travelingcarriage 60 is used to coarsely adjust the position of therobot arm 410. Before surgery, the operator pushes thedolly 60 to the operating range of therobotic arm 410.
Referring also to fig. 6, in one embodiment, thehead frame 210 is adapted to be fixedly connected to a patient's bed, and themobile cart 60 is fixedly connected to thehead frame 210 to substantially fix thehead frame 210 and establish a rigid connection between therobotic arm 410 and thehead frame 210.
In one embodiment, thehead frame 210 is only fixedly connected to the patient bed, and is not connected to themobile cart 60, so as to prevent the patient bed from collapsing and damaging the head and cervical vertebrae of the human body.
In one embodiment, theintraoperative control system 10 further includes adisplay device 70. Thedisplay device 70 is electrically connected to thecentral control device 50. Thedisplay device 70 is used for displaying the movement position information of the human head, the movement information or the traveling path information of therobot arm 410, and the like.
In one embodiment, thedisplay device 70 is fixedly disposed on themobile cart 60, so that an operator can obtain relevant information in time.
Thedisplay device 70 includes an LED display screen, a display or a display device, and the like.
In one embodiment, theintraoperative control system 10 further includes analarm device 80. Thealarm device 80 is connected to thecentral control device 50. Thealarm device 80 includes an audible alarm, a photoelectric alarm, a screen reminder, or the like.
Alarm device 80 set up in removedolly 60, the operator of being convenient for in time discovers alarm information. Thealarm device 80 and thedisplay device 70 are respectively connected to thecentral control device 50. The connection mode comprises wireless connection and wired connection. The wireless connection comprises Bluetooth, WIFI or a cellular network and the like.
Referring to fig. 7, the present embodiment provides a control method of theintraoperative control system 10, wherein theintraoperative control system 10 comprises aheadgear 210, alinkage frame 220, avisual identifier 230, avisual tracking device 30, arobotic arm 410, a secondposition identifying device 420, acontroller 430, and acentral control device 50, theheadgear 210 is adapted to be secured to the head of a patient, the connectingframe 220 is secured at one end to theheadgear 210, thevisual identifier 230 is fixedly connected to the other end of the connectingframe 220, thevisual identifier 230 is used for generating a first light signal, the secondposition identifying device 420 is fixed to therobot arm 410, the secondposition identifying device 420 is used for generating a second optical signal, thevisual tracking device 30, thecontroller 430 and therobot arm 410 are respectively connected to thecentral control device 50, and the control method includes:
s100, controlling thevisual tracking device 30 to collect the first light signal and the second light signal. Thevisual tracking device 30 converts the first light signal into the first position data signal. Thevisual tracking device 30 also converts the second light signal to the second position data signal.
S200, controlling thecentral control device 50 to collect the first position data signal and the second position data signal. Thecentral control device 50 generates travel path information from the first position data signal and the second position data signal.
S300, controlling thecentral control device 50 to output the travel path information to thecontroller 430. Thecontroller 430 controls therobot arm 410 to reach a target position according to the travel path information.
The embodiments of the present application provide a control method for theintraoperative control system 10 while monitoring the movement of the head and the movement of therobotic arm 410. The control method of the intraoperative control system updates a travel path according to the first position data signal and the second position data signal. The control method of the intraoperative control system reduces the operation error caused by the movement of the head of the human body or themechanical arm 410, and improves the accuracy and safety of theintraoperative control system 10.
In one embodiment, before S100, the control method further includes:
s010, adjusting thevisual tracking device 30 to make thevisual identifier 230 and the secondposition identifying device 420 within the lighting range of thevisual tracking device 30.
S010 is the positioning of thevisual tracking apparatus 30.
In one embodiment, the surgeon may manually make gross adjustments to the position of therobotic arm 410 to bring therobotic arm 410 within the confines of an automated operation while performing a procedure on a human body using therobotic arm 410. After the manual coarse adjustment is completed, the path is automatically planned through theintraoperative control system 10, and the fine operation is automatically performed.
In one embodiment, the step of generating the travel path information from the first position data signal and the second position data signal in S200 includes:
s210, thecentral control device 50 determines whether the control state of thecontroller 430 is the automatic state or the manual state.
S220, if thecontroller 430 is in the manual state, thecentral control apparatus 50 prohibits therobot arm 410 from moving through thecontroller 430.
S230, thecentral control device 50 updates the target position information according to the first position data signal, and thecentral control device 50 generates the travel route information according to the target position information and the second position data signal, and sends the travel route information to thecontroller 430.
In one embodiment, after S230, the control method further includes:
s240, thecentral control apparatus 50 controls thecontroller 430 to put therobot arm 410 in a standby state, and therobot arm 410 waits for a doctor' S operation.
In one embodiment, after S210, the control method further includes:
s211, if thecontroller 430 is in the automatic state, thecentral control apparatus 50 stops therobot 410 by thecontroller 430.
S212, thecentral control device 50 updates the target position information according to the first position data signal, and thecentral control device 50 generates the travel route information according to the updated target position information and the second position data signal, and sends the travel route information to thecontroller 430.
S213, thecontroller 430 controls therobot arm 410 to reach a target position according to the travel path information.
In one embodiment, after the step of thecentral control device 50 acquiring the first position data signal and the second optical signal in S200, the control method further includes:
s201, thecentral control device 50 obtains a real-time displacement of the head according to the first position data signal and the previous first position data signal, and if the real-time displacement is greater than a predetermined threshold, S210 is executed.
In one embodiment, the predetermined threshold is 0.3mm, which can ensure both the safety of the human body and the smooth operation.
The intraoperative control system described above may also be applied to other surgical sites. The position identification device may be provided at the site to be operated. The control method is used for controlling the intraoperative control system to perform other surgical site operations.
Although the individual steps in the above flowcharts are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the flowchart may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-described examples merely represent several embodiments of the present application and are not to be construed as limiting the scope of the claims. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

a second position identification device (420) disposed on the robot arm (410), the second position identification device (420) being configured to generate a second optical signal for identifying the position of the robot arm (410), the visual tracking device (30) being further configured to acquire the second optical signal, the visual tracking device (30) being further configured to convert the second optical signal into a second position data signal, the central control device (50) being further configured to acquire the second position data signal in real time and generate travel path information according to the head displacement and the second position data signal, the central control device (50) being further configured to update the travel path information according to the first position data signal and the second position data signal;
7. A method for controlling an intraoperative control system, characterized in that the intraoperative control system (10) comprises a head frame (210), a connecting frame (220), a visual identifier (230), a visual tracking device (30), a mechanical arm (410), a second position identifier (420), a controller (430) and a central control device (50), wherein the head frame (210) is used for being fixed on the head of a patient, one end of the connecting frame (220) is connected to the head frame (210), the visual identifier (230) is connected with the other end of the connecting frame (220), the visual identifier (230) is used for generating a first light signal, the second position identifier (420) is arranged on the mechanical arm (410), the second position identifier (420) is used for generating a second light signal, the visual tracking device (30), the controller (430) and the mechanical arm (410) are respectively connected with the central control device (50), the control method comprises the following steps:
CN202010243839.3A2020-03-312020-03-31Head position identification system, intraoperative control system and control methodActiveCN111407406B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202010243839.3ACN111407406B (en)2020-03-312020-03-31Head position identification system, intraoperative control system and control method

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202010243839.3ACN111407406B (en)2020-03-312020-03-31Head position identification system, intraoperative control system and control method

Publications (2)

Publication NumberPublication Date
CN111407406A CN111407406A (en)2020-07-14
CN111407406Btrue CN111407406B (en)2022-04-26

Family

ID=71485348

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202010243839.3AActiveCN111407406B (en)2020-03-312020-03-31Head position identification system, intraoperative control system and control method

Country Status (1)

CountryLink
CN (1)CN111407406B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101257844A (en)*2005-04-292008-09-03范德比特大学 Systems and methods for providing access to the cochlea in vivo using image navigation
CN103445929A (en)*2013-09-102013-12-18河南科技大学第一附属医院Operation fixing cap
CN103735317A (en)*2013-12-182014-04-23宁波市全灵医疗设备股份有限公司Navigation device in orthopedics department and preparation method of navigation device
CN105050527A (en)*2013-03-152015-11-11圣纳普医疗(巴巴多斯)公司Intelligent positioning system and methods therefore
CN105193458A (en)*2005-12-282015-12-30Pt稳定股份公司Method and system for compensating a self-caused displacement of tissue
CN105208958A (en)*2013-03-152015-12-30圣纳普医疗(巴巴多斯)公司Systems and methods for navigation and simulation of minimally invasive therapy
CN106580470A (en)*2016-10-182017-04-26南京医科大学附属口腔医院System and method for head positioning on basis of binocular vision
CN106687063A (en)*2014-08-132017-05-17株式会社高永科技Tracking system and tracking method using same
CN108201470A (en)*2016-12-162018-06-26上海铂联医疗科技有限公司A kind of autonomous type tooth-implanting robot system and its device and method
CN109330687A (en)*2018-11-262019-02-15上海术凯机器人有限公司A kind of surgical robot system
CN109571412A (en)*2019-01-152019-04-05北京华晟经世信息技术有限公司A kind of mechanical arm independent navigation mobile system and method
CN109864806A (en)*2018-12-192019-06-11江苏集萃智能制造技术研究所有限公司The Needle-driven Robot navigation system of dynamic compensation function based on binocular vision
CN110494095A (en)*2017-04-202019-11-22直观外科手术操作公司System and method for constraining virtual reality surgery systems
CN110650703A (en)*2017-05-052020-01-03斯科皮斯有限公司 Surgical Navigation System

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN104083217B (en)*2014-07-032016-08-17北京天智航医疗科技股份有限公司A kind of surgery positioning device and robotic surgical system
JP6657933B2 (en)*2015-12-252020-03-04ソニー株式会社 Medical imaging device and surgical navigation system
US10631935B2 (en)*2016-10-252020-04-28Biosense Webster (Israel) Ltd.Head registration using a personalized gripper
KR102019482B1 (en)*2017-07-312019-09-06경북대학교 산학협력단Optical tracking system and controlling method thereof
CN107440797B (en)*2017-08-212020-04-03刘洋Registration and registration system and method for surgical navigation
CN108042218A (en)*2017-12-052018-05-18北京军秀咨询有限公司A kind of neurosurgery patient head mark automatic vision positioner and method
CN107970060A (en)*2018-01-112018-05-01上海联影医疗科技有限公司Surgical robot system and its control method
CN108705536A (en)*2018-06-052018-10-26雅客智慧(北京)科技有限公司A kind of the dentistry robot path planning system and method for view-based access control model navigation
CN109692050B (en)*2018-12-262020-05-22雅客智慧(北京)科技有限公司Calibration and tracking method and device for dental implant navigation operation

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101257844A (en)*2005-04-292008-09-03范德比特大学 Systems and methods for providing access to the cochlea in vivo using image navigation
CN105193458A (en)*2005-12-282015-12-30Pt稳定股份公司Method and system for compensating a self-caused displacement of tissue
CN105050527A (en)*2013-03-152015-11-11圣纳普医疗(巴巴多斯)公司Intelligent positioning system and methods therefore
CN105208958A (en)*2013-03-152015-12-30圣纳普医疗(巴巴多斯)公司Systems and methods for navigation and simulation of minimally invasive therapy
CN103445929A (en)*2013-09-102013-12-18河南科技大学第一附属医院Operation fixing cap
CN103735317A (en)*2013-12-182014-04-23宁波市全灵医疗设备股份有限公司Navigation device in orthopedics department and preparation method of navigation device
CN106687063A (en)*2014-08-132017-05-17株式会社高永科技Tracking system and tracking method using same
CN106580470A (en)*2016-10-182017-04-26南京医科大学附属口腔医院System and method for head positioning on basis of binocular vision
CN108201470A (en)*2016-12-162018-06-26上海铂联医疗科技有限公司A kind of autonomous type tooth-implanting robot system and its device and method
CN110494095A (en)*2017-04-202019-11-22直观外科手术操作公司System and method for constraining virtual reality surgery systems
CN110650703A (en)*2017-05-052020-01-03斯科皮斯有限公司 Surgical Navigation System
CN109330687A (en)*2018-11-262019-02-15上海术凯机器人有限公司A kind of surgical robot system
CN109864806A (en)*2018-12-192019-06-11江苏集萃智能制造技术研究所有限公司The Needle-driven Robot navigation system of dynamic compensation function based on binocular vision
CN109571412A (en)*2019-01-152019-04-05北京华晟经世信息技术有限公司A kind of mechanical arm independent navigation mobile system and method

Also Published As

Publication numberPublication date
CN111407406A (en)2020-07-14

Similar Documents

PublicationPublication DateTitle
AU2020399817B2 (en)Navigation surgery system and registration method therefor, electronic device, and support apparatus
JP6840815B2 (en) Surgical robot automation with tracking markers
CN216021360U (en) A surgical navigation system
US11717350B2 (en)Methods for robotic assistance and navigation in spinal surgery and related systems
JP6714737B2 (en) Surgical robot system and related method for monitoring target trajectory deviation
CN110652359B (en)Surgical robot system
US7894872B2 (en)Computer assisted orthopaedic surgery system with light source and associated method
JP6704034B2 (en) Surgical robot system with retractor
WO2020151598A1 (en)Surgery robot system and use method therefor
CN111317572A (en)Surgical robot automation with tracking markers
JP2018011938A (en)Surgical robotic automation with tracking markers
JP6751461B2 (en) Surgical robot automation with tracking markers
JP6894466B2 (en) Systems and methods related to robotic guidance in surgery
CN114431960A (en) A method for identifying and segmenting anatomical structures from cone-beam CT images
CN109549706A (en)A kind of surgical operation auxiliary system and its application method
JP6979049B2 (en) Robot systems and related methods that provide co-registration using natural standards
JP2021003552A (en)Surgical robotic automation with tracking markers
JP7082090B2 (en) How to tune virtual implants and related surgical navigation systems
JP2021041166A (en) Surgical robot automation with tracking markers
CN113262049B (en) System and method for determining the optimal 3-dimensional position and orientation of an imaging device for imaging a patient's skeleton
CN111407406B (en)Head position identification system, intraoperative control system and control method
JP2018108344A (en)System and method for measuring depth of instruments
WO2025137057A1 (en)Hybrid optical-inertial bone tracking
CN209826970U (en)Surgical operation auxiliary system
CN110636797B (en) Device and method for determining positioning data of an X-ray image acquisition device on a mobile patient support unit

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp