Disclosure of Invention
The following is a summary of the subject matter described in detail herein. This summary is not intended to limit the scope of the claims.
The application aims to at least solve one of the technical problems existing in the related art to a certain extent, and the embodiment of the application provides a surgical positioning and guiding method, a device, equipment and a medium, which can reflect a guiding workpiece in an image accurately in real time.
An embodiment of a first aspect of the present application is a surgical positioning and guiding method applied to a surgical positioning and guiding device, the surgical positioning and guiding device including a positioning workpiece and a guiding workpiece, the positioning workpiece including a plurality of reflective balls, the reflective balls and the positioning balls forming an optical module, the surgical positioning and guiding method including the steps of:
acquiring a conversion relation between a coordinate system of the optical module and a coordinate system of the preoperative image;
Acquiring coordinates of a first vector, coordinates of a second vector and an angle of rotation of the first vector around the second vector, wherein the first vector is a vector formed by the origin of a coordinate system of the positioning ball and the optical module, and the second vector is a normal vector of a projection plane formed by a plurality of reflecting balls;
determining coordinate information of the guiding workpiece under the coordinate system of the optical module according to the coordinates of the first vector, the coordinates of the second vector and the rotation angle of the first vector around the second vector;
According to the conversion relation between the coordinate system of the optical module and the coordinate system of the preoperative image, converting the coordinate information of the guiding workpiece under the coordinate system of the optical module to obtain the coordinate information of the guiding workpiece under the coordinate system of the preoperative image;
And displaying the guiding workpiece in the preoperative image according to the coordinate information of the guiding workpiece in the coordinate system of the preoperative image.
In certain embodiments of the first aspect of the present application, the acquiring a conversion relationship between a coordinate system of the optical module and a coordinate system of the preoperative image includes:
acquiring a first coordinate value of a plurality of auxiliary reflecting balls of an auxiliary tool under a coordinate system of an optical module and a second coordinate value of a plurality of auxiliary reflecting balls of an auxiliary tool under a coordinate system of a preoperative image;
and determining a conversion relation between the coordinate system of the optical module and the coordinate system of the preoperative image according to the first coordinate value and the second coordinate value.
In certain embodiments of the first aspect of the present application, the determining a conversion relationship between a coordinate system of the optical module and a coordinate system of the preoperative image according to the first coordinate value and the second coordinate value includes:
determining a plurality of first coordinate values in a preset time period as target coordinate values;
determining a respiratory cycle according to the maximum value of the target coordinate value;
And sampling the first coordinate value and the second coordinate value in stages based on the breathing period so as to determine the conversion relation between the coordinate system of the optical module and the coordinate system of the preoperative image.
In certain embodiments of the first aspect of the present application, the determining the coordinate information of the guiding workpiece under the coordinate system of the optical module according to the coordinates of the first vector, the coordinates of the second vector and the rotation angle of the first vector around the second vector is expressed by the following formula of P '=p×cos θ+ (n×p) sin θ+n (n·p) (1-cos θ), where P is the first vector, N is the second vector, θ is the rotation angle of the first vector around the second vector, and P' is the coordinate information of the guiding workpiece under the coordinate system of the optical module.
In certain embodiments of the first aspect of the present application, the coordinates of the first vector are denoted as (px, py, pz), the coordinates of the second vector are denoted as (ax, by, cz), and the coordinate information of the guiding object under the coordinate system of the optical module is denoted as (px ', py ', pz '), wherein ,px'=px*cosθ+(ay*pz-az*py)sinθ+ax(ax*px+ay*py+az*pz)(1-cosθ),py'=py*cosθ+(ax*pz-az*px)sinθ+ay(ax*px+ay*py+az*pz)(1-cosθ),pz'=pz*cosθ+(ax*py-ay*px)sinθ+az(ax*px+ay*py+az*pz)(1-cosθ).
An embodiment of the second aspect of the application is a surgical positioning and guiding device, which applies the surgical positioning and guiding method as described above, wherein the surgical positioning and guiding device comprises a positioning workpiece and a guiding workpiece, the positioning workpiece comprises a plurality of reflecting balls positioned on a projection plane, the guiding workpiece comprises positioning balls, and the reflecting balls and the positioning balls form an optical module.
In certain embodiments of the second aspect of the present application, the surgical positioning guide device further includes a mounting member, the positioning workpiece is disposed on one side of the mounting member, a probe is disposed at one end of the mounting member, the guiding workpiece is movably connected to one end of the mounting member, and a display screen for displaying images is disposed on the mounting member.
In certain embodiments of the second aspect of the present application, the guiding workpiece further comprises a first clamping jaw, a second clamping jaw, a first adjusting clamping piece, a second adjusting clamping piece, a locking piece for locking the guiding workpiece and a return spring for returning the guiding workpiece to a preset position, wherein the first clamping jaw is connected with the first adjusting clamping piece, the second clamping jaw is connected with the second adjusting clamping piece, the first adjusting clamping piece and the second adjusting clamping piece are spliced, the return spring is located inside the first adjusting clamping piece and the second adjusting clamping piece, and the locking piece is connected with the return spring through a gear.
An embodiment of the third aspect of the application is an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the surgical localization guidance method as described above when executing the computer program.
An embodiment of the fourth aspect of the present application is a computer-readable storage medium storing computer-executable instructions for performing the surgical localization guidance method as described above.
The technical scheme has the advantages that the real-time position of the guiding workpiece can be intelligently, accurately and quickly determined through the matching of the reflecting ball of the positioning workpiece and the positioning ball of the guiding workpiece, the guiding path of the guiding workpiece is displayed on a digital human body in real time, a doctor can clearly grasp the real-time posture of the guiding workpiece, the doctor can clearly judge whether the puncture path is wrong, the blind wearing problem is solved, the precision and the quality of a puncture operation are improved, the operation difficulty and the operation risk are reduced through the operation positioning guiding device, and the dependence of the operation on professional experience and capability is also reduced.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
It should be noted that although functional block division is performed in a device diagram and a logic sequence is shown in a flowchart, in some cases, the steps shown or described may be performed in a different order than the block division in the device, or in the flowchart. The terms first, second and the like in the description, in the claims and in the above-described figures, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order.
Embodiments of the present application will be further described below with reference to the accompanying drawings.
Embodiments of the present application provide a surgical positioning guide.
Referring to fig. 2, the surgical positioning guide includes a mounting member 20, a positioning work 10, and a guide work 50, the positioning work 10 including a plurality of reflecting balls 11 on a projection plane, the guide work 50 including a positioning ball 55, the reflecting balls 11 and the positioning ball 55 constituting an optical module.
Specifically, the positioning workpiece 10 of the embodiment of the present application includes four reflective balls 11, and of course, in other embodiments, the number of reflective balls 11 may be other numbers, for example, 3, and the number of reflective balls 11 may range from 3 to 8.
The guide workpiece 50 of the embodiment of the present application includes one positioning ball 55, but of course, in other embodiments, the number of positioning balls 55 may be other numbers, such as 2, and the number of positioning balls 55 may be greater than or equal to one.
The guide workpiece 50 includes a positioning ball 55, defined as a positioning ball P, and the positioning ball 55P is disposed on the guide workpiece 50 for attitude calculation and real-time tracking of the guide workpiece 50. The positioning workpiece 10 includes four reflective balls 11, defined as reflective ball a, reflective ball B, reflective ball C, and reflective ball D, respectively. Wherein the line of the reflective ball a and the reflective ball B is parallel to the initial position of the guiding workpiece 50, i.e. the line AB is parallel to OP.
The reflective sphere 11 can be detected and tracked in real time by an optical sensing device. The optical sensing device has an optical positioning navigation system, and has extremely high precision and no interpolation measurement rate of 335 Hz. The device consists of two cameras, and can shoot real-time video images through the cameras, identify and track the reflective balls 11, the reflective planes and the infrared lamps in the real-time video images, simultaneously observe reflective and/or active datum points (infrared lamps), and calculate the positions of the reflective balls by using triangulation. When several fiducial points are fixed to a mark point, the system can determine its 6 degrees of freedom (x, y, z, α, β, γ) data. The positioning space reaches 520mm 80mm 95mm. The system is compatible with passive image guided surgical tools.
The positioning work 10 is provided on one side of the mount 20. One end of the mount 20 is provided with a probe 30, specifically, the probe 30 is a B-ultrasonic probe.
One end of the mounting member 20 is also movably connected with a guide workpiece 50, the guide workpiece 50 is mounted on the mounting member 20 through a rotating shaft, and the guide workpiece 50 can rotate around the rotating shaft.
The other end of the mount 20 is provided with a mount 21. The surgical positioning guide may be mounted to the robotic arm by a mount 21.
The mount 20 is provided with a display screen 40 for displaying images, such as preoperative images and digitized and imaged guide workpieces 50.
Referring to fig. 3 and 4, in particular, the guide workpiece 50 further includes a first jaw 51, a second jaw 52, a first adjustment jaw 53, a second adjustment jaw 54, a locking member 56 for locking the guide workpiece 50, and a return spring for returning the guide workpiece 50 to a preset position, the first jaw 51 is connected to the first adjustment jaw 53, the second jaw 52 is connected to the second adjustment jaw 54, the first adjustment jaw 53 and the second adjustment jaw 54 are spliced, the return spring is located inside the first adjustment jaw 53 and the second adjustment jaw 54, and the locking member 56 is connected to the return spring through a gear.
The mounting member 20 is provided with an electric jaw, the surgical positioning guide device is mounted on the electric jaw through the first clamping jaw 51 and the second clamping jaw 52, and the opening and closing of the guide workpiece 50 are realized through the driving of the electric jaw, so that the guide workpiece 50 and the puncture instrument are separated through the guide and the opening after the guide is completed.
The locking member 56 is a locking screw, and by tightening the locking member 56, the guide work 50 can be locked so as to be fixed in position relative to the mount 20 and not movable relative to the mount 20, and by loosening the locking member 56, the guide work 50 can be movable. The guide work piece 50 can be automatically reset to a preset position after being moved by a reset spring. This allows for more flexibility in the use of the surgical positioning guide.
Referring to fig. 1, the surgical positioning guide device can implement the following surgical positioning guide method, including but not limited to the following steps:
step S100, acquiring a preoperative image, and planning an operative path according to the preoperative image to obtain a planned path;
Step S200, obtaining a conversion relation between a coordinate system of the optical module and a coordinate system of the preoperative image;
Step S300, acquiring coordinates of a first vector, coordinates of a second vector and an angle of rotation of the first vector around the second vector;
Step S400, determining coordinate information of the guiding workpiece 50 under the coordinate system of the optical module according to the coordinates of the first vector, the coordinates of the second vector and the rotation angle of the first vector around the second vector;
step S500, according to the conversion relation between the coordinate system of the optical module and the coordinate system of the preoperative image, converting the coordinate information of the guiding workpiece 50 under the coordinate system of the optical module to obtain the coordinate information of the guiding workpiece 50 under the coordinate system of the preoperative image;
step S600, displaying the guiding workpiece 50 in the preoperative image according to the coordinate information of the guiding workpiece 50 in the coordinate system of the preoperative image;
step S700, calibrating the current path according to the planned path and adjusting the position of the guiding workpiece 50.
For step S100, a preoperative image of the patient is acquired by the CT apparatus. Planning the operation path according to the preoperative image by a physician with abundant experience to obtain the planned path. Or the preoperative image is input into a path planning model based on the neural network, and the path planning model is utilized to intelligently and automatically plan the operation path according to the preoperative image so as to obtain a planned path.
Of course in other embodiments, the pre-operative image may be obtained by magnetic resonance imaging (Magnetic Resonance Imaging, MRI), positron emission tomography (Positron Emission Computed Tomography, PET), digital subtraction angiography (Digital Subtraction Angiography, DSA), and 2 d/3 d endoscopic images.
For step S200, in order to achieve real-time positioning synchronization of the guiding workpiece 50, spatial positional relationship between the optical module and the preoperative image needs to be unified.
Obtaining a conversion relationship between a coordinate system of the optical module and a coordinate system of the preoperative image, including but not limited to the following steps:
Acquiring a first coordinate value of a plurality of auxiliary reflecting balls 11 of an auxiliary tool under the coordinate system of the optical module and a second coordinate value of the auxiliary reflecting balls under the coordinate system of the preoperative image;
And determining a conversion relation between the coordinate system of the optical module and the coordinate system of the preoperative image according to the first coordinate value and the second coordinate value.
In particular, the auxiliary tool is a flexible construction tool, which in this embodiment comprises 5 auxiliary light-reflecting balls 11, which are fixed to the surface of the manikin in a fixed axial direction.
Defining the coordinate system of the optical module as Cottm and the coordinate system of the preoperative image as Cicnm, sequentially acquiring the position information of each auxiliary light reflecting ball 11 under the coordinate system of the optical module, namely a first coordinate value, denoted as Pj (j=1, 2,..5), and sequentially acquiring the position information of each auxiliary light reflecting ball 11 under the coordinate system of the preoperative image, namely a second coordinate value, denoted as Pi (i=1, 2,..5).
The first coordinate value and the second coordinate value can be obtained by meeting the following relation that Picnm = Rottm-icnm × Pottm + Tottm-icnm, wherein Rottm-icnm are 3*3 rotation matrices, tottm-icnm are 3*1 translation vectors, and further obtaining the conversion relation between the coordinate system of the optical module and the coordinate system of the preoperative image.
In addition, considering that in practical clinical application, respiratory motion with different amplitudes exists in the abdomen, in order to reduce registration errors as much as possible, a dynamic periodic registration method is required to be adopted for spatial position unification of respiratory segments.
Determining a conversion relationship between the coordinate system of the optical module and the coordinate system of the preoperative image according to the first coordinate value and the second coordinate value, including but not limited to the following steps:
determining a plurality of first coordinate values in a preset time period as target coordinate values;
determining a respiration period according to the maximum value of the target coordinate value;
the first coordinate value and the second coordinate value are sampled in stages based on the breathing cycle to determine a conversion relationship between the coordinate system of the optical module and the coordinate system of the preoperative image.
Specifically, since the normal human breathing cycle is approximately 3-5s, in order to ensure the accuracy of the breathing cycle, a plurality of first coordinate values of the auxiliary light reflecting ball 11 within a 20s period are recorded as target coordinate values, and one breathing cycle is determined according to the maximum value of the target coordinate values.
The first coordinate value and the second coordinate value are sampled in stages based on the respiratory period to obtain a corresponding conversion relation TMottm-icnm { k=1, 2,..N }, and corresponding TMottm-icnm is adopted for spatial position unification corresponding to different respiratory phases, namely [ Psiem,1] = TMottm-icnm [ Pottm,1], wherein TMottm-icnm is a spatial conversion matrix of 4*4.
The above Ticnm-ottm calculations in combination enable real-time simultaneous display of the guide workpiece 50 in the coordinate system of the preoperative image.
For the real-time calculation of the end pose of the guide work 50 with respect to steps S300 and S400, in order to reduce calculation errors due to the structure, the two light reflecting balls 11 (light reflecting ball a and light reflecting ball B) of the positioning work 10 in the default initial state are in one direction with the end of the guide work 50, that is, AB OP, and then the real-time pose of the end of the guide work 50 is calculated from the positional information of the positioning ball 55.
Because the positioning ball 55 is fixed relative to the guide workpiece 50, the relative position relationship between the positioning ball 55 and the guide workpiece 50 in the initial state can be obtained, namely, the projection of the projection plane is carried out on the positioning ball 55, and further, the angle relationship between the projection points of the guide workpiece 50 and the positioning ball 55 and the line of the diagnosis point in the initial state can be obtained, and the diagnosis point can be directly calibrated through the mechanical arm end. Each time the positioning ball 55 moves, the real-time position of the guiding workpiece 50 can be calculated directly according to the angle of rotation of the projection point connecting line around the normal vector of the projection plane.
The method comprises the steps of obtaining coordinates of a first vector, coordinates of a second vector and a rotation angle of the first vector around the second vector, wherein the first vector is a vector formed by an origin of a coordinate system of the positioning ball 55 and the optical module, the first vector is defined as P, the second vector is a normal vector of a projection plane formed by a plurality of reflecting balls 11, and the second vector is defined as N.
Referring to fig. 5 and 6, a first vector P is defined to be rotated by an angle θ around a second vector N, resulting in a new vector P'. Then, the coordinate information of the guiding workpiece 50 in the coordinate system of the optical module is determined according to the coordinates of the first vector, the coordinates of the second vector and the angle of rotation of the first vector around the second vector, and is represented by the following formula P '=p×cos θ+ (n×p) sin θ+n (n·p) (1-cos θ), where θ is the angle of rotation of the first vector around the second vector, and P' is the coordinate information of the guiding workpiece 50 in the coordinate system of the optical module.
The coordinates of the first vector are denoted (px, py, pz), the coordinates of the second vector are denoted (ax, by, cz), the coordinate information of the guiding workpiece 50 in the coordinate system of the optical module is denoted (px ', py ', pz '), and n×p= (ay-pz-az-py, ax×pz-az-px, ax×py-ay×px), n×p=ax×px+ay×py+az; there are ,px'=px*cosθ+(ay*pz-az*py)sinθ+ax(ax*px+ay*py+az*pz)(1-cosθ),py'=py*cosθ+(ax*pz-az*px)sinθ+ay(ax*px+ay*py+az*pz)(1-cosθ),pz'=pz*cosθ+(ax*py-ay*px)sinθ+az(ax*px+ay*py+az*pz)(1-cosθ).
For step S500, the coordinate information of the guiding workpiece 50 in the coordinate system of the preoperative image is obtained by converting the coordinate information of the guiding workpiece 50 in the coordinate system of the optical module according to the conversion relationship between the coordinate system of the optical module and the coordinate system of the preoperative image.
For step S600, the guiding workpiece 50 is displayed in the preoperative image according to the coordinate information of the guiding workpiece 50 in the coordinate system of the preoperative image, and displayed through the display screen 40, to provide an intuitive viewing angle for the doctor.
For step S700, the guide work piece 50 in the preoperative image of the display 40 is used to guide the wire extension function, the current path is calibrated according to the planned path, and whether the current puncture path is correct is calibrated, so that the physician can conveniently adjust the position of the guide work piece 50.
Through the above embodiment, through the cooperation of the reflective ball 11 of the positioning workpiece 10 and the positioning ball 55 of the guiding workpiece 50, the real-time position of the guiding workpiece 50 can be intelligently, accurately and quickly determined, and the guiding path of the guiding workpiece 50 is displayed on the digital human body in real time, so that a doctor can clearly grasp the real-time posture of the guiding workpiece 50, and can clearly judge whether the puncture path is wrong, the blind wearing problem is solved, the precision and quality of the puncture operation are improved, the operation difficulty and the operation risk are reduced through the operation positioning guiding device, and the dependence of the operation on professional experience and capability is also reduced.
The embodiment of the application provides electronic equipment. The electronic device comprises a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the surgical positioning guidance method as described above when executing the computer program.
The electronic equipment can be any intelligent terminal including a tablet personal computer, a vehicle-mounted computer and the like.
Generally, for the hardware structure of the electronic device, the processor may be implemented by using a general-purpose CPU (central processing unit), a microprocessor, an application-specific integrated circuit (ApplicationSpecificIntegratedCircuit, ASIC), or one or more integrated circuits, etc. to execute related programs, so as to implement the technical solution provided by the embodiments of the present application.
The memory may be implemented in the form of read-only memory (ReadOnlyMemory, ROM), static storage, dynamic storage, or random access memory (RandomAccessMemory, RAM). The memory may store an operating system and other application programs, and when the technical solutions provided in the embodiments of the present disclosure are implemented by software or firmware, relevant program codes are stored in the memory, and the processor invokes the method for executing the embodiments of the present disclosure.
The input/output interface is used for realizing information input and output.
The communication interface is used for realizing communication interaction between the device and other devices, and can realize communication in a wired mode (such as USB, network cable and the like) or in a wireless mode (such as mobile network, WIFI, bluetooth and the like).
The bus transfers information between the various components of the device, such as the processor, memory, input/output interfaces, and communication interfaces. The processor, memory, input/output interface and communication interface are communicatively coupled to each other within the device via a bus.
Embodiments of the present application provide a computer-readable storage medium. The computer readable storage medium stores computer executable instructions for performing the surgical localization guidance method as described above.
It should be appreciated that the method steps in embodiments of the present invention may be implemented or carried out by computer hardware, a combination of hardware and software, or by computer instructions stored in non-transitory computer-readable memory. The method may use standard programming techniques. Each program may be implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language. Furthermore, the program can be run on a programmed application specific integrated circuit for this purpose.
Furthermore, the operations of the processes described herein may be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The processes (or variations and/or combinations thereof) described herein may be performed under control of one or more computer systems configured with executable instructions, and may be implemented as code (e.g., executable instructions, one or more computer programs, or one or more applications), by hardware, or combinations thereof, collectively executing on one or more processors. The computer program includes a plurality of instructions executable by one or more processors.
Further, the method may be implemented in any type of computing platform operatively connected to a suitable, including, but not limited to, a personal computer, a smart phone, a mainframe, a workstation, a network or distributed computing environment, a separate or integrated computer platform, or in communication with a charged particle tool or other imaging device, and so forth. Aspects of the invention may be implemented in machine-readable code stored on a non-transitory storage medium or device, whether removable or integrated into a computing platform, such as a hard disk, optical read and/or write storage medium, RAM, ROM, etc., such that it is readable by a programmable computer, which when read by a computer, is operable to configure and operate the computer to perform the processes described herein. Further, the machine readable code, or portions thereof, may be transmitted over a wired or wireless network. When such media includes instructions or programs that, in conjunction with a microprocessor or other data processor, implement the steps described above, the invention described herein includes these and other different types of non-transitory computer-readable storage media. The invention also includes the computer itself when programmed according to the methods and techniques of the present invention.
The computer program can be applied to the input data to perform the functions described herein, thereby converting the input data to generate output data that is stored to the non-volatile memory. The output information may also be applied to one or more output devices such as a display. In a preferred embodiment of the invention, the transformed data represents physical and tangible objects, including specific visual depictions of physical and tangible objects produced on a display.
Although embodiments of the present application have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the spirit and scope of the application as defined by the appended claims and their equivalents.
While the preferred embodiment of the present application has been described in detail, the present application is not limited to the embodiments, and those skilled in the art can make various equivalent modifications or substitutions without departing from the spirit of the present application, and the equivalent modifications or substitutions are intended to be included in the scope of the present application as defined in the appended claims.