Movatterモバイル変換


[0]ホーム

URL:


CN116158850B - Surgical positioning and guiding method, device, equipment and medium - Google Patents

Surgical positioning and guiding method, device, equipment and medium

Info

Publication number
CN116158850B
CN116158850BCN202310141894.5ACN202310141894ACN116158850BCN 116158850 BCN116158850 BCN 116158850BCN 202310141894 ACN202310141894 ACN 202310141894ACN 116158850 BCN116158850 BCN 116158850B
Authority
CN
China
Prior art keywords
vector
coordinate system
workpiece
guiding
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310141894.5A
Other languages
Chinese (zh)
Other versions
CN116158850A (en
Inventor
王澄
李迟迟
朱光宇
滕皋军
陆骊工
陈晓东
罗皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Hengle Medical Technology Co ltd
Original Assignee
Zhuhai Hengle Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Hengle Medical Technology Co ltdfiledCriticalZhuhai Hengle Medical Technology Co ltd
Priority to CN202310141894.5ApriorityCriticalpatent/CN116158850B/en
Publication of CN116158850ApublicationCriticalpatent/CN116158850A/en
Application grantedgrantedCritical
Publication of CN116158850BpublicationCriticalpatent/CN116158850B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

Translated fromChinese

本申请实施例提供了手术定位导向方法、装置、设备及介质,方法包括根据第一向量的坐标、第二向量的坐标和第一向量绕第二向量旋转的角度,确定导向工件在光学模组的坐标系下的坐标信息;根据坐标系转换关系由导向工件在光学模组的坐标系下的坐标信息转换得到其在术前图像的坐标系下的坐标信息;进而将导向工件显示在术前图像中;通过定位工件的反光球和导向工件的定位球的配合,能够智能且准确快捷地确定导向工件的实时位置,并将导向工件的导向路径实时在数字人体上进行显示,提高手术的精度和质量,降低手术的难度和风险。

The embodiments of the present application provide a surgical positioning and guidance method, apparatus, equipment and medium. The method includes determining the coordinate information of the guide workpiece in the coordinate system of the optical module based on the coordinates of the first vector, the coordinates of the second vector and the angle of rotation of the first vector around the second vector; converting the coordinate information of the guide workpiece in the coordinate system of the optical module to obtain its coordinate information in the coordinate system of the preoperative image according to the coordinate system conversion relationship; and then displaying the guide workpiece in the preoperative image; through the cooperation of the reflective ball of the positioning workpiece and the positioning ball of the guide workpiece, the real-time position of the guide workpiece can be determined intelligently, accurately and quickly, and the guiding path of the guide workpiece can be displayed in real time on the digital human body, thereby improving the accuracy and quality of the surgery and reducing the difficulty and risk of the surgery.

Description

Surgical positioning and guiding method, device, equipment and medium
Technical Field
Embodiments of the present application relate to, but are not limited to, the medical arts, and more particularly, to surgical positioning and guidance methods, devices, apparatus, and media.
Background
The image guided surgery navigation technology provides a doctor with very strong visual navigation information based on various medical image information and an augmented reality technology. The existing operation navigation system has simple structure and function, can only mechanically perform navigation and positioning, and cannot meet the complex operation environment, for example, the positioning and guiding device cannot simulate and display a guiding path on a digital human body of a patient in real time, display the puncture condition according to the path for a doctor in real time and puncture a target focus or not, the positioning and guiding device interventional doctor cannot adjust the guiding angle by means of clinical experience and according to the breathing condition of the patient, or after the positioning angle doctor is adjusted, the system does not know the puncture result of the tail end positioning and guiding device, and the clinical problem of blind puncture of the interventional doctor cannot be solved.
Disclosure of Invention
The following is a summary of the subject matter described in detail herein. This summary is not intended to limit the scope of the claims.
The application aims to at least solve one of the technical problems existing in the related art to a certain extent, and the embodiment of the application provides a surgical positioning and guiding method, a device, equipment and a medium, which can reflect a guiding workpiece in an image accurately in real time.
An embodiment of a first aspect of the present application is a surgical positioning and guiding method applied to a surgical positioning and guiding device, the surgical positioning and guiding device including a positioning workpiece and a guiding workpiece, the positioning workpiece including a plurality of reflective balls, the reflective balls and the positioning balls forming an optical module, the surgical positioning and guiding method including the steps of:
acquiring a conversion relation between a coordinate system of the optical module and a coordinate system of the preoperative image;
Acquiring coordinates of a first vector, coordinates of a second vector and an angle of rotation of the first vector around the second vector, wherein the first vector is a vector formed by the origin of a coordinate system of the positioning ball and the optical module, and the second vector is a normal vector of a projection plane formed by a plurality of reflecting balls;
determining coordinate information of the guiding workpiece under the coordinate system of the optical module according to the coordinates of the first vector, the coordinates of the second vector and the rotation angle of the first vector around the second vector;
According to the conversion relation between the coordinate system of the optical module and the coordinate system of the preoperative image, converting the coordinate information of the guiding workpiece under the coordinate system of the optical module to obtain the coordinate information of the guiding workpiece under the coordinate system of the preoperative image;
And displaying the guiding workpiece in the preoperative image according to the coordinate information of the guiding workpiece in the coordinate system of the preoperative image.
In certain embodiments of the first aspect of the present application, the acquiring a conversion relationship between a coordinate system of the optical module and a coordinate system of the preoperative image includes:
acquiring a first coordinate value of a plurality of auxiliary reflecting balls of an auxiliary tool under a coordinate system of an optical module and a second coordinate value of a plurality of auxiliary reflecting balls of an auxiliary tool under a coordinate system of a preoperative image;
and determining a conversion relation between the coordinate system of the optical module and the coordinate system of the preoperative image according to the first coordinate value and the second coordinate value.
In certain embodiments of the first aspect of the present application, the determining a conversion relationship between a coordinate system of the optical module and a coordinate system of the preoperative image according to the first coordinate value and the second coordinate value includes:
determining a plurality of first coordinate values in a preset time period as target coordinate values;
determining a respiratory cycle according to the maximum value of the target coordinate value;
And sampling the first coordinate value and the second coordinate value in stages based on the breathing period so as to determine the conversion relation between the coordinate system of the optical module and the coordinate system of the preoperative image.
In certain embodiments of the first aspect of the present application, the determining the coordinate information of the guiding workpiece under the coordinate system of the optical module according to the coordinates of the first vector, the coordinates of the second vector and the rotation angle of the first vector around the second vector is expressed by the following formula of P '=p×cos θ+ (n×p) sin θ+n (n·p) (1-cos θ), where P is the first vector, N is the second vector, θ is the rotation angle of the first vector around the second vector, and P' is the coordinate information of the guiding workpiece under the coordinate system of the optical module.
In certain embodiments of the first aspect of the present application, the coordinates of the first vector are denoted as (px, py, pz), the coordinates of the second vector are denoted as (ax, by, cz), and the coordinate information of the guiding object under the coordinate system of the optical module is denoted as (px ', py ', pz '), wherein ,px'=px*cosθ+(ay*pz-az*py)sinθ+ax(ax*px+ay*py+az*pz)(1-cosθ),py'=py*cosθ+(ax*pz-az*px)sinθ+ay(ax*px+ay*py+az*pz)(1-cosθ),pz'=pz*cosθ+(ax*py-ay*px)sinθ+az(ax*px+ay*py+az*pz)(1-cosθ).
An embodiment of the second aspect of the application is a surgical positioning and guiding device, which applies the surgical positioning and guiding method as described above, wherein the surgical positioning and guiding device comprises a positioning workpiece and a guiding workpiece, the positioning workpiece comprises a plurality of reflecting balls positioned on a projection plane, the guiding workpiece comprises positioning balls, and the reflecting balls and the positioning balls form an optical module.
In certain embodiments of the second aspect of the present application, the surgical positioning guide device further includes a mounting member, the positioning workpiece is disposed on one side of the mounting member, a probe is disposed at one end of the mounting member, the guiding workpiece is movably connected to one end of the mounting member, and a display screen for displaying images is disposed on the mounting member.
In certain embodiments of the second aspect of the present application, the guiding workpiece further comprises a first clamping jaw, a second clamping jaw, a first adjusting clamping piece, a second adjusting clamping piece, a locking piece for locking the guiding workpiece and a return spring for returning the guiding workpiece to a preset position, wherein the first clamping jaw is connected with the first adjusting clamping piece, the second clamping jaw is connected with the second adjusting clamping piece, the first adjusting clamping piece and the second adjusting clamping piece are spliced, the return spring is located inside the first adjusting clamping piece and the second adjusting clamping piece, and the locking piece is connected with the return spring through a gear.
An embodiment of the third aspect of the application is an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the surgical localization guidance method as described above when executing the computer program.
An embodiment of the fourth aspect of the present application is a computer-readable storage medium storing computer-executable instructions for performing the surgical localization guidance method as described above.
The technical scheme has the advantages that the real-time position of the guiding workpiece can be intelligently, accurately and quickly determined through the matching of the reflecting ball of the positioning workpiece and the positioning ball of the guiding workpiece, the guiding path of the guiding workpiece is displayed on a digital human body in real time, a doctor can clearly grasp the real-time posture of the guiding workpiece, the doctor can clearly judge whether the puncture path is wrong, the blind wearing problem is solved, the precision and the quality of a puncture operation are improved, the operation difficulty and the operation risk are reduced through the operation positioning guiding device, and the dependence of the operation on professional experience and capability is also reduced.
Drawings
The accompanying drawings are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate and do not limit the application.
FIG. 1 is a step diagram of a surgical positioning guidance method provided by an embodiment of the present application;
FIG. 2 is a block diagram of a surgical positioning guide provided by an embodiment of the present application;
FIG. 3 is a block diagram of a pilot workpiece;
FIG. 4 is a block diagram of another direction of the oriented workpiece;
FIG. 5 is a schematic view of a surgical positioning guide for adjusting the position of a guide workpiece;
fig. 6 is a schematic view of the guide work piece in position for adjustment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
It should be noted that although functional block division is performed in a device diagram and a logic sequence is shown in a flowchart, in some cases, the steps shown or described may be performed in a different order than the block division in the device, or in the flowchart. The terms first, second and the like in the description, in the claims and in the above-described figures, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order.
Embodiments of the present application will be further described below with reference to the accompanying drawings.
Embodiments of the present application provide a surgical positioning guide.
Referring to fig. 2, the surgical positioning guide includes a mounting member 20, a positioning work 10, and a guide work 50, the positioning work 10 including a plurality of reflecting balls 11 on a projection plane, the guide work 50 including a positioning ball 55, the reflecting balls 11 and the positioning ball 55 constituting an optical module.
Specifically, the positioning workpiece 10 of the embodiment of the present application includes four reflective balls 11, and of course, in other embodiments, the number of reflective balls 11 may be other numbers, for example, 3, and the number of reflective balls 11 may range from 3 to 8.
The guide workpiece 50 of the embodiment of the present application includes one positioning ball 55, but of course, in other embodiments, the number of positioning balls 55 may be other numbers, such as 2, and the number of positioning balls 55 may be greater than or equal to one.
The guide workpiece 50 includes a positioning ball 55, defined as a positioning ball P, and the positioning ball 55P is disposed on the guide workpiece 50 for attitude calculation and real-time tracking of the guide workpiece 50. The positioning workpiece 10 includes four reflective balls 11, defined as reflective ball a, reflective ball B, reflective ball C, and reflective ball D, respectively. Wherein the line of the reflective ball a and the reflective ball B is parallel to the initial position of the guiding workpiece 50, i.e. the line AB is parallel to OP.
The reflective sphere 11 can be detected and tracked in real time by an optical sensing device. The optical sensing device has an optical positioning navigation system, and has extremely high precision and no interpolation measurement rate of 335 Hz. The device consists of two cameras, and can shoot real-time video images through the cameras, identify and track the reflective balls 11, the reflective planes and the infrared lamps in the real-time video images, simultaneously observe reflective and/or active datum points (infrared lamps), and calculate the positions of the reflective balls by using triangulation. When several fiducial points are fixed to a mark point, the system can determine its 6 degrees of freedom (x, y, z, α, β, γ) data. The positioning space reaches 520mm 80mm 95mm. The system is compatible with passive image guided surgical tools.
The positioning work 10 is provided on one side of the mount 20. One end of the mount 20 is provided with a probe 30, specifically, the probe 30 is a B-ultrasonic probe.
One end of the mounting member 20 is also movably connected with a guide workpiece 50, the guide workpiece 50 is mounted on the mounting member 20 through a rotating shaft, and the guide workpiece 50 can rotate around the rotating shaft.
The other end of the mount 20 is provided with a mount 21. The surgical positioning guide may be mounted to the robotic arm by a mount 21.
The mount 20 is provided with a display screen 40 for displaying images, such as preoperative images and digitized and imaged guide workpieces 50.
Referring to fig. 3 and 4, in particular, the guide workpiece 50 further includes a first jaw 51, a second jaw 52, a first adjustment jaw 53, a second adjustment jaw 54, a locking member 56 for locking the guide workpiece 50, and a return spring for returning the guide workpiece 50 to a preset position, the first jaw 51 is connected to the first adjustment jaw 53, the second jaw 52 is connected to the second adjustment jaw 54, the first adjustment jaw 53 and the second adjustment jaw 54 are spliced, the return spring is located inside the first adjustment jaw 53 and the second adjustment jaw 54, and the locking member 56 is connected to the return spring through a gear.
The mounting member 20 is provided with an electric jaw, the surgical positioning guide device is mounted on the electric jaw through the first clamping jaw 51 and the second clamping jaw 52, and the opening and closing of the guide workpiece 50 are realized through the driving of the electric jaw, so that the guide workpiece 50 and the puncture instrument are separated through the guide and the opening after the guide is completed.
The locking member 56 is a locking screw, and by tightening the locking member 56, the guide work 50 can be locked so as to be fixed in position relative to the mount 20 and not movable relative to the mount 20, and by loosening the locking member 56, the guide work 50 can be movable. The guide work piece 50 can be automatically reset to a preset position after being moved by a reset spring. This allows for more flexibility in the use of the surgical positioning guide.
Referring to fig. 1, the surgical positioning guide device can implement the following surgical positioning guide method, including but not limited to the following steps:
step S100, acquiring a preoperative image, and planning an operative path according to the preoperative image to obtain a planned path;
Step S200, obtaining a conversion relation between a coordinate system of the optical module and a coordinate system of the preoperative image;
Step S300, acquiring coordinates of a first vector, coordinates of a second vector and an angle of rotation of the first vector around the second vector;
Step S400, determining coordinate information of the guiding workpiece 50 under the coordinate system of the optical module according to the coordinates of the first vector, the coordinates of the second vector and the rotation angle of the first vector around the second vector;
step S500, according to the conversion relation between the coordinate system of the optical module and the coordinate system of the preoperative image, converting the coordinate information of the guiding workpiece 50 under the coordinate system of the optical module to obtain the coordinate information of the guiding workpiece 50 under the coordinate system of the preoperative image;
step S600, displaying the guiding workpiece 50 in the preoperative image according to the coordinate information of the guiding workpiece 50 in the coordinate system of the preoperative image;
step S700, calibrating the current path according to the planned path and adjusting the position of the guiding workpiece 50.
For step S100, a preoperative image of the patient is acquired by the CT apparatus. Planning the operation path according to the preoperative image by a physician with abundant experience to obtain the planned path. Or the preoperative image is input into a path planning model based on the neural network, and the path planning model is utilized to intelligently and automatically plan the operation path according to the preoperative image so as to obtain a planned path.
Of course in other embodiments, the pre-operative image may be obtained by magnetic resonance imaging (Magnetic Resonance Imaging, MRI), positron emission tomography (Positron Emission Computed Tomography, PET), digital subtraction angiography (Digital Subtraction Angiography, DSA), and 2 d/3 d endoscopic images.
For step S200, in order to achieve real-time positioning synchronization of the guiding workpiece 50, spatial positional relationship between the optical module and the preoperative image needs to be unified.
Obtaining a conversion relationship between a coordinate system of the optical module and a coordinate system of the preoperative image, including but not limited to the following steps:
Acquiring a first coordinate value of a plurality of auxiliary reflecting balls 11 of an auxiliary tool under the coordinate system of the optical module and a second coordinate value of the auxiliary reflecting balls under the coordinate system of the preoperative image;
And determining a conversion relation between the coordinate system of the optical module and the coordinate system of the preoperative image according to the first coordinate value and the second coordinate value.
In particular, the auxiliary tool is a flexible construction tool, which in this embodiment comprises 5 auxiliary light-reflecting balls 11, which are fixed to the surface of the manikin in a fixed axial direction.
Defining the coordinate system of the optical module as Cottm and the coordinate system of the preoperative image as Cicnm, sequentially acquiring the position information of each auxiliary light reflecting ball 11 under the coordinate system of the optical module, namely a first coordinate value, denoted as Pj (j=1, 2,..5), and sequentially acquiring the position information of each auxiliary light reflecting ball 11 under the coordinate system of the preoperative image, namely a second coordinate value, denoted as Pi (i=1, 2,..5).
The first coordinate value and the second coordinate value can be obtained by meeting the following relation that Picnm = Rottm-icnm × Pottm + Tottm-icnm, wherein Rottm-icnm are 3*3 rotation matrices, tottm-icnm are 3*1 translation vectors, and further obtaining the conversion relation between the coordinate system of the optical module and the coordinate system of the preoperative image.
In addition, considering that in practical clinical application, respiratory motion with different amplitudes exists in the abdomen, in order to reduce registration errors as much as possible, a dynamic periodic registration method is required to be adopted for spatial position unification of respiratory segments.
Determining a conversion relationship between the coordinate system of the optical module and the coordinate system of the preoperative image according to the first coordinate value and the second coordinate value, including but not limited to the following steps:
determining a plurality of first coordinate values in a preset time period as target coordinate values;
determining a respiration period according to the maximum value of the target coordinate value;
the first coordinate value and the second coordinate value are sampled in stages based on the breathing cycle to determine a conversion relationship between the coordinate system of the optical module and the coordinate system of the preoperative image.
Specifically, since the normal human breathing cycle is approximately 3-5s, in order to ensure the accuracy of the breathing cycle, a plurality of first coordinate values of the auxiliary light reflecting ball 11 within a 20s period are recorded as target coordinate values, and one breathing cycle is determined according to the maximum value of the target coordinate values.
The first coordinate value and the second coordinate value are sampled in stages based on the respiratory period to obtain a corresponding conversion relation TMottm-icnm { k=1, 2,..N }, and corresponding TMottm-icnm is adopted for spatial position unification corresponding to different respiratory phases, namely [ Psiem,1] = TMottm-icnm [ Pottm,1], wherein TMottm-icnm is a spatial conversion matrix of 4*4.
The above Ticnm-ottm calculations in combination enable real-time simultaneous display of the guide workpiece 50 in the coordinate system of the preoperative image.
For the real-time calculation of the end pose of the guide work 50 with respect to steps S300 and S400, in order to reduce calculation errors due to the structure, the two light reflecting balls 11 (light reflecting ball a and light reflecting ball B) of the positioning work 10 in the default initial state are in one direction with the end of the guide work 50, that is, AB OP, and then the real-time pose of the end of the guide work 50 is calculated from the positional information of the positioning ball 55.
Because the positioning ball 55 is fixed relative to the guide workpiece 50, the relative position relationship between the positioning ball 55 and the guide workpiece 50 in the initial state can be obtained, namely, the projection of the projection plane is carried out on the positioning ball 55, and further, the angle relationship between the projection points of the guide workpiece 50 and the positioning ball 55 and the line of the diagnosis point in the initial state can be obtained, and the diagnosis point can be directly calibrated through the mechanical arm end. Each time the positioning ball 55 moves, the real-time position of the guiding workpiece 50 can be calculated directly according to the angle of rotation of the projection point connecting line around the normal vector of the projection plane.
The method comprises the steps of obtaining coordinates of a first vector, coordinates of a second vector and a rotation angle of the first vector around the second vector, wherein the first vector is a vector formed by an origin of a coordinate system of the positioning ball 55 and the optical module, the first vector is defined as P, the second vector is a normal vector of a projection plane formed by a plurality of reflecting balls 11, and the second vector is defined as N.
Referring to fig. 5 and 6, a first vector P is defined to be rotated by an angle θ around a second vector N, resulting in a new vector P'. Then, the coordinate information of the guiding workpiece 50 in the coordinate system of the optical module is determined according to the coordinates of the first vector, the coordinates of the second vector and the angle of rotation of the first vector around the second vector, and is represented by the following formula P '=p×cos θ+ (n×p) sin θ+n (n·p) (1-cos θ), where θ is the angle of rotation of the first vector around the second vector, and P' is the coordinate information of the guiding workpiece 50 in the coordinate system of the optical module.
The coordinates of the first vector are denoted (px, py, pz), the coordinates of the second vector are denoted (ax, by, cz), the coordinate information of the guiding workpiece 50 in the coordinate system of the optical module is denoted (px ', py ', pz '), and n×p= (ay-pz-az-py, ax×pz-az-px, ax×py-ay×px), n×p=ax×px+ay×py+az; there are ,px'=px*cosθ+(ay*pz-az*py)sinθ+ax(ax*px+ay*py+az*pz)(1-cosθ),py'=py*cosθ+(ax*pz-az*px)sinθ+ay(ax*px+ay*py+az*pz)(1-cosθ),pz'=pz*cosθ+(ax*py-ay*px)sinθ+az(ax*px+ay*py+az*pz)(1-cosθ).
For step S500, the coordinate information of the guiding workpiece 50 in the coordinate system of the preoperative image is obtained by converting the coordinate information of the guiding workpiece 50 in the coordinate system of the optical module according to the conversion relationship between the coordinate system of the optical module and the coordinate system of the preoperative image.
For step S600, the guiding workpiece 50 is displayed in the preoperative image according to the coordinate information of the guiding workpiece 50 in the coordinate system of the preoperative image, and displayed through the display screen 40, to provide an intuitive viewing angle for the doctor.
For step S700, the guide work piece 50 in the preoperative image of the display 40 is used to guide the wire extension function, the current path is calibrated according to the planned path, and whether the current puncture path is correct is calibrated, so that the physician can conveniently adjust the position of the guide work piece 50.
Through the above embodiment, through the cooperation of the reflective ball 11 of the positioning workpiece 10 and the positioning ball 55 of the guiding workpiece 50, the real-time position of the guiding workpiece 50 can be intelligently, accurately and quickly determined, and the guiding path of the guiding workpiece 50 is displayed on the digital human body in real time, so that a doctor can clearly grasp the real-time posture of the guiding workpiece 50, and can clearly judge whether the puncture path is wrong, the blind wearing problem is solved, the precision and quality of the puncture operation are improved, the operation difficulty and the operation risk are reduced through the operation positioning guiding device, and the dependence of the operation on professional experience and capability is also reduced.
The embodiment of the application provides electronic equipment. The electronic device comprises a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the surgical positioning guidance method as described above when executing the computer program.
The electronic equipment can be any intelligent terminal including a tablet personal computer, a vehicle-mounted computer and the like.
Generally, for the hardware structure of the electronic device, the processor may be implemented by using a general-purpose CPU (central processing unit), a microprocessor, an application-specific integrated circuit (ApplicationSpecificIntegratedCircuit, ASIC), or one or more integrated circuits, etc. to execute related programs, so as to implement the technical solution provided by the embodiments of the present application.
The memory may be implemented in the form of read-only memory (ReadOnlyMemory, ROM), static storage, dynamic storage, or random access memory (RandomAccessMemory, RAM). The memory may store an operating system and other application programs, and when the technical solutions provided in the embodiments of the present disclosure are implemented by software or firmware, relevant program codes are stored in the memory, and the processor invokes the method for executing the embodiments of the present disclosure.
The input/output interface is used for realizing information input and output.
The communication interface is used for realizing communication interaction between the device and other devices, and can realize communication in a wired mode (such as USB, network cable and the like) or in a wireless mode (such as mobile network, WIFI, bluetooth and the like).
The bus transfers information between the various components of the device, such as the processor, memory, input/output interfaces, and communication interfaces. The processor, memory, input/output interface and communication interface are communicatively coupled to each other within the device via a bus.
Embodiments of the present application provide a computer-readable storage medium. The computer readable storage medium stores computer executable instructions for performing the surgical localization guidance method as described above.
It should be appreciated that the method steps in embodiments of the present invention may be implemented or carried out by computer hardware, a combination of hardware and software, or by computer instructions stored in non-transitory computer-readable memory. The method may use standard programming techniques. Each program may be implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language. Furthermore, the program can be run on a programmed application specific integrated circuit for this purpose.
Furthermore, the operations of the processes described herein may be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The processes (or variations and/or combinations thereof) described herein may be performed under control of one or more computer systems configured with executable instructions, and may be implemented as code (e.g., executable instructions, one or more computer programs, or one or more applications), by hardware, or combinations thereof, collectively executing on one or more processors. The computer program includes a plurality of instructions executable by one or more processors.
Further, the method may be implemented in any type of computing platform operatively connected to a suitable, including, but not limited to, a personal computer, a smart phone, a mainframe, a workstation, a network or distributed computing environment, a separate or integrated computer platform, or in communication with a charged particle tool or other imaging device, and so forth. Aspects of the invention may be implemented in machine-readable code stored on a non-transitory storage medium or device, whether removable or integrated into a computing platform, such as a hard disk, optical read and/or write storage medium, RAM, ROM, etc., such that it is readable by a programmable computer, which when read by a computer, is operable to configure and operate the computer to perform the processes described herein. Further, the machine readable code, or portions thereof, may be transmitted over a wired or wireless network. When such media includes instructions or programs that, in conjunction with a microprocessor or other data processor, implement the steps described above, the invention described herein includes these and other different types of non-transitory computer-readable storage media. The invention also includes the computer itself when programmed according to the methods and techniques of the present invention.
The computer program can be applied to the input data to perform the functions described herein, thereby converting the input data to generate output data that is stored to the non-volatile memory. The output information may also be applied to one or more output devices such as a display. In a preferred embodiment of the invention, the transformed data represents physical and tangible objects, including specific visual depictions of physical and tangible objects produced on a display.
Although embodiments of the present application have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the spirit and scope of the application as defined by the appended claims and their equivalents.
While the preferred embodiment of the present application has been described in detail, the present application is not limited to the embodiments, and those skilled in the art can make various equivalent modifications or substitutions without departing from the spirit of the present application, and the equivalent modifications or substitutions are intended to be included in the scope of the present application as defined in the appended claims.

Claims (10)

CN202310141894.5A2023-02-202023-02-20Surgical positioning and guiding method, device, equipment and mediumActiveCN116158850B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202310141894.5ACN116158850B (en)2023-02-202023-02-20Surgical positioning and guiding method, device, equipment and medium

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202310141894.5ACN116158850B (en)2023-02-202023-02-20Surgical positioning and guiding method, device, equipment and medium

Publications (2)

Publication NumberPublication Date
CN116158850A CN116158850A (en)2023-05-26
CN116158850Btrue CN116158850B (en)2025-09-16

Family

ID=86411020

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202310141894.5AActiveCN116158850B (en)2023-02-202023-02-20Surgical positioning and guiding method, device, equipment and medium

Country Status (1)

CountryLink
CN (1)CN116158850B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108348294A (en)*2015-11-242018-07-31思想外科有限公司 Active Robotic Nail Deployment for Total Knee Arthroplasty
CN111870344A (en)*2020-05-292020-11-03中山大学肿瘤防治中心(中山大学附属肿瘤医院、中山大学肿瘤研究所)Preoperative navigation method, system and terminal equipment

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
TWI435705B (en)*2008-11-202014-05-01Been Der YangSurgical position device and image guided navigation system using the same
CN112006779B (en)*2020-09-272024-05-03安徽埃克索医疗机器人有限公司Precision detection method of surgical navigation system
CN112641512B (en)*2020-12-082023-11-10北京信息科技大学Spatial registration method applied to preoperative robot planning
EP4346609A4 (en)*2021-05-262024-08-28Beyeonics Surgical Ltd.System and method for verification of conversion of locations between coordinate systems
CN113331948B (en)*2021-05-282022-12-09浙江德尚韵兴医疗科技有限公司Interventional operation robot system, calibration device and calibration method
CN113400325B (en)*2021-06-232022-03-25四川锋准机器人科技有限公司 A method of navigation and positioning of dental implant robot
CN114366144B (en)*2022-01-132025-04-18杭州柳叶刀机器人有限公司 Oral image positioning navigation method and system
CN114795496B (en)*2022-05-162024-12-06北京埃克索医疗科技发展有限公司 A passive surgical robot navigation and positioning system
CN115105175B (en)*2022-06-302024-12-24上海诺生医疗科技有限公司 Puncture navigation system, method, device, storage medium and puncture device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN108348294A (en)*2015-11-242018-07-31思想外科有限公司 Active Robotic Nail Deployment for Total Knee Arthroplasty
CN111870344A (en)*2020-05-292020-11-03中山大学肿瘤防治中心(中山大学附属肿瘤医院、中山大学肿瘤研究所)Preoperative navigation method, system and terminal equipment

Also Published As

Publication numberPublication date
CN116158850A (en)2023-05-26

Similar Documents

PublicationPublication DateTitle
CN110876643B (en)Medical operation navigation system and method
CN106308946B (en)A kind of augmented reality devices and methods therefor applied to stereotactic surgery robot
US20230000565A1 (en)Systems and methods for autonomous suturing
CN112043382B (en)Surgical navigation system
CN107753105B (en)Surgical robot system for positioning operation and control method thereof
CN107753106B (en) Surgical robot for positioning surgery and its control method
US20220054199A1 (en)Robotic surgery systems and surgical guidance methods thereof
EP2953569B1 (en)Tracking apparatus for tracking an object with respect to a body
CN113940755A (en) A surgical-image-integrated surgical planning and navigation method
JP6475324B2 (en) Optical tracking system and coordinate system matching method of optical tracking system
JP2002186603A (en) Coordinate transformation method for object guidance
EP4018957A1 (en)Systems and methods for surgical port positioning
US20250082184A1 (en)Techniques for controlling an imaging device
CN116549109A (en) medical navigation method
US20220022964A1 (en)System for displaying an augmented reality and method for generating an augmented reality
CN116158850B (en)Surgical positioning and guiding method, device, equipment and medium
CN116831729A (en)Instrument prompting method and system under surgical robot endoscope vision
CN117770958A (en)Tracer orientation positioning method and device, electronic equipment and medium
WO2023114136A1 (en)Dynamic 3d scanning robotic laparoscope
Busam et al.Markerless inside-out tracking for interventional applications
US20250288361A1 (en)Generating imaging pose recommendations
US20250268685A1 (en)Method for carrying out patient registration on a medical visualization system, and medical visualization system
US20230210627A1 (en)Three-dimensional instrument pose estimation
Park et al.A method for fluoroscopy based navigation system to assist needle insertion concerning reduced radiation exposure for endoscopic disc surgery
WO2025173000A1 (en)Multi-arm robotic systems and methods for calibrating and verifying calibration of the same

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
CB02Change of applicant information
CB02Change of applicant information

Country or region after:China

Address after:Room 101, building 1, No. 36, Doukou Road, Guangdong Macao cooperative traditional Chinese medicine science and Technology Industrial Park, Hengqin New District, Zhuhai City, Guangdong Province 519000

Applicant after:Zhuhai Hengle Medical Technology Co.,Ltd.

Address before:Room 101, building 1, No. 36, Doukou Road, Guangdong Macao cooperative traditional Chinese medicine science and Technology Industrial Park, Hengqin New District, Zhuhai City, Guangdong Province 519000

Applicant before:Zhuhai Hengle Medical Technology Co.,Ltd.

Country or region before:China

GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp