Movatterモバイル変換


[0]ホーム

URL:


US9468412B2 - System and method for accuracy verification for image based surgical navigation - Google Patents

System and method for accuracy verification for image based surgical navigation
Download PDF

Info

Publication number
US9468412B2
US9468412B2US11/767,281US76728107AUS9468412B2US 9468412 B2US9468412 B2US 9468412B2US 76728107 AUS76728107 AUS 76728107AUS 9468412 B2US9468412 B2US 9468412B2
Authority
US
United States
Prior art keywords
surgical instrument
instrument tip
location
navigation
surgical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US11/767,281
Other versions
US20080319311A1 (en
Inventor
Mohamed Ali Hamadeh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Stryker European Operations Holdings LLC
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Electric CofiledCriticalGeneral Electric Co
Priority to US11/767,281priorityCriticalpatent/US9468412B2/en
Assigned to GENERAL ELECTRIC COMPANYreassignmentGENERAL ELECTRIC COMPANYASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: HAMADEH, MOHAMED ALI
Publication of US20080319311A1publicationCriticalpatent/US20080319311A1/en
Application grantedgrantedCritical
Publication of US9468412B2publicationCriticalpatent/US9468412B2/en
Assigned to STRYKER EUROPEAN HOLDINGS I, LLCreassignmentSTRYKER EUROPEAN HOLDINGS I, LLCASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GENERAL ELECTRIC COMPANY
Assigned to STRYKER EUROPEAN OPERATIONS HOLDINGS LLCreassignmentSTRYKER EUROPEAN OPERATIONS HOLDINGS LLCCHANGE OF NAME (SEE DOCUMENT FOR DETAILS).Assignors: STRYKER EUROPEAN HOLDINGS III, LLC
Assigned to STRYKER EUROPEAN HOLDINGS III, LLCreassignmentSTRYKER EUROPEAN HOLDINGS III, LLCNUNC PRO TUNC ASSIGNMENT (SEE DOCUMENT FOR DETAILS).Assignors: STRYKER EUROPEAN HOLDINGS I, LLC
Assigned to STRYKER EUROPEAN OPERATIONS HOLDINGS LLCreassignmentSTRYKER EUROPEAN OPERATIONS HOLDINGS LLCCHANGE OF ADDRESSAssignors: STRYKER EUROPEAN OPERATIONS HOLDINGS LLC
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

Certain embodiments of the present invention provide for a system and method for assessing the accuracy of a surgical navigation system. The method may include acquiring an X-ray image that captures a surgical instrument. The method may also include segmenting the surgical instrument in the X-ray image. In an embodiment, the segmenting may be performed using edge detection or pattern recognition. The distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip may be computed. The distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip may be compared with a threshold value. If the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip is greater than the threshold value, the user may be alerted.

Description

BACKGROUND OF THE INVENTION
The present invention generally relates to a system and method for improving the navigation accuracy of an electromagnetic navigation system for use with medical applications. Particularly, the present invention relates to a system and method for verifying the accuracy of a surgical navigation system.
Electromagnetic type navigation systems are useful in numerous applications. One application of particular use is in medical applications, and more specifically, image guided surgery. Typical image guided surgical systems acquire a set of images of an operative region of a patient's body and track a surgical tool or instrument in relation to one or more sets of coordinates. At the present time, such systems have been developed or proposed for a number of surgical procedures such as brain surgery and arthroscopic procedures on the knee, wrist, shoulder or spine, as well as certain types of angiography, cardiac or other interventional radiological procedures and biopsies. Such procedures may also involve preoperative or intra-operative x-ray images being taken to correct the position or otherwise navigate a tool or instrument involved in the procedure in relation to anatomical features of interest. For example, such tracking may be useful for the placement of an elongated probe, radiation needle, fastener or other article in tissue or bone that is internal or is otherwise positioned so that it is difficult to view directly.
An electromagnetic tracking system may be used in conjunction with an x-ray system. For example, an electromagnetic tracking system may be used in conjunction with a C-arm fluoroscope. The C-arm fluoroscope may utilize an x-ray source at one end of the C-arm and an x-ray detector, or camera, at the other end of the C-arm. The patient may be placed between the x-ray source and the x-ray detector. X-rays may pass from the x-ray source, through the patient, to the x-ray detector where an image is captured. The electromagnetic tracking system may generate an electromagnetic field between the ends of the C-arm so tracking may continue during a surgical procedure.
Currently, an operating surgeon can assess the accuracy of the surgical navigation system in a subjective manner. A surgeon may visually compare the “predicted location” of the navigation instrument with the “actual location” of the same instrument on the intra-operative navigated images acquired during surgery. In current fluoroscopic navigation systems, this process may be referred to as “confirmation shots.”
For example, a surgeon may track the “predicted location” of the surgical instrument being used on the navigated fluoroscope image. The surgeon may set up the surgical instrument trajectory defining the surgical planning for the procedure being performed. By leaving the surgical instrument in the planned trajectory in the field of view of the C-arm, the navigation system may display the predicted location of the surgical instrument on the navigated fluoroscope image. The surgeon may then acquire an X-ray image, also called a confirmation shot, and captures it to the navigation screen. The predicted location of the surgical instrument and its actual location, materialized by its radiographic shadow on the navigated image, are visible on the navigated image. The predicted location of the surgical instrument is then superimposed on the navigated image.
By comparing the predicted location of the surgical instrument superimposed on the navigated image with the actual location, which may be shown as a shadow of the surgical instrument, the surgeon may assess the overall system accuracy. If the predicted location matches the actual location on the navigated image, the surgeon may determine the navigation system is accurate. If there is a visible difference between the predicted and actual instrument locations, the surgeon can conclude that the system error is high such that it makes the surgical navigation inaccurate and as a consequence, unsuitable for being used.
One disadvantage to the technique described above is that it provides a subjective method from which the surgeon makes a judgment on whether the navigation system is accurate enough to use. Using this technique to determine whether a navigation system is sufficiently accurate may vary from surgeon to surgeon. Also, a user relying on the navigation system knowing it has some degree of inaccuracy may not have confidence in the navigation system. This may compromise the effectiveness of the navigation system during the surgery.
Accordingly, a system and method is needed to better assess the accuracy of an instrument navigation system. Such a system and method may automate the verification procedure for assessing accuracy of the instrument navigation system.
SUMMARY OF THE INVENTION
Certain embodiments of the present invention may include a method for assessing the accuracy of a surgical navigation system. The method may include acquiring an X-ray image that captures a surgical instrument. The method may also include segmenting the surgical instrument in the X-ray image. In an embodiment, the segmenting the surgical instrument in the X-ray image may be performed using edge detection or pattern recognition. The method may also include computing the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip. The distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip may be compared with a threshold value. If the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip is greater than a threshold value, a user may be alerted. The computation of the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip may be the navigation error in a 3D navigation volume. The navigation error in a 3D navigation volume may be computed by computing a back-projection line that stems from the pixel coordinates of the actual location of the surgical instrument tip and computing the distance between the back-projection line and the 3D coordinates of the predicted location of the surgical instrument tip. The computation of the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip is the navigation error in a 2D image. The navigation error in a 2D image may be computed by projecting 3D coordinates of the predicted location of the surgical instrument tip onto an image and computing the distance between the predicted projection and the actual projection of the surgical instrument tip on the image.
Certain embodiments of the present invention may include a system for assessing the accuracy of a surgical navigation system. The system may include an x-ray unit for acquiring an X-ray image that captures a surgical instrument. The system may also include a computer unit comprising computer software that segments the surgical instrument in the X-ray image and that computes the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip. The computer unit may further comprise computer software for comparing the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip with a threshold value. The computer unit may further comprise computer software for alerting a user if the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip is greater than the threshold value.
Certain embodiments of the present invention may include a computer readable medium having a set of instructions for execution by a computer. The set of instructions may include an acquisition routine for acquiring an X-ray image that captures a surgical instrument. The set of instructions may also include a segmentation routine for segmenting the surgical instrument in the X-ray image. In an embodiment, the segmentation routine may include edge detection. The set of instructions may also include a computation routine for computing the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip. The set of instructions may also include a comparison routine for comparing the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip with a threshold value. The set of instructions may also include an alerting routine for alerting a user if the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip is greater than the threshold value. The computation routine for computing the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip may be the navigation error in a 3D navigation volume. The set of instructions may include a set of instructions for computing the navigation error in a 3D navigation volume. The set of instructions for computing the navigation error in a 3D navigation volume may include a first computation routine for computing a back-projection line that stems from the pixel coordinates of the actual location of the surgical instrument tip and a second computation routine for computing the distance between the back-projection line and the 3D coordinates of the predicted location of the surgical instrument tip. The computation routine for computing the distance between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip may be the navigation error in a 2D image. The set of instructions for computing the navigation error in a 2D image may include a first computation routine for projecting 3D coordinates of the predicted location of the surgical instrument tip onto an image and a second computation routine for computing the distance between the predicted projection and the actual projection of the surgical instrument tip on the image.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a system that may be used for image guided surgery in accordance with an embodiment of the present invention.
FIG. 2 illustrates a graphic illustrating the predicted instrument location and the actual instrument location.
FIG. 3 illustrates an image illustrating the contrast between patient anatomy and a surgical instrument.
FIG. 4 illustrates an image after edge detection segmentation has been performed.
FIG. 5 illustrates a graphic showing the computation of the 3D error in navigation volume.
FIG. 6 illustrates a graphic showing the computation of the 2D error in an image.
FIG. 7 illustrates a method for assessing the accuracy of a surgical navigation system in accordance with an embodiment of the present invention.
FIG. 8 illustrates a method for computing the navigation error in a 3D navigation volume.
FIG. 9 illustrates a method for computing the navigation error in a 2D image.
The foregoing summary, as well as the following detailed description of certain embodiments of the present invention, will be better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, certain embodiments are shown in the drawings. It should be understood, however, that the present invention is not limited to the arrangements and instrumentality shown in the attached drawings.
DETAILED DESCRIPTION OF THE INVENTION
FIG. 1 illustrates asystem100 that may be used for image guided surgery in accordance with an embodiment of the present invention. Thesystem100 illustrates, as an example of a medical imaging unit, a C-arm unit110. The medical imaging unit, however, may be other medical imaging equipment, such as an ultrasound unit, for example. Accordingly, any medical imaging equipment may be used.
The C-arm unit110 is connected to acomputer unit120. The connection between the C-arm unit110 and thecomputer unit120 may be wired or wireless. Thecomputer unit120 may be any equipment or software that permits electronic medical images, such as x-rays, ultrasound, CT, MRI, EBT, MR, or nuclear medicine for example, to be electronically acquired, stored, or transmitted for viewing and operation. Thecomputer unit120 may receive input from a user. Thecomputer unit120 represents, in general, equipment and software. The actual physical computer units may be separate units, part of a single unit, a computer system, or part of a computer system.
Thecomputer unit120 may be connected to other devices via an electronic network. The connection of thecomputer unit120 to an electronic network is illustrated byline140. The connection between thenetwork140 and thecomputer unit120 may be wired or wireless. Thecomputer unit120 may also be connected to adisplay unit130. The connection between thecomputer unit120 and thedisplay unit130 may be wired or wireless. Thedisplay unit130 may be a single display unit or multiple display units. Additionally, thedisplay unit130 may be a two-dimensional display unit or a three-dimensional display unit, for example. Accordingly, any display unit may be used in accordance with the present invention.
Element105 represents a patient andelement107 represents a table on which the patient is lying.Elements150,160, and170 are electronic sensors that may identify their location with reference to a reference frame and with reference to each other. Although three sensors150-170 are shown, any number of sensors may be used. The sensors150-170 are generally in electronic communication with thecomputer unit120.Element112 represents an x-ray source andelement115 represents an x-ray detector. Thex-ray detector115 may be, for example, an image intensifier or flat panel detector. The electronic communication may be over a wire or may be transmitted in a wireless fashion. The components of thesystem100 may be single units, separate units, may be integrated in various forms, and may be implemented in hardware and/or in software.
FIG. 2 illustrates a graphic200 illustrating the predicted instrument location and the actual instrument location. The graphic200 illustrates a “confirmation shot” that shows the predictedinstrument location210 superimposed onto theactual instrument location220 as part of an X-ray of a patient's anatomy. In the embodiment as shown on the graphic200, theactual instrument location220 may be illustrated as a shadow on the graphic200. The graphic200 is only an example and theactual instrument location220 and the predictedinstrument location210 may be represented in a manner different than that shown in the graphic200.
FIG. 3 illustrates animage300 illustrating the contrast between patient anatomy and a surgical instrument. Theimage300 illustrates an X-ray image showing thesurgical instrument320. Thesurgical instrument320 represents the actual location of the surgical instrument. In an embodiment, theimage300 may represent a “confirmation shot” prior to superposition of the predicted instrument location.
In an embodiment, theimage300 may be segmented by computer software to identify thesurgical instrument320. In an embodiment, computer software may segment and extract the tip of thesurgical instrument320 from theimage300. The segmentation of thesurgical instrument320 may be performed by edge detection or pattern recognition algorithms to achieve an accurate localization of the tip of thesurgical instrument320 within theimage300. The computer software may utilize the contrast between the anatomical structure of theimage300 and thesurgical instrument320.
FIG. 4 illustrates animage400 which is theimage300 after edge detection segmentation has been performed. Theimage400 shows thesurgical instrument shadow420.FIG. 4 is only an example and other type of segmentation may be performed to obtain the surgical instrument shadow.
Once the computer software has located the actual location of the tip of thesurgical instrument320 as illustrated, for example inFIG. 4, the computer software may compute the navigation error between the actual location of the surgical instrument and the predicted location of the surgical instrument. In an embodiment, in order to evaluate the navigation error, two quantities may be computed, the 3D error computation in navigation volume and the 2D error computation in the navigated image.
FIG. 5 illustrates a graphic500 showing the computation of the 3D error in navigation volume. As shown in the graphic500, the coordinates520 (ua,va) denote the pixel coordinates520 of the actual location of the surgical instrument tip computed using image segmentation in the image. The coordinates530 (up,vp) denote the pixel coordinates530 of the predicted location of the surgical instrument tip in the image. The back-projection line510 L(ua,va) stemming from the pixel coordinates520 of the actual location of thesurgical instrument tip520 in the image is computed using camera calibration parameters. In an embodiment, the computation of the back-projection line stemming from an image pixel (u,v) is performed once the camera calibration parameters have been computed. The camera calibration is a process that aims at estimating the X-ray image formation model. The estimation enables the system to compute the 2D projection of a 3D point or inversely to compute the back-projection line associated with a 2D pixel within the calibrated image. The output of the camera calibration process is generally referred to as camera parameters or projection parameters. The parameters enable the passage from a 3D point in the space to its 2D projection and the computation, for a 2D pixel, of the back-projection line associated with that pixel. The back-projection line510 L(ua,va) is computed in the tracker coordinates system.
Also shown in the graphic500 is the surgical instrument innavigation volume540. The distance between the back-projection line510 L(ua,va) and the 3D coordinates of the predicted location of the surgical instrument tip in the tracker coordinates (X,Y,Z)t550 is computed. The 3D coordinates of the surgical instrument tip (X,Y,Z)t550 in the tracker coordinates system are known because the surgical instrument may be mounted onto an electromagnetic receiver and its tip has previously been calibrated. For example, the surgical instrument tip may be calibrated using fixed points calibration.
The distance between the back-projection line510 L(ua,va) and the 3D coordinates of the surgical instrument tip (X,Y,Z)t550 is the navigation error in the3D navigation volume560. The navigation error in3D navigation volume560 may be computed as follows:
E3d=Distance[L(ua, va), (X,Y,Z)t]  Equation 1
where E3dis the navigation error in 3D navigation volume. In Equation 1, the Distance function used to compute the 3D error is the distance between a Line L(ua,va) and a 3D point (X,Y,Z)t. The distance between the Line L(ua,va) and the 3D point (X,Y,Z)tmay be geometrically defined as the length measurement of the line segment stemming from the 3D point and which is perpendicular to the Line L(u,v). This distance is measured in the plane defined by the Line L(u,v) and the 3D point (X,Y,Z)t.
The mathematical formula to compute the distance between the Line L(ua,va) and the 3D point (X,Y,Z)tmay depend on the parameters used to define the 3D line. For example, the distance between the 3D point (x0,y0,z0) and the line L defined as the line that passes through (x1,y1,z1) with its orientation defined by the vector (a,b,c)t is given by:
D[L(ua,va)(X,Y,Z)t]=y0-y1z0-z1bc2+z0-z1x0-x1ca2+x0-x1y0-y1ab2a2+b2+c2Equation2
FIG. 6 illustrates a graphic600 showing the computation of the 2D error in an image. The 2D error in the image may also be called the 2D projection error. As shown in the graphic600, the 3D coordinates of the predicted location of the surgical instrument tip (X,Y,Z)tare projected onto the image using camera projection parameters and performing a projection operation. In an embodiment, the computation of the back-projection line stemming from an image pixel (u,v) is performed once the camera calibration parameters have been computed. The camera calibration is a process that aims at estimating the X-ray image formation model. The estimation enables the system to compute the 2D projection of a 3D point or inversely to compute the back-projection line associated with a 2D pixel within the calibrated image. The output of the camera calibration process is generally referred to as camera parameters or projection parameters. The parameters enable the passage from a 3D point in the space to its 2D projection and the computation, for a 2D pixel, of the back-projection line associated with that pixel. The projection of the 3D coordinates of the predicted location of the surgical instrument yields the predicted 2D instrument location on the image. The coordinates630 (up,vp) denote the pixel coordinates630 of the predicted location of the surgical instrument tip in the image. The coordinates620 (ua,va) denote the pixel coordinates620 of the actual location of the surgical instrument tip computed using image segmentation in the image.
The distance between the predicted projection630 (up,vp) and the actual projection620 (ua,va) of the surgical instrument tip on the image is the2D navigation error660 in the image. The2D navigation error660 in the image may also be called the 2D navigation error in the confirmation shot. The2D navigation error660 may be computed as follows:
E2d=D=sqrt[(up−ua)2+(vp−va)2]  Equation 3
The distance D is the difference in 2D image space between the predicted location of thesurgical instrument tip630 and the actual location of thesurgical instrument tip620. The distance D is equal to the2D navigation error660, E2d.
Accordingly, the navigation error in3D navigation volume560 and the2D navigation error660 may be computed and quantified. In accordance with an embodiment of the present invention, the quantified error values may be compared with a threshold error value. In an embodiment, if one of the quantified error values is greater than the associated threshold error value, the user may be alerted that the navigation system is inaccurate. It should be noted that it is not necessary that both the navigation error in3D navigation volume560 and the2D navigation error660 are used. In an embodiment one of the values may be used. In another embodiment, both of the values may be used. The embodiments of the present invention free the surgeon from determining whether the surgical navigation system is accurate.
FIG. 7 illustrates a method700 for assessing the accuracy of a surgical navigation system in accordance with an embodiment of the present invention. Atstep710, an X-ray image that captures the surgical instrument is acquired. In an embodiment, the X-ray image may be a confirmation shot. Atstep720, the X-ray image acquired atstep710 may be segmented. The segmenting may include image processing in order to segment and extract the tip of the surgical instrument in the X-ray image. In an embodiment, the segmentation of the surgical instrument may be performed using an edge detection or pattern recognition algorithm to achieve accurate localization of the surgical instrument tip within the X-ray image. Atstep730, the navigation error between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip is computed. In an embodiment, the navigation error may be the navigation error in a 3D navigation volume. In an embodiment the navigation error may be the 2D navigation error. Atstep740, the navigation error may be compared to a threshold value. If the navigation error is greater than a threshold value, at step750 a user may be alerted that the surgical navigation system is operating with insufficient accuracy.
FIG. 8 illustrates a method800 for computing the navigation error in a 3D navigation volume. Atstep810, a back-projection line that stems from the pixel coordinates of the actual location of the surgical instrument tip is computed. The back-projection line is computed in the tracker coordinates system. In an embodiment, the computation of the back-projection line stemming from an image pixel (u,v) is performed once the camera calibration parameters have been computed. The camera calibration is a process that aims at estimating the X-ray image formation model. The estimation enables the computation of the 2D projection of a 3D point or inversely to compute the back-projection line associated with a 2D pixel within the calibrated image. The output of the camera calibration process is generally referred to as camera parameters or projection parameters. The parameters enable the passage from a 3D point in the space to its 2D projection and the computation, for a 2D pixel, of the back-projection line associated with that pixel.
Atstep820, the distance between the back-projection line and the 3D coordinates of the predicted location of the surgical instrument tip is computed. The 3D coordinates of the surgical instrument tip in the tracker coordinates system are known because the surgical instrument may be mounted onto an electromagnetic receiver and its tip has previously been calibrated. For example, the surgical instrument tip may be calibrated using fixed points calibration. In an embodiment, the distance between the back-projection line and the 3D coordinates of the surgical instrument tip is the navigation error in the 3D navigation volume.
FIG. 9 illustrates a method900 for computing the navigation error in a 2D image. Atstep910, 3D coordinates of the predicted location of the surgical instrument tip are projected onto an image. In an embodiment, the 3D coordinates of the predicted location of the surgical instrument tip are projected onto the image using camera projection parameters. The projection of the 3D coordinates of the predicted location of the surgical instrument yields the predicted 2D instrument location on the image.
Atstep920, the distance between the predicted projection and the actual projection of the surgical instrument tip on the image is computed. In an embodiment, the distance between the predicted projection and the actual projection of the surgical instrument tip on the image is the 2D navigation error in the image. The 2D navigation error in the image may also be called the 2D navigation error in the confirmation shot.
The system and method700 described above may be carried out as part of a computer-readable storage medium including a set of instructions for a computer. The set of instructions may include an acquisition routine for acquiring an X-ray image that captures the surgical instrument. In an embodiment, the X-ray image may be a confirmation shot. The set of instructions may also include a segmentation routine for segmenting the acquired X-ray image. The segmenting may include image processing in order to segment and extract the tip of the surgical instrument in the X-ray image. In an embodiment, the segmentation of the surgical instrument may be performed using an edge detection or pattern recognition algorithm to achieve accurate localization of the surgical instrument tip within the X-ray image. The set of instructions may also include a computation routine for computing the navigation error between the predicted location of the surgical instrument tip and the actual location of the surgical instrument tip. In an embodiment, the navigation error may be the navigation error in a 3D navigation volume. In an embodiment the navigation error may be the 2D navigation error. The set of instructions may also include a comparison routine for comparing the navigation error with a threshold value. The set of instructions may also include an alerting routine for alerting a user if the navigation error is greater than a threshold value that the surgical navigation system may be operating with insufficient accuracy.
The system and method800 described above may be carried out as part of a computer-readable storage medium including a set of instructions for a computer. The set of instructions may include a first computation routine for computing a back-projection line that stems from the pixel coordinates of the actual location of the surgical instrument tip. In an embodiment, the back-projection line is computed in the tracker coordinates system. The set of instructions may also include a second computation routine for computing the distance between the back-projection line and the 3D coordinates of the predicted location of the surgical instrument tip. The 3D coordinates of the surgical instrument tip in the tracker coordinates system are known because the surgical instrument may be mounted onto an electromagnetic receiver and its tip has previously been calibrated. For example, the surgical instrument tip may be calibrated using fixed points calibration. In an embodiment, the distance between the back-projection line and the 3D coordinates of the surgical instrument tip is the navigation error in the 3D navigation volume.
The system and method900 described above may be carried out as part of a computer-readable storage medium including a set of instructions for a computer. The set of instructions may include a first computation routine for projecting the 3D coordinates of the predicted location of the surgical instrument tip onto an image. In an embodiment, the 3D coordinates of the predicted location of the surgical instrument tip are projected onto the image using camera projection parameters. The projection of the 3D coordinates of the predicted location of the surgical instrument yields the predicted 2D instrument location on the image. The set of instructions may also include a second computation routine for computing the distance between the predicted projection and the actual projection of the surgical instrument tip on the image. In an embodiment, the distance between the predicted projection and the actual projection of the surgical instrument tip on the image is the 2D navigation error in the image. The 2D navigation error in the image may also be called the 2D navigation error in the confirmation shot.
While the invention has been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims (20)

The invention claimed is:
1. A method for assessing an accuracy of a surgical navigation system, the method comprising:
acquiring an X-ray image capturing a surgical instrument tip;
tracking a tracked location of the surgical instrument tip;
segmenting the surgical instrument tip in the X-ray image to determine an imaged location of the surgical instrument tip; and
computing a distance between the tracked location of the surgical instrument tip and the imaged location of the surgical instrument tip to form a navigation error of the surgical navigation system.
2. The method ofclaim 1, comprising comparing the distance between the tracked location of the surgical instrument tip and the imaged location of the surgical instrument tip with a threshold value.
3. The method ofclaim 2, comprising alerting a user if the navigation error of the surgical navigation system is greater than the threshold value.
4. The method ofclaim 1, wherein the computing the distance between the tracked location of the surgical instrument tip and the imaged location of the surgical instrument tip comprises:
computing a back-projection line stemming from the imaged location of the surgical instrument tip; and
computing a distance between the back-projection line and the tracked location of the surgical instrument tip.
5. The method ofclaim 1, wherein the computing the distance between the tracked location of the surgical instrument and the imaged location of the surgical instrument tip comprises:
projecting 3D coordinates of the tracked location of the surgical instrument tip onto an image to form a tracked projection; and
computing a distance between the tracked projection of the surgical instrument tip and an imaged projection of the surgical instrument tip on the image.
6. The method ofclaim 1, wherein the tracking the tracked location of the surgical instrument tip comprises tracking the tracked location of the surgical instrument tip with an electromagnetic tracking system.
7. The method ofclaim 1, wherein the tracking the tracked location of the surgical instrument tip comprises tracking the tracked location of the surgical instrument tip according to a planned trajectory of the surgical instrument tip.
8. A system for assessing an accuracy of a surgical navigation system, the system comprising:
an X-ray unit configured to acquire an X-ray image capturing a surgical instrument tip;
a tracking system configured to track a tracked location of the surgical instrument tip; and
a computer unit configured to segment the surgical instrument tip in the X-ray image to form an imaged location of the surgical instrument tip, the computer unit configured to compute the distance between the tracked location of the surgical instrument tip and the imaged location of the surgical instrument tip to form a navigation error of the surgical navigation system.
9. The system ofclaim 8, wherein the computer unit is configured to compare the navigation error of the surgical navigation system with a threshold value.
10. The system ofclaim 9, wherein the computer unit is configured to alert a user if the navigation error of the surgical navigation system is greater than the threshold value.
11. The system ofclaim 8, wherein the tracking system comprises an electromagnetic tracking system.
12. The system ofclaim 8, wherein the tracking system is configured to track the tracked location of the surgical instrument tip according to a planned trajectory of the surgical instrument tip.
13. A non-transitory computer readable medium having a set of instructions for execution by a computer, the set of instructions comprising:
an acquisition routine for acquiring an X-ray image capturing a surgical instrument tip;
a segmentation routine for segmenting the surgical instrument tip in the X-ray image to determine an imaged location of the surgical instrument tip; and
a computation routine for computing the distance between a tracked location of the surgical instrument tip from a tracking system and the imaged location of the surgical instrument tip to form a navigation error of the surgical navigation system.
14. The set of instructions ofclaim 13, comprising a comparison routine to compare the navigation error of the surgical navigation system with a threshold value.
15. The set of instructions ofclaim 14, comprising an alerting routine to alert a user if the navigation error of the surgical navigation system is greater than the threshold value.
16. The set of instructions ofclaim 13, wherein the computation routine computes the navigation error of the surgical navigation system in a 3D navigation volume.
17. The set of instructions ofclaim 16, comprising:
a first computation routine for computing a back-projection line stemming from pixel coordinates of the imaged location of the surgical instrument tip; and
a second computation routine for computing a distance between the back-projection line and pixel coordinates of the tracked location of the surgical instrument tip.
18. The set of instructions ofclaim 13, comprising:
a first computation routine for projecting 3D coordinates of the tracked location of the surgical instrument tip onto an image; and,
a second computation routine for computing a distance between the tracked projection and an imaged projection of the surgical instrument tip on the image.
19. The set of instructions ofclaim 13, wherein the tracking system comprises an electromagnetic tracking system.
20. The set of instructions ofclaim 13, wherein the tracked location of the surgical instrument tip is tracked according to a planned trajectory of the surgical instrument tip.
US11/767,2812007-06-222007-06-22System and method for accuracy verification for image based surgical navigationActive2034-07-30US9468412B2 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US11/767,281US9468412B2 (en)2007-06-222007-06-22System and method for accuracy verification for image based surgical navigation

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US11/767,281US9468412B2 (en)2007-06-222007-06-22System and method for accuracy verification for image based surgical navigation

Publications (2)

Publication NumberPublication Date
US20080319311A1 US20080319311A1 (en)2008-12-25
US9468412B2true US9468412B2 (en)2016-10-18

Family

ID=40137222

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US11/767,281Active2034-07-30US9468412B2 (en)2007-06-222007-06-22System and method for accuracy verification for image based surgical navigation

Country Status (1)

CountryLink
US (1)US9468412B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11350995B2 (en)2016-10-052022-06-07Nuvasive, Inc.Surgical navigation systems and methods
US11612440B2 (en)2019-09-052023-03-28Nuvasive, Inc.Surgical instrument tracking devices and related methods

Families Citing this family (169)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8190238B2 (en)*2005-12-092012-05-29Hansen Medical, Inc.Robotic catheter system and methods
US8219178B2 (en)2007-02-162012-07-10Catholic Healthcare WestMethod and system for performing invasive medical procedures using a surgical robot
US10653497B2 (en)2006-02-162020-05-19Globus Medical, Inc.Surgical tool systems and methods
US10893912B2 (en)2006-02-162021-01-19Globus Medical Inc.Surgical tool systems and methods
US10357184B2 (en)2012-06-212019-07-23Globus Medical, Inc.Surgical tool systems and method
US20090036900A1 (en)2007-02-022009-02-05Hansen Medical, Inc.Surgery methods using a robotic instrument system
US9403020B2 (en)2008-11-042016-08-02Nevro CorporationModeling positions of implanted devices in a patient
US9642555B2 (en)*2008-11-202017-05-09Medtronic, Inc.Subcutaneous lead guidance
US9254123B2 (en)2009-04-292016-02-09Hansen Medical, Inc.Flexible and steerable elongate instruments with shape control and support elements
WO2011044421A1 (en)2009-10-082011-04-14C. R. Bard, Inc.Spacers for use with an ultrasound probe
CN103228219B (en)2010-08-092016-04-27C·R·巴德股份有限公司 Support and Covering Structures for Ultrasound Probe Heads
US8805519B2 (en)2010-09-302014-08-12Nevro CorporationSystems and methods for detecting intrathecal penetration
US8965482B2 (en)*2010-09-302015-02-24Nevro CorporationSystems and methods for positioning implanted devices in a patient
US20120191079A1 (en)2011-01-202012-07-26Hansen Medical, Inc.System and method for endoluminal and translumenal therapy
US9308050B2 (en)2011-04-012016-04-12Ecole Polytechnique Federale De Lausanne (Epfl)Robotic system and method for spinal and other surgeries
US20130030363A1 (en)2011-07-292013-01-31Hansen Medical, Inc.Systems and methods utilizing shape sensing fibers
AU2013211937B2 (en)2012-01-252016-07-28Nevro CorporationLead anchors and associated systems and methods
US8676331B2 (en)2012-04-022014-03-18Nevro CorporationDevices for controlling spinal cord modulation for inhibiting pain, and associated systems and methods, including controllers for automated parameter selection
EP2861153A4 (en)*2012-06-152016-10-19Bard Inc C RApparatus and methods for detection of a removable cap on an ultrasound probe
US10874466B2 (en)2012-06-212020-12-29Globus Medical, Inc.System and method for surgical tool insertion using multiaxis force and moment feedback
US10231791B2 (en)2012-06-212019-03-19Globus Medical, Inc.Infrared signal based position recognition system for use with a robot-assisted surgery
US11864745B2 (en)2012-06-212024-01-09Globus Medical, Inc.Surgical robotic system with retractor
US12004905B2 (en)2012-06-212024-06-11Globus Medical, Inc.Medical imaging systems using robotic actuators and related methods
US12262954B2 (en)2012-06-212025-04-01Globus Medical, Inc.Surgical robotic automation with tracking markers
EP2863827B1 (en)2012-06-212022-11-16Globus Medical, Inc.Surgical robot platform
US12329593B2 (en)2012-06-212025-06-17Globus Medical, Inc.Surgical robotic automation with tracking markers
US11896446B2 (en)2012-06-212024-02-13Globus Medical, IncSurgical robotic automation with tracking markers
US20150032164A1 (en)2012-06-212015-01-29Globus Medical, Inc.Methods for Performing Invasive Medical Procedures Using a Surgical Robot
US10646280B2 (en)2012-06-212020-05-12Globus Medical, Inc.System and method for surgical tool insertion using multiaxis force and moment feedback
US10758315B2 (en)2012-06-212020-09-01Globus Medical Inc.Method and system for improving 2D-3D registration convergence
US10799298B2 (en)2012-06-212020-10-13Globus Medical Inc.Robotic fluoroscopic navigation
US11857266B2 (en)2012-06-212024-01-02Globus Medical, Inc.System for a surveillance marker in robotic-assisted surgery
US10842461B2 (en)2012-06-212020-11-24Globus Medical, Inc.Systems and methods of checking registrations for surgical systems
US11793570B2 (en)2012-06-212023-10-24Globus Medical Inc.Surgical robotic automation with tracking markers
US11607149B2 (en)2012-06-212023-03-21Globus Medical Inc.Surgical tool systems and method
US11317971B2 (en)2012-06-212022-05-03Globus Medical, Inc.Systems and methods related to robotic guidance in surgery
US11298196B2 (en)2012-06-212022-04-12Globus Medical Inc.Surgical robotic automation with tracking markers and controlled tool advancement
US11974822B2 (en)2012-06-212024-05-07Globus Medical Inc.Method for a surveillance marker in robotic-assisted surgery
US11395706B2 (en)2012-06-212022-07-26Globus Medical Inc.Surgical robot platform
US11786324B2 (en)2012-06-212023-10-17Globus Medical, Inc.Surgical robotic automation with tracking markers
US11116576B2 (en)2012-06-212021-09-14Globus Medical Inc.Dynamic reference arrays and methods of use
US10350013B2 (en)2012-06-212019-07-16Globus Medical, Inc.Surgical tool systems and methods
US10624710B2 (en)2012-06-212020-04-21Globus Medical, Inc.System and method for measuring depth of instrumentation
US12310683B2 (en)2012-06-212025-05-27Globus Medical, Inc.Surgical tool systems and method
US11864839B2 (en)2012-06-212024-01-09Globus Medical Inc.Methods of adjusting a virtual implant and related surgical navigation systems
US11399900B2 (en)2012-06-212022-08-02Globus Medical, Inc.Robotic systems providing co-registration using natural fiducials and related methods
US11045267B2 (en)2012-06-212021-06-29Globus Medical, Inc.Surgical robotic automation with tracking markers
US11253327B2 (en)2012-06-212022-02-22Globus Medical, Inc.Systems and methods for automatically changing an end-effector on a surgical robot
US11589771B2 (en)2012-06-212023-02-28Globus Medical Inc.Method for recording probe movement and determining an extent of matter removed
US11857149B2 (en)2012-06-212024-01-02Globus Medical, Inc.Surgical robotic systems with target trajectory deviation monitoring and related methods
US10136954B2 (en)2012-06-212018-11-27Globus Medical, Inc.Surgical tool systems and method
US12220120B2 (en)2012-06-212025-02-11Globus Medical, Inc.Surgical robotic system with retractor
US11963755B2 (en)2012-06-212024-04-23Globus Medical Inc.Apparatus for recording probe movement
US20140148673A1 (en)2012-11-282014-05-29Hansen Medical, Inc.Method of anchoring pullwire directly articulatable region in catheter
US20140277334A1 (en)2013-03-142014-09-18Hansen Medical, Inc.Active drives for robotic catheter manipulators
US9326822B2 (en)2013-03-142016-05-03Hansen Medical, Inc.Active drives for robotic catheter manipulators
US9408669B2 (en)2013-03-152016-08-09Hansen Medical, Inc.Active drive mechanism with finite range of motion
US20140276936A1 (en)2013-03-152014-09-18Hansen Medical, Inc.Active drive mechanism for simultaneous rotation and translation
US9265935B2 (en)2013-06-282016-02-23Nevro CorporationNeurological stimulation lead anchors and associated systems and methods
US9480860B2 (en)*2013-09-272016-11-01Varian Medical Systems, Inc.System and methods for processing images to measure multi-leaf collimator, collimator jaw, and collimator performance utilizing pre-entered characteristics
US9283048B2 (en)2013-10-042016-03-15KB Medical SAApparatus and systems for precise guidance of surgical tools
US9241771B2 (en)2014-01-152016-01-26KB Medical SANotched apparatus for guidance of an insertable instrument along an axis during spinal surgery
WO2015121311A1 (en)2014-02-112015-08-20KB Medical SASterile handle for controlling a robotic surgical system from a sterile field
US11033182B2 (en)2014-02-212021-06-153Dintegrated ApsSet comprising a surgical instrument
US10046140B2 (en)2014-04-212018-08-14Hansen Medical, Inc.Devices, systems, and methods for controlling active drive systems
EP3134022B1 (en)2014-04-242018-01-10KB Medical SASurgical instrument holder for use with a robotic surgical system
CN106999248B (en)2014-06-192021-04-06Kb医疗公司Systems and methods for performing minimally invasive surgery
US10357257B2 (en)2014-07-142019-07-23KB Medical SAAnti-skid surgical instrument for use in preparing holes in bone tissue
US10765438B2 (en)2014-07-142020-09-08KB Medical SAAnti-skid surgical instrument for use in preparing holes in bone tissue
EP3226781B1 (en)2014-12-022018-08-01KB Medical SARobot assisted volume removal during surgery
US10013808B2 (en)2015-02-032018-07-03Globus Medical, Inc.Surgeon head-mounted display apparatuses
WO2016131903A1 (en)2015-02-182016-08-25KB Medical SASystems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique
US9789321B2 (en)2015-04-032017-10-17Nevro Corp.Couplings for implanted leads and external stimulators, and associated systems and methods
CN108024806B (en)2015-07-212022-07-013D集成公司Cannula assembly kit, trocar assembly kit, sleeve assembly, minimally invasive surgical system and method thereof
US11020144B2 (en)2015-07-212021-06-013Dintegrated ApsMinimally invasive surgery system
US10058394B2 (en)2015-07-312018-08-28Globus Medical, Inc.Robot arm and methods of use
US10646298B2 (en)2015-07-312020-05-12Globus Medical, Inc.Robot arm and methods of use
US10080615B2 (en)2015-08-122018-09-25Globus Medical, Inc.Devices and methods for temporary mounting of parts to bone
JP6894431B2 (en)2015-08-312021-06-30ケービー メディカル エスアー Robotic surgical system and method
US10034716B2 (en)2015-09-142018-07-31Globus Medical, Inc.Surgical robotic systems and methods thereof
DK178899B1 (en)2015-10-092017-05-083Dintegrated ApsA depiction system
US9771092B2 (en)2015-10-132017-09-26Globus Medical, Inc.Stabilizer wheel assembly and methods of use
US11058378B2 (en)2016-02-032021-07-13Globus Medical, Inc.Portable medical imaging system
US10842453B2 (en)2016-02-032020-11-24Globus Medical, Inc.Portable medical imaging system
US10117632B2 (en)2016-02-032018-11-06Globus Medical, Inc.Portable medical imaging system with beam scanning collimator
US11883217B2 (en)2016-02-032024-01-30Globus Medical, Inc.Portable medical imaging system and method
US10448910B2 (en)2016-02-032019-10-22Globus Medical, Inc.Portable medical imaging system
US10866119B2 (en)2016-03-142020-12-15Globus Medical, Inc.Metal detector for detecting insertion of a surgical device into a hollow tube
EP3241518B1 (en)2016-04-112024-10-23Globus Medical, IncSurgical tool systems
US10463439B2 (en)2016-08-262019-11-05Auris Health, Inc.Steerable catheter with shaft load distributions
US11241559B2 (en)2016-08-292022-02-08Auris Health, Inc.Active drive for guidewire manipulation
US11039893B2 (en)2016-10-212021-06-22Globus Medical, Inc.Robotic surgical systems
JP7233841B2 (en)2017-01-182023-03-07ケービー メディカル エスアー Robotic Navigation for Robotic Surgical Systems
EP3351202B1 (en)2017-01-182021-09-08KB Medical SAUniversal instrument guide for robotic surgical systems
JP7583513B2 (en)2017-01-182024-11-14ケービー メディカル エスアー Universal instrument guide for robotic surgical systems, surgical instrument system
US10980999B2 (en)2017-03-092021-04-20Nevro Corp.Paddle leads and delivery tools, and associated systems and methods
US11071594B2 (en)2017-03-162021-07-27KB Medical SARobotic navigation of robotic surgical systems
US20180289432A1 (en)2017-04-052018-10-11Kb Medical, SaRobotic surgical systems for preparing holes in bone tissue and methods of their use
US11135015B2 (en)2017-07-212021-10-05Globus Medical, Inc.Robot surgical platform
US11357548B2 (en)2017-11-092022-06-14Globus Medical, Inc.Robotic rod benders and related mechanical and motor housings
EP3492032B1 (en)2017-11-092023-01-04Globus Medical, Inc.Surgical robotic systems for bending surgical rods
US11794338B2 (en)2017-11-092023-10-24Globus Medical Inc.Robotic rod benders and related mechanical and motor housings
US11134862B2 (en)2017-11-102021-10-05Globus Medical, Inc.Methods of selecting surgical implants and related devices
US20190254753A1 (en)2018-02-192019-08-22Globus Medical, Inc.Augmented reality navigation systems for use with robotic surgical systems and methods of their use
JP7225259B2 (en)2018-03-282023-02-20オーリス ヘルス インコーポレイテッド Systems and methods for indicating probable location of instruments
WO2019191423A1 (en)2018-03-292019-10-03Nevro Corp.Leads having sidewall openings, and associated systems and methods
US10573023B2 (en)2018-04-092020-02-25Globus Medical, Inc.Predictive visualization of medical imaging scanner component movement
US11337742B2 (en)2018-11-052022-05-24Globus Medical IncCompliant orthopedic driver
US11278360B2 (en)2018-11-162022-03-22Globus Medical, Inc.End-effectors for surgical robotic systems having sealed optical components
US11602402B2 (en)2018-12-042023-03-14Globus Medical, Inc.Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11744655B2 (en)2018-12-042023-09-05Globus Medical, Inc.Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11918313B2 (en)2019-03-152024-03-05Globus Medical Inc.Active end effectors for surgical robots
US11806084B2 (en)2019-03-222023-11-07Globus Medical, Inc.System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11382549B2 (en)2019-03-222022-07-12Globus Medical, Inc.System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11419616B2 (en)2019-03-222022-08-23Globus Medical, Inc.System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11571265B2 (en)2019-03-222023-02-07Globus Medical Inc.System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US20200297357A1 (en)2019-03-222020-09-24Globus Medical, Inc.System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11317978B2 (en)2019-03-222022-05-03Globus Medical, Inc.System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11045179B2 (en)2019-05-202021-06-29Global Medical IncRobot-mounted retractor system
US11628023B2 (en)2019-07-102023-04-18Globus Medical, Inc.Robotic navigational system for interbody implants
US11571171B2 (en)2019-09-242023-02-07Globus Medical, Inc.Compound curve cable chain
US12396692B2 (en)2019-09-242025-08-26Globus Medical, Inc.Compound curve cable chain
US12329391B2 (en)2019-09-272025-06-17Globus Medical, Inc.Systems and methods for robot-assisted knee arthroplasty surgery
US11890066B2 (en)2019-09-302024-02-06Globus Medical, IncSurgical robot with passive end effector
US12408929B2 (en)2019-09-272025-09-09Globus Medical, Inc.Systems and methods for navigating a pin guide driver
US11864857B2 (en)2019-09-272024-01-09Globus Medical, Inc.Surgical robot with passive end effector
US11426178B2 (en)2019-09-272022-08-30Globus Medical Inc.Systems and methods for navigating a pin guide driver
US11510684B2 (en)2019-10-142022-11-29Globus Medical, Inc.Rotary motion passive end effector for surgical robots in orthopedic surgeries
US11992373B2 (en)2019-12-102024-05-28Globus Medical, IncAugmented reality headset with varied opacity for navigated robotic surgery
US12133772B2 (en)2019-12-102024-11-05Globus Medical, Inc.Augmented reality headset for navigated robotic surgery
US12220176B2 (en)2019-12-102025-02-11Globus Medical, Inc.Extended reality instrument interaction zone for navigated robotic
US12064189B2 (en)2019-12-132024-08-20Globus Medical, Inc.Navigated instrument for use in robotic guided surgery
US11464581B2 (en)2020-01-282022-10-11Globus Medical, Inc.Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11382699B2 (en)2020-02-102022-07-12Globus Medical Inc.Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US12414752B2 (en)2020-02-172025-09-16Globus Medical, Inc.System and method of determining optimal 3-dimensional position and orientation of imaging device for imaging patient bones
US11207150B2 (en)2020-02-192021-12-28Globus Medical, Inc.Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US11253216B2 (en)2020-04-282022-02-22Globus Medical Inc.Fixtures for fluoroscopic imaging systems and related navigation systems and methods
US11153555B1 (en)2020-05-082021-10-19Globus Medical Inc.Extended reality headset camera system for computer assisted navigation in surgery
US11510750B2 (en)2020-05-082022-11-29Globus Medical, Inc.Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US11382700B2 (en)2020-05-082022-07-12Globus Medical Inc.Extended reality headset tool tracking and control
US11317973B2 (en)2020-06-092022-05-03Globus Medical, Inc.Camera tracking bar for computer assisted navigation during surgery
US12070276B2 (en)2020-06-092024-08-27Globus Medical Inc.Surgical object tracking in visible light via fiducial seeding and synthetic image registration
US11382713B2 (en)2020-06-162022-07-12Globus Medical, Inc.Navigated surgical system with eye to XR headset display calibration
US11877807B2 (en)2020-07-102024-01-23Globus Medical, IncInstruments for navigated orthopedic surgeries
US11793588B2 (en)2020-07-232023-10-24Globus Medical, Inc.Sterile draping of robotic arms
US11737831B2 (en)2020-09-022023-08-29Globus Medical Inc.Surgical object tracking template generation for computer assisted navigation during surgical procedure
US11523785B2 (en)2020-09-242022-12-13Globus Medical, Inc.Increased cone beam computed tomography volume length without requiring stitching or longitudinal C-arm movement
US12076091B2 (en)2020-10-272024-09-03Globus Medical, Inc.Robotic navigational system
US11911112B2 (en)2020-10-272024-02-27Globus Medical, Inc.Robotic navigational system
US11941814B2 (en)2020-11-042024-03-26Globus Medical Inc.Auto segmentation using 2-D images taken during 3-D imaging spin
US11717350B2 (en)2020-11-242023-08-08Globus Medical Inc.Methods for robotic assistance and navigation in spinal surgery and related systems
US12161433B2 (en)2021-01-082024-12-10Globus Medical, Inc.System and method for ligament balancing with robotic assistance
US12150728B2 (en)2021-04-142024-11-26Globus Medical, Inc.End effector for a surgical robot
US12178523B2 (en)2021-04-192024-12-31Globus Medical, Inc.Computer assisted surgical navigation system for spine procedures
US11857273B2 (en)2021-07-062024-01-02Globus Medical, Inc.Ultrasonic robotic surgical navigation
US11439444B1 (en)2021-07-222022-09-13Globus Medical, Inc.Screw tower and rod reduction tool
US12213745B2 (en)2021-09-162025-02-04Globus Medical, Inc.Extended reality systems for visualizing and controlling operating room equipment
US12238087B2 (en)2021-10-042025-02-25Globus Medical, Inc.Validating credential keys based on combinations of credential value strings and input order strings
US12184636B2 (en)2021-10-042024-12-31Globus Medical, Inc.Validating credential keys based on combinations of credential value strings and input order strings
US20230368330A1 (en)2021-10-202023-11-16Globus Medical, Inc.Interpolation of medical images
US20230165639A1 (en)2021-12-012023-06-01Globus Medical, Inc.Extended reality systems with three-dimensional visualizations of medical image scan slices
US11911115B2 (en)2021-12-202024-02-27Globus Medical Inc.Flat panel registration fixture and method of using same
US12103480B2 (en)2022-03-182024-10-01Globus Medical Inc.Omni-wheel cable pusher
US12048493B2 (en)2022-03-312024-07-30Globus Medical, Inc.Camera tracking system identifying phantom markers during computer assisted surgery navigation
US12394086B2 (en)2022-05-102025-08-19Globus Medical, Inc.Accuracy check and automatic calibration of tracked instruments
US12161427B2 (en)2022-06-082024-12-10Globus Medical, Inc.Surgical navigation system with flat panel registration fixture
US12226169B2 (en)2022-07-152025-02-18Globus Medical, Inc.Registration of 3D and 2D images for surgical navigation and robotic guidance without using radiopaque fiducials in the images
US20240020840A1 (en)2022-07-152024-01-18Globus Medical, Inc.REGISTRATION OF 3D and 2D IMAGES FOR SURGICAL NAVIGATION AND ROBOTIC GUIDANCE WITHOUT USING RADIOPAQUE FIDUCIALS IN THE IMAGES
US12318150B2 (en)2022-10-112025-06-03Globus Medical Inc.Camera tracking system for computer assisted surgery navigation

Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5389101A (en)*1992-04-211995-02-14University Of UtahApparatus and method for photogrammetric surgical localization
US5603318A (en)*1992-04-211997-02-18University Of Utah Research FoundationApparatus and method for photogrammetric surgical localization
US6129668A (en)*1997-05-082000-10-10Lucent Medical Systems, Inc.System and method to determine the location and orientation of an indwelling medical device
US6470207B1 (en)*1999-03-232002-10-22Surgical Navigation Technologies, Inc.Navigational guidance via computer-assisted fluoroscopic imaging
US6484049B1 (en)*2000-04-282002-11-19Ge Medical Systems Global Technology Company, LlcFluoroscopic tracking and visualization system
US6560354B1 (en)*1999-02-162003-05-06University Of RochesterApparatus and method for registration of images to physical space using a weighted combination of points and surfaces
US20030208122A1 (en)*2000-03-012003-11-06Melkent Anthony J.Multiple cannula image guided tool for image guided procedures
US20050096589A1 (en)*2003-10-202005-05-05Yehoshua ShacharSystem and method for radar-assisted catheter guidance and control
US20050203386A1 (en)*2004-03-112005-09-15Siemens AktiengesellschaftMethod of calibrating an X-ray imaging device
US20050281385A1 (en)*2004-06-022005-12-22Johnson Douglas KMethod and system for improved correction of registration error in a fluoroscopic image
US20060058616A1 (en)*2003-02-042006-03-16Joel MarquartInteractive computer-assisted surgery system and method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5389101A (en)*1992-04-211995-02-14University Of UtahApparatus and method for photogrammetric surgical localization
US5603318A (en)*1992-04-211997-02-18University Of Utah Research FoundationApparatus and method for photogrammetric surgical localization
US6129668A (en)*1997-05-082000-10-10Lucent Medical Systems, Inc.System and method to determine the location and orientation of an indwelling medical device
US6560354B1 (en)*1999-02-162003-05-06University Of RochesterApparatus and method for registration of images to physical space using a weighted combination of points and surfaces
US6470207B1 (en)*1999-03-232002-10-22Surgical Navigation Technologies, Inc.Navigational guidance via computer-assisted fluoroscopic imaging
US20030208122A1 (en)*2000-03-012003-11-06Melkent Anthony J.Multiple cannula image guided tool for image guided procedures
US6484049B1 (en)*2000-04-282002-11-19Ge Medical Systems Global Technology Company, LlcFluoroscopic tracking and visualization system
US20060058616A1 (en)*2003-02-042006-03-16Joel MarquartInteractive computer-assisted surgery system and method
US20050096589A1 (en)*2003-10-202005-05-05Yehoshua ShacharSystem and method for radar-assisted catheter guidance and control
US20050203386A1 (en)*2004-03-112005-09-15Siemens AktiengesellschaftMethod of calibrating an X-ray imaging device
US20050281385A1 (en)*2004-06-022005-12-22Johnson Douglas KMethod and system for improved correction of registration error in a fluoroscopic image

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11350995B2 (en)2016-10-052022-06-07Nuvasive, Inc.Surgical navigation systems and methods
US11612440B2 (en)2019-09-052023-03-28Nuvasive, Inc.Surgical instrument tracking devices and related methods

Also Published As

Publication numberPublication date
US20080319311A1 (en)2008-12-25

Similar Documents

PublicationPublication DateTitle
US9468412B2 (en)System and method for accuracy verification for image based surgical navigation
US20220047247A1 (en)Apparatus and method for real-time tracking of tissue structures
US10405825B2 (en)System and method for automatically determining calibration parameters of a fluoroscope
US7711406B2 (en)System and method for detection of electromagnetic radiation by amorphous silicon x-ray detector for metal detection in x-ray imaging
US8165372B2 (en)Information processing apparatus for registrating medical images, information processing method and program
US8682413B2 (en)Systems and methods for automated tracker-driven image selection
US8694075B2 (en)Intra-operative registration for navigated surgical procedures
US9320569B2 (en)Systems and methods for implant distance measurement
US9715739B2 (en)Bone fragment tracking
US8131031B2 (en)Systems and methods for inferred patient annotation
US20080300477A1 (en)System and method for correction of automated image registration
US20080300478A1 (en)System and method for displaying real-time state of imaged anatomy during a surgical procedure
US20080119712A1 (en)Systems and Methods for Automated Image Registration
US20080119725A1 (en)Systems and Methods for Visual Verification of CT Registration and Feedback
CN106999247A (en)For performing the trace labelling supporting structure of navigation surgical procedures and using its surface registration method
US11127153B2 (en)Radiation imaging device, image processing method, and image processing program
US8126536B2 (en)Method and apparatus for determining the frontal plane of the pelvic bone
Wang et al.Intra-op measurement of the mechanical axis deviation: an evaluation study on 19 human cadaver legs
US20140316256A1 (en)Display Of An Acquired Cine Loop For Procedure Navigation
US20240398375A1 (en)Spatial registration method for imaging devices
US8067726B2 (en)Universal instrument calibration system and method of use
US9477686B2 (en)Systems and methods for annotation and sorting of surgical images
US20240216073A1 (en)Method and device for generating an uncertainty map for guided percutaneous procedures
KR20190123857A (en)Apparatus for image overlay and method for the same
US11432898B2 (en)Tracing platforms and intra-operative systems and methods using same

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:GENERAL ELECTRIC COMPANY, NEW YORK

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HAMADEH, MOHAMED ALI;REEL/FRAME:019470/0536

Effective date:20070621

STCFInformation on status: patent grant

Free format text:PATENTED CASE

ASAssignment

Owner name:STRYKER EUROPEAN HOLDINGS I, LLC, MICHIGAN

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GENERAL ELECTRIC COMPANY;REEL/FRAME:046020/0621

Effective date:20171206

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4

ASAssignment

Owner name:STRYKER EUROPEAN HOLDINGS III, LLC, DELAWARE

Free format text:NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:STRYKER EUROPEAN HOLDINGS I, LLC;REEL/FRAME:056969/0771

Effective date:20210219

Owner name:STRYKER EUROPEAN OPERATIONS HOLDINGS LLC, MICHIGAN

Free format text:CHANGE OF NAME;ASSIGNOR:STRYKER EUROPEAN HOLDINGS III, LLC;REEL/FRAME:056969/0893

Effective date:20190226

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:8

ASAssignment

Owner name:STRYKER EUROPEAN OPERATIONS HOLDINGS LLC, MICHIGAN

Free format text:CHANGE OF ADDRESS;ASSIGNOR:STRYKER EUROPEAN OPERATIONS HOLDINGS LLC;REEL/FRAME:069730/0754

Effective date:20241217


[8]ページ先頭

©2009-2025 Movatter.jp