Movatterモバイル変換


[0]ホーム

URL:


WO2025191487A1 - System for tracking of an instrument - Google Patents

System for tracking of an instrument

Info

Publication number
WO2025191487A1
WO2025191487A1PCT/IB2025/052622IB2025052622WWO2025191487A1WO 2025191487 A1WO2025191487 A1WO 2025191487A1IB 2025052622 WIB2025052622 WIB 2025052622WWO 2025191487 A1WO2025191487 A1WO 2025191487A1
Authority
WO
WIPO (PCT)
Prior art keywords
implant
pose
tracking device
tracking
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IB2025/052622
Other languages
French (fr)
Inventor
Patrick A. Helm
Osvaldo Andres Barrera
Anthony B. Ross
Bradley JACOBSEN
Yvan Paitel
Joseph Brannan
Joshua Blauer
Andrew Wald
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Medtronic Navigation Inc
Original Assignee
Medtronic Navigation Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Medtronic Navigation IncfiledCriticalMedtronic Navigation Inc
Publication of WO2025191487A1publicationCriticalpatent/WO2025191487A1/en
Pendinglegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Definitions

Landscapes

Abstract

Disclosed is a system for assisting in guiding and performing a procedure on a subject. The system may determine a pose of a member without a tracking device associated therewith. The system may further determine a pose of a tracked instrument relative to the member.

Description

SYSTEM FOR TRACKING OF AN INSTRUMENT
FIELD
[0001] This application claims the benefit of U.S. Provisional Patent Application Serial No. 63/564,042, filed 12 March 2024, the entire content of which is incorporated herein by reference.
[0002] The present disclosure relates to a tracking and navigation system and related instruments.
BACKGROUND
[0003] This section provides background information related to the present disclosure which is not necessarily prior art.
[0004] An instrument can be navigated relative to a subject for performing various procedures. For example, the subject can include a patient on which a surgical procedure is being performed. During a surgical procedure, an instrument can be tracked in an object or subject space. In various embodiments, the subject space can be a patient space defined by a patient. The location of the instrument that is tracked can be displayed on a display device relative to an image of the patient.
[0005] An instrument may be placed in the subject, such as a human patient. The instrument may be positioned for a selected period of time. After the selected period of time the instrument may be selected to be removed. The position of the instrument must be determined to allow removal thereof. SUMMARY
[0006] This section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.
[0007] According to various embodiments, an imaging system may be used to acquire image data of a subject. The imaging system may include an ultrasound imaging system that includes an ultrasound (US) probe that generally includes an ultrasound transducer to emit and receive ultrasound frequencies. It is understood, however, that the imaging system may include separate components that emit and receive ultrasound frequencies. Further, the imaging system may include a system that acquires image data with other modalities, such as x-ray, magnetic resonance, etc.
[0008] Two or more images may be registered to one another. The two images may be intraoperative or pre- and intra-operative or any selected images. Further, more than two images may be registered. The images may be a real time image and a prior acquired image. The registration may be in real time to illustrate a current pose of a portion of the subject relative to a prior acquired image. Both the real time image and the prior acquired image may be selectively segmented.
[0009] An instrument may be placed in the subject. For example, the instrument may be or include an implant. The implant may be placed in the subject for a selected period. Following the selected period, the instrument may be removed. The instrument may be located within the subject due to movement of the instrument, the subject, or other relative motion between the two.
[0010] Further areas of applicability will become apparent from the description provided herein. The description and specific examples in this summary are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
DRAWINGS
[0011] The drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.
[0012] Fig. 1 is diagrammatic view illustrating an overview of a robotic system and a navigation system, according to various embodiments;
[0013] Fig. 2 is a schematic view of an implant pose determination system and retrieval system navigation, according to various embodiments;
[0014] Fig. 3 is a schematic view of an implant pose determination system and retrieval system navigation, according to various embodiments;
[0015] Fig. 4 is a schematic view of an implant pose determination system and retrieval system navigation, according to various embodiments;
[0016] Fig. 5 is a schematic view of an implant pose determination system and retrieval system navigation, according to various embodiments; and
[0017] Fig. 6 is a flowchart of a process for implant retrieval.
[0018] Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
DETAILED DESCRIPTION
[0019] Example embodiments will now be described more fully with reference to the accompanying drawings. [0020] The subject disclosure is directed to an exemplary embodiment of a surgical procedure on a subject, such as a human patient. It is understood, however, that the system and methods described herein are merely exemplary and not intended to limit the scope of the claims included herein. In various embodiments, it is understood, that the systems and methods may be incorporated into and/or used on non-animate objects. The systems may be used to, for example, image and register coordinate systems between two systems for use on manufacturing systems, maintenance systems, and the like. For example, automotive assembly may use one or more robotic systems including individual coordinate systems that may be registered together for coordinated or consorted actions. Accordingly, the exemplary illustration of a surgical procedure herein is not intended to limit the scope of the appended claims.
[0021] Various members or portions thereof may also be tracked relative to the subject. For example, a tracking system may be incorporated into a navigation system to allow tracking and navigation of one or more instruments (which may be the members) that may be tracked relative to the subject. The subject may also be tracked. The navigation system may include one or more tracking systems that track various portions, such as tracking devices, associated with instruments. The tracking system may include a localizer that is configured to determine the pose of the tracking device in a navigation system coordinate system. The pose may include any number of degrees of freedom, such as a three-dimensional location (e.g., x, y, z) and an orientation (e.g., yaw, pitch, and roll). Techniques, systems, or processes to determine the navigation system coordinate system may include those described at various references including U.S. Pat. No. 8,737,708; U.S. Pat.
No. 9,737,235; U.S. Pat. No. 8,503,745; U.S. Pat. No. 8,175,681 ; and U.S. Pat.
No. 11 ,135,025; all incorporated herein by reference. In particular, a localizer may be able to track an object within a volume relative to the subject. The navigation volume, in which a device may be tracked, may include or be referred to as the navigation coordinate system or navigation space. A determination or correlation between two coordinate systems may allow for or also be referred to as a registration between two coordinate systems.
[0022] In various embodiments, the first coordinate system, which may be a robotic coordinate system, may be registered to a second coordinate system, which may be the navigation coordinate system defined by or relative to the tracking system of the navigation system. Accordingly, coordinates in one coordinate system may then be transformed (e.g., mapped) to a different or second coordinate system due to a registration. Registration may allow for the use of two coordinate systems and/or the switching between two coordinate systems. For example, during a procedure, a first coordinate system may be used for a first portion or a selected portion of a procedure and a second coordinate system may be used during a second portion of a procedure. Further, two coordinate systems may be used to perform or track a single portion of a procedure, such as for verification and/or collection of additional information.
[0023] Image data may be acquired for use and/or to generate images of selected portions of a subject. The images may be displayed for viewing by a user, such as a surgeon. In various embodiments, superimposed on at least a portion of the image may be a graphical representation of a tracked portion or member, such as an instrument. The graphical representation may be generated (e.g., by a processor module executing instructions) entirely as a graphic that represents the instrument. According to various embodiments, the graphical representation may be superimposed on the image at an appropriate pose due to registration of an image space (also referred to as an image coordinate system) to a subject space. A method to register a subject space defined by a subject to an image space may include those disclosed in U.S. Pat. Nos. U.S. Pat. No. 8,737,708; U.S. Pat. No. 9,737,235; U.S. Pat. No. 8,503,745; and U.S. Pat. No. 8,175,681 ; all incorporated herein by reference.
[0024] During a selected procedure, the first coordinate system may be registered to the subject space or subject coordinate system due to a selected procedure, such as imaging of the subject. In various embodiments, the first coordinate system may be registered to the subject by imaging the subject with a fiducial portion that is fixed relative to the first member or system, such as a robotic system or other instrument. The known position of the fiducial relative to any portion, such as the robotic system or the subject, may be used to register the subject space relative to any coordinate system in which the fiducial may be determined (e.g., by imaging or detecting (e.g., touching)). A registration of a second coordinate system may allow for tracking of additional elements not fixed to a first portion, such as a robot that has a known coordinate system.
[0025] The tracking of an instrument during a procedure, such as a surgical or operative procedure, allows for navigation of a procedure. The navigation may be used to determine a pose of one or more portions, such as an instrument. The pose may include any number of degrees of freedom, such as a three-dimensional location (e.g., x, y, z) and an orientation (e.g., yaw, pitch, and roll). When image data is used to define an image space it can be correlated or registered to a physical space defined by a subject, such as a patient. According to various embodiments, therefore, the patient defines a patient space in which an instrument can be tracked and navigated. The image space defined by the image data can be registered to the patient space defined by the patient. The registration can occur with the use of fiducials that can be identified in the image data and in the patient space.
[0026] Fig. 1 is a diagrammatic view illustrating an overview of a procedure room or arena. In various embodiments, the procedure room may include a surgical suite in which may be placed a robotic system 20 and a navigation system 26 that can be used for various procedures. The robotic system 20 may include a Mazor X™ robotic guidance system, sold by Mazor Robotics Ltd. having a place of business in Israel and/or Medtronic, Inc. having a place of business in Minnesota, USA and/or as disclosed in U.S. Pat. No. 11 ,135,025, incorporated herein by reference. The robotic system 20 may be used to assist in guiding a selected instrument, such as drills, screws, etc. relative to a subject 30. In addition or alternatively, the robotic system 20 may hold and/or move various instruments such as an imaging system that may be an ultrasound (US) probe 33. The US probe 33 may be moved to achieve acquisition of selected image data. [0027] The robotic system 20 may include a mount 34 that fixes a portion, such as a robotic base 38, relative to the subject 30. The robotic system 20 may include one or more arms 40 that are moveable or pivotable relative to the subject 30, such as including an end effector 44. The end effector 44 may be any appropriate portion, such as a tube, guide, or passage member. Affixed to and/or in place of the end effector may be the imaging system that may be the US probe 33.
[0028] The end effector 44 may be moved relative to the base 38 with one or more motors. The position of the end effector 44 may be known or determined relative to the base 38 with one or more encoders at one or more joints, such as a wrist joint 48 and/or an elbow joint 52 of the robotic system 20. A robotic processor module 53 may be used to control (e.g., execute instructions) to move and determine a pose of the end effector, such as relative to the base 34. As discussed above, the robotic system 20 including the various portions may be operated to move each portion relative to the base 34. The pose of the base 34 may be known in a coordinate system, such as the patient space of the patient 30 and/or the image coordinate system due to a registration as discussed above and exemplary disclosed in U.S. Pat. No. 11 ,135,025, incorporated herein by reference.
[0029] The navigation system 26 can be used to track the pose of one or more tracking devices, tracking devices may include a robot tracking device 54, a subject tracking device 58, an imaging system tracking device 62, a tool tracking device 66, and/or an US probe tracking device 81 . Anyone or more of the tracking devices may be generally referred to as a tracking device herein. The US tracking device 81 may be used to track the US probe 33 and the imaging system tracking device 62 may be used to track an imaging system 80, as discussed herein, during image data acquisition. This tracking while acquiring image data may assist in registration, such as allowing an automatic registration.
[0030] A tool or moveable member 68 may be any appropriate tool or also referred to as an instrument such as a drill, forceps, or other tool operated by a user 72. The tool 68 may also include an implant, such as a spinal implant or orthopedic implant. It should further be noted that the navigation system 26 may be used to navigate any type of instrument, implant, or delivery system, including: guide wires, arthroscopic systems, orthopedic implants, spinal implants, deep brain stimulation (DBS) probes, etc. Moreover, the instruments may be used to navigate or map any region of the body. The navigation system 26 and the various instruments may be used in any appropriate procedure, such as one that is generally minimally invasive or an open procedure.
[0031] The imaging device or system 80 may be an additional or alternative imaging system that may be used to acquire pre-, intra-, or post-operative or realtime image data of a subject, such as the subject 30. It will be understood, however, that any appropriate subject can be imaged and any appropriate procedure may be performed relative to the subject. In the example shown, the imaging device 80 comprises an 0-arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado, USA. The imaging device 80 may have a generally annular gantry housing 82 in which an image capturing portion is moveably placed. The imaging device 80 can include those disclosed in U.S. Pat. Nos. 7,188,998; 7,108,421 ; 7,106,825; 7,001 ,045; and
6,940,941 ; all of which are incorporated herein by reference, or any appropriate portions thereof. It is further appreciated that the imaging device 80 may include in addition or alternatively a fluoroscopic C-arm. Other exemplary imaging devices may include fluoroscopes such as bi-plane fluoroscopic systems, ceiling mounted fluoroscopic systems, cath-lab fluoroscopic systems, fixed C-arm fluoroscopic systems, isocentric C-arm fluoroscopic systems, 3D fluoroscopic systems, etc. Other appropriate imaging devices can also include MRI, CT, ultrasound, etc. The various imaging systems may include or use one or more imaging modalities, such as x-ray, magnetic resonance, ultrasound, Positron emission tomography (PET) scans, combinations thereof, etc.
[0032] The position of the imaging system 33, 80, and/or portions therein such as the image capturing portion, can be precisely known relative to any other portion of the imaging device 33, 80. The imaging device 33, 80, according to various embodiments, can know and/or recall precise coordinates relative to a fixed or selected coordinate system. For example, the robotic system 20 may know or determine its position and position the US probe 33 at a selected pose. According to various embodiments, the US probe 33 may emit or transmit ultrasound waves in a selected pattern or plane 35 (Fig. 4), as is understood by one skilled in the art, with a selected or known shape. The plane may be a shape as is understood by one skilled in the art. The ultrasound probe may be a transducer and/or include both transmission and receiving portions. An echo from the transmitted signal in the plane may be received at the receiver portion. Via the transmitted signal in the plane an acquisition of image data in a field of view occurs to generate images, also referred to as sonograms when images are generated based on ultrasound data. Also, ultrasound signals may be received from other portions or members. The image data collection plane and/or its pose may be determined or known relative to the US tracking device 81 , as discussed above. Similarly, the imaging system 80 may also position the imaging portions at a selected pose. This can allow the imaging system 80 to know its position relative to the patient 30 or other references. In addition, as discussed herein, the precise knowledge of the position of the image capturing portion can be used in conjunction with a tracking system to determine the position of the image capturing portion and the image data relative to the tracked subject, such as the patient 30.
[0033] Herein, reference to the imaging system 33 may refer to any appropriate imaging system, unless stated otherwise. Thus, the US probe 33 as the imaging system is merely exemplary regarding the subject disclosure. As one skilled in the art will understanding, generally the US probe 33 may emit a US wave in a plane and receive an echo relative to any portions engaged by the wave. The received echo at the US probe 33 or other appropriate received may be used to generate image data and may be used to generate an US image also referred to as a sonogram. The pose (e.g., distance from a selected portion of the US probe 33 and/or the tracking device 81) may be determined or predetermined and saved for recall with a calibration process and/or jig, such as that disclosed in U.S. Pat. Nos. 7,831 ,082; 8,320,653; and 9,138,204, all incorporated herein by reference. [0034] The imaging device 80 can be tracked with a tracking device 62. Also, the tracking device 81 can be associated directly with the US probe 33. The US probe 33 may, therefore, be directly tracked with a navigation system as discussed herein. In addition or alternatively, the US probe 33 may be positioned and tracked with the robotic system 20. Regardless, image data defining an image space acquired of the patient 30 can, according to various embodiments, be registered (e.g., manually, inherently, or automatically) relative to an object space. The object space can be the space defined by a patient 30 in the navigation system 26.
[0035] The patient 30 can also be tracked as the patient moves with a patient tracking device, DRF, or tracker 58. Alternatively, or in addition thereto, the patient 30 may be fixed within navigation space defined by the navigation system 26 to allow for registration. As discussed further herein, registration of the image space to the patient space or subject space allows for navigation of the instrument 68 with the image data. When navigating the instrument 68, a position of the instrument 68 can be illustrated relative to image data acquired of the patient 30 on a display device 84. An additional and/or alternative display device 84’ may also be present to display an image. Various tracking systems, such as one including an optical localizer 88 or an electromagnetic (EM) localizer 92 can be used to track the instrument 68.
[0036] More than one tracking system can be used to track the instrument
68 in the navigation system 26. According to various embodiments, these tracking systems can include an electromagnetic tracking (EM) system having the EM localizer 94, an optical tracking system having the optical localizer 88 and/or other appropriate tracking systems not illustrated such as an ultrasound tracking system, or other appropriate tracking systems. One or more of the tracking systems can be used to track selected tracking devices, as discussed herein, sequentially or simultaneously. It will be understood, unless discussed otherwise, that a tracking device can be a portion trackable with a selected tracking system. A tracking device need not refer to the entire member or structure to which the tracking device is affixed or associated.
[0037] The position of the patient 30 relative to the imaging device 33 can be determined by the navigation system 26. The position of the imaging system 33 may be determined, as discussed herein. The patient 30 can be tracked with the dynamic reference frame 58, as discussed further herein. Accordingly, the position of the patient 30 relative to the imaging device 33 can be determined.
[0038] Image data acquired from the imaging system 33, or any appropriate imaging system, can be acquired at and/or forwarded from an image device controller 96, that may include a processor module, to a navigation computer and/or processor system 102 that can be a part of a controller or work station 98 having the display 84 and a user interface 106. The processor system 102 may be a processor module, as discussed herein, including integral memory or a communication system to access external memory for executing instructions and/or operated as a specific integrated circuit (e.g., ASIC) It will also be understood that the image data is not necessarily first retained in the controller 96, but may also be directly transmitted to the work station 98. The work station 98 can provide facilities for displaying the image data as an image 108 on the display 84, saving, digitally manipulating, or printing a hard copy image of the received image data. The user interface 106, which may be a keyboard, mouse, touch pen, touch screen or other suitable device, allows the user 72 to provide inputs to control the imaging device 80, via the image device controller 96, or adjust the display settings of the display 84. The work station 98 may also direct the image device controller 96 to adjust the image capturing portion of the imaging device 80 to obtain various two-dimensional images along different planes in order to generate one or more representative two-dimensional and three-dimensional image data that may be used to generate two-dimensional, three-dimensional, or a more than of either or both images.
[0039] With continuing reference to FIG. 1 , the navigation system 26 can further include any one or more tracking system, such as the tracking system including either or both of the electromagnetic (EM) localizer 94 and/or the optical localizer 88. The tracking systems may include a controller and interface portion 110. The controller 110 can be connected to the processor portion 102, which can include a processor included within a computer. The EM tracking system may include the STEALTHSTATION® AXIEM™ Navigation System, sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado; or can be the EM tracking system described in U.S. Pat. No. 7,751 ,865; U.S. Pat. No. 5,983,126; U.S. Pat. No. 5,913,820; or U.S. Pat. No. 5,592,939; all of which are herein incorporated by reference. It will be understood that the navigation system 26 may also be or include any appropriate tracking system, including a STEALTHSTATION® TREON® or S7™ tracking systems having an optical localizer, which may be used as the optical localizer 88, and sold by Medtronic Navigation, Inc. of Louisville, Colorado. Other tracking systems include an acoustic, radiation, radar, etc. The navigation system 26 and/or tracking system may be a hybrid system that includes components from multiple tracking systems. The tracking systems can be used according to generally known or described techniques in the above incorporated references. Details will not be included herein except when to clarify selected operation of the subject disclosure.
[0040] Wired or physical connections can interconnect the tracking systems, imaging device 80, etc. Alternatively, various portions, such as the instrument 68 may employ a wireless communications channel, such as that disclosed in U.S. Pat. No. 6,474,341 , herein incorporated by reference, as opposed to being coupled directly to the controller 110. Also, the tracking devices 62, 66, 54 can generate a field and/or signal that is sensed by the localizer(s) 88, 94.
[0041] Various portions of the navigation system 26, such as the instrument 68, and others as will be described in detail below, can be equipped with at least one, and generally multiple, of the tracking devices 66. The instrument can also include more than one type or modality of tracking device 66, such as an EM tracking device and/or an optical tracking device. The instrument 68 can include a graspable or manipulable portion at a proximal end and the tracking devices may be fixed near the manipulable portion of the instrument 68. [0042] According to various embodiments, the navigation system 26 can be used to track the instrument 68 relative to the patient 30. The instrument 68 can be tracked with the tracking system, as discussed above. Image data of the patient 30, or an appropriate subject, can be used to assist the user 72 in guiding the instrument 68. The image data, which may include one or more image data or images, may be registered to the patient 30. The image data defines an image space that is registered to the patient space defined by the patient 30. The registration can be performed as discussed herein, automatically, manually, or combinations thereof.
[0043] Generally, registration allows a translation map to be generated of the physical location of the instrument 68 relative to the image space of the image data. The translation map allows the tracked position of the instrument 68 to be displayed on the display device 84 relative to the image 108. A graphical representation 68i, also referred to as an icon, can be used to illustrate the location of the instrument 68 relative to the image data 108.
[0044] With continuing reference to Fig. 1 , a subject registration system or method can use the tracking device 58. The tracking device 58 may include portions or members 120 that may be trackable, but may also act as or be operable as a fiducial assembly. The fiducial assembly 120 can include a clamp or other fixation portion 124 and the imageable fiducial body 120. It is understood, however, that the members 120 may be separate from the tracking device 58. The fixation portion 124 can be provided to fix any appropriate portion, such as a portion of the anatomy. As illustrated in Fig. 1 , the fiducial assembly 120 can be interconnected with a portion of a spine 126 such as a spinous process 130. The fixation portion 124 can be interconnected with a spinous process 130 in any appropriate manner. For example, a pin or a screw can be driven into the spinous process 130. Fiducial portions may also include one or more portions of the subject that may be imaged, such as boney portions.
[0045] As illustrated in Fig. 1 , the US probe 33 may be positioned relative to the subject 30, such as by the robotic system 20. As discussed herein, therefore, the robotic system 20 may move the US probe 33 to a selected position relative to the subject 30. According to various embodiments, the US probe 33 may be positioned relative to the subject in any appropriate manner. For example, the user 72 or a second user may move and/or hold the US probe 33.
[0046] In various embodiments, an instrument 160 may be in or may be positioned in the subject 30 in a selected manner. The instrument 160, as noted above, may include an implant. According to various embodiments the implant may include a member or system that is positioned within the subject for a period of time. For example, the implant 160 may include a system, such as a pacemaker system. The pacemaker system may include various portions such as an internal power source, a signal transmitting or transmission system, a sensing system, a processor module, a stimulation electrode, and memory module, or other appropriate portions. The implant 160 may operate to perform various procedures, receive selected information, and/or transmit selected information. In various embodiments, the implant 160 may be a pacemaker such as a MICRA® pacemaker sold by Medtronic, Inc. having a place of business in Minnesota. The pacemaker may be a member that is positioned within the subjects, such as in or near a heart 164 of the subject 30. The implant 160 may not have a physical construct that extends from the perimeter of the member of 160, such as a lead connection or other member. Further the implant 160 may have a selected dimension such as having an external geometry that may be contained within a volume of about 2 cubic centimeters (cm3) to about 10 cm3. While the implant 160 may be a pacemaker, it is understood that the implant 160 may be any appropriate implant such as a spinal disc replacement, vertebral screw, bone anchor, or the like. Further, the implant may not be or only be a medical device or system. The implant 160 is generally positioned within the subject 30 without an external or extending connection or member, such as a lead or tether. For example, such as a lead that may extend from an implantable cardiac device to an anchor that may be positioned in the heart 164.
[0047] Once positioned in the subject 30, such as in the heart 164 of the subject 30, the implant 160 may be at a selected pose. Nevertheless, the subject 30 may then leave an area, such as the surgical suite as illustrated in Fig. 1 , for a period of time. At a selected time, such as to replace the implant 160, recharge the implant 160, or other reasons the implant 160 may be selected to be removed from the subject 30. No physical member may extend from the implant 160 to an exterior of the subject 30 and the implant 160 may be positioned in substantially soft tissue in the subject 30 at exact pose of the implant 160 may not be known. Thus, the pose of the implant 160 may be determined with a selected pose determination system, as discussed further herein. The pose determination system may be positioned near the subject 30, such as touching a surface of the subject 30 and/or able to interact with the implant 160 within the subject 30. Further, the implant 160 need not be a previously positioned member, but may be any appropriate member that is selected or is within the subject 30.
[0048] Turning reference to Fig. 2, the subject 30 may be positioned for retrieval of the implant 160 from a selected portion of the subject 30, such as the heart 164. The implant 160 may be retrieved with any appropriate instrument such as an implant capturing or retrieval system 170 that may include an extension portion or member 174 that extends into the subject 30, such as through a vasculature (e.g., vein or artery) to the heart 164. The capturing system 170 may further include a capture end or portion 178 that may have a capture member 182. The capturing system 170 may be tracked with one or more of the tracking systems, such as via a tracking device 186. The tracking device 186 may be positioned near the capture member 182 at the capture end 178. Thus, the capture portion 182 may be tracked with the navigation system 26, as discussed above. The retrieval system 170 may be moved through the subject 30 in an appropriate manner, such as with a guided catheter. Therefore, the image 108 may display in the appropriate portion of the subject to include the current pose of the retrieval portion 182 which may be displayed as the graphical representation 182i. A pose of the capture portion 182 may be determined by tracking the tracking device 186 illustrated as a graphical representation 182i superimposed on the image 108. The graphical representation 182i may be generated by the navigation system based upon a tracked pose of the tracking device 186 to determine or illustrate a pose of the retrieval member 182. One skilled in the art will understand that the graphical representation may be displayed in the appropriate manner.
[0049] The retrieval system 170 may be any appropriate retrieval systems such as the EnSnare® or the OneSnare ® vascular capture or retrieval devices or system sold by Merit Medical Systems having a place of business in Utah or the INDY OTW® vascular capture or retrieval devices or system sold by Cook Medical Technologies, LLC., having a place of business in Indiana. Accordingly, the capture device 170 may include the vascular capture retrieval device as noted above that may be augmented with the tracking device 186. The retrieval system 170 may, therefore, be navigated through the subject 30. Any appropriate retrieval system 170, however, may be used according to various embodiments.
[0050] The implant 160, however, may not include a tracking device, at least one tracked or navigated directly with the navigation system 26, associated therewith. The implant 160, however, may be able to transmit or generate various signals or pulses. For example, the implant 160 may be a pacing implant and therefore may generate a pacing stimulation pulse. The implant 160, therefore, may emit a signal 196 (illustrated schematically) into the subject 30. The signal 196 may be a pacing signal that may be less than an amplitude or power necessary to capture and pace tissue, such as the heart 164. Alternatively or additionally, the implant 160 may also transmit a signal, such as a radio signal, including a Bluetooth® radio signal. The implant 160 may also include a transmission system to transmit a signal with selected signal, such as those discussed herein. [0051] Each one or more of these signals may be transmitted by the implant 160, indicated by the signal marking 196, and may be captured or sensed at various portions that are positioned relative to the subject 30. The receiving portion may also be referred to as an implant or member pose determination system. The pose determination system may be positioned near the subject, as discussed herein. Near the subject may be in or in contact with the subject or relative to the subject to receive the signal 196 and allow for a determination of a pose of the implant 160.
[0052] In various embodiments, the capture system 170 may include a signal sensing portion with the tracking device 186 or at another appropriate portion of the system 170. In various embodiments, the tracking device 186 or portion with the tracking device 186 may sense the signal 196 from the implant 160 to attempt to locate or identify a pose of the implant 160.
[0053] Additionally or alternatively, signal sensors may be positioned near, such as in contact with, the subject 30. Signal sensors may include one or more signal sensors such as a first signal sensor 200, a second signal sensor 204, and a third sensor signal 206. Each of the sensor signal sensors 200, 204, 206 may sense the signal 196. The sensed signal may then be used to triangulate a pose of the implant 160 relative to each of the signal sensors 200-206. The triangulation may be based upon generally known triangulation techniques and may be performed by executing instructions with a selected processor module, such as the processor module 102. In various embodiments, a signal strength sensed at each of the signal sensors 200-206, a relative delay in time to one or more of the signal sensors 200-206, or the like may be used to triangulate the pose of the implant
160 relative to the signal sensors 200-206. For example, if the signal is sound based, triangulation may include time of flight triangulation, especially if the location of the sensors is well known by using the EM navigation system. With a selected number of sensors, such as three or more, there may be some averaging which could be based on signal strength to each sensor. In addition or alternatively to triangulation, other techniques may include directional antenna radio technique for triangulation, phase shift, etc.
[0054] The pose of the implant 160 relative to the signal sensors 200-206 may also then be known to the navigation system 26 such as via signal sensor tracking devices associated with one or more of the signal sensors 200-206. For example, a first signal sensor tracking device 210 may be associated with the signal sensor 200, a second signal sensor tracking device 212 may be associated with the second signal sensor 204, and the third signal sensor tracking device 214 may be associated with the third tracking signal sensor 206. The tracking system, included with the navigation system 26, may know the pose of each of the signal sensors 200-206 due to respective related signal sensor tracking devices 210-214.
[0055] Due to the registration of the navigation system coordinate system to the image coordinate system, a graphical representation of the implant 160 may also be illustrated relative to the image 108 such as the graphical representation 160i. Again, the graphical representation may be a generated graphic (e.g., icon) that is superimposed on the image 108. Therefore, the display 184 may be viewed that includes both of the image 108 and the graphical representations 182i and 160i to understand respective poses of the implant 116 and the capture portion
182.
[0056] Thus, the user 172 or any appropriate user may manipulate the retrieval system 170 to move the capture portion 182 relative to the implant 160. The capture portion 182 may then be used to manipulate and capture the implant 160 to allow the removal of the implant 160 or explantation.
[0057] This system, according to various embodiments, allow the implant 160 to be positioned within the subject without an included tracking device, such as an EM tracking device that is tracked with the EM localizer 94. Nevertheless, the pose of the implant 160 may still be determined without having a specific tracking device attached. The one or more signal sensors 200-206 may be positioned relative to the subject to sense the signal from the implant 160 to triangulate the pose of the implant 160 relative to one or more of the signal sensors 200-206. The signal sensors 200-206 may be tracked with one or more tracking devices 210-214 to allow for a navigation or tracking of the signal sensors 200- 206. Once the signal sensors 200-206 are tracked, the triangulated pose of the implant 160 may be known and navigated as well in the navigation coordinate system. Therefore, the pose of the retrieval portion 182 may be known relative to the implant 160. The respective poses of the implant 160 and the retrieval member 182 may be displayed on the display device 84, as discussed above.
[0058] The retrieval portion 182 may be navigated to the implant 160. In various embodiments, as noted above, a robotic system, such as the robotic system 20 may be used to manipulate the retrieval system 170. For example, a depth and/or steering of a steering catheter of the retrieval system 170 may be used to position the retrieval portion 182 relative to the implant 160 based upon the navigation information.
[0059] In various embodiments, with reference to Fig. 3, various portions are included with the same reference numerals as used above and are reintroduced here. The implant 160, however, may be formed of or include a material that will resonate when exited at a selected signal or field. Additionally or alternatively, a member or feature may be included that may be excited by an excitation field.
[0060] An excitation system may include an emitter 220. The excitation system may also be referred to as a pose determination system. The emitter 220 may be positioned near the subject in a manner to cause an interaction as noted below. The emitter 220, therefore, may contact the subject 30 or otherwise be placed near the subject 30.
[0061] The emitter 220 may emit a selected field or signal that is schematically illustrated as the emission 224. The emission 224 may be any appropriate emission that may cause the implant 160 and/or at least a portion thereof to resonate. For example, the emitter 220 may emit a radio frequency that causes a resonance or ringing of a selected portion of the implant 160, such as or similar to a radio frequency identification (RFID) system. The emitter 220 may, however, emit any appropriate signal that may be a magnetic field, an electrical field, a combination of the magnetic and electrical field, an ultrasound signal, or other appropriate signals. Regardless the emission 224 may cause an excitation of the implant 160 and/or portion thereof.
[0062] An excitation of the implant 160 may cause the implant 160 to emit a ringing or resonance signal or field as schematically illustrated as emission 228. The signal from the implant 160 may be generated by any appropriate portion of the implant 160. For example, an exterior casing of the implant 160 may be formed to resonate or emit when a selected field is sensed or received at the exterior casing. Further or alternatively, a member, such as a separate excitation member, may be included with or attached to the implant 160. Regardless the implant 160 or a portion thereof may emit the signal 228.
[0063] The signal 228 may be caused due to the emission 224 of the excitation system 220. Further the excitation system 220 or any other appropriate portion, such as the signal sensors 200-206 (if included), may receive this signal 228. The signal received at the signal sensors 200-206 and/or the emitter 220 may be used to determine the pose of the implant 160. As discussed above, the signal received at the signal sensors 200-206 may be used to triangulate the pose of the implant 160. Further the signal will be received at the excitation source 220 from one or more excitation portions on the implant 160 to allow for a determination of a pose of the implant 160 relative to the excitation source 220. Again, as discussed above, any appropriate processor module, such as the processor module 102, may execute appropriate instructions to triangulate or appropriately locate, as discussed above, the pose of the implant 160 relative to one or more of the signal sensors 200-206 and/or the excitation source 220. Further, the navigation system 26 may know the pose of any one of the more of the signal sensors 200-206 or the excitation source 220 with an associated tracking device such as the tracking device 210-214 and/or a tracking device 232 associated with the excitation source 220.
[0064] As the navigation system 26 is able determine the tracked pose of the tracking devices 210-214, 232 associated with one or more of the respective receiving portions, the navigation system may determine the pose relative to the patient 30 of the implant 160. As discussed above, the determined pose of the implant 160 relative to the respective receiving portions may be determined based upon the emitted signal 228 from the implant 160. The tracked pose of the respective receivers may be determined based upon the tracking devices 210-214, 232. Therefore, a pose of the implant 160 may be determined by the navigation system 26 without a tracking device tracked by the navigation system 26 connected or included with the implant 160.
[0065] The determined pose of the implant 160 may be displayed on the display device 84, for example as a graphical representation 160i. Further, the pose of the implant retrieving portion 182 may also be displayed as a graphic representation 182i. Whether displayed or not, the relative poses of the implant retrieval portion 182 and the implant 160 in the navigation coordinate space may be determined by the navigation system due to the tracking of the respective tracking devices, as discussed above. Therefore, the navigation system 26 may be used to determine the relative pose of the implant 160 and the retrieval portion 182 to allow for navigation of the retrieval portion 182 relative to and to the implant 160 to allow for removal thereof. Thus, the excitation of a portion of the implant 160 with the excitation system 220 may be used to determine the pose of the implant 160 when implant 160 is within the subject 30. The excitation source 220 may be any appropriate excitation sort and need not require a separate individual tracking device in the implant 160.
[0066] According to various embodiments, as illustrated Fig. 4, various portions are included with the same reference numerals as used above and are reintroduced here. The pose of the implant 160, however, may not be known, as also discussed above. The implant 160 may be formed of a selected material that is selectively or appropriately echogenic, such as being able to be clearly and efficiently imaged with the US probe 33. The US probe 33 may include or be a US transducer. In various embodiments, a housing may house the US transducer. Nevertheless, the US transducer may emit a US signal and receive a reflected US signal or any appropriate US signal.
[0067] The US probe 33 may be moved relative to the subject 30. As discussed above, the plane of ultrasound 35 may be emitted by the US probe 33. The implant 160 US emissions may interact, such as the implant may be imaged with the ultrasound probe 33, such as generally understood in the art. Further, the US probe 33 may be operated as an echolocating system that interacts with the material of the implant 160 to provide a reflection that is operable to allow for a determination of a pose of the implant 160 relative to the US probe 33 rather than only imaging the implant 160. [0068] The pose of the implant 160 may be determined in the plane 35 emitted by the US probe 33. Therefore, the pose of the implant 160 relative to the US probe 33 may be determined. A determination of the pose of the implant 160 relative to the US probe 33 may be based upon analysis of the image data generated or collected at the US probe 33 based upon the emitted plane 35, received signal at the US probe 33 based upon the interaction of the US signal and the implant 160, or other appropriate determinations. In various embodiments, a pose may be predicted or evaluated based on an expected or predicted pose with the image to assist in the determination. An expected pose may be based on previous or other (e.g., past) poses in other subjects that have been evaluated with the same or similar implant. Further, a preoperative image (e.g., CT or MRI) could alternatively or also be used if merged with the ultrasound signal to help determine the pose.
[0069] The pose of the US probe 33 may be determined with the US probe tracking device 81 by the navigation system 26. As discussed above, the US probe tracking device 81 may be tracked with the appropriate tracking system, such as via the EM localizer 94. This allows the pose of the US probe 33 to be determined with the navigation system 26. The pose of the capture portion 182 may also be determined with the tracking device 186. Therefore, as the pose of the implant 160 is determined relative to the US probe 33, such as by the interaction with the US emission 35, and the pose of the US probe 33 may be determined via the US probe tracking device 81 , a relative pose of the implant 160 and the capture portion 182 may be determined. [0070] The navigation system 26 may then, similar to that discussed above, allow for the generation of the graphical representations of the implant capture portion 182 as the graphical representation 182i and a graphical representation of the implant 160 as the graphical representation 160i to be displayed on the display device 84 relative to the image 108. The relative poses may be displayed on the display device 84, if selected, but not required. Nevertheless, the navigation system 26 once the pose of the implant 160 is known relative to the implied capture portion 182, may allow for navigation of the capture portion 182 relative to the implant 160. Thus, the US probe 33 may be used to generate a determination of a pose of the implant 160 by imaging the implant 160 and/or receiving a signal regarding the interaction of the implant 160 or a portion thereof with the US emission 35.
[0071] It is understood that the US emission 35 from the US probe 33 is an exemplary non-ionizing emission. The emission 35 may be of any appropriate imaging portion, however, that may image the implant 160. The US probe 33, however, provides an efficient system that may be moved relative to the subject 30 in an appropriate manner, such as either by the user 72, robotic system 20, or other appropriate portion. The US emission 35 may interact with the implant 160 to allow for a determination of a pose of the implant 160 relative to the US probe 33. Tracking the US probe 33 with the US probe tracking device 81 allows the navigation system to determine a pose of the US probe 33. Therefore, the known determined pose of the implant 160 relative to the US probe 33 may be used by the navigation system to determine a pose of the implant 160 in a navigation space and/or an image space or coordinate system, as discussed above.
[0072] In various embodiments, as illustrated in Fig. 5, various portions are included with the same reference numerals as used above and are reintroduced here. The EM localizer 94 may be positioned relative to the subject 30. As is understood by one skilled in the art, the localizer 94 may generate a field, which may be an electro-magnetic field, as schematically illustrated as the field 240. Thus, the EM localizer may be positioned near the subject 30, such as to generate the field in an appropriate portion of the subject 30.
[0073] The tracking devices, including those discussed above, and the tracking device 186 associated with the retrieval system 170 may sense the emitted field 240 from the EM localizer 94. The pose of the tracking device 186 may be determined in the emitted field 240 due to a characteristic sensed at one or more conductive coils at the tracking device 186 based upon the emission 240. The field 240 may be a time varying and/or multiple frequency field that may be generated at one or more orientations. The tracking of the tracking device 186 in the field 240 may occur as discussed above.
[0074] The field 240, however, may be distorted by distorting members that are within the field 240. As illustrated in Fig. 5, it is understood the selected volume of the subject 30 is within the field 240 emitted by the EM localizer 94. For example, a volume 244 may be the volume in which the field 240 is emitted and sensed for determination of navigation by the navigation system 26. The implant 160 may be formed of and/or include a selected portion that may interfere with the field 240 in the volume 244.
[0075] The distortion generated by the distorting portion of the implant 160 may distort the field 240 in a selected or known manner. For example, the implant 160 may have a casing and/or a distortion portion 250 associated there with that has a particular geometry. The geometry of the distortion portion 250 may distort the field 240 in a known manner at a known orientation and location, also referred to as a pose. Therefore, the field 240 emitted by the localizer 94 into the volume 244 may be distorted by the distortion portion 250 in a known or predetermined manner. The predetermined manner may be stored for recall, such as in a lookup table.
[0076] The distorted field may be sensed by the EM localizer 94. The EM localizer 94, as is understood by one skilled in the art, includes one or more coils that may generate the field 240. The coils may also be operated as a receiver or an antenna to receive the distorted fields. Alternatively and/or additionally, a field sensor may also be provided relative to the subject 30, such as near the subject 30, to sense the distorted field. The distortion sensor 260 may also be tracked with a tracking device 264. Therefore, the distorted field may be sensed at the EM localizer 94 and/or the distortion sensor 260. The sensed distorted field may be interpreted or used to determine the pose of the implant 160 in the volume 244 based upon the known or predetermined distortion caused by the distortion portion 250. When the localizer 94 senses the distorted field, the localizer 94 is known in the navigation space. The distortion sensor tracking device 264 may, however, be tracked in the navigation space to offer determination of a pose of the distortion detection system 260 in the navigation space. This may allow the pose of the implant 160 to be determined relative to the EM localizer 94 and/or the distortion portion 260 based upon the sensed distorted field. The related position of the implant 160 in the navigation space may then be determined based upon the position of the localizer 94, which may also be understood as the origin or used to determine the origin such as relative to a subject DRF, and/or the tracked pose of the distortion port detector 260 with the distortion detector tracking device 264.
[0077] Again, this allows for the navigation of the implant capture portion 182 relative to the implant 160. The respective poses may be displayed on the display device 84 such as respective graphical representations 182i, 160i either together and/or relative to the image 108. This allows the implant capture portion 182 to be navigated relative to the implant 160 to allow for explantation or capture thereof.
[0078] The pose separately and relative to one another of at least the implant 160 and the retrieval portion 182 may be determined with the navigation system 26, as discussed above. Accordingly, a process 300 may be used to navigate the retrieval portion of 182 to the implant 160. This may allow for capture, repositioning, and/or removal (also referred to as explantation) of the implant 160. Again, the implant 160 may be any appropriate member within the subject 30.
[0079] The process 300, as illustrated in Fig. 6, may begin in a start block
304. After starting the process 300 in block 304, an association of an implant pose determination system with the subject may be made in block 308. The association of the pose determination system with the subject in block 308 may include positioning the subject 30 in a selected area, such as an operating theater or in a navigation space as defined by the navigation system 26. Further, the association may include positioning the pose determination system near the subject 30. Positioning the pose determination system near the subject 30 may include contacting the subject with at least a portion of the pose determination system, placing the pose determination system to allow a signal or field to affect the implant within the subject, or similar placement.
[0080] The implant pose determination system may include one or more of the determination systems, according to various embodiments, including those discussed above. For example, the pose determination system may include the emission of the signal 196 from the implant 160 (e.g., as illustrated in Fig. 2), emission or reflection of the signal 228 from the implant due to excitation from the excitation source 220 (e.g. as illustrated in Fig. 3), imaging or echolocating the implant 160 (e.g. as illustrated in Fig. 4), determining a predetermined distortion in an emitted field (e.g. as illustrated in Fig. 5), or other appropriate pose determination system.
[0081] Once the determination system is positioned relative to the subject 30, a determination of the pose of the implant may be made of block 320. The determination of the pose in block 320 may be made relative to the pose determination system. For example, as discussed above, the pose of the implant may be determined relative to the signal sensors 200-206. The determination of the pose relative to the signal sensors of the implant may be based upon instructions that are executed with a processor system, such as the processor system 102. The processor system 102 may be incorporated or included with the navigation system 26 or any other appropriate processor system or module. Nevertheless, the pose of the implant may be determined relative to a portion of the pose determination system or a part of the pose determination system.
[0082] The pose determination system may be tracked in the navigation system in block 324. This may include a tracking of one or more tracking devices associated with portions of the pose determination system. Generally, a determination of the portion of the pose determination system may be determined to allow for a mapping or translation of the determined pose of the implant relative to the pose determination system into the navigation space. Thus, the tracking of the pose determination system allows the navigation system 26 to determine the pose of the implant within the navigation system, such as in a navigation coordinate system, without including one or more tracking devices with the implant 160 directly. Including a tracking device with the implant 160 may include attaching a tracking device to the implant, including a tracking device within the implant, or other association to direct connections (e.g. attachments) of a tracking device to the implant 160.
[0083] The implant 160 may be determined to be retrieved or captured, by tracking a retrieval system in navigation space in block 328. Tracking the retrieval system may include tracking a tracking device associated with the retrieval system, as discussed above. For example, the tracking device 186 may be used to track the capture portion 182. Thus, a pose the retrieval system may be determined in the navigation space by tracking the tracking device associated with the retrieval system, such as the tracking device 186.
[0084] Tracking the pose determination system to allow for a translation of a pose of the implant into the navigation space and tracking the retrieval system in the navigation space, in the respective blocks 324 and 328, may allow for navigation of the retrieval system relative to the implant 160. This may allow for navigating or guiding movement of the retrieval system to the implant 160. However, in various embodiments, a display of the respective poses may be selected.
[0085] Accordingly, an optional display or image navigation sub-process 340 may be used. In the image sub-process 340, a registration of navigation space to image space may be made in block 344. The navigation space may be defined by one or more of the tracking systems and the image space may be defined by one or more images, such as the image 108. Therefore, the two spaces or coordinate systems may be registered to one another, as discussed above, in block 344. A display of a pose of the retrieval system and the implant may then be made in block 346. The display of the pose of the retriever and the implant may also be displayed on an image in block 348. The display of the pose of the retriever and/or the implant on the image in block 348 may include a superimposition of a graphical representation on the image of the subject 30. The graphical representation may be a representation that is entirely generated by a processor system, such as based upon a model of the implant and/or the retrieval system. Thus, the graphical representation that may not be a direct image capture (e.g. generated based upon image data) of their respected implant and/or the retriever.
[0086] Regardless of whether a display is made in the sub-process 340, the retrieval system may be navigated to the implant in block 360. The navigation of the retrieval system to the implant in block 360 may include a manual, automatic, or a combination there of movement of the retrieval system to the implant. As discussed above, the implant capture portion 182 may be moved relative to the implant 160 such as via a guided catheter, or other appropriate delivery system.
[0087] A determination of whether navigation is to continue may be made in block 364. For example, if the retrieval system has not yet captured the implant, a determination that the navigation should continue may be made in block 364 and a yes path 368 may be followed. If the yes path 368 is followed the process 300 may iterate or loop to the determination of a pose of the implant in block 320. This allows the process 300 to continue until a determination that the navigation should not continue, such as when the retrieval system has appropriately captured the implant.
[0088] If a determination is made that navigation should not continue, a no path 372 may be followed to end the process in block 376. The process may end in block 376, however, other appropriate procedures may continue, such as removal of the implant, manipulation or movement of the implant, or other appropriate processes. Thus, the process 300 may allow for determination of a pose of the implant within the subject without including a tracking device on the implant. Further the process 300 may allow for navigation of a retrieval portion relative to the implant for various purposes. The process 300 may be performed, according to various embodiments, to navigate a retrieval system to capture or retrieve an implant for various purposes.
[0089] Examples
[0090] Example 1 - A system to detect a pose of an implant in a subject, comprising: a navigation system including a tracking system; a tracking device configured to be tracked with the tracking system, wherein a pose of the tracking device is operable to be determined by the navigation system; a pose determination system configured to be positioned near the subject; wherein the tracking device is associated with the pose determination system such that the navigation system is operable to determine a pose of the pose determination system in a navigation coordinate system; wherein the pose determination system is operable to interact with the implant.
[0091] Example 2 - The system of Example 1 , further comprising: the implant positioned within the subject.
[0092] Example 3 - The system of Example 2, further comprising: an implant retrieval system; a retrieval system tracking device associated with the implant retrieval system; wherein the navigation system is configured to track the retrieval system tracking device to determine a pose of at least a portion of the implant retrieval system.
[0093] Example 4 - The system of Example 3, wherein the pose determination system is operable to determine a pose of the implant relative to the pose determination system. [0094] Example 5 - The system of Example 4, wherein the navigation system is configured to determine the pose of the implant retrieval system relative to the implant based on the navigation system being configured to track the retrieval system tracking device to determine the pose of at least the portion of the implant retrieval system in the navigation coordinate system, the pose determination system is operable to determine the pose of the implant relative to the pose determination system, and the navigation system is operable to determine the pose of the pose determination system in the navigation coordinate system.
[0095] Example 6 - The system of Example 5, further comprising: a display device configured to display an implant graphical representation representative of the determined pose of the implant and an implant retrieval system graphical representation representative of the determined pose of the implant retrieval system.
[0096] Example 7 - The system of Example 4, wherein the pose determination system further comprises: a pose determination processor; a first signal sensor; and a second signal sensor; wherein the tracking device includes a first tracking device and a second tracking device; wherein the first tracking device is associated with the first signal sensor and the second tracking device is associated with the second tracking device; wherein each of the first sensor and the second sensor is configured to receive a signal from the implant; wherein the pose determination processor is configured to execute instructions to triangulate the pose of the implant relative to the first signal sensor and the second signal sensor.
[0097] Example 8 - The system of Example 7, wherein the pose determination system further comprises an excitation member configured to emit a field that excites at least a portion of the implant to emit the signal.
[0098] Example 9 - The system of Example 4, wherein the pose determination system further comprises: a pose determination processor; and an ultrasound transducer; wherein the tracking device is associated with the ultrasound transducer; wherein the ultrasound transducer is configured to emit an ultrasound signal to interact with the implant; wherein the pose determination processor is configured to execute instructions to determine the pose of the implant relative to the ultrasound transducer based on the interaction of the ultrasound signal with the implant.
[0099] Example 10 -The system of Example 9, further comprising: a display device configured to display an image of the implant based on the interaction of the ultrasound signal with the implant.
[0100] Example 11 -The system of Example 4, wherein the tracking system further comprises an electromagnetic emitting system operable to emit an electromagnetic field; wherein the implant is configured to interfere with the electromagnetic field; wherein the navigation system further includes a navigation processor, wherein the navigation processor is configured to execute instructions evaluate the distortion to determine the pose of the implant. [0101] Example 12 -The system of Example 11 , wherein the evaluation of the distortion to determine the pose of the implant includes recalling a predetermined distortion of the implant in the emitted electromagnetic field.
[0102] Example 13 - A method of determining a pose of an implant in a subject, comprising: tracking a tracking device with the tracking system; determining a pose of the tracking device by a navigation system based on the tracking of the tracking device; positioning a pose determination system near the subject; associating the tracking device with the pose determination system; determining a pose of the pose determination system in a navigation coordinate system based on the determined pose of the tracking device; operating the pose determination system to interact with the implant.
[0103] Example 14 -The method of Example 13, further comprising: tracking a retrieval system tracking device associated with a implant retrieval system; determining a pose of at least a portion of the implant retrieval system with the navigation system based on tracking the retrieval system tracking device.
[0104] Example 15 -The method of Example 14, further comprising: determining a pose of the implant relative to the pose determination system.
[0105] Example 16. The method of Example 15, further comprising: determining the pose of the implant retrieval system relative to the implant based at least on (1) determining the pose of at least the portion of the implant retrieval system in the navigation coordinate system, (2) determining the pose of the implant relative to the pose determination system, and (3) determining the pose of the pose determination system in the navigation coordinate system. [0106] Example 17 -The method of Example 16, further comprising: displaying an implant graphical representation representative of the determined pose of the implant and an implant retrieval system graphical representation representative of the determined pose of the implant retrieval system.
[0107] Example 18 -The method of Example 15, further comprising: providing the pose determination system with a first signal sensor and a second signal sensor; receiving a signal from the implant at the first signal sensor; and receiving the signal from the implant at the second signal sensor; providing the tracking device as a first tracking device and a second tracking device; associating the first tracking device with the first signal sensor and the second tracking device with the second tracking device; triangulating the pose of the implant relative to the first signal sensor and the second signal sensor.
[0108] Example 19 -The method of Example 15, further comprising: providing the pose determination system with an ultrasound transducer; associating the tracking device with the ultrasound transducer operating the ultrasound transducer to emit an ultrasound signal to interact with the implant; and determining the pose of the implant relative to the ultrasound transducer based on the interaction of the ultrasound signal with the implant.
[0109] Example 20 -The method of Example 15, further comprising: providing the tracking system with an electromagnetic emitting system; operating the electromagnetic emitting system to emit an electromagnetic field; evaluating a distortion caused by the implant to determine the pose of the implant. [0110] Example embodiments are provided so that this disclosure will be thorough, and will fully convey the scope to those who are skilled in the art. Numerous specific details are set forth such as examples of specific components, devices, and methods, to provide a thorough understanding of embodiments of the present disclosure. It will be apparent to those skilled in the art that specific details need not be employed, that example embodiments may be embodied in many different forms and that neither should be construed to limit the scope of the disclosure. In some example embodiments, well-known processes, well-known device structures, and well-known technologies are not described in detail.
[0111] Instructions may be executed by a processor and may include software, firmware, and/or microcode, and may refer to programs, routines, functions, classes, data structures, and/or objects. The term shared processor circuit encompasses a single processor circuit that executes some or all code from multiple modules. The term group processor circuit encompasses a processor circuit that, in combination with additional processor circuits, executes some or all code from one or more modules. References to multiple processor circuits encompass multiple processor circuits on discrete dies, multiple processor circuits on a single die, multiple cores of a single processor circuit, multiple threads of a single processor circuit, or a combination of the above. The term shared memory circuit encompasses a single memory circuit that stores some or all code from multiple modules. The term group memory circuit encompasses a memory circuit that, in combination with additional memories, stores some or all code from one or more modules. [0112] The apparatuses and methods described in this application may be partially or fully implemented by a processor (also referred to as a processor module) that may include a special purpose computer (i.e. , created by configuring a processor) and/or a general purpose computer to execute one or more particular functions embodied in computer programs. The computer programs include processor-executable instructions that are stored on at least one non-transitory, tangible computer-readable medium. The computer programs may also include or rely on stored data. The computer programs may include a basic input/output system (BIOS) that interacts with hardware of the special purpose computer, device drivers that interact with particular devices of the special purpose computer, one or more operating systems, user applications, background services and applications, etc.
[0113] The computer programs may include: (i) assembly code; (ii) object code generated from source code by a compiler; (iii) source code for execution by an interpreter; (iv) source code for compilation and execution by a just-in-time compiler, (v) descriptive text for parsing, such as HTML (hypertext markup language) or XML (extensible markup language), etc. As examples only, source code may be written in C, C++, C#, Objective-C, Haskell, Go, SQL, Lisp, Java®, ASP, Perl, Javascript®, HTML5, Ada, ASP (active server pages), Perl, Scala, Erlang, Ruby, Flash®, Visual Basic®, Lua, or Python®.
[0114] Communications may include wireless communications described in the present disclosure can be conducted in full or partial compliance with IEEE standard 802.11 -2012, IEEE standard 802.16-2009, and/or IEEE standard 802.20- 2008. In various implementations, IEEE 802.11 -2012 may be supplemented by draft IEEE standard 802.11ac, draft IEEE standard 802.11 ad, and/or draft IEEE standard 802.11 ah.
[0115] A processor, processor module, module or ‘controller’ may be used interchangeably herein (unless specifically noted otherwise) and each may be replaced with the term ‘circuit.’ Any of these terms may refer to, be part of, or include: an Application Specific Integrated Circuit (ASIC); a digital, analog, or mixed analog/digital discrete circuit; a digital, analog, or mixed analog/digital integrated circuit; a combinational logic circuit; a field programmable gate array (FPGA); a processor circuit (shared, dedicated, or group) that executes code; a memory circuit (shared, dedicated, or group) that stores code executed by the processor circuit; other suitable hardware components that provide the described functionality; or a combination of some or all of the above, such as in a system-on- chip.
[0116] Instructions may be executed by one or more processors or processor modules, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” or “processor module” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements. [0117] The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the disclosure, and all such modifications are intended to be included within the scope of the disclosure.

Claims

CLAIMS What is claimed is:
1 . A system to detect a pose of an implant in a subject, comprising: a navigation system including a tracking system; a tracking device configured to be tracked with the tracking system, wherein a pose of the tracking device is operable to be determined by the navigation system; a pose determination system configured to be positioned near the subject; wherein the tracking device is associated with the pose determination system such that the navigation system is operable to determine a pose of the pose determination system in a navigation coordinate system; wherein the pose determination system is operable to interact with the implant; wherein the pose determination system is operable to determine a pose of the implant relative to the pose determination system.
2. The system of Claim 1 , further comprising: an implant retrieval system; a retrieval system tracking device associated with the implant retrieval system; wherein the navigation system is configured to track the retrieval system tracking device to determined a pose of at least a portion of the implant retrieval system.
3. The system of Claim 2, wherein the pose determination system further comprises: a pose determination processor; a first signal sensor; and a second signal sensor; wherein the tracking device includes a first tracking device and a second tracking device; wherein the first tracking device is associated with the first signal sensor and the second tracking device is associated with the second tracking device; wherein each of the first sensor and the second sensor is configured to receive a signal from the implant; wherein the pose determination processor is configured to execute instructions to triangulate the pose of the implant relative to the first signal sensor and the second signal sensor.
4. The system of Claim 3, wherein the pose determination system further comprises an excitation member configured to emit a field that excites at least a portion of the implant to emit the signal.
5. The system of Claim 2, wherein the pose determination system further comprises: a pose determination processor; and an ultrasound transducer; wherein the tracking device is associated with the ultrasound transducer; wherein the ultrasound transducer is configured to emit an ultrasound signal to interact with the implant; wherein the pose determination processor is configured to execute instructions to determine the pose of the implant relative to the ultrasound transducer based on the interaction of the ultrasound signal with the implant.
6. The system of Claim 2, wherein the tracking system further comprises an electromagnetic emitting system operable to emit an electromagnetic field; wherein the implant is configured to interfere with the electromagnetic field; wherein the navigation system further includes a navigation processor, wherein the navigation processor is configured to execute instructions evaluate the distortion to determine the pose of the implant.
7. The system of Claim 6, wherein the evaluation of the distortion to determine the pose of the implant includes recalling a predetermined distortion of the implant in the emitted electromagnetic field.
8. A method of determining a pose of an implant in a subject, comprising: tracking a tracking device with the tracking system; determining a pose of the tracking device by a navigation system based on the tracking of the tracking device; positioning a pose determination system near the subject; associating the tracking device with the pose determination system; determining a pose of the pose determination system in a navigation coordinate system based on the determined pose of the tracking device; operating the pose determination system to interact with the implant.
9. The method of Claim 8, further comprising: tracking a retrieval system tracking device associated with a implant retrieval system; determining a pose of at least a portion of the implant retrieval system with the navigation system based on tracking the retrieval system tracking device.
10. The method of Claim 9, further comprising: determining a pose of the implant relative to the pose determination system.
11 . The method of Claim 10, further comprising: determining the pose of the implant retrieval system relative to the implant based at least on (1) determining the pose of at least the portion of the implant retrieval system in the navigation coordinate system, (2) determining the pose of the implant relative to the pose determination system, and (3) determining the pose of the pose determination system in the navigation coordinate system.
12. The method of Claim 11 , further comprising: displaying an implant graphical representation representative of the determined pose of the implant and an implant retrieval system graphical representation representative of the determined pose of the implant retrieval system.
13. The method of Claim 10, further comprising: providing the pose determination system with a first signal sensor and a second signal sensor; receiving a signal from the implant at the first signal sensor; and receiving the signal from the implant at the second signal sensor; providing the tracking device as a first tracking device and a second tracking device; associating the first tracking device with the first signal sensor and the second tracking device with the second tracking device; triangulating the pose of the implant relative to the first signal sensor and the second signal sensor.
14. The method of Claim 10, further comprising: providing the pose determination system with an ultrasound transducer; associating the tracking device with the ultrasound transducer; operating the ultrasound transducer to emit an ultrasound signal to interact with the implant; and determining the pose of the implant relative to the ultrasound transducer based on the interaction of the ultrasound signal with the implant.
15. The method of Claim 10, further comprising: providing the tracking system with an electromagnetic emitting system; operating the electromagnetic emitting system to emit an electromagnetic field; evaluating a distortion caused by the implant to determine the pose of the implant.
PCT/IB2025/0526222024-03-122025-03-12System for tracking of an instrumentPendingWO2025191487A1 (en)

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US202463564042P2024-03-122024-03-12
US63/564,0422024-03-12

Publications (1)

Publication NumberPublication Date
WO2025191487A1true WO2025191487A1 (en)2025-09-18

Family

ID=95154880

Family Applications (1)

Application NumberTitlePriority DateFiling Date
PCT/IB2025/052622PendingWO2025191487A1 (en)2024-03-122025-03-12System for tracking of an instrument

Country Status (1)

CountryLink
WO (1)WO2025191487A1 (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5592939A (en)1995-06-141997-01-14Martinelli; Michael A.Method and system for navigating a catheter probe
US5913820A (en)1992-08-141999-06-22British Telecommunications Public Limited CompanyPosition location system
US5983126A (en)1995-11-221999-11-09Medtronic, Inc.Catheter location system and method
US6474341B1 (en)1999-10-282002-11-05Surgical Navigation Technologies, Inc.Surgical communication and power system
US6940941B2 (en)2002-02-152005-09-06Breakaway Imaging, LlcBreakable gantry apparatus for multidimensional x-ray based imaging
US7001045B2 (en)2002-06-112006-02-21Breakaway Imaging, LlcCantilevered gantry apparatus for x-ray imaging
US7106825B2 (en)2002-08-212006-09-12Breakaway Imaging, LlcApparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system
US7108421B2 (en)2002-03-192006-09-19Breakaway Imaging, LlcSystems and methods for imaging large field-of-view objects
US7188998B2 (en)2002-03-132007-03-13Breakaway Imaging, LlcSystems and methods for quasi-simultaneous multi-planar x-ray imaging
US7751865B2 (en)2003-10-172010-07-06Medtronic Navigation, Inc.Method and apparatus for surgical navigation
US7831082B2 (en)2000-06-142010-11-09Medtronic Navigation, Inc.System and method for image based sensor calibration
US8175681B2 (en)2008-12-162012-05-08Medtronic Navigation Inc.Combination of electromagnetic and electropotential localization
US8503745B2 (en)2009-05-132013-08-06Medtronic Navigation, Inc.System and method for automatic registration between an image and a subject
US8737708B2 (en)2009-05-132014-05-27Medtronic Navigation, Inc.System and method for automatic registration between an image and a subject
US9138204B2 (en)2011-04-292015-09-22Medtronic Navigation, Inc.Method and apparatus for calibrating and re-aligning an ultrasound image plane to a navigation tracker
US9737235B2 (en)2009-03-092017-08-22Medtronic Navigation, Inc.System and method for image-guided navigation
US20200222127A1 (en)*2019-01-102020-07-16Medtronic Navigation, Inc.System and Method for Registration Between Coordinate Systems and Navigation

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5913820A (en)1992-08-141999-06-22British Telecommunications Public Limited CompanyPosition location system
US5592939A (en)1995-06-141997-01-14Martinelli; Michael A.Method and system for navigating a catheter probe
US5983126A (en)1995-11-221999-11-09Medtronic, Inc.Catheter location system and method
US6474341B1 (en)1999-10-282002-11-05Surgical Navigation Technologies, Inc.Surgical communication and power system
US7831082B2 (en)2000-06-142010-11-09Medtronic Navigation, Inc.System and method for image based sensor calibration
US8320653B2 (en)2000-06-142012-11-27Medtronic Navigation, Inc.System and method for image based sensor calibration
US6940941B2 (en)2002-02-152005-09-06Breakaway Imaging, LlcBreakable gantry apparatus for multidimensional x-ray based imaging
US7188998B2 (en)2002-03-132007-03-13Breakaway Imaging, LlcSystems and methods for quasi-simultaneous multi-planar x-ray imaging
US7108421B2 (en)2002-03-192006-09-19Breakaway Imaging, LlcSystems and methods for imaging large field-of-view objects
US7001045B2 (en)2002-06-112006-02-21Breakaway Imaging, LlcCantilevered gantry apparatus for x-ray imaging
US7106825B2 (en)2002-08-212006-09-12Breakaway Imaging, LlcApparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system
US7751865B2 (en)2003-10-172010-07-06Medtronic Navigation, Inc.Method and apparatus for surgical navigation
US8175681B2 (en)2008-12-162012-05-08Medtronic Navigation Inc.Combination of electromagnetic and electropotential localization
US20120220860A1 (en)*2008-12-162012-08-30Medtronic Navigation, Inc.Combination of Electromagnetic and Electropotential Localization
US9737235B2 (en)2009-03-092017-08-22Medtronic Navigation, Inc.System and method for image-guided navigation
US8503745B2 (en)2009-05-132013-08-06Medtronic Navigation, Inc.System and method for automatic registration between an image and a subject
US8737708B2 (en)2009-05-132014-05-27Medtronic Navigation, Inc.System and method for automatic registration between an image and a subject
US9138204B2 (en)2011-04-292015-09-22Medtronic Navigation, Inc.Method and apparatus for calibrating and re-aligning an ultrasound image plane to a navigation tracker
US20200222127A1 (en)*2019-01-102020-07-16Medtronic Navigation, Inc.System and Method for Registration Between Coordinate Systems and Navigation
US11135025B2 (en)2019-01-102021-10-05Medtronic Navigation, Inc.System and method for registration between coordinate systems and navigation

Similar Documents

PublicationPublication DateTitle
US9579161B2 (en)Method and apparatus for tracking a patient
US8165658B2 (en)Method and apparatus for positioning a guide relative to a base
US8734466B2 (en)Method and apparatus for controlled insertion and withdrawal of electrodes
US8010177B2 (en)Intraoperative image registration
US20140275987A1 (en)Integrated Navigation Array
JP2008126075A (en)System and method for visual verification of ct registration and feedback
WO2008133615A1 (en)Method and apparatus for controlled insertion and withdrawal of electrodes
WO2008130354A1 (en)Intraoperative image registration
EP2432388B1 (en)System for cardiac lead placement
WO2025191487A1 (en)System for tracking of an instrument
US20250177054A1 (en)System and method for navigation
US20240341728A1 (en)System And Method For Imaging And Registration For Navigation
US20250177052A1 (en)System and method for navigation
US20250177053A1 (en)System and method for navigation
US20240415577A1 (en)System And Method For Navigation
WO2025114869A1 (en)System for navigation
US20240277415A1 (en)System and method for moving a guide system
WO2024215866A1 (en)System and method for imaging and registration for navigation
WO2025088433A1 (en)System and method for navigation
WO2025158394A1 (en)System and method for imaging
WO2025088617A1 (en)Path planning with collision avoidance
WO2024257036A1 (en)System and method for navigation
WO2024231740A1 (en)Medical instrument navigation system registration probe with depth-finding or imaging capabilities

[8]ページ先頭

©2009-2025 Movatter.jp