SYSTEM AND METHOD FOR NAVIGATION
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of and priority to U.S. Provisional Patent Application No. 63/516,006 filed July 27, 2023. This application include subject matter related to that disclosed in US Pat. App. No. 63/515,888 filed July 27, 2023. The entire disclosure of the above applications is incorporated herein by reference.
FIELD
[0002] The subject disclosure is related generally to a tracking and navigation system, and particularly to tracking using one or more tracking systems and determining a relationship therebetween.
BACKGROUND
[0001] This section provides background information related to the present disclosure which is not necessarily prior art.
[0002] An instrument can be navigated relative to a subject for performing various procedures. For example, the subject can include a patient on which a surgical procedure is being performed. During a surgical procedure, an instrument can be tracked in a physical space which may also be referred to as an object or subject space. In various embodiments, the subject space can be a patient space defined by a patient. The location of the instrument that is tracked can be displayed on a display device relative to an image of the patient or a view of the patient.
[0003] The position of the patient can be determined with a tracking system. Generally, a patient is registered to the image, via tracking an instrument relative to the patient to generate a transformation (e.g., translational and rotational) map between the subject or object space (e.g., patient space) and the image space. This often requires time during a surgical procedure for a user, such as a surgeon, to identify one or more points in the subject space and correlating, often identical points, in the image space.
[0004] After registration, the position of the instrument can be appropriately displayed on the display device while tracking the instrument. The position of the instrument relative to the subject can be displayed as a graphical representation, sometimes referred to as an icon on the display device.
SUMMARY
[0005] This section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.
[0006] According to various embodiments, an imaging system may be used to acquire image data of a subject. The imaging system may include an ultrasound (US) imaging system that includes an US probe that generally includes an ultrasound transducer to emit and receive ultrasound frequencies. It is understood, however, that the imaging system may include separate components that emit and receive ultrasound frequencies.
[0007] According to various embodiments, the US probe may be moved relative to a subject, such as a by a user and/or with a robotic system. The US probe may be moved relative to the subject in any appropriate manner, however. Further, the US probe may be held relative to the subject with an appropriate holder or mount. Regardless, various objects may be placed relative to the subject, such as in or near the subject space. The various objects may be formed of various materials such as conductive materials including metal or metal alloys.
[0008] An object, also referred to as an interfering object, formed of conductive materials may have currents induced therein due to fields that are formed near the object having the conductive materials. The conductive materials may also be referred to as interfering materials. A field, such as from a tracking system electromagnetic field, may induce currents within the conductive materials, such as eddy currents. The conductive materials or objects may thereafter emit fields that are not emitted by the tracking system. The fields that are emitted by the conductive objects or materials may interfere with the field emitted by the electromagnetic tracking system. Therefore, the various objects formed of conductive materials may form interfering fields. The interfering fields may interfere with tracking of a tracking device.
[0009] A tracking or navigation system may include at least two tracking systems. The two tracking systems may operate individually and in concert to ensure a selected accuracy of a navigation of a selected tracking device. The tracking systems may work individually for a selected time or volume for a local navigation. Two tracking systems may work in concert to ensure a global accuracy of a navigation of a tracking device. The two tracking systems may include, for example, an optical, visual, electromagnetic, inertial, or combinations thereof.
[0010] The two or more tracking systems may be used to ensure global accuracy and precision over a period of time and/or volume. Further, the two or more tracking systems may be registered to one another and/or to a subject. Thus, the two or more tracking systems may be operable to reduce a tracking area over a large volume and/or period of time.
[0011] Further areas of applicability will become apparent from the description provided herein. The description and specific examples in this summary are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
DRAWINGS
[0012] The drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.
[0013] Fig. 1 is diagrammatic view illustrating an overview of a robotic system and a navigation system, according to various embodiments;
[0014] Fig. 2 is a representation of two or more tracking systems to track one or more objects, according to various embodiments;
[0015] Fig. 3 is a representation of two or more tracking systems to track one or more objects, according to various embodiments;
[0016] Fig. 4 is a representation of a procedure theater with two or more tracking systems to track one or more objects, according to various embodiments;
[0017] Fig. 5 is a representation of two or more tracking systems to track one or more objects, according to various embodiments;
[0018] Fig. 6 is a representation of two or more tracking systems to track one or more objects, according to various embodiments; [0019] Fig. 7 is a representation of two or more tracking systems to track one or more objects, according to various embodiments;
[0020] Fig. 8 is a flowchart of a method for tracking an object, according to various embodiments; and
[0021] Fig. 9 is a flowchart of a method for tracking an object, according to various embodiments.
[0022] Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
DETAILED DESCRIPTION
[0023] Example embodiments will now be described more fully with reference to the accompanying drawings.
[0024] The subject disclosure is directed to an exemplary embodiment of a surgical procedure on a subject, such as a human patient. It is understood, however, that the system and methods described herein are merely exemplary and not intended to limit the scope of the claims included herein. In various embodiments, it is understood, that the systems and methods may be incorporated into and/or used on non-animate objects. The systems may be used to, for example, to register two or more coordinate systems for use on manufacturing systems, maintenance systems, and the like. For example, automotive assembly may use one or more robotic systems including individual coordinate systems that may be registered together for coordinated or concerted actions. Accordingly, the exemplary illustration of a surgical procedure herein is not intended to limit the scope of the appended claims.
[0025] Discussed herein, according to various embodiments, one or more tracking systems may be used to track a selected tracking device. According to various embodiment, at least one tracking system may operate by emitting an electromagnetic (EM) field from a localizer, also referred to as an EM localizer. The EM field may be emitted from one or more coils that may be oriented relative to an origin point. The field may be a largely magnetic field. The field may be constant or varying in time and/or frequency. In an EM tracking system, a tracking device may include one or more coils that operate as sensors to sense the field. The field may generate a current within the coil of the tracking device. A determination of a position and orientation (also referred to collectively as a “pose”) of the tracking device may be made.
[0026] According to various embodiment, at least one tracking system may operate by an optical system, which may emitting or receiving selected radiation, such as a with optical sensors. The optical sensors may sense visible or other types of light, such as infrared (IR). For example, two or more sensors may be positioned to triangulate an optical or visual sensor. As discussed herein a visual sensor may be sensed with visible wavelengths of light (e.g., about 350 to about 800 nanometers) and optical sensors may sense non-visible wavelengths (e.g., IR at about 700 nanometers to about 1 millimeter). The tracking device may emit and/or reflect such wavelengths.
[0027] According to various embodiment, at least one tracking system may operate by sensing motion, such as an accelerometer measuring a component or vector acceleration, or a gyroscope measure an component or vector orientation. According to various embodiments these sensors may be referred to as inertial sensors. Exemplary sensors may include a MPU-9250 Nine-Axis (Gyro + Accelerometer + Compass) MEMS MotionTracking™ Device sold by InvenSense. Such sensors may sense motion in at least three degrees of freedom. Thus, two or more sensors may be used to sense motion in at least six degrees of freedom, including translation and rotation. Further, various sensors may include sensors to sense a selected degree of freedom.
[0028] Various conditions may adversely affect accuracy and/or precision of one or more of the tracking systems. For example, various materials are conductive, such as conductive polymers, metal or metal alloys, or other materials. Objects or items may be formed with these materials. If an item formed with these materials is also in or near the field generated by the EM localizer, a current may be formed or induced in the object. In this instance, the object may be referred to as an interfering object. When a current is induced in the interfering object, a field may also be produced. A field produced due to the induced current in the interfering object may also be referred to as an interfering field. These interfering fields may alter the field sensed by the tracking device such that it is not always sensing the EM field generated by the EM localizer. Inertial sensors may have drift over time. Further, optical or visual sensors may have line of sight necessity. Thus, at any given instant one or more tracking systems may be adversely affected by a given condition.
[0029] Regardless, various portions may be tracked relative to the subject. For example, a tracking system may be incorporated into a navigation system that includes one or more instruments that may be tracked relative to the subject. The navigation system may include one or more tracking systems that track various portions, such as tracking devices, associated with instruments. The tracking system may include a localizer that is configured to, alone or in combination with a processor, determine the pose of a tracking device in a navigation system coordinate system. Determination of the navigation system coordinate system may include those described at various references including US Pat. No. 8,737,708; US Pat. No. 9,737,235; US Pat. No. 8,503,745; and US Pat. No. 8,175,681 ; all incorporated herein by reference. In particular, a localizer may be able to track an object within a volume relative to the subject. The navigation volume, in which a device may be tracked may include or be referred to as the navigation coordinate system or navigation space. A determination or correlation between two coordinate systems may allow for or also be referred to as a registration between two coordinate systems.
[0030] Furthermore, images may be acquired of selected portions of a subject. The images may be displayed for viewing by a user, such as a surgeon. The images may have superimposed on at least a portion of the image a graphical representation (e.g., icon) of a tracked portion or member, such as an instrument. The images may have a coordinate system and define an image space. According to various embodiments, the graphical representation may be superimposed on the image at an appropriate position due to registration of an image space (also referred to as an image coordinate system) to a subject space (also referred to as an subject or physical coordinate system). A method to register a subject space defined by a subject (including a physical space relative thereto and/or inclusive of) to an image space may include those disclosed in US Pat. No. 8,737,708; US Pat. No. 9,737,235; US Pat. No. 8,503,745; and US Pat. No. 8,175,681 ; all incorporated herein by reference.
[0031] As discussed herein, image space to subject space registration may not be required. An imaging system (e.g., a US probe) may be tracked as may an instrument that is separate or second relative to the imaging system. Thus, the image coordinate space is tracked with the imaging system and the tracked pose of the instrument may be tracked with the same or correlated tracking system. Thus, a pose of the instrument within the image may be known based on the correlated tracking systems.
[0032] Nevertheless, image to subject registration may occur. For example, during a selected procedure, a coordinate system may be registered to the subject space or subject coordinate system due to a selected procedure, such as imaging of the subject. In various embodiments, the first coordinate system may be registered to the subject by imaging the subject with a fiducial portion that is fixed relative to the first member or system, such as the robotic system. The known position of the fiducial relative to the robotic system may be used to register the subject space relative to the robotic system due to the image of the subject including the fiducial portion. Thus, the position of the robotic system or a portion thereof, such as the end effector, may be known or determined relative to the subject. Due to registration of a second coordinate system to the robotic coordinate system may allow for tracking of additional elements not fixed to the robot relative to a position determined or tracked by the robot.
[0033] The tracking of the instrument during a procedure, such as a surgical or operative procedure, allows for navigation of the instrument during the procedure and may allow a navigated procedure. When image data is used to define an image space it can be correlated or registered to a physical space defined by a subject, such as a patient as discussed herein. According to various embodiments, therefore, the patient defines a patient space in which an instrument can be tracked and navigated. The image space defined by the image data can be registered to the patient space defined by the patient. The registration can occur with the use of fiducials that can be identified in the image data and in the patient space.
[0034] Fig. 1 is a diagrammatic view illustrating an overview of a procedure room or arena. In various embodiments, the procedure room may include a surgical suite in which may be placed a robotic system 20 and a navigation system 26 that can be used for various procedures. The robotic system 20 may include a Mazor X™ robotic guidance system, sold by Medtronic, Inc. The robotic system 20 may be used to assist in guiding a selected instrument, such as drills, screws, etc. relative to a subject 30. Additionally or alternatively, the robotic system 20 may hold and/or move an imaging system, such as an ultrasound (US) probe 33. The robotic system 20 may include a mount 34 that fixes a portion, such as a robotic base 38, relative to the subject 30. The robotic system 20 may include one or more arms 40 that are moveable or pivotable relative to the subject 30, such as including an end effector 44. The robotic system 20 may further include a controller 22 that may include one or more of a processor and/or memory system. The controller 22 ay be inclusive with the robotic system 20 and/or in communication therewith. The controller 22 may access and/or receiving instructions that are executed to move the end effector 44 and/or other portions of the arm 40. Thus, the robotic system 20 may be controlled manually by the user 72 and/or automatically such as by executing the instructions with the controller 22 and/or combinations thereof.
[0035] The end effector 44 may be any appropriate portion, such as a tube, guide, or passage member. Affixed to and/or in place of the end effector may be the imaging system that may be the US probe 33. The end effector 44 may be moved relative to the base 38 with one or more motors. The position of the end effector 44 may be known or determined relative to the base 38 with one or more encoders at one or more joints, such as a wrist joint 48 and/or an elbow joint 52 of the robotic system 20. One or more portions of the robotic system 20 may be formed of conductive materials.
[0036] The navigation system 26 can be used to track the location of one or more tracking devices and/or determine and/or illustrate a pose thereof. Tracking devices may include a robot tracking device 54, a subject tracking device 58, an imaging system tracking device 62, an imaging system or second imaging system tracking device 81 , and/or an instrument or tool tracking device 66. A tool or instrument 68 may be any appropriate moveable member such as a drill, forceps, catheter, tube, scalpel, or other tool moved or controlled by a user 72. The tool 68 may also or alternatively include an implant, such as a spinal implant or orthopedic implant. It should further be noted that the navigation system 26 may be used to navigate any type of instrument, implant, or delivery system, including: guide wires, arthroscopic systems, orthopedic implants, spinal implants, deep brain stimulation (DBS) probes, etc. Moreover, the instruments may be used to navigate or map any region of the body. The navigation system 26 and the various instruments may be used in any appropriate procedure, such as one that is generally minimally invasive or an open procedure.
[0037] An additional or alternative, imaging system 80 may be used to acquire pre- , intra-, or post-operative or real-time image data of a subject, such as the subject 30. It will be understood, however, that any appropriate subject can be imaged and any appropriate procedure may be performed relative to the subject. In the example shown, the imaging system 80 comprises an O-arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Colorado, USA. The imaging system 80 may have a generally annular gantry housing 82 in which an image capturing portion is moveably placed and/or enclosed. The imaging system 80 can include those disclosed in US Pat. Nos. 7,188,998; 7,108,421 ; 7,106,825; 7,001 ,045; and 6,940,941 ; all of which are incorporated herein by reference, or any appropriate portions thereof. It is further appreciated that the imaging system 80 may include in addition or alternatively a fluoroscopic C-arm. Other exemplary imaging devices may include fluoroscopes such as bi-plane fluoroscopic systems, ceiling mounted fluoroscopic systems, cath-lab fluoroscopic systems, fixed C-arm fluoroscopic systems, isocentric C-arm fluoroscopic systems, 3D fluoroscopic systems, etc. Other appropriate imaging devices can also include MRI, CT, ultrasound, etc.
[0038] The position of the imaging system 33, 80, and/or portions therein such as the image capturing portion, can be known (e.g., precisely such as within about 0.1 millimeters (mm) to about 5 mm) relative to any other portion of the imaging device 33, 80 and or within the navigable domain. The navigable domain or navigation space may be the physical space in which any one or more tracking system may track a tracking device. The imaging device 33, 80, according to various embodiments, can know and/or recall precise coordinates relative to a fixed or selected coordinate system. For example, the robotic system 20 may know or determine its position and position the US probe 33 at a selected pose. Similarly, the imaging system 80 may also position the imaging portions at a selected pose. This can allow the imaging system 80 to know its position relative to the patient 30 or other references. In addition, as discussed herein, the precise knowledge of the position of the image capturing portion can be used in conjunction with a tracking system to determine the position of the image capturing portion and the image data relative to the tracked subject, such as the patient 30. In other words, the imaging system tracking device 62, 81 may be used and/or operable to determine a pose of the imaging system 33, 80 at a selected time such as during image data acquisition.
[0039] Herein, reference to the imaging system 33 may refer to any appropriate imaging system, unless stated otherwise. Thus, the US probe 33 as the imaging system is merely exemplary regarding the subject disclosure. As one skilled in the art will understand, generally the US probe 33 may emit a US wave in a plane and receive an echo relative to any portions engaged by the wave. The received echo at the US probe 33 or other appropriate received may be used to generate image data and may be used to generate an US image also referred to as a sonogram.
[0040] The imaging device 80 can be tracked with the tracking device 62. Also, the tracking device 81 can be associated directly with the US probe 33. The US probe 33 may, therefore, be directly tracked with a navigation system 26 as discussed herein. In addition or alternatively, the US probe 33 may be positioned and tracked with the robotic system 20. Regardless, image data defining an image space acquired of the patient 30 can, according to various embodiments, be registered (e.g., manually, inherently, or automatically) relative to an object space. The object space can be the space defined by a patient 30 in the navigation system 26.
[0041] The patient 30 can also be tracked as the patient moves with a patient tracking device, DRF, or tracker 58. Alternatively, or in addition thereto, the patient 30 may be fixed within navigation space defined by the navigation system 26 to allow for and/or maintain registration such as to the image space of the image 108. According to various embodiments, registration of the image space to the patient space or subject space may allow for navigation of the instrument 68 with the image data. According to various embodiments, navigation of the instrument need not required image to subject registration, as discussed herein. When navigating the instrument 68, a position of the instrument 68 can be illustrated relative to image data acquired of the patient 30 on a display device 84 such as with a graphical representation 68i, 68i’. An additional and/or alternative display device 84’ may also be present to display an image. Various tracking systems, such as one including an optical localizer 88, a visual localizer 91 , or an electromagnetic (EM) localizer 94, or other appropriate tracking system (e.g., inertial) can be used to track the instrument 68.
[0042] More than one tracking system can be used to track the instrument 68 or other portion, such as the US probe 33 with the tracking device 81 in the navigation system 26. According to various embodiments, these can include an electromagnetic tracking (EM) system having the EM localizer 94 and/or an optical tracking system having the optical localizer 88. Either or both of the tracking systems can be used to track selected tracking devices, as discussed herein. It will be understood, unless discussed otherwise, that a tracking device can be a portion trackable with a selected tracking system. A tracking device need not refer to the entire member or structure to which the tracking device is affixed or associated.
[0043] The position of the patient 30 relative to the imaging device 33 can be determined by the navigation system 26. The position of the imaging system 33 may be determined, as discussed herein. The patient 30 can be tracked with the dynamic reference frame (or DRF) 58, as discussed further herein. Accordingly, the position of the patient 30 relative to the imaging device 33 can be determined.
[0044] Image data acquired from the imaging system 33, or any appropriate imaging system, can be acquired at and/or forwarded from an image device controller 96, that may include a processor module, to a navigation computer and/or processor module (also referred to as a processor) 102 that can be a part of a controller or work station 98 having the display 84 and a user interface 106. Further, a memory 103, of any appropriate type, may be accessed by the processor 102. It will also be understood that the image data is not necessarily first retained in the controller 96, but may also be directly transmitted to the work station 98. The work station 98, which may be any appropriate computing system, can provide facilities for displaying the image data as an image 108 on the display 84, saving, digitally manipulating, or printing a hard copy image of the received image data. The user interface 106, which may be a keyboard, mouse, touch pen, touch screen, virtual interfaces, or other suitable device, allows the user 72 to provide inputs to control the imaging device 80, 33, via the image device controller 96, or adjust the display settings of the display 84. The work station 98 may also direct the image device controller 96 to adjust the image capturing portion of the imaging device 80 to obtain various two-dimensional images along different planes in order to generate representative two-dimensional and three-dimensional image data.
[0045] With continuing reference to FIG. 1 , the navigation system 26 can further include one or more tracking systems, including either or both of the electromagnetic (EM) localizer 94 and/or the optical localizer 88. The tracking systems may include a controller and interface portion 110. The controller 110 can be connected to the processor portion 102, which can include a processor included within a computer. The EM tracking system may include the STEALTHSTATION® AXIEM™ Navigation System, sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado; or can be the EM tracking system described in US Patent Application Serial No. 10/941 ,782, filed Sept. 15, 2004, and entitled "METHOD AND APPARATUS FOR SURGICAL NAVIGATION"; US Patent No. 5,913,820, entitled “Position Location System,” issued June 22, 1999; and US Patent No. 5,592,939, entitled “Method and System for Navigating a Catheter Probe,” issued January 14, 1997; all of which are herein incorporated by reference. It will be understood that the navigation system 26 may also be or include any appropriate tracking system, including a STEALTHSTATION® TREON® or S7™ tracking systems having an optical localizer, that may be used as the optical localizer 88, and sold by Medtronic Navigation, Inc. of Colorado. Other tracking systems include an acoustic, radiation, radar, etc. The tracking systems can be used according to generally known or described techniques in the above incorporated references. Details will not be included herein except when to clarify selected operation of the subject disclosure.
[0046] Wired or physical connections can interconnect the tracking systems, imaging device 80, etc. Alternatively, various portions, such as the instrument 68 may employ a wireless communications channel, such as that disclosed in US Patent No. 6,474,341 , entitled “Surgical Communication Power System,” issued November 5, 2002, herein incorporated by reference, as opposed to being coupled directly to the controller 110. Also, the tracking devices such as 62, 66, 54 can generate a field and/or signal that is sensed by the localizer(s) 88, 94.
[0047] Various portions of the navigation system 26, such as the instrument 68, and others as will be described in detail below, can be equipped with at least one, and generally multiple, of the tracking devices 66. The instrument can also include more than one type or modality of tracking device 66, such as an EM tracking device and/or an optical tracking device. The instrument 68 can include a graspable or manipulable portion at a proximal end and the tracking devices may be fixed near the manipulable portion of the instrument 68.
[0048] Additional representative or alternative localization and tracking system is set forth in US Patent No. 5,983,126, entitled “Catheter Location System and Method,” issued November 9, 1999, which is hereby incorporated by reference. The navigation system 26 may be a hybrid system that includes components from various tracking systems.
[0049] According to various embodiments, the navigation system 26 can be used to track any appropriate portion such as the US probe 33 and/or the instrument 68 relative to the patient 30. The instrument 68 can be tracked with the tracking system, as discussed above. Image data of the patient 30, or an appropriate subject, can be used to assist the user 72 in guiding the instrument 68. The image data may or may not be registered to the patient 30. For example, as discussed herein, the US probe 33 is tracked and generates the image data. Thus, the image data need not be registered to the subject to display a pose of the tracked instrument 68 relative to the image data generator with the tracked US probe 33. The image data defines the image space that is registered to the patient space defined by the patient 30. The registration can be performed as discussed herein, automatically, manually, or combinations thereof. The registration can include the process and the final transformation (including a translation and rotation) map. Generally, registration includes determining points in the image data and the subject space and determining a transformation map therebetween. Once done, the image space are registered to the subject space, or any two or more coordinate spaces.
[0050] Generally, registration also allows a transformation map to be generated of a tracked physical pose of the instrument 68 relative to the image space of the image data. The transformation map allows the tracked position of the instrument 68 to be displayed on the display device 84 relative to the image data 108. The graphical representation 68i, also referred to as an icon, can be used to illustrate the location of the instrument 68 relative to the image data 108. [0051] With continuing reference to Fig. 1 , a subject registration system or method can use the tracking device 58. The tracking device 58 may include portions or members 120 that may be trackable, but may also act as or be operable as a fiducial assembly. The fiducial assembly 120 can include a clamp or other fixation portion 124 and the imageable fiducial body 120. It is understood, however, that the members 120 may be separate from the tracking device 58. The fixation portion 124 can be provided to fix any appropriate portion, such as a portion of the anatomy. As illustrated in Fig. 1 , the fiducial assembly 120 can be interconnected with a portion of a spine 126 such as a spinous process 130. The fixation portion 124 can be interconnected with a spinous process 130 in any appropriate manner. For example, a pin or a screw can be driven into the spinous process 130. Further, the tracking device 58 may be operable to track with one or more tracking systems or modalities, such as EM tracking system or optical tracking system.
[0052] As illustrated in Fig. 1 , the imaging device 33 may include the US probe 33 that may be positioned relative to the subject 30, such as by the robotic system 20 and/or the surgeon 72. In various embodiments, the surgeon 72 may operate the robotic arm 20 and/or hold the US probe 33 separate therefrom. As discussed herein, therefore, the robotic system 20 may move the US probe 33 to a selected position relative to the subject 30. According to various embodiments, the imaging system may be positioned relative to the subject in any appropriate manner.
[0053] Further, as is understood by one skilled in the art, the image data acquired with one or more ultrasound arrays 125 of the US probe 33 may be registered in navigation system such as disclosed in the US Patent No. 7,085,400 and US Patent No. 9,138,204, both incorporated herein by reference. The image data acquired within the respective ultrasound arrays 125 may be of the subject 30. As the ultrasound arrays 125 are registered to the subject 30 using the navigation system 26, the image data required of the subject 30, such as of a heart 127 and/or other subject portion such as a vertebrae, may also have its pose determined in navigation space within the navigation system 26. The image data may be used to generate a specific image portion such as an image of the heart 127i. The image 127i may be a reconstruction based on the image data from the US probe 33. The graphical representation 68i may be represented relative to the image 108 and/or portions therefore such as the image of the heart 127i. Further, the graphical representation 68i may be superimposed on the reconstruction and/or the image. The reconstruction may include additional data (e.g., atlas or population data) and may also be referred to as a model. The graphical representation may be superimposed on the model.
[0054] By tracking the US probe 33 with the tracking device 81 in the patient space, a determination of the pose of the imaging plane 129 may be determined. Therefore, a plurality of the discrete images collected at each pose of the imaging plane 129, due to the pose of the US probe 33 may be combined in a selected manner to achieve a three- dimensional image. A determination of a pose of the imaging plane 129 relative to the tracking device 81 may be determined in an appropriate manner, such as via mechanical and electromechanical nominal dimensions or with a calibration system that may include a calibration jig or other appropriate calibration system. The imaging system 33 may be tracked, as discussed above. Various tracking systems may include and/or require calibration of the imaging system. Thus, the pose of the tracking device 22 relative to a plane of the US imaging system may be determined and/or known. Various systems and methods are disclosed in US Patent Nos. 6,379,302; 6,669,635; 6,968,224; 7,085,400; 7,831 ,082; 8,320,653; 8,811 ,662; and 9,138,204 all of which are incorporated herein by reference.
[0055] According to various embodiments, the ultrasound probe may emit or transmit ultrasound waves in a selected pattern or plane. The plane may be a shape as is understood by one skilled in the art. The plane is generally able to acquire data in a field of view to generate images, also referred to as sonograms when images are generated based on ultrasound data.
[0056] As noted above, the navigation system 26 may include a plurality of tracking systems. For example, an optical tracking system can include the optical localizer 88 to track one or more optical tracking devices. The EM localizer 94 may be used to track one or more EM tracking devices. In various embodiments, tracking devices can include multiple portions that may be tracked by both of the tracking system, such as the patient tracking device or DRF 58, including the optical portions 120 and an EM portion 121. In addition, or alternatively, the respective tracking devices may track respective or different tracking devices on more than one portion, such as the patient tracking device 58, or the ultrasound tracking device 81 , or the imager tracking device 62.
[0057] In various embodiments, the respective tracking systems may operate most efficiently in different conditions as is understood by one skilled in the art. For example, the EM tracking device including the EM localizer 94 may transmit fields that may be distorted by various items such as the housing of the US probe 33, the housing of the imaging system 80, or other systems. Further, a line of sight from the optical localizer 88 may be blocked by opaque portions, such as the gantry 82 of the imaging system 80, a portion of the user 72, or even of the subject 30. Accordingly, it may be selected to operate or receive tracking information from only a single one of the tracking systems at a time to track various portions. Accordingly, it may be selected to combine received tracking information from more than one tracking systems at a time and weight their tracking information according to determined and/or pre-determined individual system tracking quality metrics to track various portions. Nevertheless, the coordinate systems of the plurality of tracking devices may be registered to one another, also referred to as correlated, such that a position in one may be transformed to a position in another, such as through registration. Further, the coordinate systems of the tracking systems may be registered to a coordinate system of the image 108 and/or the imaging system may be tracked with at least one of the respective tracking systems. Thus, the tracked position of any portion may be displayed regardless of the tracking system used to track it. Further, the determined position of any instrument may be translated to any other tracking system. [0058] The various tracking systems may be used to register or confirm a pose of an instrument between the different tracking systems. For example, if the EM tracking system having the EM localizer 94 is generating a field that is distorted, the optical tracking system 88 may be used to confirm a position of a tracked instrument by confirming a position of the EM localizer 94 in the coordinate space of the optical localizer 88 or other appropriate localizer. Further, it may be understood that a distortion is local such that a local change of pose may be determined with the EM tracking system, regardless of whether the global position is not known for the tracking device, as discussed further herein according to various embodiments. Therefore, plurality of tracking systems may be used to confirm a pose of a tracking device tracked with another or a different tracking system and/or confirm a pose over a short volume or time even if a distortion or interference occurs.
[0059] Further, it is understood that various other tracking systems may be used. For example, a vision tracking system may include one or more optical sensors that are able to identify a shape or portion of a tracking device, including an outline of an instrument itself. Further, tracking systems may include inertial tracking systems. In an inertial tracking system a drift may occur over time and therefore a secondary tracking system may confirm or reconfirm a pose of a tracking device having an inertial sensor. According to various embodiments, therefore, various tracking systems may be used concurrently and/or individually as discussed further herein.
[0060] The system, including the navigation system 26 discussed above, they include the various portions including the EM localizer 94 and/or the optical or visual localizer including the visual cameras 93 or the optical localizer cameras 88. As discussed above, the visual and/or optical cameras may be included in a spaced apart component, such as the optical localizer is illustrated in fig.1 , and/or included on a user wearable system such as the augmented reality system 91 . It is understood that the discussion herein regarding the EM localizer 94 and an EM tracking system and the optical or visual localizer 93 and an optical or localizing track tracking system 88 may be associated with any appropriate and related tracking systems. Therefore, the discussion herein of the optical tracking system 93 and the EM tracking system with the EM localizer 94 are merely exemplary and various features related there too may be associated or relate to other systems. For example, as disclosed above, an inertial tracking system may be used to track an instrument by sensing movement. However, a drift in an inertial tracker may occur over time and use of the visual tracking system 93 may be advantageous and/or useful for confirming or resetting a pose of a tracking device. Therefore, the exemplary systems discussed herein are understood to relate to other systems, and less specifically indicated otherwise.
[0061] As disclosed above, one or more tracking systems may be used to track one or more objects, including at least various portions thereof. For example, as illustrated in Fig. 2, that the optical tracking system 88 and/or the visual tracking system 93 may track various tracking devices, including those discussed above, and herein. An optical or visual tracking device 170 may be associated with the EM Localizer 94 that is tracked by at least one of the optical tracking system 88 and/or the visual tracking system 93. For example, the optical tracking device 170 may be fixed to an exterior housing or casing of the EM Localizer 94 such that it is viewable or within a line of sight of either or both of the optical localizer 88 and/or the visual localizer 93. Discussion herein relating to only one of the optical or visual localizers 88, 93 will be understood to refer to both unless specifically identified otherwise. Accordingly, the optical tracking device 170 may also be referred to or a visual tracking device that is tracked by the visual localizer 93. An EM tracking device 174 also may be associated the optical localizer 88 and/or an EM tracking device 176 may be associated with the visual localizer 93. Accordingly, each of the optical localizer 88 and the visual localizer 93 may be tracked with the EM localizer 94 in the EM tracking system. Further, various instruments such as the US Probe 33 may have a visual tracking device 180 associated therewith. The visual tracking device 180 may be tracked with the visual localizer 93, or any appropriate localizer as discussed above.
[0062] The visual tracking devices 170, 180 may be provided in any appropriate shape, configuration, geometry, with markings, or the like that may be visually identified as such or optically identified by the respective tracking systems, including the visual localizer 93. For example, the visual tracking device 170 may include one or more icons or features 184a, 184b, 184c, and 184d. The visual tracking device 180 may also include one or more markings, 188a, 188b, 188c, and 188d. The markings 184, 188 may be identical to one another other than in size, shape, and/or configuration. Further, the markings may be any appropriate type of markings such as an alternating color checkerboard, geometric shapes, or the like. The visual tracking devices 170, 180 may be distinguishable from one another due to various features thereof, such as a configuration of the respective markings 184, 188 and/or positions relative to one another. For example, the visual tracking device 170 may include the marking 184b that is an arrow. In the visual tracking device 170 the arrow 184b may be pointed toward the marking 184c. In the visual tracking device 180 the marking 188b may also be an arrow but pointing to the marking 188a. Therefore, the navigation system 26 may differentiate between the two visual tracking devices 170, 180. [0063] Further, various other visual distinctions between the visual tracking devices, such as 170, 180 may include a geometry of the markings 184, 188. For example, each of the markings 184, 188 may include respective centers 184a' and in sequence thereafter, and 188a’ and in sequence thereafter. The centers 184a' and 188a’ may be positioned at various distances relative to the other markings. For example, the center 184a’ may be a distant 192 from a center 184d’ of the marking 184d. The center 188a’ may be a second distance 196 from the center 188d' of the marking 188d. Therefore, the distinctive or different configurations may also be used to separately identify the various visual tracking devices, 170, 180.
[0064] The visual localizer 93 may have in its field of view 200 both of the visual tracking device 170 and the visual tracking device 180. The field of view 200 of the visual localizer 91 may therefore be able to view both of the visual tracking devices 170, 180 and allow for a determination of a pose of each within at least the visual localizer 93 navigation space. Further, a determination, such as with the navigation system 26, of a respective or relative pose between the two visual tracking devices 170, 180 may be made.
[0065] For example, as illustrated in Fig. 1 , the visual tracking device 180 associated with the US Probe 33 may be within the field view of the visual localizer 93, as may the visual tracking device 170 associated with the EM localizer 94. Therefore, the navigation system 26 may determine a pose of each and/or between the US probe 33 and the EM localizer 94. The pose of either, or both of the EM localizer 94 and the US probe 33 may also be determined within the coordinate or navigation space of the visual localizer 93 due to the visual localizers 170, 180 associated with respective portions.
[0066] It is understood that a visual tracking device may also be associated with any other portion, such as the robotic system 20, the optical localizer 88, the imaging system 80, and/or the tool 68. For example, the tool 68 may include a visual tracking device 204 associated therewith. As discussed herein, however, the tool 68 may only include a single tracking device 66 and the tracking device 66 may be an EM tracking device. As discussed further herein, the instrument 68 may be within an image data or imaging plane of the US probe 33. As the instrument 68 has the tracking device 66 associated therewith, which may be an EM tracking device, the pose of the imaging plane emitted or generated by the US probe 33 may be correlated to the tracked pose of the instrument 68, with the EM tracking device 66, at least because the visual localizer 93 is able to view the visual tracking device 170 associated with the EM localizer 94 and/or the visual tracking device 180 associated with the US probe 33.
[0067] Turning reference to Fig. 3, the visual localizer 93 may be used by the user 72 to view one or more of the visual trackers, such as the visual tracker 170 associated with the EM localizer 94 and the visual localizer 180 associated with the US Probe 33. The US Probe 33 may be used to generate image data in an image plane 129 of the subject 30. As illustrated in Fig. 3, the visual tracker 180 may be within a field of view of the visual localizer 93. As discussed above, the visual localizer 93 may be associated with respective systems, such as augmented reality or virtual reality headset 91 that may be worn by the user 72. It is also understood, however, that the optical localizer 88 may be provided to track selected elements, even including the visual tracking devices 170, 180. In addition, the optical localizer 88 may have a visual tracking device 95 associated there with. The visual tracking device 95 may include features and markings, similar to those discussed above, but allowing for tracking of the optical localizer 88 with the visual localizer 93.
[0068] The visual localizer 93 may define a coordinate space within the field of view of the visual localizer 93. The coordinate space of the visual localizer 93 may use the cameras of the visual localizer 93 as an origin and/or any other appropriate portion. The various visual tracking devices, 170, 180, 95 may, therefore, be tracked within the single coordinate space of the visual localizer 93.
[0069] As illustrated in fig. 3, the EM localizer 94 may be positioned relative to the subject 30 and various portions thereof, such as the instrument 68. The instrument 68 may include the tracking device 66, which may be an EM tracking device. The instrument 68 may have a first portion 220 that is exterior to the subject 30 and a second portion 224 that is interior to the subject 30, such as extending through an incision 226. The incision 226 may be formed in the subject in in any appropriate manner. The instrument 68 may be a selected instrument, such as an ablation device, a catheter, or the like. Nevertheless, the internal portion 224 may be positioned within the subject 30 and the EM tracking device 66 may be associated there with. Therefore, the EM tracking system, including the EM localizer 94, may be used to track the pose of the EM tracking device 66 and the instrument 68, including the internal portion 234, therewith. The US Probe 33 may generate to the imaging plane 129. However, as illustrated in fig. 3, the US Probe 33 is not tracked with the EM localizer 94 in the EM tracking system and neither is the subject 30. The tracked pose of the instrument 68, therefore, may not be immediately determined with the visual tracking device 93 and /or displayed with the display 84. Nevertheless, the EM localizer 94 may be tracked with the visual tracking device 170. The visual tracking device 170 tracked in the coordinate space of the visual localizer 93 allows for a registration or correlation of the coordinate space of the EM localizer 94 and the visual localizer 93. In other words, a transformation of the coordinate space or navigation space of the EM localizer 94 can be made to the coordinate space or navigation space of the visual localizer 93. The position of the visual localizer 170 on the EM localizer 94 may be known by the navigation system 26. Therefore, the emitted fields from the EM localizer 94 and their respective points may be transformed to a coordinate space of the visual localizer 93. This transformation is based upon the known pose of the visual local tracking device 170 associated with the EM localizer 94 and the visual localizer 93. Therefore, a tracked pose of the EM tracking device 66 associated with the instrument 68 may also be transformed through the coordinate space or navigation space of the visual localizer 93. The visual localizer 93 is also able to track the visual tracking device 180 associated with the US Probe 93. Therefore, the pose of the image plane 129 may also be known in the coordinate space of the visual localizer 93. Thus, the navigation system 26 may be able to execute instructions to transform a pose of the EM tracking device 66 into the coordinate space of the visual localizer 93. Thus, the pose of the imaging plane 129 in the visual localizing coordinate space may be transformed or registered to the pose of the EM tracking device 66 on the instrument 68. Thus, the image generated or reconstructed with the image data from the image plane 129 may be registered to the pose of the instrument 6. This may be particularly helpful if the instrument 68 is not within the image plane of the US Probe 33. For example, the instrument 68 may be a distance from with the image plane 129, such as the instrument 68 prime illustrated in phantom. While the instrument 68 prime is positioned outside or away from the image plane 129 the display device 84 may display the image 108 and a relative pose of the instrument 68 prime even if it is not in the image plane 129 due to the registration of the coordinate space of the EM localizer and the coordinate space of the visual localizer 93.
[0070] With continuing reference to fig. 3 and additional reference to fig. 4, the visual tracking system 93 has within its field of view 200 both of the visual tracking device 170 associated with the EM localizer 94 into the visual tracking device 180 associated with the US Probe 33. The US Probe 33 emits or generates the image plane 190 which allows for the collection of image data and the generation or reconstruction of the image 127 on the display device 84. It is illustrated in fig. 4, however, the instrument 68 is not within the image plane 129. Nevertheless, the EM tracking device 66 is able to track the instrument 68 within the subject 30. Therefore, the tracked pose and determined pose of the instrument may be displayed as the graphical representation 68i. As discussed above, the EM localizer 94 may generate the field so that it allows for tracking the EM tracking device 66. The common coordinate space of the visual localizer 93 due to tracking the US Probe 33 with the visual tracking device 180 and the EM localizer 94 with the visual tracking device 170 allows the correlation of the tract pose of the instrument 68 to be displayed on the display device 84 relative to the image 127i generated with the image data from the US Probe 33.
[0071] As disclosed above, the navigation system 26, such as via tracking of the tracking devices, may be able to track or determine a pose of various portions, such as the US probe 33 that generates the image plane 129 and the instrument 68 to allow for generation of a graphic representation 68i relative to a displayed image, such as the image 127i. When the instrument is generally on or in the same area as the image portion the graphical representation may be superimposed on the image. However, as is also illustrated in Fig. 4, the graphical illustration 68i may be displayed at a distance from the image that represents a distance of the instrument 68 away from the portion being imaged by the US probe 33 in the image plane 129. Therefore, the graphical representation may be displayed added determined pose relative to the image 127i and displayed on the display device 84 even when not in a common space such that it would be superimposed on the image 127i.
[0072] Turning reference to Fig. 5, the EM localizer 94 may optionally not include the visual tracking device 170. The EM localizer 94, however, may generally emit an EM field, as disclosed above, to allow for tracking and navigation of various tracking devices. For example, the US probe 33 may include the tracking device 81 that may be an EM tracking device. The EM tracking device 81 may be associated with the US probe 33 in an appropriate manner, such as fixed to a surface thereof, enclosed within a housing thereof, embedded in a housing or structure thereof, or in any other appropriate manner. Nevertheless, the EM tracking device 81 may be provided to track a pose of the US probe 33 and, therefore, also the pose of the imaging plane 129. The US probe 33 may also include the visual tracking device 180, as discussed above. Therefore, at least two tracking devices that may be tracked in two different tracking modalities (e.g. EM and visual) may be associated with the US probe 33. Thus, both of the tracking systems may be used to track and determine the pose of the US probe 33 and their related imaging plane 129.
[0073] The instrument 68 may also include the EM tracking device 66, as disclosed above. Whether the EM tracking device 66 may be associated with an inserter portion 234 that is a portion of the instrument 68 within the subject 30. The inserted portion 234 of the instrument 68 may therefore not be viewable by the visual tracking system 93 and/or the user 72. Therefore, the EM tracking system 94 may be used to track both the US probe 33 and the instrument 68 with the respective tracking devices 81 , 66.
[0074] As disclosed above, however, various items or portions may generate distorting fields. For example, the US probe 33 may include a housing 250 or any structure, such as an internal structure or support, formed of a conducting material and the conducting material may have fields induced therein. Therefore, the induced fields in the US probe 33 may generate distorting fields, also referred to as interfering fields, that may interfere with an accuracy of a tracking of the various EM tracking devices, for example globally (e.g., across an entire navigation space). For example, the visual tracking system 93 may be able to track the various visual tracking devices 170, 180 relative to the subject in all areas within the field of view of the visual localizer 93. However, a registration of the EM localizer 94 relative to the visual localizer 93 may not be accurate for determining a pose of the US probe 33 within all of the areas of the field of view of the visual tracking device 93 due to the local distortion. [0075] As discussed further herein, when local distortion occurs a determination of whether a local accuracy and precision of the EM tracking device 81 may be usable for locally tracking various elements such as the tracking device 81 associated with the US probe 33 and/or for the tracking device 66 associated with the instrument 68. The distortion may be a local distortion that allows for accurate tracking of the tracking devices within the local distortion even if it is not globally accurate, such as within the navigation space of the visual tracking device 93.
[0076] Further, the distortion may be determined and the visual tracking device 180 may be tracked during a check or revalidation with the visual tracking device localizer 93. If the visual tracking device 180 is tracked with the visual localizer 93, the global pose of the US probe 33 may then be reaffirmed and/or confirmed with at least a second tracking system. Therefore, the registration of the EM localizer 94 relative to the visual localizer 93 may be reaffirmed once the visual localizer 93 tracks the visual tracking device 180. In this instance, the distortion of any distortion or interfering fields may be accounted for and the locally tracked pose of the EM tracking device 81 may again be globally accurate. Therefore, including the two tracking devices 81 , 180 with the US probe 33 may be used to assist in overcoming or accounting for distortion that may distort the field sensed by the EM tracking device 81 and/or the EM tracking device 66.
[0077] Various distortions in the EM tracking field may be determined based upon various measurements. For example, a phase shift of the sensed EM field may be used to determine that a distortion or interfering field is occurring. For example, the time changing fields emitted by the EM localizer 94 may have a known phase shift when being sensed. If the phase shift changes from the known or predetermined phase shift a determination of distortion may occur. Further, the EM tracking device 81 may include a plurality of tracking portions, such as a plurality of conductive coils. The conductive coils have a known and predetermined geometry relative to one another. If the sensed relative geometry of the individual coils within the EM tracking device 81 changes a determination of distortion may occur. At these times, the determined distortion may be used to estimate a navigation error or error limit and evaluate whether a reconfirmation of a global tracking pose is useful or needed, such as with the visual tracking device 93 and/or a locally precise tracking of the EM tracking device , such as the tracking device 81 and/or the tracking device 66, may be used for a selected or known local area or volume, which may also be referred to as a tracking constraint.
[0078] Turning reference to Fig. 6, the US probe 33 may be associated with the visual tracking device 180 and the EM tracking device 81. Accordingly, the US probe 33 may be tracked with both of the EM localizer 94 and the visual localizer 93. Further, the instrument 68 may include the EM tracking device 66, particularly at the inserted portion 234 of the instrument 68. As discussed above, therefore, the instrument 68 may be tracked with the EM localizer 94. As also discussed above, the visual localizer 93 may be used to confirm or reconfirm a pose of the US probe 33 in the global space of the navigation space.
[0079] In various embodiments, however, the image plane 129 may also be used to reconfirm or calibrate the tracked pose of the tracking device 66 and/or the instrument 68. As illustrated in Fig. 6, the display device 84 may display the image 127i that is generated with the image data from the image plane 129. As discussed above, the image 127i may be a reconstruction, model, direct image data, or the like. In various embodiments, the image 127i may be a reconstruction, such as a three-dimensional model, based upon collected image data. Regardless, the image 127i may be generated based on the image plane 129.
[0080] As discussed above, the graphical representation 68i of the instrument 68 may also be displayed on the display device 84. When the instrument 68 is within the image plane or adjacent, or overlaying the portion being imaged the graphical representation 68i may be displayed relative to and/or superimposed on a portion of the image 127i.
[0081] As displayed on the display device 84, if the instrument 68 is within the image plane 129 an image 68x of the instrument 68 may also be displayed. If the image of the instrument 68x is displayed a position of the instrument 68 may be determined relative to the US probe 33 based upon the image data collected in the image plane 129 with the US probe 33.
[0082] According to various embodiments, the tracked pose of the US probe 33 may be undistorted whereas the tracked pose of the instrument 68 may be distorted, particularly if distortion occurs relative to the EM tracker 66. However, the imaged portion of the instrument 68x may also be displayed with the image 127i due to the image plane 129. As discussed above, the image plane 129 may be registered or calibrated relative to the US probe 33. Therefore, the pose of the instrument 68 may be determined based upon the image 68x of the instrument 68. For example, a distance 260 may be a value, such as a transformation value including a translation, rotation, or other difference. The transformation value is the value between the image position 68x and tracked pose of the instrument 68 illustrated by the graphical representation position 68i of the instrument 68. The graphical representation 68i position may be based upon tracking of the tracking device 66. However, if the two do not overlay one another an error may be determined. The pose of the instrument in the image at the image 68x may be determined based on segmenting the image, according to appropriate systems, to identify the image portion 68x of the instrument 68. The transformation value 260 may be used to correct or update a global pose of the tracking device 66 related to the instrument 68. The transformation distance 260 may, as disclosed above, be a translation, rotation, or other value.
[0083] Therefore, the image pose of the instrument 68 as collected in the image plane 129 may be determined and viewed with the display device 84. The image pose of the image portion 68x may be based on the segmentation, disclosed above, such as identifying a portion of the image that matches a known shape of the instrument 68. The selected processor system may execute instructions to determine the transformation value 260. After the value was determined the navigation system 26 may update the EM localizer 94 and related tracking system to correct for or determine a global pose of the EM tracking device 66.
[0084] According to various embodiments, the tracked pose of the instrument 68 may be undistorted whereas the tracked pose of the US probe 33 may be distorted, particularly if distortion occurs relative to the EM tracker 81 . However, the imaged portion of the instrument 68x may also be displayed with the image 127i due to the image plane 129. Here the system may determine a transformation value 260 and may correct for or determine a global pose of the EM tracking device 81 .
[0085] Therefore, the image data collected with the US probe 33 may also be used to update or confirm a global pose of one or more of the tracking devices. For example, as discussed above, the instrument 68 may be imaged with the US probe 33 and the image plane 129. The US probe 33 may be tracked with the visual tracking device 180 an/or the EM tracking device 81 . Therefore, the image 68x pose of the instrument 68 within the image plane 129 may be displayed that is actual image data collected with the US probe 33. The image of the instrument 68x may be determined such as by segmenting the image data acquired with the US probe 33, or any appropriate imaging system. The pose of the image portion 68x may, therefore, be determined within the image data and relative to the imaging system, such as the US probe 33, as discussed above (e.g., by calibration of the image plane 129 relative to the US probe 33 and/or a tracking device associated therewith). This may be displayed relative to the tracked pose of the instrument as displayed or represented by the graphical representation 68i. The difference may be the transformation value 260.
[0086] According to various embodiments, the transformation value is a pose difference between the image pose of the instrument 68x and the tracked pose of the instrument 68i. The navigation system may compare transformation value 260 with estimated navigation errors or error limits and relevant navigation constraints. The system may then notify the user of differences and/or a request for correction of registration. Further, the value 260 may be useful to check that the image plane is placed in the correct position. Alternatively or additionally the value 260 could be used to add to the transformation for the image plane to correct its pose. A maximum or threshold value could be set for transformation 260 to improve accuracy.
[0087] start
[0088] Turning reference to Fig. 7, the one or more tracking systems, including the EM localizer 94 and the visual localizer 93 may be in the surgical or operation theatre, as discussed above. Further, the US probe 33 may include one or more of the tracking devices, such as the visual tracking device 180 and the EM tracking device 81. Further, the instrument 68 may be used to assist in performing a procedure on the subject 30. The instrument 68 may include the EM tracking device 66 on the inserted portion 234 that is inserted within the patient 30. Thus, precise tracking of the instrument may be performed with the EM localizer 94. As disclosed above, however, various distortions may occur.
[0089] In addition to the tracking devices associated with the various instruments, such as the US probe 33 and/or the instrument 68, a tracking assembly 262, also referred to as a DRF, may be associated with the subject 30. Similar to the DRF 58 discussed above, the DRF 262 may be associated with the subject 30 in a selected manner. For example, the DRF 262 may be adhered to the subject 30 and a selected location, such as near the heart 127. Therefore, portions of the subject 30 may be tracked by one or more of the tracking systems near or adjacent to where the image data is collected with the US probe 33.
[0090] The DRF 262 may include one or plurality of tracking devices. For example, the DRF 262 may include a visual tracking device 264 and an EM tracking device 268. It is understood that only one of the two tracking devices, or any appropriate number of tracking devices may be associated with the DRF 262. However, the inclusion of two or more tracking devices allows for the DRF 262 to be tracked with two or more tracking systems. The tracking with the two or more tracking systems may be substantially simultaneous, according to various embodiments, when at least to the two tracking devices 264, 268 are associated with the single DRF 262.
[0091] Further, the inclusion of the two or more tracking devices 264, 268 with the single DRF 262 may lead to various advantages and efficiencies. The two tracking devices 264, 268 may be at known geometries relative to one another and/or at known geometries relative to a point defined by the DRF 262. For example, the DRF 262 may include a divot or registration mark or point 272. In various embodiments the instrument 68 may be registered within the navigation spaces of the EM localized 94 and the visual localizer 93 by associating or touching a portion of the instrument 68 to the registration point 272 which is at a known geometry relative to the two tracking devices 264, 268.
[0092] Further, the known or selected single point relative to the DRF 262 may be used as a registration point to correlate or register the navigation space of the EM localizer 94 and the navigation space of the visual localizer 93. The registration may be a single registration or may be updated at an appropriate time. For example, the visual localizer 93 may move or be moved and the viewing of the visual tracking device 264 by the visual localizer 93 may be used to confirm or determine a pose of the DRF 262. The known pose of the EM tracking device 268 relative to the visual tracking device 264 may be known (e.g., pre-determined and stored in a memory system that is recalled and/or measured by the user 72) to allow for a determination of a common tracked point or pose in the EM localizer 94 navigation space. Thus, registration between two or more tracking systems in the navigation system 26 may be performed with the DRF 262, including the two or more tracking devices.
[0093] Further, the DRF 262 associated with the subject 30 may be used to assist in determining or confirming a pose of the US probe 33 and/or the instrument 68 relative to a known point on the subject 30. According to various embodiments, the DRF 262 may be positioned at a known or predetermined pose. For example, the DRF 262 may be positioned at a known and determinable or recognizable feature, such as an anatomical feature, including a suprasternal notch, xyphoid process, or other selected point. The known points 276 may be used to assist in calibrating and/or correlating the various spaces. For example, the US probe 33 may be moved to image the predetermined or known point 276 in the subject 30. The position of the point 276 may be viewed in an image on the display device, such as image 280. The selected portion 276 may be identified in the image 280, such as the 276i in the image 280. Therefore, the tracked pose of the US probe 33 such as with the visual tracking device 180 and/or the EM tracking device 81 , may be calibrated relative to the imaged pose of the selected portion 276 of the subject 30 that is viewable or identified in the image 280.
[0094] Therefore, as discussed above, the various tracking devices that are associated with one another, including single portions, may be used to assist in correlating various navigation spaces. Further, the different tracking devices may include different features and/or advantages relative to one another to allow for confirmation, precision, update speed, and the like. As discussed further herein, therefore, the various tracking devices may be tracked with two or more tracking systems and allow for performing a procedure on the subject 30 with a selected confidence and/or selected efficiency to include a precise and accurate navigation of an instrument, including two or more instruments, during the selected procedure.
[0095] As discussed above, various systems may be used to track different items or portions. According to various embodiments, the visual tracking device system 93 may be used to track one or more items and the EM localizer 94 may be used to track one or more items. According to various embodiments, a single item, such as the US probe 33, may include tracking devices 180, 81 for use with both of the tracking systems, respectively.
[0096] Further, however, it is understood that other tracking systems may also be used such as optical tracking systems which may use cameras to track one or more objects that may be identified with a selected wavelength (e.g., infrared (IR)), inertial tracking systems, or other appropriate tracking systems. The various systems may be used to allow for correction and/or determination of an accuracy or trustworthy pose of one or more items.
[0097] Selected configurations may allow for registration between a plurality of the tracking systems. The various systems, as discussed above, and their respective operation modalities and/or configurations may be used to track and determine a pose of a selected object or item. Selected objects or items may include one or more of an instrument, including the instrument 68 and/or the US probe 33. Discussion herein to an object or item may be understood to refer to any appropriate tracked portion. This specific tracked object or item need not be a specific item, according to various embodiments.
[0098] A process 300 for navigation of an object is illustrated in Fig. 8 and may be used to evaluate an output to determine the pose of one or more items. With initial reference to Fig. 8, the process 300 may begin in START block 310. The process 300 may then receive first tracking data from a first tracking system in block 314 and receive second tracking data from a second tracking system box 318. The received tracking datum or data may be a first instance tracking data, such as at a first time. As discussed herein, the process 300 may be iterated, such as the received tracking data 314, 318 may become a second instance, such as when compared to the first instance. Thus, the tracking data may be collected over time (e.g., of a procedure) and/or movement (e.g., of the object being tracked). As discussed further herein, various systems may be used to evaluate selected tracking data and will be described in greater detail herein.
[0099] The two received tracking data may be from two tacking systems that operate in different modalities or the same modalities, but separated from one another. The received first tracking data may be any appropriate first tracking data such as tracking data received from the visual localizer 93, and optical localizer 88, or other appropriate tracking system. According to various embodiments the first tracking system may also include an EM or inertial tracking system. The second tracking data received in block 318 may be any appropriate tracking data such as EM tracking data, inertial tracking data, or the like. Further, according to various embodiments the second tracking data may also include or include only visual tracking data or the like.
[00100] Nevertheless, the process 300 may include receiving or acquiring at least a first and second tracking data that may be from different tracking systems or from two tracking systems. According to various embodiments, the two tracking systems may be identical tracking systems and/or alternative tracking systems. Particularly when alternative tracking systems are used a determination or validation or weighting of each of the tracking systems relative to one another may be made. This may assist in determining a trustworthiness or accuracy of one or more of the tracking data received.
[00101] As discussed above, each of the different types of tracking data or tracking systems may have advantages and/or disadvantages relative to a selected application. For example, the visual or optical tracking systems may require a line of sight which may be blocked when a user blocks a line of sight, the subject 30 blocks the line of sight, an object in an operating theater blocks line of sight, or other opaque items may be moved into an area and/or a brightness of a selected area may not be bright enough to be accurately viewed by one or more of the tracking systems. Further, an EM tracking system may have distortion created by interfering objects or fields relative to a tracked item. The interfering fields may be identified as disclosed above, such as due to a change in tracked geometry of a tracking device having two or more tracking portions thereof and/or a determined or identified phase shifts, or the like. The EM localizer 94, therefore, may be untrustworthy or inaccurate for selected area or volume or outside of a selected (e.g., local) area or volume. An inertial tracking system may be accurate or precise for a selected time, but have drift over period of time. For example, an inertial tracking system may be accurate and precise globally, such as after registration, for a selected period of time including fractions of a second to minutes. For example, about 5 seconds, about 10 seconds, about 30 seconds, or any appropriate time within a selected range such as about 1 second to about 10 seconds including the various values included within the range. Therefore, reference between the first tracking data from block 314 and the second tracking data from block 318 may be useful for determining a trustworthy or accuracy of a selected determined pose.
[00102] Accordingly, a determination of an accuracy or trustworthiness of the first tracking data may be made in block 322. Similarly, a determination of an accuracy or trustworthiness may be made of the second tracking data and block 326. The determination of the accuracy or the trustworthiness of the first and second tracking data in the respective blocks 322, 326 may be based upon respective tracking systems. For example, a determination of a blocking of a line of sight of the visual tracking system may be used to determine the accuracy or trustworthiness of the first tracking data block 322. [00103] The determination of a blocking of a line of sight may include an update rate of a viewing or determination of a tracked visual tracking device, such as the visual tracking device 180. For example, the tracking system may understand or select to update the viewing or determination of a pose of the visual tracking device 180 at a selected rate such as 10 times a second, 100 times a second, or any appropriate rate. A determination of accuracy may be based upon a rate of updating, such as whether a certain number of recent updates have been made or not been able to be made due to a lack of viewing of the tracking device.
[00104] Further determinations of accuracy and/or whether a lack of the tracking device being in a field of view may include various parameters, including those noted above and herein. According to various examples, visual pattern matching accuracy may scale with the number of matched pattern corners so that partially blocked patterns may be identified and may reduce accuracy. A visual pattern may be any appropriate pattern, such as a checkboard design. According to various examples, visual pattern matching accuracy may scale with the pattern viewing angle so that large viewing angles (e.g., greater than a selected threshold, such as greater than about 30 degrees) may reduce accuracy. According to various examples, optical tracker matching accuracy may scale with the number of matched markers so that a selected number out of a set or predetermined number are completely or partially blocked may reduce accuracy. According to various examples, optical marker matching accuracy may scale with the shape of matched markers so that a partially blocked marker may reduce accuracy. [00105] For the determination of accuracy or trustworthiness of the second tracking data may be made in a similar manner and/or in a different manner. For example, as disclosed above, in the EM tracking system if a phase shift is determined to be greater than a selected amount, a determination of accuracy or trustworthiness may be made based thereon. Further, distortion of a known geometry of the EM tracking device may also be used. According to various embodiments, EM sensor signals matching accuracy may scale with distortions so that distortions causing signals error may reduce accuracy. [00106] The inertial tracking system may have a known or determinable drift. Therefore, inaccuracy of the inertial tracking system may be based upon any elapsed time from a prior confirmation or registration of the inertial tracking system.
[00107] After the determined accuracy or trustworthiness of both the first tracking data and the second tracking data have been made in the respective blocks 322, 326 a determination in block 330 of whether the accuracy or trustworthiness of both of the first second tracking data and the second tracking data are greater than a threshold may be made. If a determination that either or both accuracies or trustworthiness are below a selected threshold, a NO path 334 may be followed to a process block 338, as discussed further herein. The threshold accuracy or trustworthiness may be any appropriate factor or constraint. For example, whether accuracy is within 0.1 mm, 1 mm, with 2 mm, or other selected distance. Thresholds may also be a time that has past since a calibration or validation. Thresholds may be based on a selected required or desired accuracy.
[00108] If the accuracies are determined to be greater than a selected threshold, a YES path 350 may be followed. The threshold accuracies may be determined relative to the respective tracking systems. The respective robustness of the different tracking devices, or the like. For example, the trustworthiness of the visual tracking system may be based upon the number of updates and the most recent update of the tracked pose of the respective tracking device, such as the visual tracking device 180. The accuracy or threshold for the EM tracking system may be a determined based upon a deviation or change of phase that is not expected and/or determined geometry.
[00109] Nevertheless, the threshold may be compared to the determined accuracies and if the determined accuracies are greater than the threshold ds path 350 may be followed. Thereafter, a recalling of weights for both the first tracking data and the second tracking data may be made in block 354. The recalling of the weights may be optional in block 354 as may be the application of the recall the weights in block 356. According to various embodiments, the weights for each of the received first and second tracking data from blocks 314 and 318 may range between 0% and 100% and any portion thereof. For example, the weight for the first tracking data may be 10% and the weight for the second tracking data may be 90%. The weights may also include a 0% weight for any first or second tracking data and a 100% weight for the other of the first and second tracking data.
[00110] The determined weights may be based upon and/or changed based on the determined accuracies in block 322 and 326, respectively. For example, a higher determined accuracy for the second tracking data may increase a weight applied there too. Similarly, a higher determined accuracy or trustworthiness for the first tracking data may also include a recall a higher weight applied there too. According to various embodiments, a standard or nominal weight may be applied to the first and second tracking data, such as 50% each, 40% for the first tracking data and 60% percent for the second tracking data, 60% for the first tracking data and 40% for the second tracking data, or any appropriate nominal weighting. The recalled weights may be based upon and predetermined based upon the possible accuracies and trustworthiness. Accordingly, the recalled weights may be stored in a lookup table and recalled based upon the determined accuracies and trustworthiness of the respective first and second tracking data in blocks 322, 326.
[00111] After determining the weights, they may be applied to block 356, optionally, as discussed above. Therefore, once the weights are determined to block 354, they may be applied in 356. As discussed above, the weights may include zero weight for at least one of the first or second tracking data and/or may include a 50/50 weight. Therefore, whether the weights are recalled and applied or not a determination of a pose of attract objects may be made in block 360.
[00112] A determination of a pose of the tracked object may be based upon the received first and second tracking data from blocks 314, 318 and/or only one of the two. For example, the system may determine that the trustworthiness or accuracy of either or both of the first or second tracking data are greater than a selected threshold and use only one of the two tracking systems. According to various embodiments, however, weights may be applied to allow for a determined pose to be based on a combination of tracking data from the first and second tracking data at all times. Thus, the determined pose of the tracked object may be made in block 360 may be based upon the received first tracking data in blocks 314, 318 as discussed above.
[00113] The determined pose may then be output in block 364. The output may include a display, an output of instructions to move the robotic arm 20, instructions for the user to move the instrument 68, or any other appropriate output. The output of block 364 may be based upon the determined pose of the tracked object in block 360. According to various embodiments, the determined pose may be displayed on the display device 84 as the graphical representation 68i. The graphical representation may illustrate the tracked pose of the instrument 68 for viewing by the user 72. According to various embodiments, the determined pose of the object may also be used to control movement with the robotic arm. Controlling movement may include providing output to the robotic arm system processor, as discussed above, to move the robotic arm end effector 44 to a selected pose based upon the current tracked pose.
[00114] The process 300, after outputting the determined pose in block 364, may then either receive an input or determine whether a procedure has ended in block 370. If the procedure has not ended a NO path 374 may be followed to restart the process 300. Accordingly, the process 300 may be iterative to continually update the tracked pose or receive data of a tracked pose of an object, such as the instrument 68.
[00115] If the procedure is determined to have ended in box 370, a YES path 380 may be followed to end the process 300 in block 384. The process 300, therefore, may cease tracking or cease receiving information regarding the tracked pose of the instrument or object in block 384.
[00116] Thus, the tracked pose of an object, such as the instrument 68, may be made with receiving tracking data from at least two tracking systems in blocks 314 and 318, respectively, to output a determined to pose of the objects in block 364 as discussed above. Therefore, the tracked instruments may be based upon tracking data of more than one tracking system, as discussed above. [00117] As discussed above, the process 300 may be used to output a determined pose of an object. According to various embodiments, however, as discussed above, if determination of an accuracy or trustworthiness in a global sense of the first second tracking data may be made in block 330 and if the accuracy does not reach a threshold the NO path will be followed to block 338.
[00118] Block 338 may be part of a process 400 illustrated in Fig. 9. The process 400 may allow for tracking of an object, such as the instrument 68, when an accuracy in a global sense is determined to be below a selected threshold. Generally, an accuracy in the global sense may be the determination that a tracked pose of an object in an entire navigation space is not accurate enough to perform a selected procedure. However, accuracy may be determined to be appropriate for at least within selected constraints, as discussed further herein.
[00119] According to the process 400 the determination of whether both the first and second tracking data are below a threshold may be made in block 338. If a determination is made that both are not below a threshold a NO path 410 may be followed to a determination block of whether the first tracking data is below a threshold in block 414.
[00120] If it is determined that the first tracking data is not below a threshold a NO path 418 may be followed to correct or update the second tracking data with the first tracking data in block 422. In correcting or updating the second tracking data with the first tracking data in block 422, a determined pose of the object with the first tracking data may be used to define a point or pose within the navigation space of the second tracking data or second tracking navigation space. The use of the accurate or trustworthy first tracking data may be used to update or define a current pose of the tracked object in the second tracking navigation space to allow for the continued tracking of the object with the second tracking data based upon the updated or corrected pose. Therefore, both of the first and second tracking systems may continue to be used once the second tracking data is updated with the first tracking data in block 422. According to various embodiments, the correction or updating of the second tracking data (e.g., to generate or output updated second tracking data) with the first tracking data in block 422 occurs only when first tracking data is not below the threshold. In other words, the first tracking data has met the threshold to update or correct the second tracking data. [00121] Further, the corrected and//or updated second tracking data may be used as an input or to reinitiate the process 300. Therefore, the corrected or updated second tracking data with the first tracking data may allow for a continued receiving of the first and second tracking data in blocks 314, 318, as discussed above. Thus, the process 400 may allow for a continued tracking of an object with both the first and second tracking system once the second tracking data is corrected or updated with the first tracking data. [00122] If both the first and second tracking data are determined to not be at, such as below, a threshold and/or if the first tracking data is below a threshold from block 338 and 414, the YES path 440 may be followed. In following the YES path 440, a recall of the last accurate or trustworthy first tracking data may be made in block 444. The last accurate or trustworthy tracking data from the first tracking system may be a last tracked pose of a visual tracking device, such as the visual tracking device 180, that meets a selected update rate, or other parameter. Other parameters may include pattern matching metrics that may include fit errors as percentages, lengths, or areas. The recall of last accurate or trustworthy first tracking data may be recalled from a temporary storage of first tracking data, a positioning of the tracked object in a selected location, or the like.
[00123] Regardless of the process, the recalled last accurate or trustworthy first tracking data may be set as a reference or origin point for the second tracking system in block 448. The setting of the recalled last accurate or trusted first tracking data as a reference or origin point may include identifying a point relative to a tracked portion of a navigation space, such as the subject 30, DRF 262, etc.. For example, the last accurate or trustworthy first tracking data may be a point within the subject, such as at a portion of the heart 127. The identified point or portion of the subject may be identified or used as the reference point and the object may be repositioned to the last recalled accurate or trustworthy point. This allows the tracked object, such as the instrument 68, to have an accurate or precise local origin or initial point.
[00124] As discussed above, the EM tracking system may encounter distortion for various reasons. Nevertheless, a distorted EM field may be used for precise and accurate tracking over selected constraints, such as within a constrained volume. Further, as discussed above, an inertial tracking system may have a drift over a selected period of time. However, within a constrained period of time the local movement or pose of the inertially tracked object may be precisely and accurately determined. Therefore, the identification of an origin or reference point to block 448 may allow for the definition of constraints for tracking within a second tracking system in block 452.
[00125] Regarding the above, the instrument 68 including the internal portion 234 that is within the subject 30 may be tracked with only the EM tracking localizer 94. Therefore, the navigation may need to be stopped if a distortion occurs and the EM tracking device 66 cannot be trusted. However, if a local point may be determined, such as being set in block 448, a determination of constraints for tracking with a second tracking system may be made in block 452. Constraints may include a selected area or volume or maximum amount of movement from the set reference point. For example, the constraint may include tracking allowed only within 1 centimeter, 2 centimeters, 5 centimeters, 10 centimeters, or any value from the reference point.
[00126] The constraints may be determined in block 452. The constraints may be based upon a sensed amount of distortion, recalled parameters based upon a type of distortion or amount of distortion that is sensed, or the like. According to various examples, the constraints may depend on the amount of distortion in the volume around the reference point, if distortion was not above a selected threshold (e.g., too high) a larger allowable area around the reference point, or the user might be able to set it if for their application tracking was acceptable near the reference point. According to various examples, a time constraint may be determined by time integrating characterized accelerometer uncertainties to specified distance constraints. According to various examples, a pose constraint may be determined by correlating measured geometry errors and their pose derivatives to previously characterized pose errors and specified distance or angle constraints.
[00127] After the determination of constraints in block 452, tracking of the object with the second tracking system may occur in block 456. As disclosed above, the instrument 68 may only be tracked with the EM tracking system and therefore even local tracking with only the second tracking system may occur according to the process 400. A determination may be made in block 416 if tracking is continuing within the constraints. As discussed above, the constraints may include a time, volume of movement, only one or both of these, or other appropriate constraint. If the object is determined to be within the constraint, such as within a maximum distance of movement, a determination that the tracking is maintained within the constraints be made and a YES path 464 may be followed to continue tracking the object with the second tracking system.
[00128] If a determination is made that the tracking is not within the constraints, in other words has exceeded the constraints (e.g., in time or distance), a NO path 470 may be followed. For example, as disclosed above, the determination that the object has moved a distance greater than the constraint, such as greater than 10 centimeters, may be determined and therefore the NO path 470 may be followed to stop tracking in block 474. In stopping tracking, the pose of the instrument may be removed from the display 84, a visual warning may be provided to the user, an audible warning may be provided to the user, a haptic or touch feedback may be provided to the user, or any appropriate feedback and/or or combinations of feedback may be provided. According to various embodiments, for example, the graphical representation of the instrument may be removed from the display device 84. Further, the tracking system may no longer provide instructions for movement of a selected instrument, such as maintaining or holding the robotic arm 40 in a selected pose and not allowing or not performing instructions for movement.
[00129] After stopping tracking in block 474, a determination of whether tracking is selected may be made in block 478. If no tracking is selected, a NO path 480 may be followed to end the process in END block 484. In ending the process 400 in block 480 navigation may be ended and a procedure may be performed or finished without tracking and/or a procedure may be ended.
[00130] If tracking is selected in block 478, a YES path 488 may be followed to reinitialize tracking in block 492 according to some appropriate manner such as instructing a user to move an instrument, move a localizer, reset or reinitialize the tracking systems, reregister the tracking systems, or other appropriate instruction or movement. Restarting tracking, however, may require further input from a user and/or system to maintain or continue tracking. Nevertheless, once reinitialized the process may also use the import or reinitialization from block 492 to reset or reinitialize the process 300 as illustrated in Fig 8. [00131] According to various embodiments, therefore, the navigation system 26 may be used to navigate or track an object with two or more tracking systems to allow for a continuous and accurate or trustworthy tracking of an object. Using the two or more tracking system a pose of an object may be determined and a tracked procedure may continue or be maintained even if one or more tracking system is determined to be untrustworthy according to the process and systems as discussed above. Further, the processes, such as the process or method 300, 400 may be carried out substantially automatically with the processor system, as disclosed above. Therefore, the processes 300, 400 may be used by the navigation system 26 and the various processors thereof and/or accessed by to allow for navigation or tracking a selected object.
[00132] Example embodiments are provided so that this disclosure will be thorough and will fully convey the scope to those who are skilled in the art. Numerous specific details are set forth such as examples of specific components, devices, and methods, to provide a thorough understanding of embodiments of the present disclosure. It will be apparent to those skilled in the art that specific details need not be employed, that example embodiments may be embodied in many different forms and that neither should be construed to limit the scope of the disclosure. In some example embodiments, well- known processes, well-known device structures, and well-known technologies are not described in detail.
[00133] Instructions may be executed by a processor and may include software, firmware, and/or microcode, and may refer to programs, routines, functions, classes, data structures, and/or objects. The term shared processor circuit encompasses a single processor circuit that executes some or all code from multiple modules. The term group processor circuit encompasses a processor circuit that, in combination with additional processor circuits, executes some or all code from one or more modules. References to multiple processor circuits encompass multiple processor circuits on discrete dies, multiple processor circuits on a single die, multiple cores of a single processor circuit, multiple threads of a single processor circuit, or a combination of the above. The term shared memory circuit encompasses a single memory circuit that stores some or all code from multiple modules. The term group memory circuit encompasses a memory circuit that, in combination with additional memories, stores some or all code from one or more modules.
[00134] The apparatuses and methods described in this application may be partially or fully implemented by a processor (also referred to as a processor module) that may include a special purpose computer (i.e., created by configuring a processor) and/or a general purpose computer to execute one or more particular functions embodied in computer programs. The computer programs include processor-executable instructions that are stored on at least one non-transitory, tangible computer-readable medium. The computer programs may also include or rely on stored data. The computer programs may include a basic input/output system (BIOS) that interacts with hardware of the special purpose computer, device drivers that interact with particular devices of the special purpose computer, one or more operating systems, user applications, background services and applications, etc.
[00135] The computer programs may include: (i) assembly code; (ii) object code generated from source code by a compiler; (iii) source code for execution by an interpreter; (iv) source code for compilation and execution by a just-in-time compiler, (v) descriptive text for parsing, such as HTML (hypertext markup language) or XML (extensible markup language), etc. As examples only, source code may be written in C, C++, C#, Objective-C, Haskell, Go, SQL, Lisp, Java®, ASP, Perl, Javascript®, HTML5, Ada, ASP (active server pages), Perl, Scala, Erlang, Ruby, Flash®, Visual Basic®, Lua, or Python®.
[00136] Communications may include wireless communications described in the present disclosure can be conducted in full or partial compliance with IEEE standard 802.11 -2012, IEEE standard 802.16-2009, and/or IEEE standard 802.20-2008. In various implementations, IEEE 802.11 -2012 may be supplemented by draft IEEE standard 802.11 ac, draft IEEE standard 802.11 ad, and/or draft IEEE standard 802.11 ah.
[00137] A processor, processor module, module or ‘controller’ may be used interchangeably herein (unless specifically disclosed otherwise) and each may be replaced with the term ‘circuit.’ Any of these terms may refer to, be part of, or include: an Application Specific Integrated Circuit (ASIC); a digital, analog, or mixed analog/digital discrete circuit; a digital, analog, or mixed analog/digital integrated circuit; a combinational logic circuit; a field programmable gate array (FPGA); a processor circuit (shared, dedicated, or group) that executes code; a memory circuit (shared, dedicated, or group) that stores code executed by the processor circuit; other suitable hardware components that provide the described functionality; or a combination of some or all of the above, such as in a system-on-chip.
[00138] Instructions may be executed by one or more processors or processor modules, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” or “processor module” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements. The processor or processors may operate entirely automatically and/or substantially automatically. In automatic operation the processor may execute instructions based on received input and execute instructions in light thereof. Thus, various outputs may be made without further or any manual (e.g., user) input.
[00139] The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the invention, and all such modifications are intended to be included within the scope of the invention.