CROSS-REFERENCE TO RELATED APPLICATIONSThis application claims the benefit of U.S. Provisional Application No. 62/595,822, filed on Dec. 7, 2017. This application includes subject matter similar to that disclosed in U.S. patent application Ser. No. ______, filed concurrently herewith (Attorney Docket No. 5074X-000041). The entire disclosures of each of the above applications are incorporated herein by reference.
FIELDThe subject disclosure relates to preforming a procedure, and particularly to registering an image space to a real or subject space.
BACKGROUNDDuring a selected procedure, a user may acquire images of a subject that are based upon image data according to the subject. Generally the image data may be acquired using various imaging techniques or systems and the image data may be reconstructed for viewing by the user on a display device, such as a flat panel or flat screen, cathode ray tube, or the like that is positioned away from a region of operation. The region of operation may be relative to a subject, such as a human patient, for performing a selected procedure. For example, a sinus procedure may be performed and images of a subject's sinuses may be displayed on a display device that does not overlay the subject.
The procedure may further include directly viewing at least a portion of a region of interest or operation, such as with an endoscope. An endoscope may position a camera at a selected location, such as within a nasal or an accessible sinus cavity. An endoscope may have limited range of movement and/or field of view at various locations within selected anatomical areas.
SUMMARYThis section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.
In various embodiments, an instrument may be positioned relative to a portion of a subject for performing a procedure. A subject may include a living subject or a non-living subject. In various embodiments, a living subject may include a human and the procedure being performed may be performed relative to a nasal passage and/or sinus cavity. For example, a balloon sinus dilation procedure may occur, such as one performed with a NuVent® EM Balloon Sinus Dilation System, sold by Medtronic, Inc. having a place of business in Minnesota. It is understood that the dilation of a sinus need not be performed with an electromagnetic (EM) navigated instrument, however the dilation of sinuses with an inflatable instrument may include instruments including various inflation and expansion features such as the NuVent® sinus dilation surgery.
In various embodiments, a procedure may occur in a region of operation of a subject. The region of operation may be a specific or limited area or volume on which a procedure is being performed or relative to which a procedure is being performed. The region of operation may also be referred to as a region of interest or include a region of interest therein. In various embodiments, for example, the sinuses of a human subject may be operated on, such as for performing debridement, dilation, or other appropriate procedures. The procedure may occur within a region of operation while a region of interest may include an entire head of the patient or cranium.
The operation performed on the sinus may generally be a low invasive or non-open procedure. In the low invasive procedure various natural body cavities such as nasal passages, are used to access the sinuses. Upon access to the sinuses, therefore, the operating end of an instrument may not be visible to a user.
The instrument that is being used to perform the procedure may include a tracking device configured or operable to be tracked by a tracking system. In various embodiments, the tracking system may include a visual or optical tracking system that tracks, such as by viewing or recording, the tracking device on the instrument. A navigation or tracking system, including a processor system, may then determine the position of the operating end relative to the tracking device based upon known and/or predetermined geometric relationships between the operating end and the tracking device.
The tracking system may include one or more cameras or optical tracking devices positioned relative to a display device. The display device may include a transparent or semi-transparent viewscreen. The viewscreen may be positioned relative to a user, such as allowing the user to view the subject through the viewscreen. The viewscreen may be mounted to a structure that allows the user to wear the viewscreen relatively close to a user's eyes such that the viewscreen fills all or a substantial portion of a field of view of the user. A displayed image may then be displayed on the viewscreen to allow the user to view the image while also viewing the subject. The tracked location of the instrument, or at least a portion thereof such as the operating end, may also be displayed on the display using the viewscreen. Accordingly, cameras may be associated with the device worn by the user to allow for a determination of a location of the instrument relative to the region of operation and/or region of interest in the region of operation and superimpose on an image or augmenting the user's view of the subject by displaying the tracked location of the instrument.
The system may include a viewing portion or system and a tracked portion. The viewing portion may view a real object (e.g. a subject) and displayed images. The viewing system, therefore, may be an augmented reality viewing system. In addition thereto, or alternatively thereto, viewing systems may include a view screen or monitor separate from or spaced away from the subject. In addition, images may be captured in real time with selected imaging systems, such as an endoscope. An endoscope may be positioned relative to the subject, such as within a nasal passage and/or sinus, and the images acquired with the endoscope may be displayed simultaneously with views of the subject that are acquired prior to an operative procedure. Therefore the viewing system may be used to display and view real time and pre-acquired images. Both types of images may be registered to the subject with various techniques, such as those described further herein.
Further areas of applicability will become apparent from the description provided herein. The description and specific examples in this summary are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
DRAWINGSThe drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.
FIG. 1 is an environmental view of a user and a viewing system;
FIG. 2A is a real/physical world view of a region of interest including an instrument, according to various embodiments;
FIG. 2B is a real/physical world view of a region of interest including instruments, according to various embodiments;
FIG. 3A is a view point of a user viewing the real/physical world and at least one displayed icon, according to various embodiments;
FIG. 3B is a view point of a user viewing the real/physical world and at least one displayed icon, according to various embodiments;
FIG. 3C is a display of a view point of a user viewing at least one displayed icon and a real time image with a display device, according to various embodiments; and
FIG. 4 is a flowchart illustrating an operation of a display system.
Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
DETAILED DESCRIPTIONExample embodiments will now be described more fully with reference to the accompanying drawings.
With reference toFIG. 1, in an operating theatre oroperating room10, a user, such as asurgeon12, can perform a procedure on a subject, such as a patient14 which may lay or be supported by a patient bed orsupport15. The patient14 may define a patient longitudinal axis141. To assist in performing the procedure, theuser12 can use animaging system16 to acquire image data of the patient14 to allow a selected system to generate or create images to assist in performing a procedure. Theimaging system16 may include any appropriate imaging system such as a computer tomography (CT) imager, O-Arm® imaging system sold by Medtronic, Inc., and/or a NewTom® VGi evo cone beam imager sold by NewTom having a place of business in Verona, Italy.
A model (such as a three-dimensional (3D) image) can be generated using the image data. The generated model may be displayed as animage18 on adisplay device20. In addition, or alternatively to thedisplay device18, projection images (e.g. 2D x-ray projections) as captured with theimaging system16 may be displayed. Furthermore, an augmented viewscreen (AV) ordisplay device21 may be provided to or used by theuser12. TheAV21 may be worn by theuser12, as discussed further herein. Further, theAV21 may also be referred to as a viewing system that is an integrated system or a portion of a system for viewing various items, as discussed herein.
Either or both of thedisplay device20 or theaugmented viewscreen21 can be part of and/or connected to aprocessor system22 that includes an input device24 (input devices may include a keyboard, a mouse, a microphone for verbal inputs, and inputs from cameras) and aprocessor26 which can include one or more processors or microprocessors incorporated with theprocessing system22 along with selected types of non-transitory and/ortransitory memory27. Aconnection28 can be provided between theprocessor26 and thedisplay device20 or theaugmented viewscreen21 for data communication to allow driving thedisplay device20 to display or illustrate theimage18.
Theimaging system16, as discussed above, can include an O-Arm® imaging system sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colo., USA. Theimaging system16, including the O-Arm® imaging system, or other appropriate imaging systems may be in use during a selected procedure, such as the imaging system described in U.S. Pat. Nos. 8,238,631; 9,411,057; and 9,807,860; all incorporated herein by reference. Theimaging system16 may be used to acquire image data of thepatient14 prior to or during use of theAV21.
Theimaging system16, when, for example, including the O-Arm® imaging system, may include amobile cart30 that includes a controller and/orcontrol system32. The control system may include aprocessor33aand amemory33b(e.g. a non-transitory memory). Thememory33bmay include various instructions that are executed by theprocessor33ato control the imaging system, including various portions of theimaging system16.
Animaging gantry34 in which asource unit36 and adetector38 is positioned may be connected to themobile cart30. Thegantry34 may be O-shaped or toroid shaped, wherein thegantry34 is substantially annular and includes walls that form a volume in which thesource unit36 anddetector38 may move. Themobile cart30 can be moved from one operating theater to another and thegantry34 can move relative to thecart30, as discussed further herein. This allows theimaging system16 to be mobile and moveable relative to the subject14 thus allowing it to be used in multiple locations and with multiple procedures without requiring a capital expenditure or space dedicated to a fixed imaging system. The control system may include theprocessor33awhich may be a general purpose processor or a specific application processor and thememory system33b(e.g. a non-transitory memory such as a spinning disk or solid state non-volatile memory). For example, the memory system may include instructions to be executed by the processor to perform functions and determine results, as discussed herein.
Thesource unit36 may be an x-ray emitter that can emit x-rays through the patient14 to be detected by thedetector38. As is understood by one skilled in the art, the x-rays emitted by thesource36 can be emitted in a cone and detected by thedetector38. The source/detector unit36/38 is generally diametrically opposed within thegantry34. Thedetector38 can move in a 360° motion around thepatient14 within thegantry34 with thesource36 remaining generally 180° opposed (such as with a fixed inner gantry or moving system) to thedetector38. Also, thegantry34 can move isometrically relative to the subject14, which can be placed on the patient support or table15, generally in the direction ofarrow40 as illustrated inFIG. 1. Thegantry34 can also tilt relative to the patient14 illustrated byarrows42, move longitudinally along theline44 relative to thelongitudinal axis14L of thepatient14 and thecart30, can move up and down generally along theline46 relative to thecart30 and transversely to thepatient14, to allow for positioning of the source/detector36/38 relative to thepatient14. Theimaging device16 can be precisely controlled to move the source/detector36/38 relative to the patient14 to generate precise image data of thepatient14. Theimaging device16 can be connected with theprocessor26 via connection50 which can include a wired or wireless connection or physical media transfer from theimaging system16 to theprocessor26. Thus, image data collected with theimaging system16 can be transferred to theprocessing system22 for navigation, display, reconstruction, etc.
Thesource36 may be any appropriate x-ray source, such as a multiple power x-ray source. It is understood, however, that theimaging system16 may be any appropriate imaging system, such as a magnetic resonance imaging (MRI) system, C-arm x-ray imaging system; computed tomography (CT) imaging system, etc. The image data and/or images acquired with the selected imaging system, however, may be displayed on one or more of thedisplay devices20,21.
It is further understood that theimaging system16 may be operated to acquire image data and/or images prior to performing a procedure on thepatient14. For example, images may be acquired and studied to diagnose and/or plan a procedure for thepatient14. Thus, theuser12 that performs a procedure on the patient14 need not use theimaging system16 in the same room as the procedure being performed.
According to various embodiments, theimaging system16 can be used with a tracking system and navigation system, including various portions as discussed herein, operable to track a location of theimaging device16 and/or other portions. The tracking system may include a localizer and/or digitizer, including either or both of anoptical localizer60 and anelectromagnetic localizer62 can be used to generate a field and/or receive and/or send a signal within a navigation domain relative to thepatient14. A navigated space or navigational domain relative to the patient14 can be registered to theimage18. Correlation, as understood in the art, is to allow registration of a navigation space defined within the navigational domain and an image space defined by theimage18.
In various embodiments, a patient tracker ordynamic reference frame64 can be connected to the patient14 to allow for a dynamic tracking and maintenance of registration of the patient14 to theimage18. The patient tracking device ordynamic registration device64 allows for images to be registered and then used for a selected procedure. In various embodiments, thelocalizers60,62 may track the patient tracker.Further communication lines74 may be provided between various features, such as thelocalizers60,62, theimaging system16, and aninterface system76 and theprocessor system22, which may be a navigation processor system. In various embodiments, thecommunication system74 may be wired, wireless, or use a physical media transfer system (e.g. read/write to a memory chip).
Further, thegantry34 can include a tracking device, such as anoptical tracking device82 or anelectromagnetic tracking device84, to be tracked, such as with one or more of theoptical localizer60 orelectromagnetic localizer62. Accordingly, theimaging device16 can be tracked relative to the patient14 as can theinstrument66 to allow for initial registration, automatic registration, or continued registration of the patient14 relative to theimage18. Registration and navigated procedures are discussed in the above incorporated U.S. Pat. No. 8,238,631, incorporated herein by reference.
One skilled in the art will understand that theinstrument66 may be any appropriate instrument, such as a ventricular or vascular stent, spinal implant, neurological stent or stimulator, ablation device, dilator, or the like. Theinstrument66 can be an interventional instrument or can include or be an implantable device. Tracking theinstrument66 allows for viewing a location (including x,y,z position and orientation) of theinstrument66 relative to the patient14 with use of the registeredimage18 without direct viewing of theinstrument66 within thepatient14.
With continuing reference toFIG. 1 and additional reference toFIG. 2A, thepatient14, in addition to and/or alternatively to thepatient tracker64, may include one or more patient markers or trackers100 (herein referenced to100 and a lowercase letter). Thepatient trackers100 may include various features such as being opaque or imageable with various imaging systems, such as X-rays or MRI. Thetrackers100, according to various embodiments, may generally be visible or captured in the image data acquired with theimaging system16, according to various embodiments. Thus, thepatient markers100 may be identifiable in an image or image data of the subject14.
In various embodiments, thepatient markers100 may be identified substantially automatically by theprocessor system26 and/or33a,or any other appropriate imaging processor or processor system. Themarkers100 may include a selected and/or unique geometry that may be identified in the image or image data. Various techniques may be used to segment and identify themarkers100 in the selected image or image data. It is also understood, however, that theuser12 may identify the markers in the image data such as by selecting portions in the image and identifying the portions as themarkets100 with one or more of theuser inputs24.
Themarkers100 may be positioned on the patient14 in any appropriate manner. For example, themarkers100 may be adhered to the patient14 such as with a self-adhesive backing, an appropriate glue or adhesive material added to themarker100 and/or thepatient14, or other appropriate mechanism. Further, or in addition thereto, the markers100 (e.g. themarkers100dand100e) may be fixed to a bony structure of thepatient14. Themarkers100 may be formed or provided as screws or have threaded portions that allow them to be threaded and fixedly positioned into a bone structure, such as a cranium of thepatient14.
Regardless of the connection technique, themarkers100 are positioned on the patient14 at a selected time. Generally themarkers100 are positioned on thepatient14 prior to imaging, such as acquiring image data or images of the patient14 with theimaging system16. Therefore when theimage system16 is operated to acquire images of thepatient14, themarkers100 are positioned on thepatient14 and will appear in any acquired image data or images.
Themarkers100 may, therefore, be used as fiducial points. It is understood that the patient14 may also include various physical and anatomical fiducial points, such as a tip of the nose, corner of an eye, earlobe, or the like. Nevertheless, the fiducial points, whether provided by themarkers100 and/or an anatomical fiducial point to thepatient14, may be used for registration of the images acquired with theimaging system16. It is understood by one skilled in the art, images acquired to the patient14 may define an image space. The image space may be of a region of operation or procedure (RO) and/or an area greater than, but at least including, the RO, such as referred to as a region of interest (ROI). It is understood that a procedure may occur in a specific area in a region of interest. Nevertheless coordinates of the image data or space may be correlated or registered to the patient14 in a physical or real space. Themarkers100 may be used to register the image space to the patient and/or navigation space as defined by thepatient14 and/or a region of operation within or relative to thepatient14.
Theinstrument66, as noted above, may be any appropriate instrument. Theinstrument66 may include a dilator instrument that includes adistal end110 that may be positioned within a portion of thepatient14, such as through anasal passage112 and into one or more sinuses of thepatient14, such as the frontal sinus. Theinstrument66 may further include aproximal end114 to which aconnector116 to which an inflation system (not illustrated) may be connected, such as to a connector or a nipple. Theconnector116 may allow for material to be passed through theinstrument66, such as ahandle120, into a balloon or otherexpandable portion122 at or near thedistal end110 of theinstrument66. Inflation of theballoon122 may, as is generally understood by one skilled in the art, expand to dilate or expand a portion of the sinus.
In various embodiments, theuser12 may grasp theinstrument66 such as with ahand130. Theuser12 may then move theinstrument66 relative to thepatient14, such as within thepatient14, such as to move the instrumentdistal end110 into a sinus, such as amaxillary sinus134. It is understood, however, that various instruments, including theinstrument66, according to various configurations may also be used to access one or more portions of other sinuses of the patient14 such as afrontal sinus136 and/or asphenoid sinus138.
Theinstrument66 may further include atracking device146. Thetracking device146 may be affixed to theinstrument66, such as theelongated handle portion120. Generally thetracking device146 is substantially fixed relative to thehandle120 such that movement of thehandle120 by theuser12, such as with theusers hand130, moves thehandle120 and thetracking device146. According to various embodiments, thetracking device146 is also rigidly fixed in space relative to thedistal end110 of theinstrument66. Accordingly, knowing the position (e.g. location and/or orientation) of thetracking device146 will allow for knowing the position of thedistal end110. Further, thedistal end110 may extend along anaxis110athat is at an angle150 relative to anaxis120aof thehandle120. Accordingly, thetracking device146 may be positioned and have a known or predetermined position and geometry relative to thedistal end110 to be tracked to determine the position of thedistal tip110 relative to thetracking device146 that is affixed to thehandle120.
In addition to theinstrument66 that may be used by theuser12, additional instruments may also be used relative to the subject14. For example, as illustrated inFIG. 2B, theuser12 may include or operate animaging system137. Theimaging system137 may include a distalimage capturing portion139, such as a camera lenses or camera. Theimaging instrument137 may generally be understood to be an endoscope, such as a EVIS EXERA III endoscope, sold by Olympus America. Theendoscope137 may be positioned relative to the subject14 by theuser12, according to various generally known techniques and embodiments. Theimage gathering portion139 may image various portions of the subject14, such as internal portions of the subject14 including thesinus cavity134, or other appropriate portions, including thenasal passage112. As discussed further herein, theendoscope137 may capture images that may be in substantially real time, such as during positioning of theinstrument66 within the subject14. The real time images captured with theendoscope137 may be displayed on various display devices or view systems, such as thedisplay device20 and/or theAV21. Therefore, theendoscope137 may be used to capture images at the imaging portion or end139 and display the images according to generally known techniques. The images may be transmitted through various systems, such as wirelessly or wired transmission systems, to theprocessing system22 for display on the selected display or viewing systems, including thedisplay device20 and/or theAV21. The signal from theendoscope137 may be a digital signal and/or an analogue signal and may be transmitted directly from the endoscope and/or through theinterface system76. Regardless, the images acquired at theimaging portion139 of theendoscope137 may be viewed by theuser12 and/or any other appropriate individual. Further, the images may be captured and recorded for various purposes.
Theendoscope137, as discussed above, may be used to acquire images of the subject14. To assist in acquiring the images or in performing a procedure, the position of the endoscope, particularly the position of the images being acquired, may be determined by one or more tracking devices. For example, anendoscope tracking device147 may be incorporated onto theendoscope137 similar to thetracking device146 connected to theinstrument66, as discussed above. Thetracking device147 may include one or more viewable markers orportions147a,similar to themarkers146aon thetracking device146. As discussed herein, therefore, thetracking device147 may be viewed or imaged with theAV21 to be tracked by the selected tracking system as discussed further herein. Thetracking device147 may be used to determine the position of theend139 capturing images to assist in determining a location of the image within thepatient14. Theendoscope137 may be positioned within the subject14, such as in thesinus cavity134 that is not directly viewable by theuser12. Additional and/or alternative tracking devices may include anend tracking device143 that may be positioned or incorporated into theendoscope137 at or near theimage capture end139. Thetracking device143 may be similar to thetracking device64, discussed above. Thetracking device143 may be an optical tracking device, EM tracking device, ultrasound tracking device, or other appropriate tracking device. As discussed further herein, registration of theinstrument137, such as with thetracking device143, and the patient orsubject tracker64 may be used to assist in registering and maintaining registration of theendoscope137 relative to the subject14.
Theuser12 may also have and/or use the alternative or augmented viewscreens orviewing system21 for use during the procedure. TheAV21 may be an appropriate device that includes at least one viewscreen and generally two viewscreens including afirst viewscreen160 and asecond viewscreen162. The viewscreens may be fixed to aframe member166 that may have one ormore temple members168 to allow theAV21 to be worn by theuser12 in a similar manner to eyeglasses. Therefore, theviewscreens160,162 may be positioned generally in front of, respectively, botheyes172 and174 of theuser12. In this manner images may be displayed on one or both of theviewscreens160,162 to allow theuser12 to view images. TheAV21 may include one or more various devices and systems such as the Hololens® wearable computer peripherals sold by Microsoft Corporation, R-9 Smartglasses wearable computer peripherals sold by Osterhout Design Group, having a place of business in San Francisco, Calif., and/or DAQRI Smart Glasses® wearable computer peripherals sold by DAQRI having a place of business at Los Angeles, Calif.
In various embodiments, theviewscreens160,162 may also be substantially transparent except for the portion displaying an image (e.g. an icon or rendering). Therefore, theuser12 may view thepatient14 and any image displayed by theviewscreens160,162. Moreover, due to the twoviewscreens160,162 displaying selected images, the display may be perceived to be substantially stereoscopic and/or three-dimensional by theuser12 relative to thepatient14. As discussed further herein, therefore, theuser12 may view thepatient14 and an image when performing a procedure.
TheAV21 may also include one or more cameras, such as afirst camera180 and asecond camera182. The twocameras180,182 may be used to view the region of interest, such as a head of thepatient14. As illustrated inFIG. 2A theuser12 may view substantially a head portion and a neck portion of a patient when performing a procedure in a region of operation, such as in a sinus of thepatient14. Therefore thecameras180,182 may also view thepatient14, for various purposes, as discussed further herein. Moreover, the cameras may view other objects in the region of interest such as thetracking device146 on theinstrument66 and/or themarkers100. Thetracking device146 may include one or more viewable markers orportions146athat are viewable by thecameras180,182 to be used to determine a perspective or view of thetracking device146 by theAV21.
While the use of twocameras180,182 are disclosed and discussed herein to view and determine the location of thetracking device146 and/ormarkers100, it is understood by one skilled in the art that only one camera, such as only one of thecameras180,182 may be required for tracking, as discussed herein. Based on various features (e.g. shapes, images, etc.) on thetracking device146,tracking device147, and/or themarkers100 a single camera, such as thecamera180, may be used to determine the location (i.e. x, y, z coordinates and orientation) relative to thecamera180 and/or relative to other trackable items. For example, thecamera180 may be used to determine the relative location of the tracking device146 (and therefore the instrument66) relative to themarkers100. Further, thecamera180 may be placed at any location relative to theuser12, such as on a head of theuser12 separate from theAV21. Thecamera180, however, may still remain in communication with theprocessor system22 for display of various images on one or more of theviewscreens160,162.
TheAV21 including thecameras180,182, therefore, may view themarkers100 on the patient14 in combination with thetracking device146 on theinstrument66. Themarkers100 on the patient14 may be viewed by thecameras180,182 and may be identified by theuser12 and/or substantially automatically by executing instructions on a processor system, such as by executing instructions with theprocessor26.
As discussed above, theprocessor system22 may have access to instructions, such as those saved on thememory27, to assist in identifying themarkers100 in an image. Thecameras180,182 may have a field of view that includes the region of interest including the head of thepatient14 and also viewing themarkers100. The instructions, which may be included in selected software, may identify themarkers100 in a viewed image, such as by segmentation of the image and identifying a selected shape, density, color, or like of themarkers100.
Once themarkers100 are identified, images acquired with theimaging system14 may be registered, such as with theprocessor system22, to register the images including themarkers100 therein in the field of view of thecameras180,182 of theAV21. A registration may occur by matching the identifiedmarkers100 in the image data acquired by theimage device16 and themarkers100 in the field view image acquired with thecameras180,182 of theAV21. Themarkers100 are generally maintained in the same position on the patient during acquisition of image data with theimaging system16 and when in the field of view of thecameras180,182 during the procedure. Registration may also occur due to thetracking device143 on theendoscope137 and the patient tracker ordynamic reference frame64. As discussed above, registration may occur due to various registration techniques, such as those disclosed in U.S. Pat. No. 8,238,631, incorporated herein by reference. Registration may be made by tracking the tracking device143 (associated with the endoscope137) and/or thepatient tracker64. Further, theAV21 may include atracking device149 that may allow theAV21 to be tracked in a same or similar tracking system and/or frame of reference relative to the subject ordynamic reference frame64 and/or thetracker143 on theendoscope137. Thus, theAV21 may be registered relative to theendoscope137 and/or the subject14. This image is acquired with theendoscope137 may be used to be displayed relative or for viewing by theuser12 relative to a view of theuser12, as discussed further herein.
As discussed above, in reference toFIG. 2A andFIG. 2B, image data and/or images acquired with theimaging system16 where themarkers100 are connected to the patient14 will include data of themarkers100. Themarkers100 may be identified in the image data or images acquired with theimaging system16 as discussed above. Themarkers100 may also be viewed by thecameras180,182 associated with theAV21. Thecameras180,182 may also view thepatient14 to identify or assist in identifying anatomical fiducial markers. Nevertheless, themarkers100 identified in the image data acquired with theimaging system18 may also be identified in images acquired with thecameras180,182. It is understood by one skilled in the art, the images acquired with thecameras180,182 may be any appropriate type of images, such as color images, infrared light images, or the like. Nevertheless, matching of themarkers100 identified in the image data acquired with theimaging system16 may be matched to locations identified of the markers viewed with thecameras180,182 to register the space or field of view viewed by theAV21 to the image space of images acquired with theimaging system16. As discussed further herein, therefore, images or portions of images acquired with theimaging system16 may be displayed with theviewscreens160,162 as appearing to be superimposed on thepatient14.
The subject14 may also be registered relative to currently acquired images or real time images acquired with theendoscope137 due to thesubject tracker64 and thetracker143 on theendoscope137. Registration may be made to the subject14 such as with thefiducial markers100dand/or other fiducial features. Thus, the real time images acquired with theendoscope137 may also be registered to the pre-acquired images. Thus the pre-acquired images may be registered to theinstrument66 such as with thecameras180,182 and/or to theAV21 via theAV21tracker149. Thus images acquired with theendoscope137 may be registered to pre-acquired images of the subject14 and for viewing by theuser12, such as with theAV21.
Theuser12 may view the subject14 through theAV21, as illustrated inFIG. 3A andFIG. 3B. Theuser12 may view or have a field ofview21f,as illustrated by dash-lines inFIG. 3A andFIG. 3B. The field ofview21fmay represent a view by theuser12 through theAV21 when viewing the subject14, or any area through theAV21. Thus, theuser12 may view the subject14, that is real or physical, and the view may also be augmented by graphical representations (also referred to herein as icons) that are displayed by theAV21. The icons or graphical representations may be displayed in the field ofview21ffor viewing by theuser12 when viewing the field of view through theAV21.
In addition to the icons, as discussed further herein, additional images orimage areas153 may be displayed within the AV field ofview21fto be viewed by theuser12. Thesupplemental viewing areas153 may be used to display various images or information for use or viewing by theuser12. For example, the real time images acquired by theendoscope137 may be displayed in the auxiliary oraugmented viewing area153. Thus, theuser12 may view the subject14 in the field ofview21fto view the subject14, the various graphical representations as discussed further herein, and additional images (e.g. endoscopic images) in the auxiliary oradditional viewing area153. Theuser12 may also selectively select or choose information to be displayed in theauxiliary display area153 such as pre-acquired images, the real time images with the endoscope, or other appropriate information.
With additional reference toFIG. 3A andFIG. 3B, for example, sinuses such as themaxillary sinus134 may be displayed as amaxillary sinus icon134′.FIG. 3A is an illustration of a point of view of theuser12 viewing through theAV21 thepatient14 and various portions that are displayed with theviewscreens160,162. As discussed above, therefore, themaxillary sinus icon134′ may be displayed for viewing by theuser12 as if theuser12 could see into thepatient14 and view themaxillary sinus134. Themaxillary icon134′ may be graphical rendering of the image data or an artificially created icon to represent the maxillary sinus.
It may also be selected to illustrate other portions of the anatomy of the patient14 such as thefrontal sinus136 and one or more of thesphenoid sinuses138. Theuser12 may also view any real world object, such as thepatient14 and/or themarkers100 affixed to thepatient14. The user may also view other real world portions, such as thepatient support15. Therefore theuser12 may view both features superimposed on thepatient14 due to theviewscreens160,162 and items in the real world by viewing through transparent portions of theviewscreens160,162.
Further, thecameras180,182 may view thetracking device146. By viewing thetracking device146 thecameras180,182 may determine the position of thetracking device146 relative to themarkers100. The position of themarkers100 are placed on the patient14 to identify locations of thepatient14. The known position of theinstrument tracking device146 relative to one or more of themarkers100 allow for a determination of a portion of theinstrument66 relative to thetracking device146 and thepatient14. As discussed above, thedistal end110 of the instrument may be at a known and fixed position relative to thetracking device146. The known and fixed relative position (e.g. the geometry) of thedistal end110 relative to thetracking device146 may, therefore, be stored in thememory27 or other appropriate memory.
The tracked location of thetracking device146, may be determined by triangulating the location of thetracking device146 based on a “view” of the tracking device with one or more of thecameras180,182. Theprocessor system22 may execute instructions, as generally understood in the art, to then determine the position of thedistal end110 and/or the working portion such as the inflatable member of122 and an instrument icon may be illustrated to include or illustrate the various portions relative to thepatient14 by displaying it on theviewscreen160,162. The instructions that are executed by theprocessor system22 may include instructions stored and recalled from thememory27. The instructions may include those that are based on an algorithm to triangulate the location of the viewed portion, such as thetracking device146, based on separate views from the twocameras180,182. The separate views may be used to generate signals from the twocameras180,182 (e.g. including image data) and the signals may be transmitted to theprocessor system22. Triangulation of the location of thetracking device146 may be based on a known distance between the twocameras180,182 and each separate view captured by each of the twocameras180,182.
Accordingly, theviewscreens160,162 may include a graphical representation also referred to as anicon110′ of the distal end and/or anicon122′ of the inflatable portion of theinstrument66. The icon, such as thedistal icon110′ may be illustrated as an icon on one or more of theviewscreens160,162 such as it appears to be superimposed or displayed relative to thepatient14. It is understood that an icon of more than a portion of theinstrument66 may be used, therefore, aninstrument icon66′ may be illustrated as the entire instrument including all portions of theinstrument66.
TheAV21 may be in communication with theprocessor system22 and/or may include onboard processing and/or other communication features to communicate with other processor systems. Accordingly, the view of the region of interest, such as the head of thepatient14, by thecameras180,182 of theAV21 may be transmitted to theprocessor system22. Due at least to the spacing apart of thecameras180,182, a triangulation may be determined for each viewed point in space, such as themarkers100 and/or thetracking device146, relative to thecameras180,182. A relative location of thetracking device146 to one or more of themarkers100 may be determined such as by executing instructions with theprocessor system22.
Theprocessor system22 receiving images from one or both of the twocameras180,182 may process and determine the distance between the various tracked, or any viewed, portions such as themarkers100 and thetracking device146. Theprocessor system22, therefore, executing instructions accessed in thememory27 may then provide to theviewscreens160,162 the selected and/or appropriate image portions such as theinstrument icon66′ or portions thereof and/or other imaged features, such as icons representing the sinuses including thefrontal sinus136′ or other appropriate portion from the image data. The registration of the pre-acquired images, such as those acquired with theimaging system16, based upon the tracked location of thetracking device146 and/or themarkers100 may be based upon known registration techniques such as those disclosed in the U.S. Pat. No. 8,238,631, incorporated herein by reference. The registration may be substantially automatic and/or based upon identification of fiducial markers, such as themarkers100, in theimages18 and/or themarkers100 on thepatient14.
Theuser12, therefore, may view both thepatient14 and other features, such as theinstrument icon66′, relative to the patient14 based upon theviewscreens160,162. Thecameras180,182 may provide all of the tracking information relative to theuser12 and thepatient14 for determining a location of various portions of theinstrument66, such as thedistal tip110 for displaying them with theviewscreens160,162. The perception of theuser12 may be that theinstrument66 is viewable relative to the patient14 even though it is within thepatient14. Further, the image data acquired with theimaging system16 may be displayed as features, such as icons, with theviewscreens160,162 relative to thepatient14. Again, the perception by theuser12 of the patient14 may be that the various portions, such as thesinuses134,136,138, are viewable by theuser12 due to theAV21. Accordingly, as illustrated inFIG. 3A, the view of the patient14 may be augmented to illustrate features that are otherwise un-viewable by theuser12 with theusers12 regular vision. In other words, theuser12 may view thepatient14, as illustrated inFIG. 3A in physical space, and a representation of an area within thepatient14, such as with the icons or renderings discussed above. This view may also be 3D and change in perspective as the user moves relative to thepatient14 and/or theinstrument66.
The patient14 may be viewed through the view screens160,162 as specifically illustrated inFIG. 3A. The various icons, such as themaxillary icon134′ and thesphenoid icon138′ may be displayed relative to theicon66′ of theinstrument66. The icons may have various and selected opacities and/or cutaways for viewing of theinstrument icon66′ relative to the anatomy icons, such as thesphenoid icon138′. Accordingly, theuser12 viewing the field of view including the icons, such as thesphenoid icon138′ and theinstrument icon66′ may see both the icons simultaneously. Moreover theuser12 may perceive a position of theinstrument66 within the selected sinus, such as thesphenoid sinus138, by viewing theinstrument icon66′ and the sinus icon, such as thesphenoid sinus icon138′, substantially simultaneously. In other words, the opacity of various icons, such as the sinus icons, may be selected to have a transparent view to be able to view the instrument icon within or as if it is within the selected anatomical portion. This allows theuser12 to view thepatient14 and the icons of the instrument and the anatomy substantially simultaneously and as if present on thepatient14, as illustrated inFIG. 3A.
In addition to, or alternatively thereto, the various icons may be displayed at a position away from thepatient14. For example, as illustrated inFIG. 3B, theinstrument icon66′ may be displayed away from the patient14 although at a tracked and determined location relative to an anatomical portion icon, such as thesphenoid icon138′. It may be selected to illustrate only those anatomical portions that are interacting or having been passed through by theinstrument66 therefore all icons may not be necessarily to be shown. It is understood that various pathways, such as anicon pathway139′ (SeeFIG. 3A andFIG. 3B) between various potions of the anatomy, such as through thenasal passage112 even when theinstrument66 is within thenasal passage112 and obscured from a non-augmented view of theuser12. Therefore, as illustrated inFIG. 3B, it is understood that the displayed portions of the anatomy that are represented or based upon the image data acquired of the patient14 may be displayed at a location away from the respective and relative physical location on thepatient14. Accordingly the icons, such as the sinus icons may be displayed at a distance away from thepatient14. This may allow theuser12 to have a more and/or substantially unobstructed view of the patient14 while also being able to view the relative location of theinstrument66 relative to selected anatomical portions.
Moreover the view screens160,162 may be used to display other images such as an endoscopic image that may be acquired substantially simultaneously and in real time, if selected. That is, theuser12 may place an endoscope in thenasal passage112 as well and one or more of theviewscreens160,162 may display the endoscope view. Accordingly, it is understood that theuser12 may position an endoscope through thenasal passage112 with theinstrument66 to provide a real time and endoscopic point of view which also may be displayed on the view screens160,162 and relative to selected icons, such as the sinus icons and/or theinstrument icon66′.
Moreover, it is understood that various images may be displayed on both of the view screens160 and162 or only one of the view screens160,162. It will be understood that images displayed on the twoview screens160,162 may be substantially similar, but altered to allow for a perception of depth and/or three-dimensionality of the selected portions, such as of the sinuses and/or theinstrument66 either based upon the image data and/or icons, by theuser12. Accordingly, thedisplays160,162 may have identical displays, substantially different displays or only one display per view screen, or be similar to provide a perception of depth for viewing by theuser12.
As discussed above, theauxiliary image153 may show or illustrate the position of theinstrument66, such as adistal end image110″ illustrated in theauxiliary image153. Theauxiliary image153 may be the real time image acquired with theendoscope137, as discussed above. Thedistal end image110″, therefore, may also be a real time image of theinstrument66. Theauxiliary image153 may also display a surface of the anatomy, such as within the sinuses, for viewing by theuser12 during a selected procedure. Therefore the field ofview21fmay allow theuser12 to view the subject14, graphical representations of instruments displayed relative to and/or superimposed on the subject14, pre-acquired images of the subject displayed relative thereto and/or superimposed on the subject14, and/or auxiliary images such as real time images of theinstrument66. Thus theuser12 may select which images to view in the field ofview21f.It is understood that any of the images or graphical representations may also be displayed on various other display devices, such as thedisplay device20. Thedisplay device20 may also view or display both the graphical representations of the locations of theinstrument66, pre-acquired images, and real time images, either alone or in combination.
With reference toFIG. 3C, the field of view may also be displayed with thedisplay device20. In various embodiments, the view of the endoscope may be displayed as anauxiliary view153′ on thedisplay20. Theicons110′ and the portions of the anatomy, such as thesinus134′, may also be display with thedisplay device20. The graphical representations may be substantially three-dimensional (3D) when displayed on thedisplay device20. Thus, the field ofview display21fmay be substantially reproduced on thedisplay device20, though the patient14 may not be displayed, but only the acquired images, such as thesinus134′, and the images that are acquired in real time, such as with theendoscope137. Thedisplay device20, it is understood, may be mobile and positioned for a best view of theuser12.
Whether displayed on thedisplay20 and/or in the field ofview21fwith theAV21, the display of the graphical representations (e.g. thesinus134′ and theinstrument110′) may be from the point of view of theuser12. Thus, as theuser12 moves relative to the subject12, the display in the field ofview21fand/or on thedisplay device20 may alter to provide a display for theuser12 as if theuser12 were looking within the subject12 at the selected position. In other words, is theuser12 moved to a position at a head of the subject12 looking inferiorly, rather than superiorly, the display of the graphical representations would be altered to match the position of theuser12 relative to the subject14. The determined position of theuser12 may be determined, in various embodiments, by thetracker149 and/or the views of theimaging device180,182 associated with theAV21.
With continuing reference toFIGS. 1-3B and additional reference toFIG. 4 a method of using theAV21 by theuser12 to assist in performing a procedure, such as a procedure relative to one or more of the sinuses including thefrontal sinus136, is described in theflowchart210. Generally the process may start in thestart block214. After initiating the process instart block214, acquisition of image data or images of the subject14 may be performed. Acquisition of the image data or images may be performed in any appropriate manner. For example, images of the subject14 may be acquired and stored on a memory system, such as thememory system27 and/or thememory system33b,at a selected time prior to performing a procedure, such as when thepatient14 is prepared for introduction of theinstrument66 into thepatient14. The images may be acquired with a selected imaging system such a CT scanner and/or an MRI and saved in an appropriate format, such as raw data and/or reconstructed images. The reconstructed images may include images that have been rendered in a three-dimensional manner for viewing by theuser12 with varying display devices, such as thedisplay device20 and/or theAV21. Further, various portions of the image or image data may be segmented, such as segmenting the sinuses, including thefrontal sinus136 from the other image data. Moreover, the identification of themarkers100 in the image or image data may be performed such as by a processor, including theprocessor26 and/or33a.Segmenting the various portions of the anatomy, such as thefrontal sinus136, and/or identifying themarkers100 may be performed using various segmentation techniques. Segmentation techniques may include those incorporated in various imaging and navigation systems such as the FUSION™ navigation system sold by Medtronic, Inc.
Image data may also be acquired substantially during or immediately prior to a procedure such as with theimaging device14 that may be used substantially intraoperatively (e.g. when the patient is prepared for the procedure). The various portions of the image or image data may be segmented, as discussed above but rather than being stored on the memory prior to the procedure for a selected period of time, the data may be transferred substantially in real time to theprocessor system22 for use during the procedure. Nevertheless, it may be understood that the image data may be stored for a selected period of time, such as to analyze and/or process the image data or images for use during the procedure.
The acquisition of the image data may be optional, as preparing images for display by theAV21 and/or use during a procedure is not required. For example, as discussed herein, the system including theAV21 and theprocessor system22 may track themarkers100 and thetracking device146 to represent the positon of the instrument without image data of thepatient14. In various embodiments, the images may be accessed by theprocessor system22 for display with theAV21, as discussed above.
TheAV21 may be placed for the procedure inblock224. Placement of theAV21 for the procedure may include placing theAV21 on theuser12, as discussed further herein. Moreover, theAV21 may be placed in communication with theprocessor system22 such as for providing processing ability to track thepatient14, such as with themarkers100, and/or theinstrument66, such as with thetracking device146. TheAV21 may therefore view the region of interest inblock226 and the user may confirm being able to view the region of interest inblock228. In viewing the region of interest, thecameras180,182 may be able to view at least the portion of the patient14 on which a procedure is to occur, such as generally the head region. As discussed above, the region of operation may be substantially unviewable by theuser12 through various external tissues of thepatient14. Therefore, the region of the operation may include the sinuses, such as thefrontal sinus136, and the region of interest may include the entire head of thepatient14. Accordingly, thecameras180,182 may be positioned to view the region of interest and theuser12 may confirm viewing the region of interest through theviewscreens160,162. Theviewscreens160,162 are substantially transparent when no icons are displayed on a portion of theviewscreens160,162.
Once theAV21 has the view of the region of interest, a recording of the region of interest with theAV21 cameras may be performed inblock232. Recording of the region of interest inblock232 may allow for collection of images with thecameras180,182 (although it is understood that more than two or less than two cameras may be used). The recording of the region of interest may include imaging at least a portion of the patient14 in an ongoing manner, such as during the entire procedure. Imaging the region of interest of the patient14 may include imaging themarkers100 and/or other fiducial points or portions of thepatient14. Accordingly, the recorded region of interest may include identifying patient markers inblock236.
Identifying of patient markers may include segmenting image data recorded at the region of interest inblock232 to identify thepatient markers100 in the image. The identifiedpatient markers100 may be displayed as an icon with theviewscreens160,162 such as with anicon100a′which may include a three-dimensional cube positioned over themarker100aon the patient14 when viewed by theuser12, as illustrated inFIG. 3. Nevertheless, identifying the patient markers inblock236 may not require or provide for the display of theicon100a′but may simply be performed to identify the marker to identify the region of the patient14 by theprocessor system22 such as for identification of a location of theinstrument66, or a portion thereof, such as thedistal end110, as discussed further herein.
Identifying themarkers100 on thepatient14 allows the processor system, such a portion of the navigation system, to track the patient14 when themarkers100 are within the field of view of the cameras of theAV21. Themarkers100 may include portions that are identifiable in the image acquired with thecameras180,182 such as a color, pattern, shape, etc. Further, themarkers100 may include features that are identifiable for determining a position, including a pose, location and orientation of the marker relative to theAV21. Therefore, thepatient14 may be tracked relative to theAV21 worn by theuser12.
Inblock240 theinstrument tracker146 may be identified. Theinstrument tracker146 may include portions that are identifiable in the image acquired with thecameras180,182 such as a color, pattern, shape, etc. Further, theinstrument tracker146 may include features that are identifiable for determining a position, including a pose, location and orientation of the marker relative to theAV21. For example, thetracking device146 may include a pattern that is viewable by thecameras180,182. The pattern on thetracking device146 may be substantially or entirely unique from different perspectives relative to the tracking device. Thus, the viewed pattern on thetracking device146 may be used to determine the positon of theinstrument tracker146 and, therefore, theinstrument66.
Theinstrument tracker146 may be fixed to theinstrument66, as discussed above. A geometric relationship between various portions of theinstrument66, such as thedistal tip110 and/or an operatingportion122, may be predetermined and entered for processing by theprocessor system22. In various embodiments, the geometry may be saved in thememory27 and recalled due to automatic identification of the instrument66 (e.g. by viewing the instrument with thecameras180,182) and/or entering the identification of the instrument by theuser12. Nevertheless theAV21 may be used to view thetracking device146 to determine a position including a location (e.g. a three-dimensional coordinates) and an orientation in various degrees of freedom (e.g. three-degrees of freedom). The tracked position of theinstrument66 may be used by theprocessing system22 for various purposes.
For example, as illustrated inFIG. 3, performed inblock244 theinstrument icon66′ may be displayed with theAV21, such as being displayed on one or more of theviewscreens160,162. Theviewscreens160,162 may be substantially transparent save for the portions illustrating the icons. Theicon66′ and/or portions of the instrument such as thedistal tip icon110′ and/or the operatingportion122′ may be illustrated on theviewscreens160,162 relative to thepatient14. Theuser12 may then view thepatient14 and the icon or icons through theviewscreens160,162. Accordingly theuser12 may view a position of at least a portion of theinstrument66 relative to thepatient14, including a portion of thepatient14.
A display of a subject portion icon may selectively or alternatively be displayed inblock248. For example, as illustrated inFIG. 3A and/orFIG. 3B, thefrontal sinus136′ icon may be displayed. Thefrontal sinus icon136′ may be displayed relative to theinstrument icon66′ and thepatient14. Therefore, theuser12 may view thepatient14, theinstrument icon66′, and thefrontal sinus icon136′. Due to the tracking of themarkers100 on the patient14 the relative position of theinstrument66 may be displayed on theviewscreens160,162 with theinstrument icon66′. Further the relative position of the subject portion, such as thefrontal sinus136, may be displayed due to registration of the pre-acquired image to the patient using themarkers100, as discussed above.
Again, as theAV21 is able to track thepatient14 due to themarkers100 the relative positions of theinstrument66 and the subject portions, such as thefrontal sinus136, may be updated in substantially real time and displayed on theviewscreens160,162 for viewing by theuser12 along with the subject14. It is understood that the icons, such as theinstrument icon66′ and thesubject portion icon136′ may be generated and displayed on theviewscreens160,162 while the user is able to view thepatient14 through theviewscreens160,162 in real time and in physical space. It is further understood that the icons may be displayed on only or both of theviewscreens160,162, as selected by theuser12.
As discussed above, theuser12 may also select to have displayed real time images, optionally, inblock252. The real time images may be images acquired with theendoscope137, as discussed above and as generally understood by one skilled in the art. The real time images may include surfaces, such as internal surfaces, of the subject14. Further, or in addition thereto, the images may include displays or images of theinstrument66, such as thedistal end110″ display of theinstrument66. Theuser12 may select to have theauxiliary image153 displayed in the field ofview21for on any appropriate display, such as thedisplay device20. Theuser12 may also select to have theauxiliary display153 turned off or not displayed such that theuser12 only use the subject14 and selected augmented reality portions, such as the graphical representation or icons as discussed above. It is further understood, that theuser12 may select to have graphical representations displayed in theauxiliary display area153 and the real time images displayed superimposed or displayed relative to the subject14 in the field ofview21f.As discussed above, the images acquired with theendoscope137 may be registered relative to the subject14 due to the selected fiducial portions and/or markers on the subject and the patient orsubject tracker64. Thus, theuser12 may view the icons relative to theinstrument66, icons relative to selected sinuses or internal portions in part or because of the pre-acquired images (e.g. MRI and/or CT image), and real time images acquired with theendoscope137 or other appropriate imaging system.
In block262 a procedure may be performed by viewing the subject14 and selected icons, such as theinstrument icon66′ and/or thefrontal sinus icon136′ due to theAV21. It is further understood that other appropriate icons may be displayed such as themaxillary sinus icon134′ and/or thesphenoid sinus icon138′. Moreover, additional instrument icons may also be displayed due to various tracking devices associated with instruments. Further different instruments maybe have different geometries that may also be entered and/or recalled prior to displaying an icon on the display device, including theAV21. Themethod210 may then end inblock272 including various other procedures, such as various staunching and/or closing procedures.
Example embodiments are provided so that this disclosure will be thorough, and will fully convey the scope to those who are skilled in the art. Numerous specific details are set forth such as examples of specific components, devices, and methods, to provide a thorough understanding of embodiments of the present disclosure. It will be apparent to those skilled in the art that specific details need not be employed, that example embodiments may be embodied in many different forms and that neither should be construed to limit the scope of the disclosure. In some example embodiments, well-known processes, well-known device structures, and well-known technologies are not described in detail.
The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the disclosure, and all such modifications are intended to be included within the scope of the disclosure.