Movatterモバイル変換


[0]ホーム

URL:


WO2025042718A1 - Atlas-based planning and navigation for medical procedures - Google Patents

Atlas-based planning and navigation for medical procedures
Download PDF

Info

Publication number
WO2025042718A1
WO2025042718A1PCT/US2024/042641US2024042641WWO2025042718A1WO 2025042718 A1WO2025042718 A1WO 2025042718A1US 2024042641 WUS2024042641 WUS 2024042641WWO 2025042718 A1WO2025042718 A1WO 2025042718A1
Authority
WO
WIPO (PCT)
Prior art keywords
patient
reference images
image
computing
patient anatomy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/042641
Other languages
French (fr)
Inventor
Runze HAN
Hui Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intuitive Surgical Operations Inc
Original Assignee
Intuitive Surgical Operations Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical Operations IncfiledCriticalIntuitive Surgical Operations Inc
Publication of WO2025042718A1publicationCriticalpatent/WO2025042718A1/en
Pendinglegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Definitions

Landscapes

Abstract

Systems and method of visualizing patient anatomy during planning or executing a medical procedure are described. The systems and methods are directed to labeling anatomical regions of interest with ill-defined or poorly visible boundaries based on reference images with labels for the corresponding reference images. The example methods include obtaining an image of patient anatomy and obtaining a plurality of reference images with labels of one or more anatomical regions of interest (ROIs). The example methods further include computing transformations between the image of patient anatomy and reference images. Still further, the example methods include generating sets of one or more labels of ROIs within the image of patient anatomy. Still further, the example methods include displaying a graphical user interface (GUI) depicting anatomical ROIs (computed based on reference image) within patient anatomy.

Description

Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 ATLAS-BASED PLANNING AND NAVIGATION FOR MEDICAL PROCEDURES CROSS-REFERENCE TO RELATED APPLICATIONS [0001] This application claims priority to and the benefit of the filing date of U.S. Provisional Patent Application No.63/520,488 entitled “ATLAS-BASED PLANNING AND NAVIGATION FOR MEDICAL PROCEDURES,” filed on August 18, 2023. The entire contents of which are hereby expressly incorporated herein by reference. FIELD [0002] Disclosed examples relate to planning and/or navigating minimally invasive medical procedures and, more specifically, to identifying low-visibility anatomical regions based on labeled reference images. BACKGROUND [0003] Minimally invasive medical techniques are intended to reduce the amount of tissue that is damaged during medical procedures, thereby reducing patient recovery time, discomfort, and harmful side effects. Such minimally invasive techniques may be performed through natural orifices in a patient anatomy or through one or more surgical incisions. Through these natural orifices or incisions, physicians may insert minimally invasive medical instruments (including surgical, diagnostic, therapeutic, and/or biopsy instruments) to reach a target tissue location. One such minimally invasive technique is to use a flexible and/or steerable elongate device, such as a flexible catheter, that can be inserted into anatomic passageways and navigated toward a region of interest (ROI) within the patient anatomy. [0004] Intraoperative imaging can greatly aid in planning and navigating a minimally invasive procedure, for example, by accurate determination of a position, orientation, and/or pose of a flexible elongate device within the patent anatomy and with respect to one or more ROIs. During certain procedures, identifying ROIs within pre-operative or intraoperative images requires time- consuming physician action. The process of identifying ROIs may be limited by the speed and quality of rendering of anatomical landmarks, as a physician navigates the image space. For example, certain ROIs may have boundaries that are not clearly visible in images, and a physician may need to identify and/or label such ROIs using anatomical landmarks. However, accurate and Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 fast identification of ROIs based on anatomical landmarks within pre-operative and intra-operative images remains a challenge using current techniques. SUMMARY [0005] The following presents a simplified summary of various examples described herein and is not intended to identify key or critical elements or to delineate the scope of the claims. [0006] In some examples, one or more non-transitory, computer readable media stores instructions that, when executed by one or more processors, cause the one or more processors to obtain an image representative of a portion of patient anatomy and to obtain a plurality of reference images with labels indicative of respective sets of anatomical regions of interest (ROIs). The instructions may further cause the one or more processors to compute one or more transformations between the image representative of the portion of patient anatomy and one or more of the plurality of reference images and to generate, based at least in part on the one or more transformations, one or more sets of labels indicative of anatomical ROIs within the image representative of the portion of patient anatomy. Still further, the instructions may cause the one or more processors to cause the display device to display a graphical user interface (GUI) depicting at least one of the one or more sets of labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy. [0007] In other examples, a system for visualizing patient anatomy during a medical procedure comprises a display device, one or more processors, and one or more non-transitory, computer- readable media storing instructions that, when executed by the one or more processors, cause the one or more processors to obtain an image representative of a portion of patient anatomy and to obtain a plurality of reference images with labels indicative of respective sets anatomical regions of interest (ROIs). The instructions may further cause the one or more processors to compute one or more transformations between the image representative of the portion of patient anatomy and one or more of the plurality of reference images and to generate, based at least in part on the one or more transformations, one or more sets of labels indicative of anatomical ROIs within the image representative of the portion of patient anatomy. Still further, the instructions may cause the one or more processors to cause the display device to display a graphical user interface (GUI) depicting at least one of the one or more sets of labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 [0008] Still in other examples, a method of visualizing patient anatomy during a medical procedure comprises obtaining, by one or more processors, an image representative of a portion of patient anatomy and obtaining, by the one or more processors, a plurality of reference images with labels indicative of respective sets of anatomical regions of interest (ROIs). The method further comprises computing, by the one or more processors, one or more transformations between the image representative of the portion of patient anatomy and one or more of the plurality of reference images and generating, by the one or more processors and based at least in part on the one or more transformations, one or more sets of labels indicative of anatomical ROIs within the image representative of the portion of patient anatomy. Still further, the method comprises causing, by the one or more processors, the display device to display a graphical user interface (GUI) depicting at least one of the one or more sets of the labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy. [0009] It is to be understood that both the foregoing general description and the following detailed description are illustrative and explanatory in nature and are intended to provide an understanding of the present disclosure without limiting the scope of the present disclosure. In that regard, additional aspects, features, and advantages of the present disclosure will be apparent to one skilled in the art from the following detailed description. BRIEF DESCRIPTIONS OF THE DRAWINGS [0010] FIG. 1A depicts an example system for navigating during a medical procedure within an operating environment. [0011] FIG 1B is a simplified diagram of a flexible elongate device disposed within an anatomical structure. [0012] FIG. 2 schematically illustrates an image representative of patient anatomy and a plurality of reference images with labels indicative of respective sets of anatomical regions of interest (ROIs). [0013] FIG. 3 schematically illustrates separation between reference images and labels, e.g., for computation and storage. [0014] FIG.4A schematically illustrates a visualization of patient anatomy with ROIs depicted as collections of labeled voxels. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 [0015] FIG.4B schematically illustrates a visualization of patient anatomy with ROIs depicted as ellipsoid regions. [0016] FIG. 5A schematically depicts an example sequence of operations to generate a set of ROI labels in an image based on labeled reference images. [0017] FIG. 5B schematically depicts an example sequence of operations to register an image with a reference image. [0018] FIG. 6 is a flow diagram of a method for visualizing, during a medical procedure and based on a set of labeled reference images, one or more ROIs within patient anatomy. [0019] FIG. 7 is a simplified diagram of a medical system according to some examples. [0020] FIG. 8A is a simplified diagram of a medical instrument system according to some examples. [0021] FIG.8B is a simplified diagram of a medical instrument including a medical tool within an elongate device according to some examples. [0022] FIGS. 9A and 9B are simplified diagrams of side views of a patient coordinate space including a medical instrument mounted on an insertion assembly according to some examples. [0023] Examples of the present disclosure and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures, wherein showings therein are for purposes of illustrating examples of the present disclosure and not for purposes of limiting the same. DETAILED DESCRIPTION [0024] In the following description, specific details are set forth describing some examples consistent with the present disclosure. Numerous specific details are set forth in order to provide a thorough understanding of the examples. It will be apparent, however, to one skilled in the art that some examples may be practiced without some or all of these specific details. The specific examples disclosed herein are meant to be illustrative but not limiting. One skilled in the art may realize other elements that, although not specifically described here, are within the scope and the spirit of this disclosure. In addition, to avoid unnecessary repetition, one or more features shown and described in association with one example may be incorporated into other examples unless Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 specifically described otherwise or if the one or more features would make an example non- functional. In some instances, well known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the examples. [0025] This disclosure describes various instruments and portions of instruments in terms of their state in three-dimensional space. As used herein, the term “position” refers to the location of an object or a portion of an object in a three-dimensional space (e.g., three degrees of translational freedom along Cartesian x-, y-, and z-coordinates). As used herein, the term “orientation” refers to the rotational placement of an object or a portion of an object (e.g., one or more degrees of rotational freedom such as roll, pitch, and yaw). As used herein, the term “pose” refers to the position of an object or a portion of an object in at least one degree of translational freedom and to the orientation of that object or portion of the object in at least one degree of rotational freedom (e.g., up to six total degrees of freedom). As used herein, the term “shape” refers to a set of poses, positions, and/or orientations measured along an object. As used herein, the term “distal” refers to a position that is closer to a procedural site and the term “proximal” refers to a position that is further from the procedural site. Accordingly, the distal portion or distal end of an instrument is closer to a procedural site than a proximal portion or proximal end of the instrument when the instrument is being used as designed to perform a procedure. [0026] This disclosure generally relates to systems and methods that facilitate user (e.g., physician) planning of, and/or user navigation during, a medical procedure, such as an endoluminal medical procedure. These systems and methods can provide more precise and more automated visualizations of regions on interest (ROIs) within pre-operative and intraoperative images of patient anatomy. More specifically, these systems and methods generate labels for one or more ROIs within an imaged portion of patient anatomy based at least in part on labeled reference images. The labeled reference images can be thought of as an atlas for navigating patient anatomy. By themselves or in combination with visualization of a flexible elongate device (e.g., a catheter) within intraoperative images, the techniques of this disclosure can reduce the need for manual user intervention, reduce procedure time (e.g., by removing the manual labeling operation of ROIs), and improve procedure accuracy. [0027] Certain medical procedures, such as lymph node station biopsy within the thoracic region, may require practitioners (e.g., physicians) to navigate to anatomical ROIs that do not have Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 distinct boundaries and/or boundaries that are not clearly visible in pre-operative or intraoperative images. Depending on imaging modalities, the ROI with poorly-defined boundaries and/or poor visibility in images may include lymph node biopsy stations, the pleura, etc. When an ROI is not clearly defined and/or clearly visible, the practitioner may need to navigate based on anatomical landmarks. For example, a biopsy ROI may be typically located a third of the way between one anatomical landmark and another. Generally, the ROI may be in a more complicated spatial relationship with respect to multiple landmarks. Consequently, determining the ROI may be a time-consuming process, requiring the practitioner to carefully go through multiple slices of three- dimensional intraoperative images. The systems and methods of the disclosure can aid the practitioner in finding the ROI quickly and accurately. [0028] An example system may obtain an image of a portion of patient anatomy and compute transformations between the obtained image and reference images. The reference images may include labels for the ROIs (e.g., lymph node biopsy stations). In some examples, the labels may be manually generated by experts for the ROIs. After computing the transformations, the system may use these transformations to map the ROIs from the reference images to the obtained image of the portion of patient anatomy. Because each reference image may result in a different ROI mapping, the system may generate the final estimation of the ROI using a weighted average of the ROIs obtained from each of the reference images. The system may select the weights in a variety of ways. For example, the system may evaluate the quality of transformation for each of the reference images (e.g., how close the transformation matches the reference to the data, smoothness of the transformation, etc.) and determine the corresponding weight based on the quality of transformation. When the transformation is high quality, the reference may be assigned a higher weight. Additionally or alternatively, the system may assign the weights based on some knowledge of the patient and metadata associated with the reference images. For example, each reference image may be labeled with subject gender, subject size, subject age, a stage of the respiration cycle during which the reference image was obtained, etc. The system may then weight the reference images at least in part in accordance with the similarity of the labels of the reference images to the respective labels of the newly obtained image. It should be noted that some of the weights may be assigned a zero value. In some examples, the analysis of the labels results in assigning the zero value to a reference image, eliminating the need to compute a transformation to that reference image. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 [0029] In some examples, the system may combine (e.g., average, morph, etc.) at least some of the reference images prior to computing transformations, thereby reducing the number of transformations and the total time required to compute them. The system may still use the individual reference images in determining the ROI by weighting the contribution of each reference, as described above. [0030] The example system may compute transformations in a variety of ways. In one example method, the system may use a two-step process for computing a transformation. The first step may include computing a discrete remapping of pixels or voxels in the obtained image onto pixels or voxels of the individual or combined reference images. For example, the system may evaluate a transformation quality function for each of a set of possible shifts and select the multi-axis shift corresponding to the best transformation. The first step may then serve as an initial point for an optimization step, wherein the system minimizes a cost function for a deformable transformation. The deformable transformation may be formulated in a variety of ways, for example, using splines. The cost function may include a combination of different ways of evaluating differences between images as well as regularization terms. The optimization algorithm may include a gradient descent, a machine learning model (e.g., transformer network), or any other suitable technique. The system may use a graphical processing unit (GPU) and/or any other suitable architecture to compute the transformations. [0031] The system may be configured to generate a graphical user interface (GUI) to display the imaged portion of patient anatomy along with the computed ROIs within the image. In some examples, displaying the computed ROIs may include depicting a contour (e.g., an ellipsoid) indicating the ROI within the displayed image. Additionally or alternatively, pixels or voxels of the displayed images may be colorized to indicate the ROIs. [0032] Besides increasing procedure speed, automating ROI labeling may improve procedure accuracy in several ways. The automatic labeling of ROIs may be more accurate than human landmark-based navigation. Additionally, by allowing the operator (e.g., physician) to focus on navigation to a labeled ROI, without having to switch to identifying nearby anatomical landmarks, the systems and methods of the present disclosure may preserve the operator attention, leading to improved procedure accuracy. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 [0033] The techniques described in the disclosure may be efficiently implemented in hardware and software. The example image processing techniques are computationally efficient and can improve with improvements in imaging. That is, improvements in image resolution may directly translate in improved accuracy of the described techniques. At least in part, the improvements in accuracy directly due to improvements in image quality stem from the fact that the techniques described in this disclosure might not rely on training of machine learning models. Additionally, when accuracy can be sacrificed or when image quality is reduced, the techniques described in the disclosure can be implemented using fewer computational operations, thus making aspects of the techniques adjustable based on hardware capabilities and speed requirements. [0034] FIG. 1A depicts an example system 100 for planning before and/or navigating during a medical procedure within an operating environment 101. The system 100 may obtain images from a portion of the operating environment 101 disposed within a field of view F (approximately demarcated by dashed lines) of an imaging unit 110. To that end, the system 100 may be in communicative connection with the imaging unit 110. The system 100 includes a processing unit 120 and a display unit 130 in communicative connection with each other. Although in FIG. 1 the imaging unit 110 is depicted as being distinct from the system 100, in other examples, the system 100 may include the imaging unit 110. In any case, one or more processors of the processing unit 120 of the system 100 may be configured to receive images from the imaging unit 110. Throughout the disclosure, the descriptions of example operations performed by the processing unit 120 below are to be understood to be executed by the one or more processors of the processing unit 120. In some examples, the one or more processors may include hardware specifically configured (e.g., hardwired or programmable) to carry out at least a portion of the example operations described in this disclosure. Additionally or alternatively, the one or more processors may be configured to carry out at least a portion of the example operations described in this disclosure by carrying out a set of software instructions. To that end, the system may include or be communicatively connected to a tangible, non-transitory, computer readable medium. The medium may store instructions which, when executed by the processing unit 120, may perform the example operations described below. For example, the instructions may cause the processing unit 120 to perform image processing operations on the images received from the imaging unit 110. Furthermore, the instructions may cause the processing unit 120 to cause the display unit 130 to display, via a graphical user interface, information based on the processing of images received from the imaging Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 unit (e.g., by sending the information, or sending data representing the entire graphical user interface including the information, to the display unit 130). [0035] The processing unit 120 may be communicatively connected to a database 150. The database 150 may include reference images and associated labels as related to the techniques of this disclosure and discussed in more detail in reference to Fig. 3. The database 150 may be structured or unstructured and may be implemented with data stored in one or more storage media (e.g., hard disk drives, solid state drives, magnetic tapes, etc.). The database 150 may store data locally within the operating room environment 101 and/or within a facility within which the operating environment 101 is disposed. Additionally or alternatively, the data of the databased 150 may be stored in the cloud and/or distributed among multiple locations. [0036] An operator (e.g., a physician, another medical practitioner, or a fully automated robotic surgery system) of a medical system may use the information displayed at the display unit 130 to perform a medical procedure (e.g., endoscopy, biopsy, pharmacological treatment, and/or ablation). The medical procedure may require the operator to control a flexible elongate device 140 inserted through an opening O (e.g., a natural orifice, a surgical incision, etc.) into an anatomical structure A of a patient P disposed at a table T. For example, the medical procedure may include navigating the flexible elongate device 140 (indicated with solid lines outside and dashed lines inside the patient P) toward an ROI R within the anatomical structure A with the aid of information displayed at the display unit 130. The region R, for example, may be a designated procedure site for examination, biopsy, surgery, or another treatment. In the particular context of the techniques of this disclosure, the region R may be an anatomical region without distinct or clearly visible boundaries (e.g., a lymph node biopsy stage, pleura, etc.). During the medical procedure, the operator may navigate, in sequence, to a plurality of ROIs. [0037] FIG. 1B is a simplified diagram of the flexible elongate device disposed within the anatomical structure A. FIG. 1B is included to give an expanded and more detailed view of a portion of the operating environment 101 disposed within the field of view F. The anatomical structure A may be a lung of the patient P. The flexible elongate device 140 may be inserted into and navigated by the operator toward the region R, for example, for the purpose of investigating or treating a pathology, or for the purpose of sampling tissue (e.g., a biopsy) in the region R. The techniques described in the present disclosure can facilitate the navigation process by generating Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 and displaying timely and accurate identification and labeling of one or more ROIs (e.g., ROI R) within patient anatomy A based on labeled reference images (e.g., stored in the database 150). [0038] Returning to FIG.1A, the imaging unit 110 may be configured to image at least a portion of the operating environment using one or more imaging modalities including, for example, infrared, terahertz, X-ray, computed tomography (CT) (e.g., CBCT), positron-emission tomography (PET), optical coherence tomography (OCT), magnetic resonance imaging (MRI), fluoroscopy, sonography, or any other suitable modality. Furthermore, the system 100 may obtain images (e.g., reference images) from any suitable number of imaging units and from any suitable combination of modalities. [0039] The images generated by the imaging unit 110 may be two-dimensional (2D) or volumetric, three-dimensional (3D) images. 3D images may include a collection of voxels on a regular grid, point clouds, or a set of 2D slices that collectively represent a three-dimensional volume. Each 2D slice of a 3D image or a whole 2D image may include a collection of pixels. Each pixel may represent a point on a planar or curved surface within the field of view F. One or more processors may be configured to transform 3D images into a set of 2D images, convert 3D images from point clouds to voxels of a regular grid, or re-grid or interpolate from one 2D or 3D grid to another one, e.g., of higher or lower resolution or a different orientation. The imaging unit 120 and/or the processing unit 110 may include at least some of the one or more processors configured to perform the operations described above. [0040] The processing unit 120 of the system 100 may obtain one or more images representative of a portion of anatomy of the patient P, e.g., including the anatomical structure A and the ROI R disposed therein. In some examples, the processing unit 120 may request the imaging unit 110 to generate the one or more images. In other examples, the imaging unit 110 may continuously generate (and, possibly, buffer or store) images of the scene within the field of view F. Generally, the processing unit 120 may cooperate with the imaging unit 110 to obtain images in the appropriate format for further processing. [0041] The processing unit 120 of the system 100 may perform operations on unlabeled images (e.g., images without labels identifying specific ROIs) obtained from the imaging unit 110 and labeled reference images (e.g., stored in the database 150) to generate new ROI labels. In some examples, the system 100 may add the newly labeled images to reference images (e.g., stored in Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 the database 150). To generate the new ROI labels for an unlabeled image, the processing unit 120 may compute one or more transformations between the respective one or more reference images and the unlabeled image. The transformations may generate a mapping between pixels or voxels on the unlabeled image and the one or more reference images. Based on the generated transformations, the processing unit 120 may, in a sense, transfer or remap the labels from the reference images to the previously unlabeled image. When using multiple reference images, the processing unit 120 need not weigh the contributions of labels from each of the reference image equally. The newly generated labels may be a weighted amalgamation of labels from reference images mapped onto the previously unlabeled image using the transformations computed by the processing unit 120. The operations to generate labels based on the reference images are described in more detail with reference to FIGS. 5A and 5B. [0042] FIG. 2 schematically illustrates an image 200a representative of patient anatomy and a plurality of reference images 200b-d with respective labeled ROIs 211-218a, 211-218b, and 211- 218c which, for example, may be indicative of lymph node biopsy stations. In the discussion below, the ROIs 211-218a, 211-218b, and 211-218 may be referred to as lymph node stations 211- 218a, 211-218b, and 211-218, with understanding that the techniques discussed may be generalized to other ROIs. Images 200a-d further include respective sets of anatomical landmarks 221-229a, 221-229b, 221-229c and 221-229d. To one of ordinary skill in the art, the corresponding anatomical landmarks (e.g., landmarks 221-229a, 221-229b, 221-229c and 221-229d) in images 200a-d may be readily apparent. In practice, identifying landmarks in reference images and labeling ROIs (e.g., ROIs 211-218a, 211-218b and 211-218c) based on the landmarks may take a qualified physician several hours. The process may involve changing view angles and scrolling through multiple slices of three-dimensional images. The speed and accuracy of rendering the different views in the visualization system may further limit the process. [0043] The images 200a-d may be indicative of chest cavities of four patients. In some examples, one or more of the reference images 200b-d may correspond to the same patient as the image 200a. For example, a pre-operative image of a patient may be labeled and saved as a reference. The intra-operative images of the patient may be different from the pre-operative reference images of the patient due to physiological changes in the patient, differences in the point of the breathing cycle when an image is taken, changes in imaging modality, etc. Generally, the Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 techniques of this disclosure are applicable to examples where any number (including zero) of the reference images 200b-d correspond to the same patient as the image 200a. [0044] Anatomical landmarks (e.g., landmarks 221a-229d) may include tracheas 221a-d, bones 222a-d (e.g., clavicles, sternum), veins 223a-d (e.g., superior vena cava, subclavian vein), arteries 224a-d (e.g., aorta, pulmonary artery), lungs 225a-d, heart chambers 226a-d and 227a-d, bronchial branches 228a-d and 229a-d, and/or any other suitable landmarks. Although the exact morphology of landmarks can vary from image to image (e.g., from patient to patient, different instances for the same patient), identifying the landmarks in different images is one way that a physician may identify, for example, the lymph node stations 211-218a, 211-218b, and 211-218. For example, a practitioner may identify lymph node stations 211a, b and 212a, b with respect to clavicles, a trachea and lungs, lymph node stations 213a, b and 214a, b – with respect to clavicles, trachea and veins, lymph node stations 215 a, b and 216a, b - with respect to heart chambers, arteries and trachea, etc. Generally, the automated techniques of the present disclosure may use a set of reference images (e.g., images 200b-d) with the ROIs identified and labeled by practitioners as described above to identify corresponding ROIs in a new image (e.g., 200a) as described in detail below with reference to FIG. 5. [0045] FIG. 3 schematically illustrates separation between reference images and labels, e.g., for computation and storage. A practitioner may use a visualization system to generate a labeled reference image 310. To that end a practitioner may label pixels or voxels within the reference image 310 or place geometric shapes indicative of ROIs within the image. A system (e.g., system 100) may separate the labeled reference image 310 into an unlabeled reference image 320 and associated labels 330. The labels 330 may be stored as another image or in any other suitable data structure. For example, the labels 330 may be stored as lists of pixels or voxels associated with particular labels, as geometric shapes with associated parameters and coordinates in the coordinate system of the image 320, or in any other suitable manner. The reference image 320 with labels removed may be stored in a database 350 (which may be the database 150 in FIG. 1) in an image portion 352 of the database 350. The labels 330 may be stored in a label portion 354 of the database 350 (which may be the database 150 in FIG. 1). [0046] FIG. 4A schematically illustrates a visualization of patient anatomy with ROIs (e.g., ROIs 410, 420) depicted as collections of labeled voxels displayed with an anatomical structure Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 430, which may be, for example, a branched bronchial structure. FIG.4B schematically illustrates a visualization of patient anatomy with ROIs (e.g., ROIs 460, 470) depicted as ellipsoid regions and displayed with the anatomical structure 430. The ROIs 410, 420, 460, 470 may be ROIs 211- 218a and 211-218b or other suitable ROIs. During a procedure, a system (e.g., system 100) may display (e.g., via the display unit 130) one or more anatomical ROIs (e.g., ROIs 410, 420, 460, 470) overlayed with a pre-operative or intraoperative image. In some examples, the system may process (e.g., by the processing unit 120) the pre-operative or intraoperative image to segment, identify, and/or highlight certain aspects or structures of anatomy (e.g., structure 430) and/or a flexible elongate device disposed within the FOV of the image. The system may update the image at suitable frame rate such as 0.2, 0.5, 1, 2, 5, 10, 20, 50 frames per second (fps) or any other suitable rate. The system may update the displayed ROIs (e.g., ROIs 410, 420, 460, 470) in conjunction with the displayed images, at the same rate as the pre-operative or intraoperative image or at a different rate. For example, the system may update the displayed ROIs (e.g., ROIs 410, 420, 460, 470) for every 2, 5, 10 or any other suitable number of frames of the displayed pre- operative or intraoperative image. The system may update the displayed ROIs (e.g., ROIs 410, 420, 460, 470) in response to patient movement including, for example, breathing, heartbeat, and/or other voluntary or involuntary movements. In some examples, the system may update the displayed ROIs according to the techniques described in more detail with respect to FIGS.5A and B. [0047] A practitioner may select which ROIs (e.g., ROIs 410, 420, 460, 470) the system displays at a given time. Furthermore, the practitioner may select which ROI(s) (e.g., ROIs 410, 420, 460, 470) the system updates in response to patient movement. In some examples, updating one or more ROI while keeping other ROIs statically displayed may reduce a computational load on the system and result in a more accurate and/or fast rendering of the selected ROI(s). The practitioner may select for display, select for motion update, highlight and/or colorize certain ROIs using a GUI generated at the display (e.g., display unit 130). [0048] In some examples, a practitioner may choose to switch from ROIs 410, 420 displayed as pixels or voxels to ROIs 460, 470 displayed as ellipsoids or other geometric representations. ROIs displayed as ellipsoids (or other geometric shapes) may be easier to see and/or use for the practitioners. For example, a practitioner may have easier time navigating to a perceived center of an elliptical ROI to perform a tissue biopsy rather than to the center of an ROI rendered as a Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 colorized collection of voxels. The system may generate elliptical (or another geometric shape or collection of shapes) of an ROI from the ROI voxels using one or more techniques. For example, the system (e.g., system 100 using processing unit 120) may estimate a centroid, axes lengths, and axes directions of an elliptical ROI by first computing the centroid of the collection of voxels (or pixels) associated with the ROI and computing moments of inertia with respect to the centroid along image axes. The system may further compute an eigen value decomposition of the inertia matrix and use the eigen values to compute ellipsoid axes lengths and eigen vectors to compute the directions of the axes. Additionally or alternatively, the system may use one or more fitting algorithms, maximizing an overlap between the collection of voxels in an ROI and the geometric shapes defining the ROI. Still in other examples, the system may use a machine learning (ML) model to generate geometric representations of ROIs based on labeled-voxel representations. [0049] It should be noted that ROI representations as collections of voxels and/or geometric shapes need not only apply to ROI visualizations. In some examples, a practitioner may choose to label one or more of the ROIs in a reference image using voxel-by voxel annotations or using geometric shapes. Furthermore, some labeled reference images may be generated using the automatic labeling techniques of this disclosure. As a result, labels for reference images stored in a database (e.g., databases 150, 350) may be stored in a non-uniform manner, with some labels as voxel collections and other as geometric shapes. The techniques of this disclosure, as described in more detail with reference to FIGS.5A, B and 6, may generate new labels based on the different types of labels in reference images. For example, the labels of reference images may be indicative of patient gender, patient age, patient weight, patient size, imaging modality, and/or point in a breathing cycle. [0050] FIG. 5A schematically depicts an example sequence 500 of operations 510-560, a workflow, or a pipeline to generate a set of ROI labels in an image based on labeled reference images. In some examples, additional operations may be included, some of the operations 510- 560 may be omitted, and/or at least some of the operations 510-560 may be performed in parallel or in a different sequence than indicated in FIG. 5A. The operations may be performed by the processing unit 120 of the system 100. [0051] Operation 510 may include generation of references based on labeled reference images stored in a database 571 (which may be the database 150 or 350). In some examples, the generation Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 of references may be based at least in part on an unlabeled image 572. The generated references may include generating reference images 573 and respective labels 574 based at least in part on reference images and labels obtained from the database 571. The generated reference images 573 may include a subset of images stored in the database 571 and/or combinations of subsets of reference images obtained from the database 571. For example, at least some of the reference images 573 may be averages or weighted averages of subsets of images obtained from the database 571. In some examples, operation 510 may include selecting a subset of images from the database 571 based on tags associated with images stored in the database 571 and with the unlabeled image 572. Such tags may be indicative of patient gender, patient age, patient weight, patient size, imaging modality, and/or point in a breathing cycle. Additionally or alternatively, tags may be indicative of quality or reliability of labels in the reference images. For example, when the unlabeled image 572 is associated with the juvenile patient (and has a corresponding tag), the operation 510 may select, by filtering based on appropriate tags, from the database 571 reference images associated with juvenile patients. Operation 510 may include generating a new reference image by averaging images with tags corresponding to the tags of the unlabeled image 572. Furthermore, operation 510 may include generating new reference images by including reference images from the database 571 by using a weighted average and giving smaller weights to that images that have different tag values from the unlabeled image 572. [0052] It should be noted that a system may perform at least portions of operation 510 prior to the planning stage of an operation. For example, operation 510 may include performing transformations on images stored in the database 571 analogous to the transformations described with reference to operation 520 to improve alignment and/or registration of reference images in the database 571 with each other. Performing registration of reference images in the database 571 in advance of an operation or a planning stage of the operation may save considerable time in performing the sequence 500. The mutually registered images from the database 571 may be combined by operation 510 (e.g., using weighted averaging) with minimal loss of information due to blurring and/or other deleterious effects of combining unregistered images. [0053] In some examples the reference images in the database 571 may be generated by different imaging modalities and/or at different resolutions. The images may be preprocessed by operation 510 to be of similar style and/or resolution as the unlabeled image. To that end, operation Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 510 may include interpolation, re-gridding, gamma-correction, style transfer, and/or any other suitable preprocessing steps. [0054] Operation 510 may mirror at least some of the transformations performed on the reference images 573 onto the respective labels 574. For example, remapping (e.g., to register with each other) and/or combining (e.g., weighted averaging) reference images 573 may likewise be performed by the system on their respective labels 574. As discussed above with reference to FIG.4, some labels stored in the database 571 may be in the form of labeled pixels or voxels while others may be in the form of geometric shapes. Operation 510 may be configured to combine the labels in a uniform manner, for example, by first converting voxel labels to geometrical shapes or vice versa. [0055] Operation 520 may include registration of the unlabeled image 572 to the reference images 573. The reference images 573 may include some or all of the images stored in the database 571 and/or combinations of the images stored in 571, as discussed above. In some examples, the reference images 573 may be combined into a single reference image to reduce computation during the planning of are the execution of a medical procedure (e.g., lymph station biopsy). Registration in operation 520 is discussed in more detail with reference to FIG. 5B. Generally, the registration operation 520 generates mappings 575 between pixels or voxels of the unlabeled image 572 and each of the reference images 573. In some examples, a mapping with respect to an n-th reference image may be represented by a linear mapping ^^^^^ ^^ ^^ ^ ^^^^^^^ ^^^^^^^^^^^^^^^^^^^ ^^ . That is, for every
Figure imgf000018_0001
^^^ ^^ ^^, the mapping is a linear combination of voxel values ^^^^^^^ ^^ in the n-th reference image with coefficients ^^^^^^^^^^^^ defining the mapping. The coefficients of the mapping may be represented as a
Figure imgf000018_0002
as a continuous function which can be evaluated at voxel indices or a combination. In some examples, the mapping may be represented by a deformed grid, where a mapping value of a reference image at each index^^^ ^^ ^^ within the unlabeled image 572 is an interpolation (nearest neighbor, weighted neighbors, convolution interpolation, etc.) of the points of the corresponding deformed grid. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 [0056] In some examples, operation 520 may compute quality of registration between the unlabeled image 572 and each of the reference images 573. The quality metric may be indicative of the overall quality of registration, as computed, for example, from a correlation between a given reference image and the unlabeled image, correlations between extracted features of the unlabeled image and a given reference image, etc. Additionally or alternatively, operation 520 may generate local quality of registration maps (e.g., for each voxel index^^^ ^^ ^^) for registration of the unlabeled image 572 to each of the reference images 573. Operation 520 may compute quality of registration maps using local cross correlations ^^^^^^^ ^^ ^^ for an n-th reference image. For example, for each voxel, the operation 520 may compute a cross correlation in the neighborhood (e.g., 7x7x7, 9x9x9, 11x11x11 or any other suitable neighborhood) of a voxel. [0057] Operation 530 may include mapping labels 574 from reference images onto the grid of the unlabeled image 572 according to the mappings 575 generated by the system in operation 520. For a given ROI, labels from the reference labels 574 may have binary values. That is, each pixel or voxel may be labeled as belonging to the ROI or not belonging to the ROI. Because remapping may produce label values between 0 and 1, operation 530 may include a thresholding operation, mapping the values to either 0 or 1 for each reference label set corresponding to a reference image. In other examples, the operation 530 may preserve the non-binary voxel values. [0058] Operation 540 may include fusion of the labels 576 resulting from mapping the labels 574 based on registering respective reference images 573 to the unlabeled image to produce a fused set of labels 577. Operation 540 may be omitted in the examples in which the reference images 574 are combined into a single reference image, resulting in a single set of labels generated by operation 530. In some examples, operation 540 may fuse labels based on weights that depend on tags (as described above), overall quality of registration, or local quality of registration at each voxel. For example, operation 540 may compute weights for each reference image and each voxel as ^^^^^^^^^^^^ ^!^ ^ ^^ ^^ ^ ^ ^" # ^ ^ . #
Figure imgf000019_0001
That is a weight of an n- to a fused label at index ^^^ ^^ ^^ is a normalized exponent of local cross correlation divided by the sum of normalized exponents of local cross-correlations for all registered reference images. Thus, the sum of all weights for each Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 given voxel is unity, but each voxel may have differently weighted contributions from the different reference images. The fused set of labels may be quantized for each voxel and each ROI as either zero (not belonging to the ROI) or one (belonging to the ROI). Alternatively, continuous label values for each voxel may indicate probabilities of belonging to each ROI. [0059] An optional operation 550 may include reshaping of ROI labels as discussed above with reference to FIGS.4A and B. The operation 550 may include generating ellipsoid labels 578 based on the fused labels 577. The continuous label values produced by the fusion operation 540 may contribute to the reshaped labels 578 by contributing to the computed centroids and moment of inertia matrices as discussed above with reference to FIGS. 4A and B. [0060] Operation 560 includes displaying of a labeled image 579 by combining the unlabeled image of the patient 572 with the fused labels 577 and/or reshaped labels 578 on a GUI rendered on a display device (e.g., included in the display unit 130). A practitioner may use the GUI to choose which labels are displayed (e.g., which station of lymph node biopsy), the appearance of labels (e.g., shape, color, transparency, contour contrast), update rate of labels (e.g., with breathing cycle), etc. [0061] It should be noted that the sequence 500 of operations may be adapted to generating or updating labels in moving images based on one or more static reference images. For example, the on one or more static reference images may be generated during the planning of a medical procedure. The movement of anatomy (e.g., due to breathing) during the procedure may necessitate updating the labels for each deformation due to motion. [0062] FIG.5B schematically depicts an example subsequence of operations 522, 523, and 524 of the operation 520 to register an image with a reference image. Operation 522 may include applications of feature extraction algorithms to extract features in the unlabeled image 572 and in the reference images 573. For example, when applied to a chest cavity image, operation 522 may segment out lungs, airways, heart, bone structure, arteries and veins, etc. Furthermore, operation 522 may apply symmetric signed distance transforms (SSDTs) or other suitable distance transforms to segmented features. Additionally or alternatively, operation 522 may include generating modality independent neighborhood descriptors (MINDs) for reference images 573 and the unlabeled image 572. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 [0063] Operations 524 and 526 may include, respectively, discrete registration and registration by a deformable transformation. The discrete registration operation 522 may generate, for each pixel or voxel of the unlabeled image 572 a set of discrete shifts in two or three dimensions to map to a pixel or voxel in each of the reference images 573. The system may implement the discrete registration by evaluating a loss function for each shift and selecting a shift for which the loss is lowest. For example, a suitable size neighborhood (3x3x3, 5x5x5, 7x7x7, 9x9x9, 11x11x11, 13x13x13, or any other suitable neighborhood) of the unlabeled images or corresponding feature field (computed by operation 522) may be cross-correlated with similar size neighborhoods in a reference image or corresponding feature fields at the same voxel index as well as voxel indices shifted by up to 3, 4, 5, 6, 7, 8, 9 or any other suitable number in each of the three directions. That is, for a shifts between -6 and 6 in each dimension, the system may compute 133 or 2197 correlation values and/or loss function values. Selecting the shift for a given voxel may include selecting a minimal loss for the voxel index. For example, the minimal loss may be computed as 5 %^^^ ^^ ^^ ^ &'(^^^)*^+,)^-^^./^^0^^^ ^^ ^^ 1 ^^./^^23+^^4^ ^4^ ^4^- 5 6-77/8^^0^^^ ^^ ^^ 177/8^^23+^^4^ ^4^ ^4^- , where ^0^^^ ^^
Figure imgf000021_0001
of ^^^ ^^ ^^ and ^23+^^4^ ^4^ ^4^ is a in the neighborhood of^^4^ ^4^ ^4^, shifted with respect to^^^
Figure imgf000021_0002
the set of discrete shifts. Thereby, the discrete registration produces a map from each pixel or voxel in the unlabeled image to the “closest” pixel or voxel of the reference image. In a sense, the mapping is a linear mapping, as described above, where the matrix of coefficients mapping between voxels of images represented as vectors has a single 1 and the rest zeros in a given row (or column). The advantages of the discrete registration computation as described above include ability to handle large shifts and avoiding local minima in the loss function. [0064] Operation 526 may include computing a continuous transformation to refine the discrete transformation computed in operation 524. An example loss function for computing a continuous transformation may be defined similarly to the loss function for the discrete transformation, but with the addition of regularization terms: 5 %4^^^ ^^ ^^ ^ &'(^^^)*^+,)^-^^./^^0^^^ ^^ ^^ 1 ^^./^^23+9^ 6 :^ ^ ^ 6 :^ ^ ^ 6 :^;- 6- ^^ ^^ ^ 177/5 ^ ^ 8^^ ^4^ ^4^ ^4^- 6
Figure imgf000021_0003
Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 where <= is a regularization parameter for smoothness constraint on the transformation and <5 is a Jacobian determinant regularization parameter to keep transformation mapping diffeomorphic (minimizing folding or tearing). It should be noted that 9:^ ^ :^ ^ :^; are real (not restricted to integer) shifts. The system may use gradient descent, Adam, and/or any other suitable
Figure imgf000022_0001
optimization technique to determine the continuous in operation 526. [0065] FIG. 6 is a flow diagram of a method of visualizing patient anatomy during planning and/or execution stage of a medical procedure. The method 600 may be implemented using, for example, the system 100 as described above. Generally, the method 600 facilitates identifying and labeling ROIs with poor visibility and/or poorly defined boundaries (e.g., pleura, thoracic lymph node stations, etc.) disposed within an anatomical structure (e.g., a chest cavity). Generally, the method 600 may be executed by one or more processors (e.g., processing unit 110) of a system (e.g., the system 100). More specifically, in some examples, the system implementing the method 600 may obtain a preoperative image of patient anatomy and label ROIs in the preoperative image based on reference images (e.g., of other patients) with labeled ROIs. To that end, the system may compute transformations between the reference images and the preoperative image and map labeled ROIs onto the preoperative image using the computed transformations. In other examples, the system implementing the method 600 may obtain and label an intraoperative image of the patient anatomy using the techniques of the disclosure. For labeling an intraoperative image, labeled pre-operative images may serve as references. Still further, combining the techniques of the disclosure for pre-operative and intraoperative image labeling, the system may use an “atlas” of labeled reference images (e.g., of reference patients) to label one or more pre-operative images of a patient. During the procedure, the system (or a different system) may label an intraoperative image based on labeled pre-operative images serving as at least some of the reference. In summary, the techniques may apply to labeling pre-operative images (e.g., for planning a procedure) and/or intraoperative images. [0066] At block 610, the method 600 includes obtaining, by the one or more processors (e.g., processing unit 110) of the system (e.g., the system 100), an image representative of a portion of patient anatomy (e.g., an unlabeled image). An imaging unit (e.g., imaging unit 110) may generate the images obtained by the system by infrared, terahertz, X-ray, computer-aided tomography (CT) (e.g., CBCT), positron-emission tomography (PET), optical coherence tomography (OCT), Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 magnetic resonance imaging (MRI), fluoroscopy, sonography, or any other suitable modality. The system may obtain images from any suitable number of imaging units and from any suitable combination of modalities. At least one imaging unit may be included in the system. In some examples an imaging unit for obtaining pre-operative images may be distinct from an imaging unit for obtaining intraoperative images. [0067] At block 620, the method 600 includes obtaining, by the one or more processors (e.g., processing unit 120) of the system (e.g., system 100), a plurality of reference images with labels indicative of respective sets of anatomical ROIs. In some examples, the system may obtain the reference images and labels directly from a database (e.g., database 150 or 350). In other examples, the system may combine reference images from the database as described above with reference to FIG. 5A. [0068] At block 630, the method 600 includes computing one or more transformations between the image representative of the portion of patient anatomy and one or more of the reference images, as described, for example, with reference to FIGS. 5A and B. In some examples, the method includes combining reference images prior to computing the transformations with respect to the unlabeled image. In such examples, the system may compute the transformations among the reference images to facilitate combining reference images. Furthermore, the system may combine reference images based on a variety of tags, as described above. Computing the transformation may include first computing a discrete transformation and then computing a continuous transformation. Computing the discrete transformation may include finding, for each voxel, a discrete shift that minimizing a loss function by searching exhaustively through a set of discrete shifts within a chosen range. The system may compute the loss function at different voxels at least in part by computing differences (between a reference image and the unlabeled image) image neighborhoods around the respective voxels and/or segmented image field neighborhoods around the respective voxels. The continuous transformation may include minimizing a local (e.g., in the neighborhood of a pixel or voxel of interest) regularized (as described with reference to FIG. 5B) loss function with respect to continuous shifts using gradient descent and/or convex Adam optimization techniques. [0069] At block 640, the method 600 includes generating, based at least in part on the one or more transformations computed at block 630, one or more sets of labels indicative of anatomical Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 ROIs within the image representative of the portion of patient anatomy. The system may compute the one or more labels based at least in part on the label fusion techniques and/or label reshaping techniques described above (e.g., with reference to FIG. 5A). Label fusion may be based on weighted sums of labels associated with reference image and the weights may vary from voxel to voxel, as described above. [0070] At block 650, the example method 600 includes causing the display device (e.g., display unit 130) to display a GUI depicting at least one of the one or more sets of labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy. A practitioner may choose the parameters of the displayed ROIs such as color, representative geometric contour (e.g., ellipsoid) or colorized voxels, transparency level, refresh rate, etc. The practitioner may choose which ROIs during which part of the procedure. For example, the practitioner may choose to display only a subset of thoracic lymph node stations to focus on navigating a biopsy tool to the regions in question. [0071] FIGS. 7-9B depict diagrams of a medical system that may be used for visualizing and manipulating a medical instrument that includes a flexible elongate device in the vicinity of ROIs within patient anatomy according to any of the methods and systems described above, in some examples. For example, each reference above to the “system” may refer to a system (e.g., system 700) discussed below, or to a subsystem thereof. [0072] FIG. 7 is a simplified diagram of a medical system 700 according to some examples. The medical system 700 may include at least portions of the system 100 described with reference to FIG. 1. The medical system 700 may be suitable for use in, for example, surgical, diagnostic (e.g., biopsy), or therapeutic (e.g., ablation, electroporation, etc.) procedures. While some examples are provided herein with respect to such procedures, any reference to medical or surgical instruments and medical or surgical methods is non-limiting. The systems, instruments, and methods described herein may be used for animals, human cadavers, animal cadavers, portions of human or animal anatomy, non-surgical diagnosis, as well as for industrial systems, general or special purpose robotic systems, general or special purpose teleoperational systems, or robotic medical systems. [0073] As shown in FIG. 7, medical system 700 may include a manipulator assembly 702 that controls the operation of a medical instrument 704 in performing various procedures on a patient Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 (e.g., patient P on table T, as in FIG. 1). The medical instrument 704 may include the flexible elongated device 140 of FIG.1 and/or devices 240, 340, 440 of FIGS.2A-4E. Medical instrument 704 may extend into an internal site within the body of patient P via an opening in the body of patient P. The manipulator assembly 702 may be teleoperated, non-teleoperated, or a hybrid teleoperated and non-teleoperated assembly with one or more degrees of freedom of motion that may be motorized and/or one or more degrees of freedom of motion that may be non-motorized (e.g., manually operated). The manipulator assembly 702 may be mounted to and/or positioned near patient table T. A master assembly 706 allows an operator O (e.g., a surgeon, a clinician, a physician, or other user, as described above) to control the manipulator assembly 702. In some examples, the master assembly 706 allows the operator O to view the procedural site or other graphical or informational displays. In some examples, the manipulator assembly 702 may be excluded from the medical system 700 and the instrument 704 may be controlled directly by the operator O. In some examples, the manipulator assembly 702 may be manually controlled by the operator O. Direct operator control may include various handles and operator interfaces for hand- held operation of the medical instrument 704. [0074] The master assembly 706 may be located at a surgeon’s console which is in proximity to (e.g., in the same room as) the patient table T on which patient P is located, such as at the side of the patient table T. In some examples, the master assembly 706 is remote from the patient table T, such as in in a different room or a different building from the patient table T. The master assembly 706 may include one or more control devices for controlling the manipulator assembly 702. The control devices may include any number of a variety of input devices, such as joysticks, trackballs, scroll wheels, directional pads, buttons, data gloves, trigger-guns, hand-operated controllers, voice recognition devices, motion or presence sensors, and/or the like. [0075] The manipulator assembly 702 supports the medical instrument 704 and may include a kinematic structure of links that provide a set-up structure. The links may include one or more non-servo controlled links (e.g., one or more links that may be manually positioned and locked in place) and/or one or more servo controlled links (e.g., one or more links that may be controlled in response to commands, such as from a control system 712). The manipulator assembly 702 may include a plurality of actuators (e.g., motors) that drive inputs on the medical instrument 704 in response to commands, such as from the control system 712. The actuators may include drive systems that move the medical instrument 704 in various ways when coupled to the medical Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 instrument 704. For example, one or more actuators may advance medical instrument 704 into a naturally or surgically created anatomic orifice. Actuators may control articulation of the medical instrument 704, such as by moving the distal end (or any other portion) of medical instrument 704 in multiple degrees of freedom. These degrees of freedom may include three degrees of linear motion (e.g., linear motion along the X, Y, Z Cartesian axes) and in three degrees of rotational motion (e.g., rotation about the X, Y, Z Cartesian axes). One or more actuators may control rotation of the medical instrument about a longitudinal axis. Actuators can also be used to move an articulable end effector of medical instrument 704, such as for grasping tissue in the jaws of a biopsy device and/or the like, or may be used to move or otherwise control tools (e.g., imaging tools, ablation tools, biopsy tools, electroporation tools, etc.) that are inserted within the medical instrument 704. [0076] The control system 712 may include at least portions of the processing unit 120. Additionally or alternatively, the control system 712 may be in communicative connection with the processing unit 120. In some examples, the output of the processing unit 120 according to the techniques described above may cause the control system 712 to autonomously (without input from the operator O) control certain movements of the medical instrument 704. [0077] The medical system 700 may include a sensor system 708 with one or more sub-systems for receiving information about the manipulator assembly 702 and/or the medical instrument 704. Such sub-systems may include a position sensor system (e.g., that uses electromagnetic (EM) sensors or other types of sensors that detect position or location); a shape sensor system for determining the position, orientation, speed, velocity, pose, and/or shape of a distal end and/or of one or more segments along a flexible body of the medical instrument 704; a visualization system (e.g., using a color imaging device, an infrared imaging device, an ultrasound imaging device, an x-ray imaging device, a fluoroscopic imaging device, a computed tomography (CT) imaging device, a magnetic resonance imaging (MRI) imaging device, or some other type of imaging device) for capturing images, such as from the distal end of medical instrument 704 or from some other location; and/or actuator position sensors such as resolvers, encoders, potentiometers, and the like that describe the rotation and/or orientation of the actuators controlling the medical instrument 704. The sensor system 708 may include at least portions of the imaging unit 110 of FIG. 1. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 [0078] The medical system 700 may include a display system 710 for displaying an image or representation of the procedural site and the medical instrument 704. Display system 710 and master assembly 706 may be oriented so physician O can control medical instrument 704 and master assembly 706 with the perception of telepresence. The display system 710 may include at least portions of the display unit 130. [0079] In some examples, the medical instrument 704 may include a visualization system, which may include an image capture assembly that records a concurrent or real-time image of a procedural site and provides the image to the operator O through one or more displays of display system 710. The image capture assembly may include various types of imaging devices (e.g., imaging unit 110). The concurrent image may be, for example, a two-dimensional image or a three-dimensional image captured by an endoscope positioned within the anatomical procedural site. In some examples, the visualization system may include endoscopic components that may be integrally or removably coupled to medical instrument 704. Additionally or alternatively, a separate endoscope, attached to a separate manipulator assembly, may be used with medical instrument 704 to image the procedural site. The visualization system may be implemented as hardware, firmware, software or a combination thereof which interact with or are otherwise executed by one or more computer processors, such as of the control system 712. [0080] Display system 710 may also display an image of the procedural site and medical instruments, which may be captured by the visualization system. In some examples, the medical system 700 provides a perception of telepresence to the operator O. For example, images captured by an imaging device at a distal portion of the medical instrument 704 may be presented by the display system 710 to provide the perception of being at the distal portion of the medical instrument 704 to the operator O. The input to the master assembly 706 provided by the operator O may move the distal portion of the medical instrument 704 in a manner that corresponds with the nature of the input (e.g., distal tip turns right when a trackball is rolled to the right) and results in corresponding change to the perspective of the images captured by the imaging device at the distal portion of the medical instrument 704. As such, the perception of telepresence for the operator O is maintained as the medical instrument 704 is moved using the master assembly 706. The operator O can manipulate the medical instrument 704 and hand controls of the master assembly 706 as if viewing the workspace in substantially true presence, simulating the experience of an operator that is physically manipulating the medical instrument 704 from within the patient anatomy. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 [0081] In some examples, the display system 710 may present virtual images of a procedural site that are created using image data recorded pre-operatively (e.g., prior to the procedure performed by the medical instrument system 200) or intra-operatively (e.g., concurrent with the procedure performed by the medical instrument system 200), such as image data created using computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), fluoroscopy, thermography, ultrasound, optical coherence tomography (OCT), thermal imaging, impedance imaging, laser imaging, nanotube X-ray imaging, and/or the like. The virtual images may include two-dimensional, three-dimensional, or higher-dimensional (e.g., including, for example, time based or velocity-based information) images. In some examples, one or more models are created from pre-operative or intra-operative image data sets and the virtual images are generated using the one or more models. [0082] In some examples, for purposes of imaged guided medical procedures, display system 710 may display a virtual image that is generated based on tracking the location of medical instrument 704. For example, the tracked location of the medical instrument 704 may be registered (e.g., dynamically referenced) with the model generated using the pre-operative or intra-operative images, with different portions of the model correspond with different locations of the patient anatomy. As the medical instrument 704 moves through the patient anatomy, the registration is used to determine portions of the model corresponding with the location and/or perspective of the medical instrument 704 and virtual images are generated using the determined portions of the model. This may be done to present the operator O with virtual images of the internal procedural site from viewpoints of medical instrument 704 that correspond with the tracked locations of the medical instrument 704. [0083] The display system 710 may display, along with images of patient anatomy and the medical instrument 704, labeled ROIs, according to the techniques described above with reference to FIGS. 1A-6. [0084] The medical system 700 may also include the control system 712, which may include processing circuitry (e.g., the processing unit 120) that implements the some or all of the methods or functionality discussed herein. The control system 712 may include at least one memory and at least one processor for controlling the operations of the manipulator assembly 702, the medical instrument 704, the master assembly 706, the sensor system 708, and/or the display system 710. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 Control system 712 may include instructions (e.g., a non-transitory machine-readable medium storing the instructions) that when executed by the at least one processor, configures the one or more processors to implement some or all of the methods or functionality discussed herein. While the control system 712 is shown as a single block in FIG. 7, the control system 712 may include two or more separate data processing circuits with one portion of the processing being performed at the manipulator assembly 702, another portion of the processing being performed at the master assembly 706, and/or the like. In some examples, the control system 712 may include other types of processing circuitry, such as application-specific integrated circuits (ASICs) and/or field- programmable gate array (FPGAs). The control system 712 may be implemented using hardware, firmware, software, or a combination thereof. [0085] In some examples, the control system 712 may receive feedback from the medical instrument 704, such as force and/or torque feedback. Responsive to the feedback, the control system 712 may transmit signals to the master assembly 706. In some examples, the control system 712 may transmit signals instructing one or more actuators of the manipulator assembly 702 to move the medical instrument 704. In some examples, the control system 712 may transmit informational displays regarding the feedback to the display system 710 for presentation or perform other types of actions based on the feedback. [0086] The control system 712 may include a virtual visualization system to provide navigation assistance to operator O when controlling the medical instrument 704 during an image-guided medical procedure. Virtual navigation using the virtual visualization system may be based upon an acquired pre-operative or intra-operative dataset of anatomic passageways of the patient P. The control system 712 or a separate computing device may convert the recorded images, using programmed instructions alone or in combination with operator inputs, into a model of the patient anatomy. The model may include a segmented two-dimensional or three-dimensional composite representation of a partial or an entire anatomic organ or anatomic region. An image data set may be associated with the composite representation. The virtual visualization system may obtain sensor data from the sensor system 708 that is used to compute an (e.g., approximate) location of the medical instrument 704 with respect to the anatomy of patient P. The sensor system 708 may be used to register and display the medical instrument 704 together with the pre-operatively or intra-operatively recorded images. For example, PCT Publication WO 2016/191298 (published Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 December 1, 2016 and titled “Systems and Methods of Registration for Image Guided Surgery”), which is incorporated by reference herein in its entirety, discloses example systems. [0087] During a virtual navigation procedure, the sensor system 708 may be used to compute the (e.g., approximate) location of the medical instrument 704 with respect to the anatomy of patient P. The location can be used to produce both macro-level (e.g., external) tracking images of the anatomy of patient P and virtual internal images of the anatomy of patient P. The system may include one or more electromagnetic (EM) sensors, fiber optic sensors, and/or other sensors to register and display a medical instrument together with pre-operatively recorded medical images. For example, U.S. Patent No. 8,900,131 (filed May 13, 2011 and titled “Medical System Providing Dynamic Registration of a Model of an Anatomic Structure for Image-Guided Surgery”), which is incorporated by reference herein in its entirety, discloses example systems. [0088] Medical system 700 may further include operations and support systems (not shown) such as illumination systems, steering control systems, irrigation systems, and/or suction systems. In some examples, the medical system 700 may include more than one manipulator assembly and/or more than one master assembly. The exact number of manipulator assemblies may depend on the medical procedure and space constraints within the procedural room, among other factors. Multiple master assemblies may be co-located or they may be positioned in separate locations. Multiple master assemblies may allow more than one operator to control one or more manipulator assemblies in various combinations. [0089] FIG. 8A is a simplified diagram of a medical instrument system 800 according to some examples. The medical instrument system 800 includes a flexible elongate device 802 (e.g., device 140, 240, 340, and/or 440), also referred to as elongate device 802, a drive unit 804, and a medical tool 826 that collectively is an example of a medical instrument 704 of a medical system 700. The medical system 700 may be a teleoperated system, a non-teleoperated system, or a hybrid teleoperated and non-teleoperated system, as explained with reference to FIG. 7. A visualization system 831, tracking system 830, and navigation system 832 are also shown in FIG. 8A and are example components of the control system 712 of the medical system 700. In some examples, the medical instrument system 800 may be used for non-teleoperational exploratory procedures or in procedures involving traditional manually operated medical instruments, such as endoscopy. The Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 medical instrument system 800 may be used to gather (e.g., measure) a set of data points corresponding to locations within anatomic passageways of a patient, such as patient P. [0090] The elongate device 802 is coupled to the drive unit 804. The elongate device 802 includes a channel 821 through which the medical tool 826 may be inserted. The elongate device 802 navigates within patient anatomy to deliver the medical tool 826 to a procedural site. The elongate device 802 includes a flexible body 816 having a proximal end 817 and a distal end 818. In some examples, the flexible body 816 may have an approximately 3 mm outer diameter. Other flexible body outer diameters may be larger or smaller. [0091] Medical instrument system 800 may include the tracking system 830 for determining the position, orientation, speed, velocity, pose, and/or shape of the flexible body 816 at the distal end 818 and/or of one or more segments 824 along flexible body 816, as will be described in further detail below. The tracking system 830 may include one or more sensors and/or imaging devices (e.g., imaging unit 110). The flexible body 816, such as the length between the distal end 818 and the proximal end 817, may include multiple segments 824. The tracking system 830 may be implemented using hardware, firmware, software, or a combination thereof. In some examples, the tracking system 830 is part of control system 712 shown in FIG. 7. The tracking system 830 may implement at least some of the techniques described with reference to FIGS. 1A-6, and, to that end, may include at least portions of or be in communicative connection with the processing unit 120 of FIG. 1A. [0092] Tracking system 830 may track the distal end 818 and/or one or more of the segments 824 of the flexible body 816 using a shape sensor 822. The shape sensor 822 may include an optical fiber aligned with the flexible body 816 (e.g., provided within an interior channel of the flexibly body 816 or mounted externally along the flexible body 816). In some examples, the optical fiber may have a diameter of approximately 800 ^m. In other examples, the diameter may be larger or smaller. The optical fiber of the shape sensor 822 may form a fiber optic bend sensor for determining the shape of flexible body 816. Optical fibers including Fiber Bragg Gratings (FBGs) may be used to provide strain measurements in structures in one or more dimensions. Various systems and methods for monitoring the shape and relative position of an optical fiber in three dimensions, which may be applicable in some examples, are described in U.S. Patent Application Publication No.2006/0013523 (filed July 13, 2005 and titled “ Fiber optic position and Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 shape sensing device and method relating thereto” ); U.S. Patent No.7,772,541 (filed on March 12, 2008 and titled “ Fiber Optic Position and/or Shape Sensing Based on Rayleigh Scatter” ); and U.S. Patent No. 8,773,650 (filed on Sept. 2, 2010 and titled “ Optical Position and/or Shape Sensing” ), which are all incorporated by reference herein in their entireties. Sensors in some examples may employ other suitable strain sensing techniques, such as Rayleigh scattering, Raman scattering, Brillouin scattering, and Fluorescence scattering. [0093] In some examples, the shape of the flexible body 816 may be determined using other techniques. For example, a history of the position and/or pose of the distal end 818 of the flexible body 816 can be used to reconstruct the shape of flexible body 816 over an interval of time (e.g., as the flexible body 816 is advanced or retracted within a patient anatomy). In some examples, the tracking system 830 may alternatively and/or additionally track the distal end 818 of the flexible body 816 using a position sensor system 820. Position sensor system 820 may be a component of an EM sensor system with the position sensor system 820 including one or more position sensors. Although the position sensor system 820 is shown as being near the distal end 818 of the flexible body 816 to track the distal end 818, the number and location of the position sensors of the position sensor system 820 may vary to track different regions along the flexible body 816. In one example, the position sensors include conductive coils that may be subjected to an externally generated electromagnetic field. Each coil of position sensor system 820 may produce an induced electrical signal having characteristics that depend on the position and orientation of the coil relative to the externally generated electromagnetic field. The position sensor system 820 may measure one or more position coordinates and/or one or more orientation angles associated with one or more portions of flexible body 816. In some examples, the position sensor system 820 may be configured and positioned to measure six degrees of freedom, e.g., three position coordinates X, Y, Z and three orientation angles indicating pitch, yaw, and roll of a base point. In some examples, the position sensor system 820 may be configured and positioned to measure five degrees of freedom, e.g., three position coordinates X, Y, Z and two orientation angles indicating pitch and yaw of a base point. Further description of a position sensor system, which may be applicable in some examples, is provided in U.S. Patent No.6,380,732 (filed August 11, 1999 and titled “ Six-Degree of Freedom Tracking System Having a Passive Transponder on the Object Being Tracked” ), which is incorporated by reference herein in its entirety. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 [0094] In some examples, the tracking system 830 may alternately and/or additionally rely on a collection of pose, position, and/or orientation data stored for a point of an elongate device 802 and/or medical tool 826 captured during one or more cycles of alternating motion, such as breathing. This stored data may be used to develop shape information about the flexible body 816. In some examples, a series of position sensors (not shown), such as EM sensors like the sensors in position sensor system 820 or some other type of position sensors may be positioned along the flexible body 816 and used for shape sensing. In some examples, a history of data from one or more of these position sensors taken during a procedure may be used to represent the shape of elongate device 802, particularly if an anatomic passageway is generally static. [0095] FIG. 8B is a simplified diagram of the medical tool 826 within the elongate device 802 according to some examples. The flexible body 816 of the elongate device 802 may include the channel 821 sized and shaped to receive the medical tool 826. In some examples, the medical tool 826 may be used for procedures such as diagnostics, imaging, surgery, biopsy, ablation, illumination, irrigation, suction, electroporation, etc. Medical tool 826 can be deployed through channel 821 of flexible body 816 and operated at a procedural site within the anatomy. Medical tool 826 may be, for example, an image capture probe, a biopsy tool (e.g., a needle, grasper, brush, etc.), an ablation tool (e.g., a laser ablation tool, radio frequency (RF) ablation tool, cryoablation tool, thermal ablation tool, heated liquid ablation tool, etc.), an electroporation tool, and/or another surgical, diagnostic, or therapeutic tool. In some examples, the medical tool 826 may include an end effector having a single working member such as a scalpel, a blunt blade, an optical fiber, an electrode, and/or the like. Other end types of end effectors may include, for example, forceps, graspers, scissors, staplers, clip appliers, and/or the like. Other end effectors may further include electrically activated end effectors such as electrosurgical electrodes, transducers, sensors, and/or the like. [0096] The medical tool 826 may be a biopsy tool used to remove sample tissue or a sampling of cells from a target anatomic location. In some examples, the biopsy tool is a flexible needle. The biopsy tool may further include a sheath that can surround the flexible needle to protect the needle and interior surface of the channel 821 when the biopsy tool is within the channel 821. The medical tool 826 may be an image capture probe that includes a distal portion with a stereoscopic or monoscopic camera that may be placed at or near the distal end 818 of flexible body 816 for capturing images (e.g., still or video images). The captured images may be processed by the Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 visualization system 831 for display and/or provided to the tracking system 830 to support tracking of the distal end 818 of the flexible body 816 and/or one or more of the segments 824 of the flexible body 816. The image capture probe may include a cable for transmitting the captured image data that is coupled to an imaging device at the distal portion of the image capture probe. In some examples, the image capture probe may include a fiber-optic bundle, such as a fiberscope, that couples to a more proximal imaging device of the visualization system 831. The image capture probe may be single-spectral or multi-spectral, for example, capturing image data in one or more of the visible, near-infrared, infrared, and/or ultraviolet spectrums. The image capture probe may also include one or more light emitters that provide illumination to facilitate image capture. In some examples, the image capture probe may use ultrasound, x-ray, fluoroscopy, CT, MRI, or other types of imaging technology. [0097] In some examples, the image capture probe is inserted within the flexible body 816 of the elongate device 802 to facilitate visual navigation of the elongate device 802 to a procedural site and then is replaced within the flexible body 816 with another type of medical tool 826 that performs the procedure. In some examples, the image capture probe may be within the flexible body 816 of the elongate device 802 along with another type of medical tool 826 to facilitate simultaneous image capture and tissue intervention, such as within the same channel 821 or in separate channels. A medical tool 826 may be advanced from the opening of the channel 821 to perform the procedure (or some other functionality) and then retracted back into the channel 821 when the procedure is complete. The medical tool 826 may be removed from the proximal end 817 of the flexible body 816 or from another optional instrument port (not shown) along flexible body 816. [0098] In some examples, the elongate device 802 may include integrated imaging capability rather than utilize a removable image capture probe. For example, the imaging device (or fiber- optic bundle) and the light emitters may be located at the distal end 818 of the elongate device 802. The flexible body 815 may include one or more dedicated channels that carry the cable(s) and/or optical fiber(s) between the distal end 818 and the visualization system 831. Here, the medical instrument system 800 can perform simultaneous imaging and tool operations. [0099] In some examples, the medical tool 826 is capable of controllable articulation. The medical tool 826 may house cables (which may also be referred to as pull wires), linkages, or other Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 actuation controls (not shown) that extend between its proximal and distal ends to controllably bend the distal end of medical tool 826, such as discussed herein for the flexible elongate device 802. The medical tool 826 may be coupled to a drive unit 804 and the manipulator assembly 702. In these examples, the elongate device 802 may be excluded from the medical instrument system 800 or may be a flexible device that does not have controllable articulation. Steerable instruments or tools, applicable in some examples, are further described in detail in U.S. Patent No.7,316,681 (filed on Oct. 4, 2005 and titled “ Articulated Surgical Instrument for Performing Minimally Invasive Surgery with Enhanced Dexterity and Sensitivity” ) and U.S. Patent No. 9,259,274 (filed Sept. 30, 2008 and titled “ Passive Preload and Capstan Drive for Surgical Instruments” ), which are incorporated by reference herein in their entireties. [0100] The flexible body 816 of the elongate device 802 may also or alternatively house cables, linkages, or other steering controls (not shown) that extend between the drive unit 804 and the distal end 818 to controllably bend the distal end 818 as shown, for example, by broken dashed line depictions 819 of the distal end 818 in FIG. 2A. In some examples, at least four cables are used to provide independent up-down steering to control a pitch of the distal end 818 and left-right steering to control a yaw of the distal end 881. In these examples, the flexible elongate device 802 may be a steerable catheter. Examples of steerable catheters, applicable in some examples, are described in detail in PCT Publication WO 2019/018736 (published Jan. 24, 2019 and titled “ Flexible Elongate Device Systems and Methods” ), which is incorporated by reference herein in its entirety. [0101] In examples where the elongate device 802 and/or medical tool 826 are actuated by a teleoperational assembly (e.g., the manipulator assembly 702), the drive unit 804 may include drive inputs that removably couple to and receive power from drive elements, such as actuators, of the teleoperational assembly. In some examples, the elongate device 802 and/or medical tool 826 may include gripping features, manual actuators, or other components for manually controlling the motion of the elongate device 802 and/or medical tool 826. The elongate device 802 may be steerable or, alternatively, the elongate device 802 may be non-steerable with no integrated mechanism for operator control of the bending of distal end 818. In some examples, one or more channels 821 (which may also be referred to as lumens), through which medical tools 826 can be deployed and used at a target anatomical location, may be defined by the interior walls of the flexible body 816 of the elongate device 802. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 [0102] In some examples, the medical instrument system 800 (e.g., the elongate device 802 or medical tool 826) may include a flexible bronchial instrument, such as a bronchoscope or bronchial catheter, for use in examination, diagnosis, biopsy, and/or treatment of a lung. The medical instrument system 800 may also be suited for navigation and treatment of other tissues, via natural or surgically created connected passageways, in any of a variety of anatomic systems, including the colon, the intestines, the kidneys and kidney calices, the brain, the heart, the circulatory system including vasculature, and/or the like. [0103] The information from the tracking system 830 may be sent to the navigation system 832, where the information may be combined with information from the visualization system 831 and/or pre-operatively obtained models to provide the physician, clinician, surgeon, or other operator with real-time position information. The tracking system 830, the navigation system 832, and the visualization system 831 may cooperatively implement, at least partially, the functionality of the system 100 in implementing the techniques described with reference to FIGS. 1A-6. In some examples, the real-time position information may be displayed on the display system 710 for use in the control of the medical instrument system 800. In some examples, the navigation system 832 may utilize the position information as feedback for positioning medical instrument system 800. Various systems for using fiber optic sensors to register and display a surgical instrument with surgical images, applicable in some examples, are provided in U.S. Patent No. 8,900,131 (filed May 13, 2011 and titled “ Medical System Providing Dynamic Registration of a Model of an Anatomic Structure for Image-Guided Surgery” ), which is incorporated by reference herein in its entirety. [0104] FIGS. 9A and 9B are simplified diagrams of side views of a patient coordinate space including a medical instrument mounted on an insertion assembly according to some examples. As shown in FIGS. 9A and 9B, a surgical environment 900 may include the patient P positioned on the patient table T. Patient P may be stationary within the surgical environment 900 in the sense that gross patient movement is limited by sedation, restraint, and/or other means. Cyclic anatomic motion, including respiration and cardiac motion, of patient P may continue. Within surgical environment 900, a medical instrument 904 is used to perform a medical procedure which may include, for example, surgery, biopsy, ablation, illumination, irrigation, suction, or electroporation. The medical instrument 904 may also be used to perform other types of procedures, such as a registration procedure to associate the position, orientation, and/or pose data Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 captured by the sensor system 708 to a desired (e.g., anatomical or system) reference frame. The medical instrument 904 may be, for example, the medical instrument 704. In some examples, the medical instrument 904 may include an elongate device 910 (e.g., a catheter) coupled to an instrument body 912. Elongate device 910 may be the elongate device 140 of FIG. 1. Elongate device 910 includes one or more channels sized and shaped to receive a medical tool. [0105] Elongate device 910 may also include one or more sensors (e.g., components of the sensor system 708). In some examples, a shape sensor 914 may be fixed at a proximal point 916 on the instrument body 912. The proximal point 916 of the shape sensor 914 may be movable with the instrument body 912, and the location of the proximal point 916 with respect to a desired reference frame may be known (e.g., via a tracking sensor or other tracking device). The shape sensor 914 may measure a shape from the proximal point 916 to another point, such as a distal end 918 of the elongate device 910. The shape sensor 914 may be aligned with the elongate device 910 (e.g., provided within an interior channel or mounted externally). In some examples, the shape sensor 914 may optical fibers used to generate shape information for the elongate device 910. [0106] In some examples, position sensors (e.g., EM sensors) may be incorporated into the medical instrument 904. A series of position sensors may be positioned along the flexible elongate device 910 and used for shape sensing. Position sensors may be used alternatively to the shape sensor 914 or with the shape sensor 914, such as to improve the accuracy of shape sensing or to verify shape information. [0107] Elongate device 910 may house cables, linkages, or other steering controls that extend between the instrument body 912 and the distal end 918 to controllably bend the distal end 918. In some examples, at least four cables are used to provide independent up-down steering to control a pitch of distal end 918 and left-right steering to control a yaw of distal end 918. The instrument body 912 may include drive inputs that removably couple to and receive power from drive elements, such as actuators, of a manipulator assembly. [0108] The instrument body 912 may be coupled to an instrument carriage 906. The instrument carriage 906 may be mounted to an insertion stage 908 that is fixed within the surgical environment 900. Alternatively, the insertion stage 908 may be movable but have a known location (e.g., via a tracking sensor or other tracking device) within surgical environment 900. Instrument carriage 906 may be a component of a manipulator assembly (e.g., manipulator assembly 702) that couples Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 to the medical instrument 904 to control insertion motion (e.g., motion along an insertion axis A) and/or motion of the distal end 918 of the elongate device 910 in multiple directions, such as yaw, pitch, and/or roll. The instrument carriage 906 or insertion stage 908 may include actuators, such as servomotors, that control motion of instrument carriage 906 along the insertion stage 908. [0109] A sensor device 920, which may be a component of the sensor system 708, may provide information about the position of the instrument body 912 as it moves relative to the insertion stage 908 along the insertion axis A. The sensor device 920 may include one or more resolvers, encoders, potentiometers, and/or other sensors that measure the rotation and/or orientation of the actuators controlling the motion of the instrument carriage 906, thus indicating the motion of the instrument body 912. In some examples, the insertion stage 908 has a linear track as shown in FIGS. 9A and 9B. In some examples, the insertion stage 908 may have curved track or have a combination of curved and linear track sections. [0110] FIG. 9A shows the instrument body 912 and the instrument carriage 906 in a retracted position along the insertion stage 908. In this retracted position, the proximal point 916 is at a position L0 on the insertion axis A. The location of the proximal point 916 may be set to a zero value and/or other reference value to provide a base reference (e.g., corresponding to the origin of a desired reference frame) to describe the position of the instrument carriage 906 along the insertion stage 908. In the retracted position, the distal end 918 of the elongate device 910 may be positioned just inside an entry orifice of patient P. Also in the retracted position, the data captured by the sensor device 920 may be set to a zero value and/or other reference value (e.g., I=0). In FIG. 9B, the instrument body 912 and the instrument carriage 906 have advanced along the linear track of insertion stage 908, and the distal end 918 of the elongate device 910 has advanced into patient P. In this advanced position, the proximal point 916 is at a position L1 on the insertion axis A. In some examples, the rotation and/or orientation of the actuators measured by the sensor device 920 indicating movement of the instrument carriage 906 along the insertion stage 908 and/or one or more position sensors associated with instrument carriage 906 and/or the insertion stage 908 may be used to determine the position L1 of the proximal point 916 relative to the position L0. In some examples, the position L1 may further be used as an indicator of the distance or insertion depth to which the distal end 918 of the elongate device 910 is inserted into the passageway(s) of the anatomy of patient P. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 [0111] One or more components of the examples discussed in this disclosure, such as control system 712, may be implemented in software for execution on one or more processors of a computer system. The software may include code that when executed by the one or more processors, configures the one or more processors to perform various functionalities as discussed herein. The code may be stored in a non-transitory computer readable storage medium (e.g., a memory, magnetic storage, optical storage, solid-state storage, etc.). The computer readable storage medium may be part of a computer readable storage device, such as an electronic circuit, a semiconductor device, a semiconductor memory device, a read only memory (ROM), a flash memory, an erasable programmable read only memory (EPROM); a floppy diskette, a CD-ROM, an optical disk, a hard disk, or other storage device. The code may be downloaded via computer networks such as the Internet, Intranet, etc. for storage on the computer readable storage medium. The code may be executed by any of a wide variety of centralized or distributed data processing architectures. The programmed instructions of the code may be implemented as a number of separate programs or subroutines, or they may be integrated into a number of other aspects of the systems described herein. The components of the computing systems discussed herein may be connected using wired and/or wireless connections. In some examples, the wireless connections may use wireless communication protocols such as Bluetooth, near-field communication (NFC), Infrared Data Association (IrDA), home radio frequency (HomeRF), IEEE 802.11, Digital Enhanced Cordless Telecommunications (DECT), and wireless medical telemetry service (WMTS). [0112] Various general-purpose computer systems may be used to perform one or more processes, methods, or functionalities described herein. Additionally or alternatively, various specialized computer systems may be used to perform one or more processes, methods, or functionalities described herein. In addition, a variety of programming languages may be used to implement one or more of the processes, methods, or functionalities described herein. [0113] While certain examples have been described above and shown in the accompanying drawings, it is to be understood that such examples are merely illustrative and are not limited to the specific constructions and arrangements shown and described, since various other alternatives, modifications, and equivalents will be appreciated by those with ordinary skill in the art.

Claims

Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 WHAT IS CLAIMED: 1. One or more non-transitory, computer readable media storing instructions that, when executed by one or more processors, cause the one or more processors to: obtain an image representative of a portion of patient anatomy; obtain a plurality of reference images with labels indicative of respective sets of anatomical regions of interest (ROIs); compute one or more transformations between the image representative of the portion of patient anatomy and one or more of the plurality of reference images; based on the one or more transformation, generate one or more sets of labels indicative of anatomical ROIs within the image representative of the portion of patient anatomy; and cause a display device to display a graphical user interface (GUI) depicting at least one of the one or more sets of labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy. 2. The computer readable media of claim 1 wherein the portion of patient anatomy includes at least a portion of a thoracic region and the anatomical ROIs include at least one thoracic lymph node station. 3. The computer readable media of claim 2 wherein the at least one thoracic lymph node station includes two or more thoracic lymph node stations. 4. The computer readable media of any one of claims 1-3 wherein the one or more sets of labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy depicted in the GUI includes a geometric contour. 5. The computer readable media of claim 4 wherein the geometric contour is an ellipsoid. 6. The computer readable media of any one of claims 1-5 wherein the GUI further depicts the portion of patient anatomy with colorized pixels or voxels based on the one or more sets of labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy. 7. The computer readable media of any one of claims 1-6 wherein computing the one or more transformations includes: a discrete registration by search, and a deformable transformation by optimization. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 8. The computer readable media of any one of claims 1-7 wherein computing the one or more transformations includes computing at least one of: B-spline or P-spline transforms. 9. The computer readable media of any one of claims 1-8 wherein computing the one or more transformations is based at least in part on a machine learning model. 10. The computer readable media of claim 9 wherein the machine learning model is a shifted windows transformer. 11. The computer readable media of any one of claims 1-10 wherein computing the one or more transformations is based at least in part on computing a cost function, the cost function including at least one of: a mean absolute error (MAE), Dice loss, smoothness regularizations. 12. The computer readable media of any one of claims 1-11 wherein at least one of the plurality of reference images is computed based on a combination of at least two of the plurality of reference images. 13. The computer readable media of claim 12 wherein the combination of at least two of the plurality of reference images is a weighted average of the at least two of the plurality of reference images. 14. The computer readable media of claim 13 wherein computing at least some component weights for the weighted average is based at least in part on at least one of: patient gender, patient age, patient weight, patient size, imaging modality and/or point in a breathing cycle of a respective reference image of the at least two of the plurality of reference images. 15. The computer readable media of claim 13 or 14 wherein computing at least some component weights for the weighted average is based at least in part on a metric indicative of quality of transformation between a respective reference image and at least one of (i) the image representative of a portion of patient anatomy, and/or (ii) another one of the reference images. 16. The computer readable media of any one of claims 1-15 wherein at least one of the plurality of reference images has an associated tag indicative of patient gender, patient age, patient weight, patient size, imaging modality, and/or point in a breathing cycle. 17. The computer readable media of any one of claims 1-16 wherein the one or more processors include a graphical processing unit configured to at least in part compute the one or more transformations. 18. A system for visualizing patient anatomy during a medical procedure, the system comprising: a display device; Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 one or more processors; and one or more non-transitory, computer-readable media storing instructions that, when executed by the one or more processors, cause the one or more processors to: obtain an image representative of a portion of patient anatomy; obtain a plurality of reference images with labels indicative of respective sets of one or more anatomical regions of interest (ROIs); compute one or more transformations between the image representative of the portion of patient anatomy and one or more of the plurality of reference images; based at least in part on the one or more transformations, generate one or more sets of labels indicative of anatomical ROIs within the image representative of the portion of patient anatomy; and cause the display device to display a graphical user interface (GUI) depicting at least one of the one or more sets of labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy. 19. The system of claim 18 wherein the portion of patient anatomy includes at least a portion of a thoracic region and the anatomical ROIs include at least one thoracic lymph node station. 20. The system of claim 19 wherein the at least one thoracic lymph node station includes two or more thoracic lymph node stations. 21. The system of any one of claims 18-20 wherein the one or more sets of labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy depicted in the GUI includes a geometric contour. 22. The system of claim 21 wherein the geometric contour is an ellipsoid. 23. The system of any one of claims 18-22 wherein the GUI further depicts the portion of patient anatomy with colorized pixels or voxels based on the one or more sets of labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy. 24. The system of any one of claims 18-23 wherein computing the one or more transformations includes: a discrete registration by search, and a deformable transformation by optimization. 25. The system of any one of claims 18-24 wherein computing the one or more transformations includes computing at least one of: B-spline or P-spline transforms. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 26. The system of any one of claims 18-25 wherein computing the one or more transformations is based at least in part on a machine learning model. 27. The system of claim 26 wherein the machine learning model is a shifted windows transformer. 28. The system of any one of claims 18-27 wherein computing the one or more transformations is based at least in part on computing a cost function, the cost function including at least one of: a mean absolute error (MAE), Dice loss, and/or smoothness regularizations. 29. The system of any one of claims 18-28 wherein at least one of the plurality of reference images is computed based on a combination of at least two of the plurality of reference images. 30. The system of claim 29 wherein the combination of at least two of the plurality of reference images is a weighted average of the at least two of the plurality of reference images. 31. The system of claim 30 wherein computing at least some component weights for the weighted average is based at least in part on at least one of: patient gender, patient age, patient weight, patient size, imaging modality and/or point in a breathing cycle of a respective reference image of the at least two of the plurality of reference images. 32. The system of claim 30 or 31 wherein computing at least some component weights for the weighted average is based at least in part on a metric indicative of quality of transformation between a respective reference image and at least one of: (i) the image representative of a portion of patient anatomy, and/or (ii) another one of the reference images. 33. The system of any one of claims 18-32 wherein at least one of the plurality of reference images has an associated tag indicative of patient gender, patient age, patient weight, patient size, imaging modality, and/or point in a breathing cycle. 34. The system of any one of claims 18-33 wherein the one or more processors include a graphical processing unit configured to at least in part compute the one or more transformations. 35. A method of visualizing patient anatomy during a medical procedure, the method comprising: obtaining, by one or more processors, an image representative of a portion of patient anatomy; obtaining, by the one or more processors, a plurality of reference images with labels indicative of respective sets of one or more anatomical regions of interest (ROIs); computing, by the one or more processors, one or more transformations between the image representative of the portion of patient anatomy and one or more of the plurality of reference images; Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 generating, by the one or more processors and based at least in part on the one or more transformations, one or more sets of labels indicative of anatomical ROIs within the image representative of the portion of patient anatomy; and causing a display device to display a graphical user interface (GUI) depicting at least one of the one or more sets of labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy. 36. The method of claim 35 wherein the portion of patient anatomy includes at least a portion of a thoracic region and the anatomical ROIs include at least one thoracic lymph node station. 37. The method of claim 36 wherein the at least one thoracic lymph node station includes two or more thoracic lymph node stations. 38. The method of any one of claims 35-36 wherein the one or more sets of labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy depicted in the GUI includes a geometric contour. 39. The method of claim 38 wherein the geometric contour is an ellipsoid. 40. The method of any one of claims 35-39 wherein the GUI further depicts the portion of patient anatomy with colorized pixels or voxels based on the one or more sets of labels indicative of the anatomical ROIs within the image representative of the portion of patient anatomy. 41. The method of any one of claims 35-40 wherein computing the one or more transformations includes: a discrete registration by search, and a deformable transformation by optimization. 42. The method of any one of claims 35-41 wherein computing the one or more transformations includes computing at least one of: B-spline or P-spline transforms. 43. The method of any one of claims 35-42 wherein computing the one or more transformations is based at least in part on a machine learning model. 44. The method of claim 43 wherein the machine learning model is a shifted windows transformer. 45. The method of any one of claims 35-44 wherein computing the one or more transformations is based at least in part on computing a cost function, the cost function including at least one of: a mean absolute error (MAE), Dice loss, smoothness regularizations. 46. The method of any one of claims 35-45 wherein at least one of the plurality of reference images is computed based on a combination of at least two of the plurality of reference images. Intuitive Docket No.: P06737-WO Attorney Docket No.: 33685/59035 47. The method of claim 46 wherein the combination of at least two of the plurality of reference images is a weighted average of the at least two of the plurality of reference images. 48. The method of claim 47 wherein computing at least some component weights for the weighted average is based at least in part on at least one of: patient gender, patient age, patient weight, patient size, imaging modality and/or point in a breathing cycle of a respective reference image of the at least two of the plurality of reference images. 49. The method of claim 47 or 48 wherein computing at least some component weights for the weighted average is based at least in part on a metric indicative of quality of transformation between a respective reference image and at least one of: i) the image representative of a portion of patient anatomy, and/or ii) another one of the reference images. 50. The method of any one of claims 35-49 wherein at least one of the plurality of reference images has an associated tag indicative of patient gender, patient age, patient weight, patient size, imaging modality, and/or point in a breathing cycle. 51. The method of any one of claims 35-50 wherein the one or more processors include a graphical processing unit configured to at least in part compute the one or more transformations.
PCT/US2024/0426412023-08-182024-08-16Atlas-based planning and navigation for medical proceduresPendingWO2025042718A1 (en)

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US202363520488P2023-08-182023-08-18
US63/520,4882023-08-18

Publications (1)

Publication NumberPublication Date
WO2025042718A1true WO2025042718A1 (en)2025-02-27

Family

ID=92792665

Family Applications (1)

Application NumberTitlePriority DateFiling Date
PCT/US2024/042641PendingWO2025042718A1 (en)2023-08-182024-08-16Atlas-based planning and navigation for medical procedures

Country Status (1)

CountryLink
WO (1)WO2025042718A1 (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6380732B1 (en)1997-02-132002-04-30Super Dimension Ltd.Six-degree of freedom tracking system having a passive transponder on the object being tracked
US20060013523A1 (en)2004-07-162006-01-19Luna Innovations IncorporatedFiber optic position and shape sensing device and method relating thereto
US7316681B2 (en)1996-05-202008-01-08Intuitive Surgical, IncArticulated surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
US7772541B2 (en)2004-07-162010-08-10Luna Innnovations IncorporatedFiber optic position and/or shape sensing based on rayleigh scatter
US8773650B2 (en)2009-09-182014-07-08Intuitive Surgical Operations, Inc.Optical position and/or shape sensing
US8900131B2 (en)2011-05-132014-12-02Intuitive Surgical Operations, Inc.Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery
US9259274B2 (en)2008-09-302016-02-16Intuitive Surgical Operations, Inc.Passive preload and capstan drive for surgical instruments
WO2016191298A1 (en)2015-05-222016-12-01Intuitive Surgical Operations, Inc.Systems and methods of registration for image guided surgery
WO2019018736A2 (en)2017-07-212019-01-24Intuitive Surgical Operations, Inc.Flexible elongate device systems and methods
US20200170623A1 (en)*2017-05-242020-06-04Body Vision Medical Ltd.Methods for using radial endobronchial ultrasound probes for three-dimensional reconstruction of images and improved target localization

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7316681B2 (en)1996-05-202008-01-08Intuitive Surgical, IncArticulated surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
US6380732B1 (en)1997-02-132002-04-30Super Dimension Ltd.Six-degree of freedom tracking system having a passive transponder on the object being tracked
US20060013523A1 (en)2004-07-162006-01-19Luna Innovations IncorporatedFiber optic position and shape sensing device and method relating thereto
US7772541B2 (en)2004-07-162010-08-10Luna Innnovations IncorporatedFiber optic position and/or shape sensing based on rayleigh scatter
US9259274B2 (en)2008-09-302016-02-16Intuitive Surgical Operations, Inc.Passive preload and capstan drive for surgical instruments
US8773650B2 (en)2009-09-182014-07-08Intuitive Surgical Operations, Inc.Optical position and/or shape sensing
US8900131B2 (en)2011-05-132014-12-02Intuitive Surgical Operations, Inc.Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery
WO2016191298A1 (en)2015-05-222016-12-01Intuitive Surgical Operations, Inc.Systems and methods of registration for image guided surgery
US20200170623A1 (en)*2017-05-242020-06-04Body Vision Medical Ltd.Methods for using radial endobronchial ultrasound probes for three-dimensional reconstruction of images and improved target localization
WO2019018736A2 (en)2017-07-212019-01-24Intuitive Surgical Operations, Inc.Flexible elongate device systems and methods

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HAN R ET AL: "Deformable registration of MRI to intraoperative cone-beam CT of the brain using a joint synthesis and registration network", PROGRESS IN BIOMEDICAL OPTICS AND IMAGING, SPIE - INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING, BELLINGHAM, WA, US, vol. 12034, 4 April 2022 (2022-04-04), pages 1203407 - 1203407, XP060156118, ISSN: 1605-7422, ISBN: 978-1-5106-0027-0, DOI: 10.1117/12.2611783*
JIANG WEI ET AL: "Medical images fusion by using weighted least squares filter and sparse representation", COMPUTERS & ELECTRICAL ENGINEERING., vol. 67, 1 April 2018 (2018-04-01), GB, pages 252 - 266, XP093223377, ISSN: 0045-7906, DOI: 10.1016/j.compeleceng.2018.03.037*
OLTEANU LUIZA A M ET AL: "Evaluation of Deformable Image Coregistration in Adaptive Dose Painting by Numbers for Head-and-Neck Cancer", INTERNATIONAL JOURNAL OF RADIATION: ONCOLOGY BIOLOGY PHYSICS, vol. 83, no. 2, 8 July 2011 (2011-07-08), pages 696 - 703, XP028917060, ISSN: 0360-3016, DOI: 10.1016/J.IJROBP.2011.07.037*
SIMONYAN KAREN ET AL: "Immediate Structured Visual Search for Medical Images", 18 September 2011, SAT 2015 18TH INTERNATIONAL CONFERENCE, AUSTIN, TX, USA, SEPTEMBER 24-27, 2015; [LECTURE NOTES IN COMPUTER SCIENCE; LECT.NOTES COMPUTER], SPRINGER, BERLIN, HEIDELBERG, PAGE(S) 288 - 296, ISBN: 978-3-540-74549-5, XP047367325*

Similar Documents

PublicationPublication DateTitle
US20240041531A1 (en)Systems and methods for registering elongate devices to three-dimensional images in image-guided procedures
US12121204B2 (en)Systems and methods of registration for image guided surgery
US11864856B2 (en)Systems and methods of continuous registration for image-guided surgery
US11080902B2 (en)Systems and methods for generating anatomical tree structures
US20210259783A1 (en)Systems and Methods Related to Registration for Image Guided Surgery
EP3791362B1 (en)Systems and methods related to registration for image guided surgery
WO2025042718A1 (en)Atlas-based planning and navigation for medical procedures
US12373942B2 (en)Systems and methods for progressive registration
WO2025029781A1 (en)Systems and methods for segmenting image data
WO2024163533A1 (en)Elongate device extraction from intraoperative images
WO2025054381A1 (en)Style transfer for intraoperative imaging
WO2025171214A2 (en)Directional filter and/or translational constraints for coordinate registration

Legal Events

DateCodeTitleDescription
121Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number:24772075

Country of ref document:EP

Kind code of ref document:A1


[8]ページ先頭

©2009-2025 Movatter.jp