Movatterモバイル変換


[0]ホーム

URL:


WO2008035271A2 - Device for registering a 3d model - Google Patents

Device for registering a 3d model
Download PDF

Info

Publication number
WO2008035271A2
WO2008035271A2PCT/IB2007/053740IB2007053740WWO2008035271A2WO 2008035271 A2WO2008035271 A2WO 2008035271A2IB 2007053740 WIB2007053740 WIB 2007053740WWO 2008035271 A2WO2008035271 A2WO 2008035271A2
Authority
WO
WIPO (PCT)
Prior art keywords
model
spatial coordinates
coordinates
imaging device
examination apparatus
Prior art date
Application number
PCT/IB2007/053740
Other languages
French (fr)
Other versions
WO2008035271A3 (en
Inventor
Heinrich Schulz
Jochen Kruecker
Original Assignee
Koninklijke Philips Electronics N.V.
Philips Intellectual Property & Standards Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V., Philips Intellectual Property & Standards GmbhfiledCriticalKoninklijke Philips Electronics N.V.
Publication of WO2008035271A2publicationCriticalpatent/WO2008035271A2/en
Publication of WO2008035271A3publicationCriticalpatent/WO2008035271A3/en

Links

Classifications

Definitions

Landscapes

Abstract

The invention relates to an examination apparatus comprising an imaging device, for example an ultrasonic scanner (3), for generating sectional images (S) of a body volume, and a localization device (4) for determining the spatial coordinates (ij) of the imaging device (3) with respect to a reference frame (x, y, z). Optionally also the spatial coordinates (Γ_M, Ic) of artificial markers (M) attached to the examined object (2) and/or of an instrument like a catheter (6) can be determined. Object features (A) ('anatomical markers') can be identified in the sectional images (S), and a registration between the reference frame (x, y, z) and a previously acquired 3D model (V) of the examined body region can be determined by a data processing device (10) based on the acquired data.

Description

Device for registering a 3D model
The invention relates to an examination apparatus, a method, and a record carrier that allow to register live sectional images of an object with a previously acquired 3D model.
Surgery and minimally invasive therapy require reliable, precise navigation along predefined paths to predetermined target points. Many kinds of interventions are therefore guided based on pre-operative high-resolution 3D images taken from the region of interest. The US 2005/0027193 Al describes in this respect a method that allows the automatic merging of two-dimensional ("2D") X-ray projections generated with a C-arm X-ray device with pre-operatively generated three-dimensional ("3D") images. By fixing a tool plate to the C-arm and by localizing said plate, projections can be matched with the 3D images independently of the actual C-arm position and without repeated use of markers attached to the patients.
Based on this background it was an object of the present invention to provide means that allow to navigate during an examination procedure with higher accuracy using a pre-operatively generated 3D model.
This objective is achieved by an examination apparatus according to claim 1, by a method according to claim 10, and by a record carrier according to claim 11. Preferred embodiments are disclosed in the dependent claims.
The examination apparatus according to the present invention is intended for examining an object, particularly for (minimally invasive) diagnostic or therapeutic interventions at patients. The apparatus comprises the following components: a) An imaging device for generating a sectional image of the object. The image will typically be two-dimensional, but it may also be three-dimensional. By definition, each pixel/voxel of a "sectional image" uniquely corresponds to one point of the object (in contrast to projections, where each pixel corresponds to a line through the object). The imaging device may typically be a mobile, e.g. hand- held device that allows a flexible selection of the plane of the generated sectional image with respect to the object. b) A localization device for determining spatial coordinates of the imaging device with respect to a given reference frame. The "spatial coordinates" may in general comprise the spatial position and/or orientation of selected points. In case of the imaging device, the spatial coordinates will typically comprise the spatial positions of three selected points on the imaging device. If not otherwise stated, the term "spatial coordinates" will in the following always refer to the given reference frame. c) A data processing device, e.g. a microcomputer or workstation, for registering the reference frame with a given 3D model of the object based on the spatial coordinates and model coordinates of at least one object feature ("anatomical marker") that appears in the sectional image and in the 3D model. In this respect, a "3D model" is understood as usual as a three-dimensional set of data, more precisely as a mapping of three-dimensional model coordinates x', y', z' to associated image values, for example grey values, color values, or membership values (e.g. describing if the point with the model coordinates belongs to bone, tissue or the like).
The described examination apparatus has the advantage that it allows an improved matching of measured spatial coordinates with a previously acquired 3D model as it exploits anatomical markers, which appear in the sectional image and which can be chosen from the immediate surroundings of the region of interest. It should be noted in this respect that the spatial coordinates of an object feature appearing in a sectional image can uniquely be determined (which is not the case with projection images). Therefore, each object feature provides a maximal amount of information for the desired registration between reference frame and 3D model.
According to a preferred embodiment of the invention, the examination apparatus further comprises at least one marker that is attached to the object, wherein said marker is called "artificial marker" in the following to distinguish it from the mentioned object features that can be considered as "natural" or "anatomical" markers. The artificial marker may for example be a piece of metal (e.g. in the shape of a circle or a cross) which can be attached to the skin of a patient and shows up in X-ray images with a high contrast. Typically, a set of several artificial markers is used. The spatial coordinates of the artificial marker(s) with respect to a given reference frame can be determined with the localization device. Moreover, the registering of the reference frame with the given 3D model of the object in the data processing device can additionally be based on the spatial coordinates and model coordinates of the at least one artificial marker. This allows to improve the accuracy of the registering procedure and provides initial values for a coarse registration as long as object features have not yet been registered. In another preferred embodiment of the apparatus, the data processing device comprises a "landmark determination module" for determining the spatial coordinates of the at least one object feature based on the sectional image and on the spatial coordinates of the imaging device. The monitoring of the spatial coordinates of the imaging device allows to move said imaging device during an intervention while keeping full control over the spatial position of the generated sectional images. This fact is exploited by the landmark determination module for calculating the spatial coordinates of an identified object feature. The object features that are used for the registration process may in principle be determined automatically by the data processing device. In a preferred embodiment of the invention, the data processing device is however connected to an input unit, e.g. a keyboard, a mouse or another pointing device, for interactively determining the object feature(s) in the sectional image and/or in the 3D model. Thus the expert knowledge of a human operator can be exploited to identify and correctly locate characteristic object features (e.g. vessel bifurcations or bone structures) in the available images. The examination apparatus may further comprise an instrument that can be navigated within the object, wherein the spatial coordinates of said instrument can be determined by the localization device. The navigation of an instrument like a needle or a catheter in the body of a patient is a typical task in minimally invasive surgery. As the spatial coordinates of this instrument can be determined, the position of the instrument can be identified and visualized in the 3D model, too, due to the available registration between reference frame and 3D model. It is therefore possible to track the movement of the instrument with high accuracy in the 3D model.
The localization device can in principle be realized by any suitable system that allows to measure the required spatial coordinates with sufficient accuracy. Suited localization devices may operate for example based on magnetic, electromagnetic, optical or acoustical measurements. They often use "active" markers which do not only passively appear in images (e.g. as a contrast in an X-ray projection) but actively generate data or signals that allow to determine their spatial position and/or orientation. The active markers may particularly measure and/or emit signals themselves. One example of an active marker is a magnetic field sensor that can measure magnitude and orientation of an external (spatially or temporally inhomogeneous) magnetic field, wherein said measurements allow to infer the spatial position of the marker with respect to the generator of the magnetic field. In another embodiment, an active marker may be a source of electromagnetic and/or acoustical radiation, e.g. of near infra red (NIR) or ultrasound, wherein the position of this source can be determined by stereoscopic methods from the intersection of at least two independent lines of sight.
The imaging device may in principle be any device that allows to generate (in real time) sectional images of the object under examination. Preferably, the imaging device comprises a 2D or 3D ultrasonic scanner, which is a compact device that may readily be used by a physician during an intervention.
The 3D model which is registered with the reference frame may originate from any suitable source, for example from theoretical constructions or from statistical data pools.
Preferably, the 3D model is however previously acquired from the particular object under examination by a further imaging device, e.g. a Computed Tomography (CT) scanner or a
Magnetic Resonance Imaging (MRI) scanner. If an artificial marker is used, it should already be present at its final location on the object during the acquisition of the 3D model.
Generating 3D images of a body volume with a CT or MRI scanner is a typical diagnostic step prior to a surgical intervention, and therefore the associated 3D models are usually available without additional effort.
The examination apparatus may further optionally comprise a display device, e.g. a monitor, for displaying the sectional image generated by the imaging device, the 3D model and/or images derived therefrom (e.g. sections calculated from the 3D model or overlays of the sectional image and the 3D model). The invention further relates to a method for registering a reference frame with a 3D model of an object, the method comprising the following steps: a) Generating a sectional image of the object with an imaging device, for example an ultrasonic scanner. b) Measuring the spatial coordinates with respect to the reference frame of the imaging device and optionally also of at least one artificial marker that is attached to the object. c) Determining the spatial coordinates of at least one object feature in the sectional image. d) Identifying the model coordinates (i.e. coordinates with respect to a model based reference frame) of the object feature and, if applicable, of the artificial marker in the
3D model. e) Calculating the desired registration between the reference frame and the 3D model based on the spatial and model coordinates of the at least one object feature and optionally of the artificial marker. The method comprises in general form the steps that can be executed with an examination apparatus of the kind described above. Therefore, reference is made to the preceding description for more information on the details, advantages and improvements of that method. Finally, the invention comprises a record carrier, for example a floppy disk, a hard disk, or a compact disc (CD), on which a computer program for registering a reference frame with a 3D model of an object is stored, wherein said program is adapted to execute a method of the aforementioned kind.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter. These embodiments will be described by way of example with the help of the accompanying single drawing which shows schematically an examination apparatus according to the present invention.
The Figure shows in particular a patient 2 lying on a table 1 during a minimally invasive intervention comprising the navigation of a catheter 6 through the vessel system of the patient 2. A movable ultrasonic scanner 3 is used to generate a two-dimensional sectional image S of the body, wherein the plane of said sectional image is indicated by a dotted line and wherein the data of the generated sectional image S are communicated to a data processing device 10 (e.g. a workstation).
The examination apparatus further comprises a localization device 4 which allows to determine the spatial coordinates - with respect to a given reference frame x, y, z - of various components:
The spatial coordinates rj (typically including both position and orientation) of at least one marker I on the imaging device 3, allowing to determine the position and orientation of the imaging device and thus of the generated sectional images provided that the imaging device 3 has been calibrated beforehand. If an ultrasonic scanner is used as imaging device 3, it is usually the head of this scanner that is tracked.
The spatial coordinates re of a marker C fixed at the tip of the catheter 6, allowing to determine the position and orientation of said tip. The spatial coordinates of the tip of a pointer 5, wherein said pointer may particularly be moved to artificial (e.g. X-ray opaque) markers M attached to the skin of the patient 2 for determining their spatial coordinates Γ_M.
The spatial coordinates ij, Γ_M, Ic measured by the localization device 4 are also transferred to the data processing device 10.
The data processing device 10 further comprises a storage 13 in which a 3D model V of the examined body region is stored. Said 3D model V may for example have been generated pre-operatively with a CT scanner 30 or an MRI device.
The scenario described up to now and similar situations in surgery and minimally invasive therapy require a reliable, precise navigation of the catheter or other instruments along predefined paths to predetermined target points. These interventions may therefore be guided based on the pre-operative high-resolution 3D models. The 3D models are however non real-time (at least when high resolution is required), and their use for imaging during an intervention is thus very limited. Proper spatial registration of pre- operative images in combination with a navigation system, i.e. the localization device 4, would make it possible to use the images much more effectively during an invention.
A spatial registration that is solely based on the spatial coordinates of the artificial markers M (determined via the pointer 5 and the localization device 4) and the image of these markers in the 3D model typically suffers from inaccuracy caused by various kinds of motion changing the spatial relationship of the markers M to each other.
Furthermore, the localization device 4 itself has limited accuracy and the accuracy may vary strongly within the region of interest. For these reasons the markers M should be positioned surrounding the region of interest, thereby averaging errors at the marker positions. Since markers are typically placed on the patient skin only, an ideal situation can however hardly be achieved.
In order to improve the accuracy of the registration between the reference frame x, y, z and the 3D model V (i.e. the model based reference frame x', y', z'), it is proposed here to make use of both artificial markers M attached to the patient and of natural or anatomical markers A that are present in the examined object and that can be identified both on the sectional images S and in the 3D model V. The Figure shows in this respect the bifurcation A of a vessel as an exemplary anatomical marker. The data processing device 10 is preferably coupled to input devices like a keyboard 21 and a mouse 22 via which a physician can select object features as anatomical markers in the sectional image S and/or the 3D model V. A landmark determination module 11 (which may be realized by dedicated hardware, by software, or by a mixture of both) can then calculate the spatial coordinates ΓA of the anatomical marker(s) A with respect to the reference frame x, y, z by taking the actual coordinates rj of the imaging device 3 into consideration.
As was described above, the spatial coordinates Γ_M of the artificial markers M can directly be measured by the localization device 4 (using the pointer 5). A registration module 12 can therefore determine the required registration between the reference frame x, y, z and the 3D model V, i.e. the model based coordinate frame x', y', z', or with other words the mapping between spatial coordinates r and model coordinates r'. Once this registration is known, the spatial coordinates re of the tip of the catheter, which are directly measured by the localization device 4, may be visualized in a representation of the 3D model V on a monitor 23.
In summary, it is proposed to identify anatomical landmarks A in the sectional live images S, identify the corresponding location in the pre-operative 3D model V, and use these points as additional markers for the registration. By doing so a marker set adapted to the region of interest can be obtained in successive steps leading to improved registration accuracy.
Using a spatially tracked live (real-time) imaging modality in addition to the pre-operative 3D image allows to establish registration, to improve accuracy of registration, and/or to verify registration between physical space/localization system space and the pre- operative image. The live image is used to identify the current physical location of common anatomical landmarks visible in both (live and pre-operative) images. These landmarks can either be used by themselves to establish a registration, or can be combined with the positions of external fiducial markers M to improve spatial registration accuracy of fiducial marker based registration procedures. The workstation 10 should offer the possibility to navigate on captured images S and V, to identify points in both images, and to compute registration transformations mapping one set of coordinates onto another. The workstation 10 will then allow the computation of an initial registration based on coordinates Γ'M of artificial markers M identified in the pre-operative 3D image V and corresponding coordinates Γ_M identified with the pointer device 5. The workstation 10 will further allow improvement or substitution of the initial registration by identifying landmarks A in the live image S, converting the live image coordinates of the identified landmarks to tracking system coordinates ΓA and mapping those to corresponding coordinates ΓA' in the pre-operative 3D image V, in combination with or replacing the coordinates identified for the initial registration. For verification of the registration, the workstation 10 should offer a side-by- side or overlay display of the live image S and the corresponding section of the pre-operative image V.
The invention can be applied in a variety of clinical procedures comprising minimally invasive surgery, interventional radiology, and catheter examinations. In particular, needle-based procedures such as biopsies and ablations, laparoscopic procedures, EP procedures and stenting can be improved using the proposed registration method.
Finally it is pointed out that in the present application the term "comprising" does not exclude other elements or steps, that "a" or "an" does not exclude a plurality, and that a single processor or other unit may fulfill the functions of several means. The invention resides in each and every novel characteristic feature and each and every combination of characteristic features. Moreover, reference signs in the claims shall not be construed as limiting their scope.

Claims

CLAIMS:
1. Examination apparatus for examining an object (2), comprising a) an imaging device (3) for generating a sectional image (S) of the object (2); b) a localization device (4) for determining the spatial coordinates (ij) of the imaging device (3) with respect to a reference frame (x, y, z); c) a data processing device (10) for registering the reference frame (x, y, z) with a 3D model (V) of the object (2) based on the spatial coordinates (ΓA) and model coordinates (rΑ) of at least one object feature (A) that appears in the sectional image (S) and in the 3D model (V).
2. Examination apparatus according to claim 1, characterized in that it comprises at least one artificial marker (M) that can be attached to the object (2), and that the registering of the data processing device (10) is further based on the spatial coordinates (Γ_M) and model coordinates (Γ'M) of said artificial marker (M).
3. The examination apparatus according to claim 1, characterized in that the data processing device (10) comprises a landmark determination module (11) for determining the spatial coordinates (iχ) of the at least one object feature (A) based on the sectional image (S) and on the spatial coordinates (ij) of the imaging device (3).
4. The examination apparatus according to claim 1, characterized in that it comprises an input unit (21, 22) that is coupled to the data processing device (10) for interactively determining the at least one object feature (A) in the sectional image (S) and/or in the 3D model (V).
5. The examination apparatus according to claim 1, characterized in that it comprises an instrument (6) that can be moved within the object (2), wherein the spatial coordinates (re) of the instrument can be determined by the localization device (4).
6. The examination apparatus according to claim 1, characterized in that the localization device (4) operates based on magnetic, electromagnetic, optical or acoustical measurements.
7. The examination apparatus according to claim 1, characterized in that the imaging device comprises an ultrasonic scanner (3).
8. The examination apparatus according to claim 1, characterized in that the 3D model (V) is acquired by a further imaging device, preferably a CT scanner (30) or a MRI scanner.
9. The examination apparatus according to claim 1, characterized in that it comprises a display device (23) for displaying the sectional image (S), the 3D model (V) and/or images derived therefrom.
10. A method for registering a reference frame (x, y, z) with a 3D model (V) of an object (2), comprising a) generating a sectional image (S) of the object (2) with an imaging device (3); b) measuring the spatial coordinates (ij) of the imaging device (3); c) determining the spatial coordinates (iχ) of at least one object feature (A) from the sectional image (S); d) identifying the model coordinates (r_Α) of the object feature (A) in the 3D model (V); e) calculating the registration between the reference frame (x, y, z) and the 3D model (V) based on the spatial coordinates (iχ) and the model coordinates (rΑ) of the object feature (A).
11. A record carrier on which a computer program registering a reference frame (x, y, z) with a 3D model (V) of an object (2) is stored, said program being adapted to execute a method according to claim 10.
PCT/IB2007/0537402006-09-202007-09-17Device for registering a 3d modelWO2008035271A2 (en)

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
EP061209722006-09-20
EP06120972.22006-09-20

Publications (2)

Publication NumberPublication Date
WO2008035271A2true WO2008035271A2 (en)2008-03-27
WO2008035271A3 WO2008035271A3 (en)2008-11-06

Family

ID=39200920

Family Applications (1)

Application NumberTitlePriority DateFiling Date
PCT/IB2007/053740WO2008035271A2 (en)2006-09-202007-09-17Device for registering a 3d model

Country Status (1)

CountryLink
WO (1)WO2008035271A2 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2009150647A3 (en)*2008-06-112010-03-18Dune Medical Devices Ltd.Double registration
WO2013140315A1 (en)*2012-03-232013-09-26Koninklijke Philips N.V.Calibration of tracked interventional ultrasound
US9257220B2 (en)2013-03-052016-02-09Ezono AgMagnetization device and method
US9459087B2 (en)2013-03-052016-10-04Ezono AgMagnetic position detection system
US9597008B2 (en)2011-09-062017-03-21Ezono AgImaging probe and method of obtaining position and/or orientation information
JP2017217510A (en)*2013-03-152017-12-14ザ クリーブランド クリニック ファウンデーションThe Cleveland ClinicFoundationMethod and system to facilitate intraoperative positioning and guidance
KR20190076806A (en)*2017-12-222019-07-02한국기술교육대학교 산학협력단Method for workspace modeling based on virtual wall using 3d scanner and system thereof
US10434278B2 (en)2013-03-052019-10-08Ezono AgSystem for image guided procedure
US11132801B2 (en)2018-02-022021-09-28Centerline Biomedical, Inc.Segmentation of three-dimensional images containing anatomic structures
US11150776B2 (en)2018-02-022021-10-19Centerline Biomedical, Inc.Graphical user interface for marking anatomic structures
US11393110B2 (en)2019-04-042022-07-19Centerline Biomedical, Inc.Spatial registration of tracking system with an image using two-dimensional image projections
US11538574B2 (en)2019-04-042022-12-27Centerline Biomedical, Inc.Registration of spatial tracking system with augmented reality display

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
DE19919907C2 (en)*1999-04-302003-10-16Siemens Ag Method and device for catheter navigation in three-dimensional vascular tree images

Cited By (23)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9008756B2 (en)2008-06-112015-04-14Dune Medical Devices Ltd.Mapping system and method for mapping a target containing tissue
WO2009150647A3 (en)*2008-06-112010-03-18Dune Medical Devices Ltd.Double registration
US10758155B2 (en)2011-09-062020-09-01Ezono AgImaging probe and method of obtaining position and/or orientation information
US9597008B2 (en)2011-09-062017-03-21Ezono AgImaging probe and method of obtaining position and/or orientation information
US10765343B2 (en)2011-09-062020-09-08Ezono AgImaging probe and method of obtaining position and/or orientation information
WO2013140315A1 (en)*2012-03-232013-09-26Koninklijke Philips N.V.Calibration of tracked interventional ultrasound
US9459087B2 (en)2013-03-052016-10-04Ezono AgMagnetic position detection system
US9257220B2 (en)2013-03-052016-02-09Ezono AgMagnetization device and method
US10434278B2 (en)2013-03-052019-10-08Ezono AgSystem for image guided procedure
JP2022059084A (en)*2013-03-152022-04-12ザ クリーブランド クリニック ファウンデーション A system that facilitates position adjustment and guidance during surgery
JP2017217510A (en)*2013-03-152017-12-14ザ クリーブランド クリニック ファウンデーションThe Cleveland ClinicFoundationMethod and system to facilitate intraoperative positioning and guidance
US12048523B2 (en)2013-03-152024-07-30The Cleveland Clinic FoundationMethod and system to facilitate intraoperative positioning and guidance
US10799145B2 (en)2013-03-152020-10-13The Cleveland Clinic FoundationMethod and system to facilitate intraoperative positioning and guidance
EP2996607B1 (en)*2013-03-152021-06-16The Cleveland Clinic FoundationSystem to facilitate intraoperative positioning and guidance
JP7290763B2 (en)2013-03-152023-06-13ザ クリーブランド クリニック ファウンデーション A system that facilitates intraoperative positioning and guidance
KR101998396B1 (en)2017-12-222019-07-09한국기술교육대학교 산학협력단Method for workspace modeling based on virtual wall using 3d scanner and system thereof
KR20190076806A (en)*2017-12-222019-07-02한국기술교육대학교 산학협력단Method for workspace modeling based on virtual wall using 3d scanner and system thereof
US11150776B2 (en)2018-02-022021-10-19Centerline Biomedical, Inc.Graphical user interface for marking anatomic structures
US11604556B2 (en)2018-02-022023-03-14Centerline Biomedical, Inc.Graphical user interface for marking anatomic structures
US11132801B2 (en)2018-02-022021-09-28Centerline Biomedical, Inc.Segmentation of three-dimensional images containing anatomic structures
US11393110B2 (en)2019-04-042022-07-19Centerline Biomedical, Inc.Spatial registration of tracking system with an image using two-dimensional image projections
US11538574B2 (en)2019-04-042022-12-27Centerline Biomedical, Inc.Registration of spatial tracking system with augmented reality display
US12119101B2 (en)2019-04-042024-10-15Centerline Biomedical, Inc.Registration of spatial tracking system with augmented reality display

Also Published As

Publication numberPublication date
WO2008035271A3 (en)2008-11-06

Similar Documents

PublicationPublication DateTitle
WO2008035271A2 (en)Device for registering a 3d model
US9320569B2 (en)Systems and methods for implant distance measurement
US7885441B2 (en)Systems and methods for implant virtual review
US8131031B2 (en)Systems and methods for inferred patient annotation
US7831096B2 (en)Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use
US7664542B2 (en)Registering intra-operative image data sets with pre-operative 3D image data sets on the basis of optical surface extraction
US8682413B2 (en)Systems and methods for automated tracker-driven image selection
CN100591282C (en) System for guiding a medical device inside a patient
US6996430B1 (en)Method and system for displaying cross-sectional images of a body
US7467007B2 (en)Respiratory gated image fusion of computed tomography 3D images and live fluoroscopy images
US8024026B2 (en)Dynamic reference method and system for use with surgical procedures
JP6404713B2 (en) System and method for guided injection in endoscopic surgery
US8694075B2 (en)Intra-operative registration for navigated surgical procedures
EP2153794B1 (en)System for and method of visualizing an interior of a body
US10357317B2 (en)Handheld scanner for rapid registration in a medical navigation system
US20080119725A1 (en)Systems and Methods for Visual Verification of CT Registration and Feedback
EP3285674B1 (en)Direct visualization of a device location
US20180303550A1 (en)Endoscopic View of Invasive Procedures in Narrow Passages
US20050004449A1 (en)Method for marker-less navigation in preoperative 3D images using an intraoperatively acquired 3D C-arm image
US20080154120A1 (en)Systems and methods for intraoperative measurements on navigated placements of implants
US20120296203A1 (en)Automatic Identification Of Tracked Surgical Devices Using An Electromagnetic Localization System
US20080119712A1 (en)Systems and Methods for Automated Image Registration
JP2020058779A (en)Method for supporting user, computer program product, data storage medium, and imaging system
US20080119724A1 (en)Systems and methods for intraoperative implant placement analysis
CN100473355C (en)System for introducing a medical instrument into a patient

Legal Events

DateCodeTitleDescription
121Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number:07826403

Country of ref document:EP

Kind code of ref document:A2

NENPNon-entry into the national phase

Ref country code:DE

122Ep: pct application non-entry in european phase

Ref document number:07826403

Country of ref document:EP

Kind code of ref document:A2


[8]ページ先頭

©2009-2025 Movatter.jp