Movatterモバイル変換


[0]ホーム

URL:


US20170140527A1 - Locally Applied Transparency for a CT Image - Google Patents

Locally Applied Transparency for a CT Image
Download PDF

Info

Publication number
US20170140527A1
US20170140527A1US14/942,455US201514942455AUS2017140527A1US 20170140527 A1US20170140527 A1US 20170140527A1US 201514942455 AUS201514942455 AUS 201514942455AUS 2017140527 A1US2017140527 A1US 2017140527A1
Authority
US
United States
Prior art keywords
instrument
plane
bounding
representation
external surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/942,455
Other versions
US9947091B2 (en
Inventor
Assaf Govari
Vadim Gliner
Ram B. Mayer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Biosense Webster Israel Ltd
Original Assignee
Biosense Webster Israel Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Biosense Webster Israel LtdfiledCriticalBiosense Webster Israel Ltd
Priority to US14/942,455priorityCriticalpatent/US9947091B2/en
Assigned to BIOSENSE WEBSTER (ISRAEL) LTD.reassignmentBIOSENSE WEBSTER (ISRAEL) LTD.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GOVARI, ASSAF, Gliner, Vadim, MAYER, RAM B.
Priority to IL248535Aprioritypatent/IL248535B/en
Priority to AU2016253613Aprioritypatent/AU2016253613A1/en
Priority to CA2947873Aprioritypatent/CA2947873A1/en
Priority to KR1020160149364Aprioritypatent/KR20170057141A/en
Priority to EP16198933.0Aprioritypatent/EP3173023B1/en
Priority to JP2016222216Aprioritypatent/JP7051286B2/en
Priority to CN201611030502.4Aprioritypatent/CN107016662B/en
Publication of US20170140527A1publicationCriticalpatent/US20170140527A1/en
Publication of US9947091B2publicationCriticalpatent/US9947091B2/en
Application grantedgrantedCritical
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

A method, including, receiving three-dimensional tomographic data with respect to a body of a living subject, and using the data to generate a representation of an external surface of the body and displaying the representation on a screen. The method further includes inserting an invasive instrument into a region of the body and identifying a position of the instrument in the body. The method also includes rendering an area of the external surface surrounding the identified position of the instrument locally transparent in the displayed representation, so as to make visible on the screen an internal structure of the body in a vicinity of the identified position.

Description

Claims (30)

We claim:
1. A method, comprising,
receiving three-dimensional tomographic data with respect to a body of a living subject;
using the data to generate a representation of an external surface of the body and displaying the representation on a screen;
inserting an invasive instrument into a region of the body and identifying a position of the instrument in the body; and
rendering an area of the external surface surrounding the identified position of the instrument locally transparent in the displayed representation, so as to make visible on the screen an internal structure of the body in a vicinity of the identified position.
2. The method according toclaim 1, wherein the tomographic data is derived from at least one of computerized tomography using X-rays, magnetic resonance imaging, positron emission tomography, single photon emission computed tomography, and ultrasound tomography.
3. The method according toclaim 1, wherein the invasive instrument comprises a sensor configured to generate a signal in response to a magnetic field traversing the sensor, and wherein identifying the position of the instrument comprises using the signal to identify the position.
4. The method according toclaim 1, and comprising incorporating an icon representing the invasive instrument into the displayed representation.
5. The method according toclaim 1, and comprising registering an imaging frame of reference of the representation with a tracking frame of reference used in tracking the position of the instrument.
6. The method according toclaim 1, and comprising defining a bounding plane with respect to the identified position of the instrument, wherein the area of the external surface is on a first side of the bounding plane, and wherein the internal-structure-made-visible is on a second side, opposite the first side, of the bounding plane.
7. The method according toclaim 6, and comprising defining a bounding region, surrounding the identified position, within the bounding plane, so that the area of the external region and the internal-structure-made-visible, when projected orthogonally to the bounding plane, lie within the bounding region.
8. The method according toclaim 6, wherein the representation of the external surface comprises a projection of the external onto an image plane, and wherein the bounding plane is parallel to the image plane.
9. The method according toclaim 6, wherein the representation of the external surface comprises a projection of the external surface onto an image plane, and wherein the bounding plane is not parallel to the image plane.
10. The method according toclaim 6, wherein the bounding plane contains the identified position of the instrument.
11. The method according toclaim 6, wherein the bounding plane does not contain the identified position of the instrument.
12. The method according toclaim 1, wherein the tomographic data comprises computerized tomographic (CT) data derived from X-rays of the body of the living subject, and wherein a region of the internal structure of the body having a low attenuation of the X-rays is rendered transparent in the displayed representation.
13. The method according toclaim 1, wherein the internal structure in the displayed representation comprises a non-segmented image derived from the tomographic data.
14. The method according toclaim 1, wherein the region of the body comprises a nasal sinus of the living subject.
15. The method according toclaim 14, wherein the invasive instrument comprises a guidewire inserted into the nasal sinus.
16. Apparatus, comprising:
an invasive instrument configured to be inserted into a region of a body of a living subject;
a screen configured to display a representation of an external surface of the body; and
a processor configured to:
receive three-dimensional tomographic data with respect to the body,
use the data to generate the representation of the external surface,
identify a position of the instrument in the body, and
render an area of the external surface surrounding the identified position of the instrument locally transparent in the displayed representation, so as to make visible on the screen an internal structure of the body in a vicinity of the identified position.
17. The apparatus according toclaim 16, wherein the tomographic data is derived from at least one of computerized tomography using X-rays, magnetic resonance imaging, positron emission tomography, single photon emission computed tomography, and ultrasound tomography.
18. The apparatus according toclaim 16, wherein the invasive instrument comprises a sensor configured to generate a signal in response to a magnetic field traversing the sensor, and wherein identifying the position of the instrument comprises using the signal to identify the position.
19. The apparatus according toclaim 16, wherein the processor is configured to incorporate an icon representing the invasive instrument into the displayed representation.
20. The apparatus according toclaim 16, wherein the processor is configured to register an imaging frame of reference of the representation with a tracking frame of reference used in tracking the position of the instrument.
21. The apparatus according toclaim 16, wherein the processor is configured to define a bounding plane with respect to the identified position of the instrument, wherein the area of the external surface is on a first side of the bounding plane, and wherein the internal-structure-made-visible is on a second side, opposite the first side, of the bounding plane.
22. The apparatus according toclaim 21, wherein the processor is configured to define a bounding region, surrounding the identified position, within the bounding plane, so that the area of the external region and the internal-structure-made-visible, when projected orthogonally to the bounding plane, lie within the bounding region.
23. The apparatus according toclaim 21, wherein the representation of the external surface comprises a projection of the external surface onto an image plane, and wherein the bounding plane is parallel to the image plane.
24. The apparatus according toclaim 21, wherein the representation of the external surface comprises a projection of the external surface onto an image plane, and wherein the bounding plane is not parallel to the image plane.
25. The apparatus according toclaim 21, wherein the bounding plane contains the identified position of the instrument.
26. The apparatus according toclaim 21, wherein the bounding plane does not contain the identified position of the instrument.
27. The apparatus according toclaim 16, wherein the tomographic data comprises computerized tomographic (CT) data derived from X-rays of the body of the living subject, and wherein a region of the internal structure of the body having a low attenuation of the X-rays is rendered transparent in the displayed representation.
28. The apparatus according toclaim 16, wherein the internal structure in the displayed representation comprises a non-segmented image derived from the tomographic data.
29. The apparatus according toclaim 16, wherein the region of the body comprises a nasal sinus of the living subject.
30. The apparatus according toclaim 29, wherein the invasive instrument comprises a guidewire inserted into the nasal sinus.
US14/942,4552015-11-162015-11-16Locally applied transparency for a CT imageActive2036-06-06US9947091B2 (en)

Priority Applications (8)

Application NumberPriority DateFiling DateTitle
US14/942,455US9947091B2 (en)2015-11-162015-11-16Locally applied transparency for a CT image
IL248535AIL248535B (en)2015-11-162016-10-26Locally applied transparency for a ct image
AU2016253613AAU2016253613A1 (en)2015-11-162016-11-03Locally applied transparency for a ct image
CA2947873ACA2947873A1 (en)2015-11-162016-11-08Locally applied transparency for a ct image
KR1020160149364AKR20170057141A (en)2015-11-162016-11-10Locally applied transparency for a ct image
EP16198933.0AEP3173023B1 (en)2015-11-162016-11-15Locally applied transparency for a ct image
JP2016222216AJP7051286B2 (en)2015-11-162016-11-15 Transparency method applied locally to CT images
CN201611030502.4ACN107016662B (en)2015-11-162016-11-16Locally applying transparency to CT images

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US14/942,455US9947091B2 (en)2015-11-162015-11-16Locally applied transparency for a CT image

Publications (2)

Publication NumberPublication Date
US20170140527A1true US20170140527A1 (en)2017-05-18
US9947091B2 US9947091B2 (en)2018-04-17

Family

ID=57345733

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US14/942,455Active2036-06-06US9947091B2 (en)2015-11-162015-11-16Locally applied transparency for a CT image

Country Status (8)

CountryLink
US (1)US9947091B2 (en)
EP (1)EP3173023B1 (en)
JP (1)JP7051286B2 (en)
KR (1)KR20170057141A (en)
CN (1)CN107016662B (en)
AU (1)AU2016253613A1 (en)
CA (1)CA2947873A1 (en)
IL (1)IL248535B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
EP3666217A1 (en)*2018-12-132020-06-17Biosense Webster (Israel) Ltd.Composite visualization of body part
US10959677B2 (en)*2017-04-262021-03-30Acclarent, Inc.Apparatus to secure field generating device to chair
US11183295B2 (en)*2017-08-312021-11-23Gmeditec Co., Ltd.Medical image processing apparatus and medical image processing method which are for medical navigation device
EP3395282B1 (en)*2017-04-252023-08-02Biosense Webster (Israel) Ltd.Endoscopic view of invasive procedures in narrow passages

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP6619456B2 (en)*2016-01-142019-12-11オリンパス株式会社 Medical manipulator system and method for operating medical manipulator system
US11154363B1 (en)*2016-05-242021-10-26Paul A. LovoiTerminal guidance for improving the accuracy of the position and orientation of an object
US20190159843A1 (en)*2017-11-282019-05-30Biosense Webster (Israel) Ltd.Low profile dual pad magnetic field location system with self tracking
JP7235519B2 (en)*2019-01-292023-03-08ザイオソフト株式会社 MEDICAL IMAGE PROCESSING APPARATUS, MEDICAL IMAGE PROCESSING METHOD, AND MEDICAL IMAGE PROCESSING PROGRAM

Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5558091A (en)*1993-10-061996-09-24Biosense, Inc.Magnetic determination of position and orientation
US7225012B1 (en)*2000-09-182007-05-29The Johns Hopkins UniversityMethods and systems for image-guided surgical interventions
US20070197896A1 (en)*2005-12-092007-08-23Hansen Medical, IncRobotic catheter system and methods
US8208991B2 (en)*2008-04-182012-06-26Medtronic, Inc.Determining a material flow characteristic in a structure
US9398936B2 (en)*2009-02-172016-07-26Inneroptic Technology, Inc.Systems, methods, apparatuses, and computer-readable media for image guided surgery

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP3232612B2 (en)*1992-01-102001-11-26株式会社日立製作所 3D image information presentation method
US6167296A (en)1996-06-282000-12-26The Board Of Trustees Of The Leland Stanford Junior UniversityMethod for volumetric image navigation
EP0999785A4 (en)*1997-06-272007-04-25Univ Leland Stanford Junior METHOD AND APPARATUS FOR GENERATING THREE-DIMENSIONAL IMAGES FOR "NAVIGATION" PURPOSES
US7006085B1 (en)*2000-10-302006-02-28Magic Earth, Inc.System and method for analyzing and imaging three-dimensional volume data sets
JP4171833B2 (en)*2002-03-192008-10-29国立大学法人東京工業大学 Endoscope guidance device and method
CN100445488C (en)*2005-08-012008-12-24邱则有 A cavity component for cast-in-place concrete molding
EP2123232A4 (en)*2007-01-312011-02-16Nat University Corp Hamamatsu University School Of Medicine DEVICE, METHOD, AND PROGRAM FOR DISPLAYING ASSISTANCE INFORMATION FOR SURGICAL OPERATION
WO2008149362A2 (en)*2007-06-052008-12-11Yoav KimchyApparatus and method for imaging tissue
US8926511B2 (en)*2008-02-292015-01-06Biosense Webster, Inc.Location system with virtual touch screen
WO2011063266A2 (en)*2009-11-192011-05-26The Johns Hopkins UniversityLow-cost image-guided navigation and intervention systems using cooperative sets of local sensors
ES2900584T3 (en)*2010-12-232022-03-17Bard Access Systems Inc System for guiding a rigid instrument
US9301733B2 (en)*2012-12-312016-04-05General Electric CompanySystems and methods for ultrasound image rendering
CN105050525B (en)*2013-03-152018-07-31直观外科手术操作公司 Shape sensor system for tracking interventional instruments and method of use

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5558091A (en)*1993-10-061996-09-24Biosense, Inc.Magnetic determination of position and orientation
US7225012B1 (en)*2000-09-182007-05-29The Johns Hopkins UniversityMethods and systems for image-guided surgical interventions
US20070197896A1 (en)*2005-12-092007-08-23Hansen Medical, IncRobotic catheter system and methods
US8208991B2 (en)*2008-04-182012-06-26Medtronic, Inc.Determining a material flow characteristic in a structure
US9398936B2 (en)*2009-02-172016-07-26Inneroptic Technology, Inc.Systems, methods, apparatuses, and computer-readable media for image guided surgery

Cited By (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
EP3395282B1 (en)*2017-04-252023-08-02Biosense Webster (Israel) Ltd.Endoscopic view of invasive procedures in narrow passages
US10959677B2 (en)*2017-04-262021-03-30Acclarent, Inc.Apparatus to secure field generating device to chair
US11183295B2 (en)*2017-08-312021-11-23Gmeditec Co., Ltd.Medical image processing apparatus and medical image processing method which are for medical navigation device
US20220051786A1 (en)*2017-08-312022-02-17Gmeditec Co., Ltd.Medical image processing apparatus and medical image processing method which are for medical navigation device
US11676706B2 (en)*2017-08-312023-06-13Gmeditec Co., Ltd.Medical image processing apparatus and medical image processing method which are for medical navigation device
EP3666217A1 (en)*2018-12-132020-06-17Biosense Webster (Israel) Ltd.Composite visualization of body part
JP2020093102A (en)*2018-12-132020-06-18バイオセンス・ウエブスター・(イスラエル)・リミテッドBiosense Webster (Israel), Ltd.Composite visualization of body part
US11350847B2 (en)2018-12-132022-06-07Biosense Webster (Israel) Ltd.Composite visualization of body part
JP7423292B2 (en)2018-12-132024-01-29バイオセンス・ウエブスター・(イスラエル)・リミテッド Composite visualization of body parts

Also Published As

Publication numberPublication date
KR20170057141A (en)2017-05-24
CA2947873A1 (en)2017-05-16
AU2016253613A1 (en)2017-06-01
JP7051286B2 (en)2022-04-11
EP3173023A1 (en)2017-05-31
US9947091B2 (en)2018-04-17
CN107016662B (en)2022-05-24
JP2017086917A (en)2017-05-25
EP3173023B1 (en)2023-06-07
IL248535B (en)2019-02-28
IL248535A0 (en)2017-01-31
CN107016662A (en)2017-08-04
EP3173023C0 (en)2023-06-07

Similar Documents

PublicationPublication DateTitle
US9947091B2 (en)Locally applied transparency for a CT image
US11026747B2 (en)Endoscopic view of invasive procedures in narrow passages
US7010080B2 (en)Method for marker-free automatic fusion of 2-D fluoroscopic C-arm images with preoperative 3D images using an intraoperatively obtained 3D data record
US7664542B2 (en)Registering intra-operative image data sets with pre-operative 3D image data sets on the basis of optical surface extraction
US9406134B2 (en)Image system for supporting the navigation of interventional tools
US7761135B2 (en)Method and device for correction motion in imaging during a medical intervention
US9058679B2 (en)Visualization of anatomical data
EP1719078B1 (en)Device and process for multimodal registration of images
US11666203B2 (en)Using a camera with an ENT tool
CN101138010A (en)Image processing system and method for aligning two-dimensional and three-dimensional volume data during interventional procedures
EP3501398A1 (en)Ent bone distance color coded face maps
AU2015238800A1 (en)Real-time simulation of fluoroscopic images
CN108113693B (en)Computed tomography image correction
US11107213B2 (en)Correcting medical scans
CN109155068B (en)Motion compensation in combined X-ray/camera interventions
JP6703470B2 (en) Data processing device and data processing method
US20230196641A1 (en)Method and Device for Enhancing the Display of Features of interest in a 3D Image of an Anatomical Region of a Patient
JP7301573B2 (en) How to place a static virtual camera
Hawkes et al.3D multimodal imaging in image guided interventions
Hawkes et al.Three-dimensional multimodal imaging in image-guided interventions
Vandermeulen et al.Prototype medical workstation for computer-assisted stereotactic neurosurgery
CN108324306A (en)Mucus is had an X-rayed in Otorhinolaryngologic operation
Manning et al.Surgical navigation

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:BIOSENSE WEBSTER (ISRAEL) LTD., ISRAEL

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GOVARI, ASSAF;GLINER, VADIM;MAYER, RAM B.;SIGNING DATES FROM 20151118 TO 20151130;REEL/FRAME:037216/0441

STCFInformation on status: patent grant

Free format text:PATENTED CASE

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:8


[8]ページ先頭

©2009-2025 Movatter.jp