Movatterモバイル変換


[0]ホーム

URL:


US9179822B2 - Endoscopic observation supporting system, method, device and program - Google Patents

Endoscopic observation supporting system, method, device and program
Download PDF

Info

Publication number
US9179822B2
US9179822B2US13/582,603US201113582603AUS9179822B2US 9179822 B2US9179822 B2US 9179822B2US 201113582603 AUS201113582603 AUS 201113582603AUS 9179822 B2US9179822 B2US 9179822B2
Authority
US
United States
Prior art keywords
endoscope
endoscopic image
image
interest
surgical tool
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/582,603
Other versions
US20120327186A1 (en
Inventor
Yoshiro Kitamura
Keigo Nakamura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm CorpfiledCriticalFujifilm Corp
Assigned to FUJIFILM CORPORATIONreassignmentFUJIFILM CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: NAKAMURA, KEIGO, KITAMURA, YOSHIRO
Publication of US20120327186A1publicationCriticalpatent/US20120327186A1/en
Application grantedgrantedCritical
Publication of US9179822B2publicationCriticalpatent/US9179822B2/en
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

A virtual endoscopic image generating unit for generating, from a 3D medical image representing an interior of a body cavity of a subject formed by a 3D medical image forming unit and inputted thereto, a virtual endoscopic image, in which a position of a structure of interest identified by a position of interest identifying unit is a view point of the virtual endoscopic image, a real-time position of at least one of an endoscope and a surgical tool detected by an endoscope position detecting unit or a surgical tool position detecting unit is contained in a field of view of the virtual endoscopic image, and the position of at least one of the endoscope and the surgical tool is shown in an identifiable manner in the virtual endoscopic image, is provided. A display control unit causes a WS display to display the generated virtual endoscopic image.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a technology for supporting endoscopic observation during surgery or examination using an endoscope inserted in a body cavity of a subject, and in particular to a technology for supporting endoscopic observation using a virtual endoscopic image representing the interior of a body cavity of a subject.
2. Description of the Related Art
In recent years, surgery using an endoscope, such as laparoscopic surgery and thoracoscopic surgery, is drawing attention. The endoscopic surgery is advantageous in that it does not require laparotomy, thoracotomy, or the like, and only needs to make two or three holes of few centimeters in diameter for insertion of an endoscope and a surgical tool, thereby significantly reducing the burden imposed on the patient. However, conducting surgery with a very limited field of view of the endoscope is highly difficult, and doctors require a lot of skill to conduct the endoscopic surgery. If a blood vessel or an organ of the patient is damaged by mistake and breeds during the endoscopic surgery, it is impossible to continue the endoscopic surgery and the doctor has to conduct conventional surgery involving laparotomy, thoracotomy, or the like.
On the other hand, a virtual endoscopy technology for generating a virtual endoscopic image, which is similar to an endoscopic image, from a 3D volume image taken with a CT device, or the like, is known. This technology is widely used in North America as a method for finding a tumor, in particular, a colorectal tumor, only by CT imaging without conducting endoscopic examination.
Further, a technology for supporting endoscopic surgery using a virtual endoscopic image has been proposed.
For example, Japanese Unexamined Patent Publication No. 2002-263053 (hereinafter, Patent Document 1) has disclosed a device that detects a position of an endoscope with a sensor, generates a virtual endoscopic image having an angle of view wider than that of the endoscope with setting the detected position of the endoscope as a view point, and displays the virtual endoscopic image and a real endoscopic image taken with the endoscope superimposed one on the other.
Further, Japanese Unexamined Patent Publication No. 2005-021353 (hereinafter, Patent Document 2) has disclosed a device that detects a real-time position of an endoscope to generate a virtual endoscopic image having the same field of view as that of the endoscope, where location of blood vessels in the field of view is visualized. The device also detects a real-time position of a surgical tool used during endoscopic surgery to generate a composite image in which an image representing the surgical tool is combined at the position of the surgical tool in the virtual endoscopic image, and displays the composite image and a real endoscopic image.
The virtual endoscopic image according to the techniques disclosed in these documents, however, has the same view point as that of the real endoscopic image, i.e., is an image viewed from the same observation direction as that of the real endoscopic image. Therefore, depending on the positional relationship between a site of interest, such as a site of surgical interest, and the endoscope or the surgical tool, the site of interest may sometimes not be shown in the virtual endoscopic image or the real endoscopic image, and the doctor cannot recognize the approach of the endoscope or the surgical tool to the site of interest in such a case.
SUMMARY OF THE INVENTION
In view of the above-described circumstances, the present invention is directed to providing a system, a method, a device and a program for allowing the user to recognize the approach of an endoscope or a surgical tool to a site of interest, such as a site of surgical interest, more reliably during observation of a body cavity of a subject using the endoscope inserted in the body cavity.
An aspect of an endoscopic observation support system of the invention is an endoscopic observation support system comprising: 3D medical image forming means for forming a 3D medical image representing an interior of a body cavity of a subject; position of interest identifying means for identifying a position of a (first) structure of interest in the body cavity in the 3D medical image; position detecting means for detecting a real-time position of at least one of an endoscope and a surgical tool inserted in the body cavity; virtual endoscopic image generating means for generating, from the 3D medical image inputted thereto, a virtual endoscopic image representing the interior of the body cavity viewed from a view point, based on the identified position of the (first)) structure of interest and the detected position of at least one of the endoscope and the surgical tool in the 3D medical image, wherein the view point is the position of the (first) structure of interest, the position of at least one of the endoscope and the surgical tool is contained in a field of view of the virtual endoscopic image, and the position of at least one of the endoscope and the surgical tool is shown in an identifiable manner in the virtual endoscopic image; and display means for displaying the virtual endoscopic image.
An aspect of an endoscopic observation support method of the invention is an endoscopic observation support method comprising the steps of: forming a 3D medical image representing an interior of a body cavity of a subject before or during observation of the interior of the body cavity with an endoscope inserted in the body cavity; identifying a position of a (first) structure of interest in the body cavity in the 3D medical image; detecting a real-time position of at least one of the endoscope and a surgical tool inserted in the body cavity; generating, from the 3D medical image inputted, a virtual endoscopic image representing the interior of the body cavity viewed from a view point, based on the identified position of the (first) structure of interest and the detected position of at least one of the endoscope and the surgical tool in the 3D medical image, wherein the view point is the position of the (first) structure of interest, the position of at least one of the endoscope and the surgical tool is contained in a field of view of the virtual endoscopic image, and the position of at least one of the endoscope and the surgical tool is shown in an identifiable manner in the virtual endoscopic image; and displaying the virtual endoscopic image.
An aspect of an endoscopic observation support device of the invention is an endoscopic observation support device comprising: 3D medical image obtaining means for obtaining a 3D medical image representing an interior of a body cavity of a subject; position of interest identifying means for identifying a position of a (first) structure of interest in the body cavity in the 3D medical image; position obtaining means for obtaining a real-time position of at least one of an endoscope and a surgical tool inserted in the body cavity detected by position detecting means; virtual endoscopic image generating means for generating, from the 3D medical image inputted thereto, a virtual endoscopic image representing the interior of the body cavity viewed from a view point, based on the identified position of the (first) structure of interest and the detected position of at least one of the endoscope and the surgical tool in the 3D medical image, wherein the viewpoint is the position of the (first) structure of interest, the position of at least one of the endoscope and the surgical tool is contained in a field of view of the virtual endoscopic image, and the position of at least one of the endoscope and the surgical tool is shown in an identifiable manner in the virtual endoscopic image; and display control means for causing display means to display the virtual endoscopic image.
An aspect of an endoscopic observation support program of the invention is an endoscopic observation support program for causing a computer to carry out the steps of: obtaining a 3D medical image representing an interior of a body cavity of a subject; identifying a position of a (first) structure of interest in the body cavity in the 3D medical image; obtaining a real-time position of at least one of an endoscope and a surgical tool inserted in the body cavity detected by position detecting means; generating, from the 3D medical image inputted, a virtual endoscopic image representing the interior of the body cavity viewed from a view point, based on the identified position of the (first) structure of interest and the detected position of at least one of the endoscope and the surgical tool in the 3D medical image, wherein the view point is the position of the (first) structure of interest, the position of at least one of the endoscope and the surgical tool is contained in a field of view of the virtual endoscopic image, and the position of at least one of the endoscope and the surgical tool is shown in an identifiable manner in the virtual endoscopic image; and causing display means to display the real endoscopic image and the virtual endoscopic image.
Now, details of the invention are described.
In the invention, a real endoscopic image representing the interior of body cavity may be formed by real-time imaging with the endoscope, and the real endoscopic image which is formed almost at the same time when the position of at least one of the endoscope and the surgical tool used to generate the virtual endoscopic image is detected may further be displayed. In this manner, the real endoscopic image formed real-time by imaging with the endoscope and the virtual endoscopic image, which contains, in the field of view thereof, the real-time position of at least one of the endoscope and the surgical tool detected by the position detecting means almost at the same time when the real endoscopic image is formed, are displayed.
In a case where generation of the virtual endoscopic image is repeated in response to detection of the position of at least one of the endoscope and the surgical tool, real-time update of both the real endoscopic image and the virtual endoscopic image is achieved.
The real endoscopic image and the virtual endoscopic image may be displayed on a single display device or may be displayed separately on a plurality of display devices. The plurality of display devices may be located side by side at the physically same place so that both the images can be observed simultaneously, or may be located at places physically apart from each other so that the images are observed separately.
In the invention, in a case where the 3D medical image is obtained during observation using the endoscope, the 3D medical image may be obtained real-time. In this case, the position of at least one of the endoscope and the surgical tool may be detected by performing image recognition processing on the obtained 3D medical image.
Specific examples of the “(first) structure of interest” may include a site of surgical interest during endoscopic surgery and an anatomical structure that requires attention during endoscopic surgery, such as a blood vessel, an organ or a tumor. A specific method for identifying the position of the (first) structure of interest may be an automatic method using a known image recognition technique, a method involving manual operation by the user, or a method combining both the automatic and manual methods.
In the invention, a plurality of virtual endoscopic images may be generated by setting a plurality of positions of the (first) structure of interest as the view points.
The description “detecting . . . a position of at least one of an endoscope and a surgical tool” may refer to either of detecting the position of the endoscope when only the endoscope is inserted in the body cavity, or detecting the position of the endoscope, the position of the surgical tool, or both the positions of the endoscope and the surgical tool when the endoscope and the surgical tool are inserted in the body cavity.
The view point of the “virtual endoscopic image” is the position of the (first) structure of interest. However, the position of the view point is not strictly limited to a position on the surface of the (first) structure of interest or a position within the structure, and may be a position where an effect that is substantially equivalent to the effect of the invention is obtained, such as a position apart from the (first) structure of interest by few pixels.
The “virtual endoscopic image” contains the position of at least one of the endoscope and the surgical tool within the field of view thereof. This means that image information along a line of sight from the view point (the position of the (first) structure of interest) toward the position of at least one of the endoscope and the surgical tool is reflected in the virtual endoscopic image. If, for example, a structure, such as an organ, a blood vessel or a fold, is present between the (first) structure of interest and the endoscope or the surgical tool, the endoscope or the surgical tool may not necessarily be shown in the virtual endoscopic image.
Further, in the “virtual endoscopic image”, the position of at least one of the endoscope and the surgical tool is shown in an identifiable manner. In a case where the 3D medical image is obtained before the endoscopic observation, the endoscope or the surgical tool has not yet been inserted in the body cavity of the subject when the 3D medical image is taken and obtained. Therefore, when the virtual endoscopic image is generated, a marker, or the like, representing the endoscope or the surgical tool may be combined at a position in the virtual endoscopic image corresponding to the position detected by the position detecting means. On the other hand, in a case where the 3D medical image is obtained real-time during the endoscopic observation and the endoscope or the surgical tool is shown in the image, the virtual endoscopic image may be generated such that the endoscope or the surgical tool is also shown in the virtual endoscopic image.
When the “virtual endoscopic image” is generated, a distance from the structure of interest to a surface of a structure in the body cavity may be used as a determinant of pixel values of the virtual endoscopic image. Alternatively, a color template, which is defined to provide the virtual endoscopic image showing sites in the body cavity in almost the same appearance as those shown in the real endoscopic image, may be used. It should be noted that the color template may include, for example, one that is defined such that each site in the body cavity has almost the same color of as that shown in the real endoscopic image, and each site in the body cavity may be shown semitransparent, as necessary, so that a structure behind an obstacle, which cannot be observed in the real endoscopic image, is visually recognizable in the virtual endoscopic image.
In the invention, a second structure of interest in the body cavity in the 3D medical image may be detected, and the virtual endoscopic image showing the detected second structure of interest in a visually recognizable manner may be generated. Specific examples of the “second structure of interest” may include those mentioned above with respect to the first structure of interest. Therefore, for example, the first structure may be a site of surgical interest during endoscopic surgery and the second structure of interest may be an anatomical structure that requires attention during the surgery, or vice versa.
In the invention, a warning may be shown when an approach of at least one of the endoscope and the surgical tool to the structure of interest satisfies a predetermined criterion. The warning may be visually shown in the virtual endoscopic image, or may be shown in a manner perceived by any other sense organ.
According to the invention, a 3D medical image representing an interior of a body cavity of a subject is obtained, a position of a structure of interest in the body cavity in the 3D medical image is identified, and a real-time position of at least one of an endoscope and a surgical tool inserted in the body cavity is detected. Then, from the 3D medical image inputted, a virtual endoscopic image is generated, wherein the view point of the virtual endoscopic image is the position of the structure of interest, the position of at least one of the endoscope and the surgical tool is contained in the field of view of the virtual endoscopic image, and the position of at least one of the endoscope and the surgical tool is shown in an identifiable manner in the virtual endoscopic image, and the virtual endoscopic image is displayed. The displayed virtual endoscopic image looks like an image taken with a camera that monitors the approach of the endoscope and the surgical tool to the structure of interest, such as a site of surgical interest or a site that requires attention. This virtual endoscopic image unique to the present invention compensates for the narrow field of view of the real endoscopic image, thereby allowing the user to more reliably recognize the approach of the endoscope or the surgical tool to the structure of interest and helping to prevent misoperation, etc., during surgery or examination.
Further, at this time, the field of view of the virtual endoscope of the continuously displayed virtual endoscopic image is changed real-time by feedback of the detected real-time position of the endoscope or the surgical tool. This allows the user to dynamically and more appropriately recognize the approach of the endoscope or the surgical tool to the structure of interest.
Still further, in the case where the real endoscopic image representing the interior of body cavity is formed by real-time imaging with the endoscope, and the real endoscopic image which is formed almost at the same time when the position of at least one of the endoscope and the surgical tool used to generate the virtual endoscopic image is detected is further displayed, the displayed real endoscopic image and virtual endoscopic image show the state of the interior of the body cavity almost at the same point of time, and the real endoscopic image and the virtual endoscopic image are continuously displayed in a temporally synchronized manner. Yet further, in the case where generation of the virtual endoscopic image is repeated in response to detection of the position of at least one of the endoscope and the surgical tool, real-time update of both the real endoscopic image and the virtual endoscopic image is achieved. That is, the field of view of the real endoscopic image changes along with movement or rotation of the endoscope, and the field of view of the virtual endoscopic image changes along with movement of the endoscope or the surgical tool. In this manner, the user can observe the interior of body cavity with complementarily using the real endoscopic image and the virtual endoscopic image.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a hardware configuration diagram of an endoscopic observation support system according to embodiments of the present invention,
FIG. 2 is a functional block diagram of the endoscopic observation support system according to first to third embodiments of the invention,
FIG. 3 is a flow chart illustrating the flow of an endoscopic observation support process according to the first to third embodiments of the invention,
FIG. 4 is a diagram schematically illustrating one example of a positional relationship between a real endoscope, a surgical tool and a structure of interest, and angles of view of the real endoscope and a virtual endoscope,
FIG. 5 is a diagram schematically illustrating one example of a real endoscopic image that is displayed in the first embodiment of the invention,
FIG. 6 is a diagram schematically illustrating one example of a virtual endoscopic image that is displayed in the first embodiment of the invention,
FIG. 7A is a diagram schematically illustrating one example of a positional relationship between a structure of interest and a surgical tool,
FIG. 7B is a diagram schematically illustrating one example of the virtual endoscopic image that is displayed in the second embodiment of the invention,
FIG. 8A is a diagram schematically illustrating one example of a color template for changing a display color in the virtual endoscopic image depending on a distance from a view point to the surface of an anatomical structure in the abdominal cavity according to the third embodiment of the invention,
FIG. 8B is a diagram schematically illustrating one example of the virtual endoscopic image, in which the display color is changed depending on the distance from the view point, according to the third embodiment of the invention,
FIG. 9 is a functional block diagram of the endoscopic observation support system according to a fourth embodiment of the invention,
FIG. 10 is a flow chart illustrating the flow of the endoscopic observation support process according to the fourth embodiment of the invention,
FIG. 11 is a diagram schematically illustrating one example of a warning display according to the fourth embodiment of the invention,
FIG. 12 is a functional block diagram of the endoscopic observation support system according to a fifth embodiment of the invention,
FIG. 13 is a flow chart illustrating the flow of the endoscopic observation support process according to the fifth embodiment of the invention,
FIG. 14A is a diagram schematically illustrating one example of a positional relationship between a structure of interest and a structure that requires attention,
FIG. 14B is a diagram schematically illustrating one example of the virtual endoscopic image that is displayed in the fifth embodiment of the invention,
FIG. 15 is a functional block diagram of the endoscopic observation support system according to a sixth embodiment of the invention,
FIG. 16 is a flow chart illustrating the flow of the endoscopic observation support process according to the sixth embodiment of the invention,
FIG. 17A is a diagram schematically illustrating angles of view of the real endoscopic image and the virtual endoscopic image when the images are combined,
FIG. 17B is a diagram schematically illustrating one example of a composite image generated by combining the real endoscopic image and the virtual endoscopic image,
FIG. 18A is a diagram schematically illustrating another example of the real endoscopic image,
FIG. 18B is a diagram schematically illustrating one example of the virtual endoscopic image, which shows only blood vessels, and
FIG. 18C is a diagram schematically illustrating one example of a superimposed image of the real endoscopic image and the virtual endoscopic image.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
Hereinafter, an endoscopic observation support system according to embodiments of the present invention is described.
FIG. 1 is a hardware configuration diagram illustrating the outline of the endoscopic observation support system. As shown in the drawing, the system includes anendoscope1, adigital processor2, alight source device3, a realendoscopic image display4, amodality5, asurgical tool6, anendoscope marker7a, asurgical tool marker7b, aposition sensor8, animage processing workstation9, and an image processing workstation display (which will hereinafter be referred to as “WS display”)10.
In this embodiment, theendoscope1 is a hard endoscope for the abdominal cavity, and is inserted into the abdominal cavity of a subject. Light from thelight source device3 is guided by an optical fiber and emitted from the tip portion of theendoscope1, and an image of the interior of the abdominal cavity of the subject is taken by an imaging optical system of theendoscope1. Thedigital processor2 converts an image signal obtained by theendoscope1 into a digital image signal, and performs image quality correction by digital signal processing, such as white balance control and shading correction. Then, thedigital processor2 adds accompanying information prescribed by the DICOM (Digital Imaging and Communications in Medicine) standard to the digital image signal to output real endoscopic image data (IRE). The outputted real endoscopic image data (IRE) is sent to theimage processing workstation9 via a LAN according to a communication protocol conforming to the DICOM standard. Further, thedigital processor2 converts the real endoscopic image data (IRE) into an analog signal and outputs the analog signal to the realendoscopic image display4, so that the real endoscopic image (IRE) is displayed on the realendoscopic image display4. Theendoscope1 obtains the image signal at a predetermined frame rate, and therefore the real endoscopic image (IRE) displayed on thereal endoscope display4 is a moving image showing the interior of the abdominal cavity. Theendoscope1 can also take a still image in response to an operation by the user.
Themodality5 is a device that images a site to be examined of the subject and generates image data (V) of a 3D medical image representing the site. In this embodiment, themodality5 is a CT device. The 3D medical image data (V) also has the accompanying information prescribed by the DICOM standard added thereto. The 3D medical image data (V) is also sent to theimage processing workstation9 via the LAN according to the communication protocol conforming to the DICOM standard.
Theendoscope marker7a, thesurgical tool marker7band theposition sensor8 form a known three-dimensional position measurement system. Theendoscope marker7aand thesurgical tool marker7bare provided in the vicinity of handles of theendoscope1 and thesurgical tool6, respectively, and three-dimensional positions of themarkers7a,7bare detected by theoptical position sensor8 at predetermined time intervals. Each of theendoscope marker7aand thesurgical tool marker7bis formed by a plurality of marker chips, so that theposition sensor8 can also detect the orientation of each of theendoscope1 and thesurgical tool6 based on a positional relationship among the marker chips, and three-dimensional positions (PSE, PST) of the tip portions of theendoscope1 and thesurgical tool6 may be calculated by an offset calculation. Theposition sensor8 sends the calculated three-dimensional position data (PSE, PST) of theendoscope1 and thesurgical tool6 to theimage processing workstation9 via a USB interface.
Theimage processing workstation9 is a computer having a known hardware configuration including a CPU, a main storage device, an auxiliary storage device, an input/output interface, a communication interface, a data bus, etc., to which an input device (such as a pointing device and a keyboard) and theWS display10 are connected. Theimage processing workstation9 is connected to thedigital processor2 and themodality5 via the LAN, and to theposition sensor8 via the USB connection. Theimage processing workstation9 has installed therein a known operating system, various application software programs, etc., and an application software program for executing an endoscopic observation support process of the invention. These software programs may be installed from a recording medium, such as a CD-ROM, or may be downloaded from a storage device of a server connected via a network, such as the Internet, before being installed.
FIG. 2 is a functional block diagram of the endoscopic observation support system according to a first embodiment of the invention. As shown in the drawing, the endoscopic observation support system according to the first embodiment of the invention includes theendoscope1, a real endoscopicimage forming unit2, the realendoscopic image display4, a 3D medicalimage forming unit5, thesurgical tool6, theWS display10, an endoscopeposition detecting unit11, a surgical toolposition detecting unit12, a real endoscopicimage obtaining unit21, an endoscopeposition obtaining unit22, a surgical toolposition obtaining unit23, a 3D medicalimage obtaining unit24, a position ofinterest identifying unit25, a virtual endoscopicimage generating unit26, and adisplay control unit27. It should be noted that the same reference numeral as that assigned to the hardware device shown inFIG. 1 is used to denote a corresponding functional block shown inFIG. 2 when there is substantially one to one correspondence between them. That is, the function of the real endoscopicimage forming unit2 is implemented by the digital processor shown inFIG. 1, and the function of the 3D medicalimage forming unit5 is implemented by the modality shown inFIG. 1. On the other hand, the function of the endoscopeposition detecting unit11 is implemented by theendoscope marker7aand theposition sensor8, and the function of the surgical toolposition detecting unit12 is implemented by thesurgical tool marker7band theposition sensor8. The dashed line frame represents theimage processing workstation9, and the functions of the individual processing units in the dashed line frame are implemented by executing predetermined programs on theimage processing workstation9. Further, a real endoscopic image IRE, an endoscope position PE, a surgical tool position PT, a 3D medical image V, a position of interest PIand a virtual endoscopic image IVEin the dashed line frame are data written in and read from predetermined memory areas of theimage processing workstation9 by the individual processing units in the dashed line frame.
Next, using the flow chart shown inFIG. 3, a schematic flow of operations by the user performed on the endoscopic observation support system and operations performed by the above-mentioned individual processing units according to the first embodiment of the invention is described.
Prior to observation of the interior of the abdominal cavity of a subject using theendoscope1, the 3D medicalimage forming unit5 images the interior of the abdominal cavity of the subject to form the 3D medical image V. On theimage processing workstation9, the 3D medicalimage obtaining unit24 obtains the 3D medical image V formed by the 3D medical image forming unit5 (#1), and then the position ofinterest identifying unit25 shows a user interface for receiving a user operation to specify a structure of interest (for example, a site of surgical interest) in the body cavity shown in the 3D medical image V obtained by the 3D medicalimage obtaining unit24, and identifies the position PIof the specified structure of interest in the 3D medical image V based on the obtained 3D medical image V (#2).
Then, as written on the right side of the flow chart shown inFIG. 3, during endoscopic surgery of the structure of interest, i.e., during observation of the interior of the abdominal cavity of the subject using theendoscope1, the real endoscopicimage forming unit2 repeatedly forms the real endoscopic image IREtaken with theendoscope1 inserted in the body cavity at a predetermined frame rate, and the formed real endoscopic image IREis displayed real-time as a live-view image on the realendoscopic image display4 until the observation is finished (#7: YES). Further, the endoscopeposition detecting unit11 and the surgical toolposition detecting unit12 repeatedly detect the real-time positions PSE, PSTof theendoscope1 and thesurgical tool6 inserted in the body cavity at predetermined time intervals.
On theimage processing workstation9, the real endoscopicimage obtaining unit21 obtains the real endoscopic image IREformed by the real endoscopic image forming unit2 (#3). Almost at the same time with this, the endoscopeposition obtaining unit22 obtains the endoscope position PSEdetected by the endoscopeposition detecting unit11 and outputs the endoscope position PE, which is obtained by converting the obtained endoscope position PSEinto a position in the coordinate system of the 3D medical image V, and the surgical toolposition obtaining unit23 obtains the surgical tool position PSTdetected by the surgical toolposition detecting unit12 and outputs the surgical tool position PT, which is obtained by converting the obtained surgical tool position PSTinto a position in the coordinate system of the 3D medical image V (#4).
The virtual endoscopicimage generating unit26 generates, from the 3D medical image V obtained by the 3D medicalimage obtaining unit24 and inputted thereto, the virtual endoscopic image IVEbased on the position PIof the structure of interest identified by the position ofinterest identifying unit25, the endoscope position PEobtained by the endoscopeposition obtaining unit22, and the surgical tool position PIobtained by the surgical tool position obtaining unit23 (#5). The virtual endoscopic image IVEis an image representing the interior of the abdominal cavity of the subject, where the position PIof the structure of interest is the view point and the surgical tool position PTis the center of the field of view. If the endoscope position PEis present in the field of view of the virtual endoscopic image IVE, a shape image representing thesurgical tool6 and a shape image representing theendoscope1 are combined with the virtual endoscopic image IVE.
Thedisplay control unit27 causes theWS display10 to display the real endoscopic image IRE, obtained by the real endoscopicimage obtaining unit21 and the virtual endoscopic image IVEgenerated by the virtual endoscopicimage generating unit26 side by side on a single screen (#6).
On theimage processing workstation9, operations to obtain a new real endoscopic image IRE(#3), to obtain the endoscope position PEand the surgical tool position PTat that point of time (#4), to generate the virtual endoscopic image IVE(#5) and to update the displayed real endoscopic image IRE, and virtual endoscopic image IVE(#6) are repeated, unless an operation to instruct to end the observation is made (#7: No). With this, the real endoscopic image IRE, and the virtual endoscopic image IVEare continuously displayed on theWS display10 in a temporally synchronized manner. When the operation to instruct to end the observation is made (#7: Yes), theimage processing workstation9 ends the repeated operations insteps #3 to #6 described above.
Next, details of the operations performed by the individual processing units in theimage processing workstation9 are described.
The real endoscopicimage obtaining unit21 is a communication interface that receives the real endoscopic image IREvia communication with the real endoscopic image forming unit (digital processor)2 and stores the real endoscopic image IREin a predetermined memory area of theimage processing workstation9. The real endoscopic image IREis transferred from the real endoscopicimage forming unit2 based on a request from the real endoscopicimage obtaining unit21.FIG. 5 schematically illustrates one example of the real endoscopic image IRE.
The endoscopeposition obtaining unit22 has a function of a communication interface to obtain the endoscope position PSEvia communication with the endoscopeposition detecting unit11, and a function of converting the obtained endoscope position PSEin the 3D coordinate system of theposition sensor8 into the endoscope position PErepresented by coordinate values in the 3D coordinate system of the 3D medical image V and storing the endoscope position PEin a predetermined memory area of theimage processing workstation9. With respect to the former communication interface function, the endoscope position PSEis obtained from the endoscopeposition detecting unit11 based on a request from the endoscopeposition obtaining unit22. With respect to the latter coordinate transformation function, an amount of rotation of coordinate axes is calculated in advance based on a correspondence relationship between the orientation of each coordinate axis in the 3D coordinate system of the position sensor and the orientation of each coordinate axis in the 3D coordinate system of the 3D medical image V, and coordinate values of a position on the subject corresponding to the origin of the 3D medical image V in the 3D coordinate system of theposition sensor8 are measured in advance to calculate an amount of translation between the coordinate axes based on the coordinate values of the origin. Then, the conversion of the endoscope position PSErepresented by the 3D coordinate system of theposition sensor8 into the endoscope position PErepresented by the coordinate values in the 3D coordinate system of the 3D medical image V can be achieved using a matrix that applies rotation by the calculated amount of rotation and translation by the calculated amount of translation.
Similarly to the endoscopeposition obtaining unit22, the surgical toolposition obtaining unit23 has a function of a communication interface to obtain the surgical tool position PSTvia communication with the surgical toolposition detecting unit12, and a function of converting the obtained surgical tool position PSTin the 3D coordinate system of theposition sensor8 into the surgical tool position PTrepresented by the coordinate values in the 3D coordinate system of the 3D medical image V and storing the surgical tool position PTin a predetermined memory area of theimage processing workstation9.
The 3D medicalimage obtaining unit24 has a function of a communication interface to receive the 3D medical image V from the 3D medicalimage forming unit5 and store the 3D medical image V in a predetermined memory area of theimage processing workstation9.
The position ofinterest identifying unit25 shows, on a cross-sectional image representing a predetermined cross-section generated from the 3D medical image V using the known MPR method, a user interface for receiving an operation to specify the structure of interest via the pointing device or keyboard of theimage processing workstation9. For example, when the pointing device is clicked on the structure of interest shown in the cross-sectional image, the position ofinterest identifying unit25 identifies the position PIof the structure of interest, which has been specified by the click, in the 3D medical image V, and stores the position PIin a predetermined memory area of theimage processing workstation9. As the structure of interest, a site of surgical interest or a site that requires attention during surgery may be specified, as desired by the user.
The virtual endoscopicimage generating unit26 generates the virtual endoscopic image IVEfrom the 3D medical image V inputted thereto based on the position PIof the structure of interest, the endoscope position PEand the surgical tool position PT.FIG. 4 schematically illustrates one example of a positional relationship among theendoscope1, thesurgical tool6 and the structure of interest PI, and angles of view of theendoscope1 and the virtual endoscope. As shown in the drawing, the virtual endoscopicimage generating unit26 uses the position PIof the structure of interest as the view point and the surgical tool position PTas the center of the field of view to set a plurality of lines of sight radiating from the view point PIwithin the range of an angle of view AV, and generates a preliminary virtual endoscopic image by projecting pixel values along each line of sight by volume rendering using the known perspective projection. The angle of view AVof the preliminary virtual endoscopic image is set to be wider than an angle of view ARof the real endoscopic image IREthrough startup parameters of the program. For the volume rendering, a color template is used, which defines color and transparency in advance such that an image having almost the same appearance as that of the sites in the abdominal cavity shown in the real endoscopic image IREis obtained. Further, the virtual endoscopicimage generating unit26 generates a surgical tool shape image MTshowing a state where thesurgical tool6 is present at the surgical tool position PT, and an endoscope shape image MEshowing a state where theendoscope1 is present at the endoscope position PEif the endoscope position PEis present in the field of view of the virtual endoscopic image. Specifically, the surgical tool shape image MTand the endoscope shape image MEare generated based on images representing the shapes of theendoscope1 and thesurgical tool6 stored in a database, as well as the surgical tool position PTand the endoscope position PE, as taught in the above-mentionedPatent Document 2. Then, the virtual endoscopicimage generating unit26 generates the virtual endoscopic image IVEby combining the preliminary virtual endoscopic image with the surgical tool shape image MTand the endoscope shape image MEby a known technique, such as alpha blending.FIG. 6 schematically illustrates one example of the thus generated virtual endoscopic image IVE, wherein the shape image MTrepresenting thesurgical tool6 is superimposed at the surgical tool position PTat the near center of the field of view, and the shape image MErepresenting theendoscope1 is superimposed at the endoscope position PEin the field of view, and the image as a whole virtually represents a state where the interior of the abdominal cavity is viewed with an endoscope from the position of the structure of interest PIshown inFIG. 4 during endoscopic surgery.
Thedisplay control unit27 generates a display screen where the real endoscopic image IREand the virtual endoscopic image IVEare displayed side by side on a single screen and outputs the generated screen to theWS display10. In this manner, the display screen where the real endoscopic image IREschematically shown inFIG. 5 as an example and the virtual endoscopic image IVEschematically shown inFIG. 6 as an example are displayed side by side is displayed on theWS display10.
As described above, in the first embodiment of the invention, prior to observation of the interior of the abdominal cavity using theendoscope1, the 3D medicalimage obtaining unit24 obtains the 3D medical image V formed by the 3D medicalimage forming unit5, and the position ofinterest identifying unit25 identifies the position PIof the structure of interest in the abdominal cavity in the 3D medical image V. During the observation, the real endoscopicimage obtaining unit21 obtains the real endoscopic image IREformed by the real endoscopicimage forming unit2, and at the same time, the endoscopeposition obtaining unit22 obtains the position PEof theendoscope1 in the 3D medical image V detected by the endoscopeposition detecting unit11 and the surgical toolposition obtaining unit23 obtains the position PTof thesurgical tool6 in the 3D medical image V detected by the surgical toolposition detecting unit12. Then, the virtual endoscopicimage generating unit26 generates, from the 3D medical image V, the virtual endoscopic image IVE, in which the position PIof the structure of interest is the view point and the surgical tool position PTis the center of the field of view, and theendoscope1 and thesurgical tool6 are combined at the endoscope position PEand the surgical tool position PT, respectively. Then, thedisplay control unit27 causes theWS display10 to display the real endoscopic image IREand the virtual endoscopic image IVE. The thus displayed virtual endoscopic image IVElooks like an image taken with a camera that monitors the approach of theendoscope1 and thesurgical tool6 to the position PIof the structure of interest. By using the virtual endoscopic image IVEto compensate for the narrow field of view of the real endoscopic image IRE, the approach of theendoscope1 and thesurgical tool6 to the structure of interest can be recognized more reliably, thereby helping to prevent misoperation, or the like, during surgery or examination.
Further, at this time, the field of view of the virtual endoscope of the continuously displayed virtual endoscopic image IVEis changed real-time by feedback of the real-time positions of theendoscope1 and thesurgical tool6 detected by the endoscopeposition detecting unit11 and the surgical toolposition detecting unit12. This allows the user to dynamically and more appropriately recognize the approach of theendoscope1 and thesurgical tool6 to the structure of interest.
Further, the real endoscopicimage forming unit2 forms the real endoscopic image IRErepresenting the interior of body cavity taken real-time with theendoscope1, and the real endoscopic image IREwhich is formed almost at the same time when the position of theendoscope1 or thesurgical tool6 used to generate the virtual endoscopic image IVEis detected is further displayed. The real endoscopic image IREand the virtual endoscopic image IVEshow the state of the interior of the body cavity almost at the same point of time, and the real endoscopic image IREand the virtual endoscopic image IVEare continuously displayed in a temporally synchronized manner. Further, at this time, the field of view of the real endoscopic image IREchanges along with movement or rotation of theendoscope1, and the field of view of the virtual endoscopic image IVEchanges along with movement of thesurgical tool6. In this manner, in the first embodiment of the invention, the user can observe the interior of body cavity with complementarily using the real endoscopic image IREand the virtual endoscopic image IVE.
Still further, the virtual endoscopicimage generating unit26 generates the virtual endoscopic image IVEusing the color template, which defines color and transparency in advance such that an image having almost the same appearance as that of the sites in the abdominal cavity shown in the real endoscopic image IREis obtained. Therefore, the user can observe both the real endoscopic image IREand the virtual endoscopic image IVEdisplayed side by side on theWS display10 by thedisplay control unit27 without a feel of inconsistency.
A second embodiment of the invention is a modification of a volume rendering process carried out by the virtual endoscopicimage generating unit26. The hardware configuration, the functional blocks and the overall flow of the process of the endoscopic observation support system of the second embodiment are the same as those in the first embodiment.
FIG. 7A schematically illustrates one example of a positional relationship between the structure of interest and thesurgical tool6. As shown in the drawing, in a case where there is an anatomical structure that obstructs the view between the position PIof the structure of interest, which is used as the view point of the virtual endoscopic image IVE, and the surgical tool position PT, if the color template is defined to provide the anatomical structure with high opacity, thesurgical tool6 behind the anatomical structure is not shown in the virtual endoscopic image IVE. Therefore, in the second embodiment of the invention, the virtual endoscopicimage generating unit26 generates the virtual endoscopic image IVEusing a color template that defines opacity such that the sites in the body cavity are shown semitransparent. In the thus generated virtual endoscopic image IVE, as schematically shown inFIG. 7B, the anatomical structure present between the position PIof the structure of interest and the surgical tool position PTis shown semitransparent, and the surgical tool shape image MTis shown in a visually recognizable manner at a position corresponding to the surgical tool position PTbehind the anatomical structure. Such an image where an anatomical structure in the abdominal cavity is shown semitransparent cannot be formed by the real endoscopicimage forming unit2, and therefore practical value of using the virtual endoscopic image IVEshowing such a semitransparent anatomical structure complementarily to the real endoscopic image IREis very high.
A third embodiment of the invention is also a modification of the volume rendering process carried out by the virtual endoscopicimage generating unit26. The hardware configuration, the functional blocks and the overall flow of the process of the endoscopic observation support system of the third embodiment are the same as those in the first embodiment.
FIG. 8A schematically illustrates one example of the color template used in the third embodiment of the invention. As shown in the drawing, this color template is defined such that the color of the virtual endoscopic image IVEis changed depending on a distance from the view point PIon the structure of interest to the surface of a structure in the abdominal cavity. For example, the virtual endoscopicimage generating unit26 detects, a position where a change of pixel value along each line of sight of the perspective projection is larger than a predetermined threshold or a position where the pixel value is equal to or larger than a predetermined threshold as the surface of a structure in the abdominal cavity, and calculates the distance from the view point PIto the surface of the structure in the abdominal cavity. Then, the virtual endoscopicimage generating unit26 uses the color template to determine the pixel value of the detected surface of the structure shown in the virtual endoscopic image IVE. The thus generated virtual endoscopic image IVEhas a thinner color at the surface of a structure nearer to the position PIof the structure of interest, and a denser color at the surface of a structure farther from the position PIof the structure of interest, as schematically shown inFIG. 8B as an example. In this manner, depth perception of the virtual endoscopic image IVE, which is hard to be perceived, can be compensated for, thereby facilitating the user to recognize the approach of theendoscope1 and thesurgical tool6. It should be noted that the color and density of the shape image MEof theendoscope1 and the shape image MTof the surgical tool displayed in the virtual endoscopic image IVEmay also be changed depending on the distance from the position PIof the structure of interest in a manner similar to that described above.
As shown in the functional block diagram ofFIG. 9, a fourth embodiment of the invention includes awarning determination unit28 in addition to the components of the first embodiment. The hardware configuration of the endoscopic observation support system of the fourth embodiment is the same as that in the first embodiment.
Thewarning determination unit28 is a processing unit implemented on theimage processing workstation9. Thewarning determination unit28 calculates a distance between the endoscope position PEand the position PIof the structure of interest and a distance between the surgical tool position PTand the position PIof the structure of interest. If either of the calculated distance is smaller than a predetermined threshold, i.e., if theendoscope1 or thesurgical tool6 approaches too close to the structure of interest, thewarning determination unit28 outputs a warning message WM.
FIG. 10 is a flow chart illustrating the flow of the endoscopic observation support process according to the fourth embodiment of the invention. As shown in the drawing, after the real endoscopic image IREand the virtual endoscopic image IVEare displayed instep #6 of the first embodiment, thewarning determination unit28 compares each of the above-described distances with the threshold (#11). If either of the above-described distances is smaller than the threshold (#11: Yes), thewarning determination unit28 outputs the warning message WM, and thedisplay control unit27 superimposes an arrow mark with a comment “CAUTION—APPROACHING” in the vicinity of the displayedendoscope1 or the surgical tool6 (theendoscope1 is shown in the drawing) that is too close to the structure of interest, and shows the shape image representing theendoscope1 or surgical tool in a denser display color, as shown inFIG. 11 as an example. This facilitates the user to recognize the abnormal approach of theendoscope1 or thesurgical tool6 to the structure of interest, thereby helping to prevent misoperation of theendoscope1 and thesurgical tool6. Such a warning display is particularly effective when a blood vessel, or the like, which will cause massive bleeding if it is damaged during surgery, is specified as the structure of interest at the position ofinterest identifying unit25.
Besides being superimposed on the displayed virtual endoscopic image IVE, as described above, the warning message may be outputted in the form of a warning sound or voice, or may be outputted both as the superimposed warning message and the warning sound. Further, a risk determination table that defines a risk depending on the distance in a stepwise manner may be prepared in advance, and thewarning determination unit28 may reference the risk determination table based on the calculated distance to determine the risk, and the determined value of the risk may be outputted as the warning message WM and thedisplay control unit27 may display an icon, or the like, corresponding to the risk on theWS display10.
As shown in the functional block diagram ofFIG. 12, a fifth embodiment of the invention includes an attention-requiredstructure detecting unit29 in addition to the components of the first embodiment. The hardware configuration of the endoscopic observation support system is the same as that of the first embodiment.
The attention-requiredstructure detecting unit29 is a processing unit implemented on theimage processing workstation9. The attention-requiredstructure detecting unit29 detects a region of attention-required structure RA from the 3D medical image V inputted thereto using a known image recognition technique.FIG. 14A schematically illustrates one example of a positional relationship between the structure of interest and the attention-required structure. In this example, the attention-requiredstructure detecting unit29 detects an attention-required blood vessel region RA that is located behind the abdominal wall by performing known blood vessel extraction processing.
FIG. 13 is a flow chart illustrating the flow of the endoscopic observation support process according to the fifth embodiment of the invention. As shown in the drawing, after the position of interest PIis identified instep #2 of the first embodiment, the attention-requiredstructure detecting unit29 detects the region of attention-required structure RA (#13). Instep #5, the virtual endoscopicimage generating unit26 generates the virtual endoscopic image IVEusing a color template that is defined to show the region of attention-required structure RA in a visually recognizable manner.FIG. 14B schematically illustrates one example of the generated virtual endoscopic image IVE. The virtual endoscopic image IVEshown in the drawing is generated using a color template that defines color and opacity such that pixels representing the abdominal wall are shown semitransparent to increase the visual recognizability of pixels representing the blood vessel. This increases the visual recognizability of the attention-required structure, thereby helping to prevent misoperation of theendoscope1 and thesurgical tool6, similarly to the fourth embodiment.
It should be noted that the attention-requiredstructure detecting unit29 may detect the region of attention-required structure RA via manual operation by the user. Further, a marker, such as an arrow, and an annotation, such as a text comment, may be superimposed on the region of attention-required structure RA.
In a sixth embodiment of the invention, the 3D medical image V is formed and obtained real-time during the observation using the endoscope. In this case, theendoscope marker7a, thesurgical tool marker7band theposition sensor8 in the hardware configuration of the first embodiment (seeFIG. 1) are not necessary.
FIG. 15 is a functional block diagram of the endoscopic observation support system according to the sixth embodiment of the invention. As shown in the drawing, the endoscopic observation support system of the sixth embodiment includes an endoscope/surgical toolposition recognition unit30 in place of the endoscopeposition detecting unit11, the surgical toolposition detecting unit12, the endoscopeposition obtaining unit22 and the surgical toolposition obtaining unit23 of the first embodiment. That is, the endoscope/surgical toolposition recognition unit30 corresponds to the position detecting means of the invention.
The endoscope/surgical toolposition recognition unit30 is a processing unit implemented on theimage processing workstation9. The endoscope/surgical toolposition recognition unit30 extracts an area showing theendoscope1 or thesurgical tool6 from the 3D medical image V inputted thereto using known pattern recognition processing to recognize the endoscope position PEand the surgical tool position PT.
FIG. 16 is a flow chart illustrating the flow of the endoscopic observation support process according to the sixth embodiment of the invention. As shown in the drawing, after the real endoscopic image IREis obtained instep #3 of the first embodiment, the 3D medicalimage obtaining unit24 obtains the 3D medical image V (#14), and the endoscope/surgical toolposition recognition unit30 recognizes the endoscope position PEand the surgical tool position PTbased on the 3D medical image V obtained by the 3D medical image obtaining unit24 (#15). Instep #5, the virtual endoscopicimage generating unit26 generates the virtual endoscopic image IVEusing a color template that is defined such that the area showing theendoscope1 or thesurgical tool6 extracted by the endoscope/surgical toolposition recognition unit30 is displayed in a predetermined color. Therefore, it is not necessary to generate the shape images of theendoscope1 and thesurgical tool6 as in the first embodiment. By forming and obtaining the 3D medical image V real-time during the observation using the endoscope in this manner, the obtained 3D medical image V shows the state of the interior of the abdominal cavity almost at the same point of time as that shown in the real endoscopic image IRE. Therefore, the generated virtual endoscopic image IVEmore accurately shows the real-time state of the interior of the abdominal cavity than a case where the 3D medical image V obtained before the observation using the endoscope is used. It should be noted that, when the 3D medical image V is taken insteps #1 and #14 in this embodiment, it is necessary to pay attention to the posture of the subject during imaging so that the position of the subject corresponding to the origin of the coordinate axes and the orientation of the coordinate axes are not changed.
In the sixth embodiment of the invention, it is preferable to use an ultrasound diagnostic device as themodality5, in view of reducing radiation exposure of the subject.
The above-described embodiments are merely examples and should not be construed as limiting the technical scope of the invention.
Further, variations and modifications made to the system configuration, the hardware configuration, the process flow, the module configuration, the user interface and the specific contents of the process of the above-described embodiments without departing from the scope and spirit of the invention are also within the technical scope of the invention.
For example, with respect to the system configuration, although themodality5 is directly connected to theimage processing workstation9 in the hardware configuration ofFIG. 1 of the above-described embodiments, an image storage server may be connected to the LAN and the 3D medical image V formed by themodality5 may once be stored in a database of the image storage server, so that the 3D medical image V is transferred from the image storage server to theimage processing workstation9 in response to a request from theimage processing workstation9.
Theendoscope1 may not be a hard endoscope, and a soft endoscope or a capsular endoscope may be used.
As themodality5, besides the above-mentioned CT device and the ultrasound diagnostic device, a MRI device, etc., may be used.
TheWS display10 may be a display that supports known stereoscopic display to display the virtual endoscopic image IVEwhich is a stereoscopic image. For example, in a case where theWS display10 is a display device that achieves stereoscopic display using two parallax images for right and left eyes, the virtual endoscopicimage generating unit26 may generate virtual endoscope parallax images for right and left eyes by setting positions of right and left eyes, which are shifted from the position PIof the structure of interest by an amount of parallax between the right and left eyes, and performing perspective projection with using the thus set right and left eye positions as the view points. Then, thedisplay control unit27 may exert control such that display pixels of theWS display10 for the left eye to display the virtual endoscope parallax image for the left eye and display pixels of theWS display10 for the right eye to display the virtual endoscope parallax image for the right eye.
The endoscopeposition detecting unit11 and the surgical toolposition detecting unit12 may use a magnetic system, or may use a gyro or a rotary encoder, as taught inPatent Document 2.
The body site to be observed may be a site of the subject which is suitable for observation using an endoscope, such as the interior of the thoracic cavity, other than the interior of the abdominal cavity.
In the above-described embodiments, theimage processing workstation9 receives the image based on a request from the real endoscopicimage obtaining unit21 with taking the communication load into account, assuming that a cycle at which the real endoscopicimage forming unit2 forms the real endoscopic image IREis shorter than a cycle at which the virtual endoscopicimage generating unit26 generates the virtual endoscopic image IVE. However, the real endoscopicimage obtaining unit21 may receive all the real endoscopic images IE sequentially formed by the real endoscopicimage forming unit2. In this case, thedisplay control unit27 may update the displayed real endoscopic image IREon theWS display10 each time the real endoscopic image IREis received, asynchronously with the timing of generation of the virtual endoscopic image IVEby the virtual endoscopicimage generating unit26.
The endoscopeposition obtaining unit22 may receive all the endoscope positions PSEdetected at predetermined time intervals by the endoscopeposition detecting unit11, and may convert only the endoscope position PSEwhich is received at the time when the operation instep #4 ofFIG. 3 is invoked into the endoscope position PEby the coordinate transformation function to output it. The same applies to the surgical toolposition obtaining unit23.
The coordinate transformation carried out by the endoscopeposition obtaining unit22 and the surgical toolposition obtaining unit23 in the above-described embodiments may be carried out by the virtual endoscopicimage generating unit26.
The position ofinterest identifying unit25 may automatically identify the position of interest using a known image recognition technique (such as a technique for extracting blood vessels or an organ or a technique for detecting an abnormal shadow).
The virtual endoscopicimage generating unit26 may set the endoscope position PEat the center of the field of view so that theendoscope1 is always within the field of view, in place of setting the surgical tool position PTat the center of the field of view so that thesurgical tool6 is always within the field of view. Further alternatively, an internally dividing point of the segment PE-PT, such as a midpoint between the endoscope position PEand the surgical tool position PT, may be set at the center of the field of view, and the angle of view may be set such that both theendoscope1 and thesurgical tool6 are within the field of view. Still alternatively, the angle of view and a magnification factor may be adjusted depending on distances between the position PTof the structure of interest, the endoscope position PEand the surgical tool position PT. For example, if the distances between these positions are small, the angle of view may be set narrower and the magnification factor may be set larger than those when the distances between these positions are large, so that the area in the field of view is magnified to facilitate the observation.
In place of combining the shape image representing theendoscope1 or thesurgical tool6 with the virtual endoscopic image IVE, a marker, such as an arrow, may be displayed at the endoscope position PEor the surgical tool position PT.
The virtual endoscopicimage generating unit26 may generate the virtual endoscopic images IVEviewed from a plurality of view points by setting a plurality of positions of interest, such as a site of surgical interest, an attention-required blood vessel and an attention-required organ, as the view points.
Theimage processing workstation9 may generate and display an image other than the above-described real endoscopic image IREand virtual endoscopic image IVE. For example, a virtual endoscopic image with a wider angle of view AVthan the angle of view AEof theendoscope1, as schematically shown inFIG. 17A as an example, may further be generated with setting the endoscope position PEas the view point and using a magnification factor that renders the size of an object of interest in the image almost the same size as that in the real endoscopic image IRE, and a new image, as schematically shown inFIG. 17B as an example, where the real endoscopic image is superimposed on the thus generated virtual endoscopic image with aligning the center of the field of view, i.e., the endoscope position PE, may be generated and displayed.
Alternatively, a composite image of the real endoscopic image and the virtual endoscopic image may be generated. For example, a virtual endoscopic image showing only a visualized blood vessel structure, as schematically shown inFIG. 18B as an example, may be combined with a real endoscopic image, as schematically shown inFIG. 18A as an example, to generate a real endoscopic image with the emphasized blood vessel structure, as schematically shown inFIG. 18C as an example.

Claims (14)

The invention claimed is:
1. An endoscopic observation support system comprising:
a 3D medical image forming unit configured to form a 3D medical image representing an interior of a body cavity of a subject;
a position of interest identifying unit configured to identify a position of a structure of interest in the body cavity in the 3D medical image, the structure of interest being a site of surgical interest or an anatomical structure that requires attention during endoscopic surgery;
a position detecting unit configured to detect a real-time position of at least one of an endoscope and a surgical tool inserted in the body cavity;
a virtual endoscopic image generating unit configured to generate, from the 3D medical image inputted thereto, a virtual endoscopic image representing the interior of the body cavity viewed from a view point, based on the identified position of the structure of interest and the detected position of at least one of the endoscope and the surgical tool in the 3D medical image, wherein the view point is the position of the structure of interest, the position of at least one of the endoscope and the surgical tool is contained in a field of view of the virtual endoscopic image, and the position of at least one of the endoscope and the surgical tool is shown in an identifiable manner in the virtual endoscopic image; and
a display control unit configured to cause a display unit to display the virtual endoscopic image.
2. The endoscopic observation support system as claimed inclaim 1, further comprising
a real endoscopic image forming unit configured to form a real endoscopic image representing the interior of body cavity by real-time imaging with the endoscope,
wherein the display control unit further causes the real endoscopic image which is formed almost at the same time when the position of at least one of the endoscope and the surgical tool used to generate the virtual endoscopic image is detected to be displayed.
3. The endoscopic observation support system as claimed inclaim 1, wherein the virtual endoscopic image generating unit determines pixel values of the virtual endoscopic image depending on a distance from the structure of interest to a surface of a structure in the body cavity.
4. The endoscopic observation support system as claimed inclaim 1, further comprising a warning unit configured to show a warning when an approach of at least one of the endoscope and the surgical tool to the structure of interest satisfies a predetermined criterion.
5. The endoscopic observation support system as claimed inclaim 1, wherein the virtual endoscopic image generating unit determines pixel values of the virtual endoscopic image using a color template, wherein the color template is defined to provide the virtual endoscopic image showing sites in the body cavity in almost the same appearance as those shown in a real endoscopic image obtained by imaging with the endoscope.
6. The endoscopic observation support system as claimed inclaim 1, further comprising
a second structure of interest detecting unit configured to detect a second structure of interest in the body cavity in the 3D medical image,
wherein the virtual endoscopic image generating unit generates the virtual endoscopic image in which the second structure of interest is shown in a visually recognizable manner.
7. The endoscopic observation support system as claimed inclaim 6, wherein the structure of interest is a site of surgical interest during endoscopic surgery using the endoscope and the second structure of interest is an anatomical structure that requires attention during the endoscopic surgery.
8. The endoscopic observation support system as claimed inclaim 1, wherein the position of the endoscope is contained in a field of view of the virtual endoscopic image, and the position of the endoscope is shown in an identifiable manner in the virtual endoscopic image.
9. An endoscopic observation support method comprising the steps of:
forming a 3D medical image representing an interior of a body cavity of a subject before or during observation of the interior of the body cavity with an endoscope inserted in the body cavity;
identifying a position of a structure of interest in the body cavity in the 3D medical image, the structure of interest being a site of surgical interest or an anatomical structure that requires attention during endoscopic surgery;
detecting a real-time position of at least one of the endoscope and a surgical tool inserted in the body cavity;
generating, from the 3D medical image inputted, a virtual endoscopic image representing the interior of the body cavity viewed from a view point, based on the identified position of the structure of interest and the detected position of at least one of the endoscope and the surgical tool in the 3D medical image, wherein the view point is the position of the structure of interest, the position of at least one of the endoscope and the surgical tool is contained in a field of view of the virtual endoscopic image, and the position of at least one of the endoscope and the surgical tool is shown in an identifiable manner in the virtual endoscopic image; and
displaying the virtual endoscopic image.
10. The endoscopic observation support method as claimed inclaim 9, wherein the position of the endoscope is contained in a field of view of the virtual endoscopic image, and the position of the endoscope is shown in an identifiable manner in the virtual endoscopic image.
11. An endoscopic observation support device comprising:
a 3D medical image obtaining unit configured to obtain a 3D medical image representing an interior of a body cavity of a subject;
a position of interest identifying unit configured to identify a position of a structure of interest in the body cavity in the 3D medical image, the structure of interest being a site of surgical interest or an anatomical structure that requires attention during endoscopic surgery;
a position obtaining unit configured to obtain a real-time position of at least one of an endoscope and a surgical tool inserted in the body cavity detected by a position detecting unit;
a virtual endoscopic image generating unit configured to generate, from the 3D medical image inputted thereto, a virtual endoscopic image representing the interior of the body cavity viewed from a view point, based on the identified position of the structure of interest and the detected position of at least one of the endoscope and the surgical tool in the 3D medical image, wherein the view point is the position of the structure of interest, the position of at least one of the endoscope and the surgical tool is contained in a field of view of the virtual endoscopic image, and the position of at least one of the endoscope and the surgical tool is shown in an identifiable manner in the virtual endoscopic image; and
a display control unit configured to cause a display unit to display the virtual endoscopic image.
12. The endoscopic observation support device as claimed inclaim 11, wherein the position of the endoscope is contained in a field of view of the virtual endoscopic image, and the position of the endoscope is shown in an identifiable manner in the virtual endoscopic image.
13. A non-transitory computer readable medium containing an endoscopic observation support program for causing a computer to carry out the steps of:
obtaining a 3D medical image representing an interior of a body cavity of a subject;
identifying a position of a structure of interest in the body cavity in the 3D medical image, the structure of interest being a site of surgical interest or an anatomical structure that requires attention during endoscopic surgery;
obtaining a real-time position of at least one of an endoscope and a surgical tool inserted in the body cavity detected by a position detecting unit;
generating, from the 3D medical image inputted, a virtual endoscopic image representing the interior of the body cavity viewed from a view point, based on the identified position of the structure of interest and the detected position of at least one of the endoscope and the surgical tool in the 3D medical image, wherein the view point is the position of the structure of interest, the position of at least one of the endoscope and the surgical tool is contained in a field of view of the virtual endoscopic image, and the position of at least one of the endoscope and the surgical tool is shown in an identifiable manner in the virtual endoscopic image; and
causing a display unit to display the virtual endoscopic image.
14. The non-transitory computer readable medium as claimed inclaim 13, wherein the position of the endoscope is contained in a field of view of the virtual endoscopic image, and the position of the endoscope is shown in an identifiable manner in the virtual endoscopic image.
US13/582,6032010-03-172011-03-16Endoscopic observation supporting system, method, device and programActive2032-06-05US9179822B2 (en)

Applications Claiming Priority (3)

Application NumberPriority DateFiling DateTitle
JP2010-0602852010-03-17
JP2010060285AJP5421828B2 (en)2010-03-172010-03-17 Endoscope observation support system, endoscope observation support device, operation method thereof, and program
PCT/JP2011/001563WO2011114731A1 (en)2010-03-172011-03-16System, method, device, and program for supporting endoscopic observation

Publications (2)

Publication NumberPublication Date
US20120327186A1 US20120327186A1 (en)2012-12-27
US9179822B2true US9179822B2 (en)2015-11-10

Family

ID=44648843

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US13/582,603Active2032-06-05US9179822B2 (en)2010-03-172011-03-16Endoscopic observation supporting system, method, device and program

Country Status (5)

CountryLink
US (1)US9179822B2 (en)
EP (1)EP2548495B1 (en)
JP (1)JP5421828B2 (en)
CN (1)CN102811655B (en)
WO (1)WO2011114731A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20150145953A1 (en)*2012-03-172015-05-28Waseda UniversityImage completion system for in-image cutoff region, image processing device, and program therefor
US10792034B2 (en)2018-07-162020-10-06Ethicon LlcVisualization of surgical devices
US11219501B2 (en)2019-12-302022-01-11Cilag Gmbh InternationalVisualization systems using structured light
US11284963B2 (en)2019-12-302022-03-29Cilag Gmbh InternationalMethod of using imaging devices in surgery
US11357593B2 (en)2019-01-102022-06-14Covidien LpEndoscopic imaging with augmented parallax
US11540706B2 (en)2017-04-132023-01-03V.T.M. (Virtual Tape Measure) Technologies Ltd.Method of using a manually-operated light plane generating module to make accurate measurements of the dimensions of an object seen in an image taken by an endoscopic camera
US11625825B2 (en)2019-01-302023-04-11Covidien LpMethod for displaying tumor location within endoscopic images
US11648060B2 (en)2019-12-302023-05-16Cilag Gmbh InternationalSurgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ
US11744667B2 (en)2019-12-302023-09-05Cilag Gmbh InternationalAdaptive visualization by a surgical system
US11759283B2 (en)2019-12-302023-09-19Cilag Gmbh InternationalSurgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11776144B2 (en)2019-12-302023-10-03Cilag Gmbh InternationalSystem and method for determining, adjusting, and managing resection margin about a subject tissue
US11832996B2 (en)2019-12-302023-12-05Cilag Gmbh InternationalAnalyzing surgical trends by a surgical system
US11850104B2 (en)2019-12-302023-12-26Cilag Gmbh InternationalSurgical imaging system
US12002571B2 (en)2019-12-302024-06-04Cilag Gmbh InternationalDynamic surgical visualization systems
US12053223B2 (en)2019-12-302024-08-06Cilag Gmbh InternationalAdaptive surgical system control according to surgical smoke particulate characteristics
US12207881B2 (en)2019-12-302025-01-28Cilag Gmbh InternationalSurgical systems correlating visualization data and powered surgical instrument data
US12257013B2 (en)2019-03-152025-03-25Cilag Gmbh InternationalRobotic surgical systems with mechanisms for scaling camera magnification according to proximity of surgical tool to tissue

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP5535725B2 (en)*2010-03-312014-07-02富士フイルム株式会社 Endoscope observation support system, endoscope observation support device, operation method thereof, and program
JP2012217591A (en)*2011-04-072012-11-12Toshiba CorpImage processing system, device, method and program
JP5844230B2 (en)*2011-10-122016-01-13富士フイルム株式会社 Endoscope system and operating method thereof
JP5953049B2 (en)*2012-01-242016-07-13オリンパス株式会社 Endoscope system
JP2013252387A (en)2012-06-082013-12-19Canon IncMedical image processing apparatus
JP6035057B2 (en)*2012-06-132016-11-30株式会社日立製作所 3D image display apparatus and 3D image display method
JP5993221B2 (en)*2012-06-142016-09-14オリンパス株式会社 Endoscope system and endoscopic image generation method
JP6042711B2 (en)*2012-12-182016-12-14富士フイルム株式会社 Trocar port position determination support device, trocar port position determination support program, and operation method of trocar port position determination support device
US20140176661A1 (en)*2012-12-212014-06-26G. Anthony ReinaSystem and method for surgical telementoring and training with virtualized telestration and haptic holograms, including metadata tagging, encapsulation and saving multi-modal streaming medical imagery together with multi-dimensional [4-d] virtual mesh and multi-sensory annotation in standard file formats used for digital imaging and communications in medicine (dicom)
JP6265627B2 (en)*2013-05-232018-01-24オリンパス株式会社 Endoscope apparatus and method for operating endoscope apparatus
CN106456267B (en)*2014-03-282020-04-03直观外科手术操作公司 Quantitative 3D visualization of instruments in the field of view
WO2015149043A1 (en)2014-03-282015-10-01Dorin PanescuQuantitative three-dimensional imaging and printing of surgical implants
KR102373714B1 (en)2014-03-282022-03-15인튜어티브 서지컬 오퍼레이션즈 인코포레이티드Quantitative three-dimensional imaging of surgical scenes from multiport perspectives
WO2015149044A1 (en)2014-03-282015-10-01Dorin PanescuSurgical system with haptic feedback based upon quantitative three-dimensional imaging
WO2015149040A1 (en)2014-03-282015-10-01Dorin PanescuQuantitative three-dimensional imaging of surgical scenes
US9603668B2 (en)2014-07-022017-03-28Covidien LpDynamic 3D lung map view for tool navigation inside the lung
JP6254053B2 (en)*2014-08-132017-12-27富士フイルム株式会社 Endoscopic image diagnosis support apparatus, system and program, and operation method of endoscopic image diagnosis support apparatus
CN105213032B (en)*2015-09-062017-12-15北京医千创科技有限公司Location of operation system
WO2017057330A1 (en)*2015-09-282017-04-06オリンパス株式会社Endoscope system and image processing method
WO2017109997A1 (en)*2015-12-252017-06-29オリンパス株式会社Image processing device, image processing method, and program
JP6932135B2 (en)*2016-03-162021-09-08コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Computational device for superimposing laparoscopic images and ultrasonic images
CN114903591A (en)2016-03-212022-08-16华盛顿大学 Virtual reality or augmented reality visualization of 3D medical images
JPWO2018066047A1 (en)*2016-10-042019-06-24オリンパス株式会社 Overtube and endoscope system
US10262453B2 (en)*2017-03-242019-04-16Siemens Healthcare GmbhVirtual shadows for enhanced depth perception
JP6947523B2 (en)*2017-03-302021-10-13オリンパス株式会社 Endoscope device, endoscopic system and endoscopic image display method
US11026747B2 (en)*2017-04-252021-06-08Biosense Webster (Israel) Ltd.Endoscopic view of invasive procedures in narrow passages
JP6755273B2 (en)*2018-03-092020-09-16オリンパス株式会社 Endoscope business support system
JPWO2019181432A1 (en)*2018-03-202021-04-01ソニー株式会社 Surgery support system, information processing device, and program
JP7023195B2 (en)*2018-07-132022-02-21富士フイルム株式会社 Inspection support equipment, methods and programs
US11705238B2 (en)*2018-07-262023-07-18Covidien LpSystems and methods for providing assistance during surgery
US11045075B2 (en)*2018-12-102021-06-29Covidien LpSystem and method for generating a three-dimensional model of a surgical site
US12089902B2 (en)2019-07-302024-09-17Coviden LpCone beam and 3D fluoroscope lung navigation
JP7254742B2 (en)*2020-03-262023-04-10Hoya株式会社 Program, information processing method, information processing device, and diagnosis support system
US12035877B2 (en)*2020-07-102024-07-16Arthrex, Inc.Endoscope insertion and removal detection system
US11627243B2 (en)*2021-07-232023-04-11Phaox LLCHandheld wireless endoscope image streaming apparatus
US20230255442A1 (en)*2022-02-112023-08-17Canon U.S.A., Inc.Continuum robot apparatuses, methods, and storage mediums

Citations (24)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH11309A (en)1997-06-121999-01-06Hitachi Ltd Image processing device
US6016439A (en)1996-10-152000-01-18Biosense, Inc.Method and apparatus for synthetic viewpoint imaging
US6167296A (en)1996-06-282000-12-26The Board Of Trustees Of The Leland Stanford Junior UniversityMethod for volumetric image navigation
US6346940B1 (en)*1997-02-272002-02-12Kabushiki Kaisha ToshibaVirtualized endoscope system
US20020128547A1 (en)2001-03-062002-09-12Olympus Optical Co., Ltd.Medical image display apparatus and method
US20030152897A1 (en)*2001-12-202003-08-14Bernhard GeigerAutomatic navigation for virtual endoscopy
JP2005021353A (en)2003-07-012005-01-27Olympus CorpSurgery supporting apparatus
US20050033117A1 (en)2003-06-022005-02-10Olympus CorporationObject observation system and method of controlling object observation system
JP2005211529A (en)2004-01-302005-08-11Olympus CorpOperative technique supporting system
US20050187432A1 (en)*2004-02-202005-08-25Eric Lawrence HaleGlobal endoscopic viewing indicator
US20060004286A1 (en)*2004-04-212006-01-05Acclarent, Inc.Methods and devices for performing procedures within the ear, nose, throat and paranasal sinuses
JP2006198032A (en)2005-01-182006-08-03Olympus CorpSurgery support system
JP2007029232A (en)2005-07-252007-02-08Hitachi Medical CorpSystem for supporting endoscopic operation
US20070135803A1 (en)*2005-09-142007-06-14Amir BelsonMethods and apparatus for performing transluminal and other procedures
US20080207997A1 (en)*2007-01-312008-08-28The Penn State Research FoundationMethod and apparatus for continuous guidance of endoscopy
US20090259102A1 (en)*2006-07-102009-10-15Philippe KoninckxEndoscopic vision system
US20110137156A1 (en)*2009-02-172011-06-09Inneroptic Technology, Inc.Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US20110245660A1 (en)*2010-03-312011-10-06Fujifilm CorporationProjection image generation apparatus and method, and computer readable recording medium on which is recorded program for the same
US20120113111A1 (en)*2009-06-302012-05-10Toshiba Medical Systems CorporationUltrasonic diagnosis system and image data display control program
US20120136208A1 (en)*2010-11-262012-05-31Fujifilm CorporationMedical image processing apparatus method and program
US8611998B2 (en)*2005-05-062013-12-17Cardiac Pacemakers, Inc.Controlled delivery of intermittent stress augmentation pacing for cardioprotective effect
US8670816B2 (en)*2012-01-302014-03-11Inneroptic Technology, Inc.Multiple medical device guidance
US8731367B2 (en)*2006-03-012014-05-20Kabushiki Kaisha ToshibaImage processing apparatus
US8744149B2 (en)*2009-09-302014-06-03Fujifilm CorporationMedical image processing apparatus and method and computer-readable recording medium for image data from multiple viewpoints

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
EP0999785A4 (en)*1997-06-272007-04-25Univ Leland Stanford Junior METHOD AND APPARATUS FOR GENERATING THREE-DIMENSIONAL IMAGES FOR "NAVIGATION" PURPOSES
CN100377685C (en)*2002-08-302008-04-02奥林巴斯株式会社Medical treatment system, endoscope insertion operation program, and endoscope device
CN1875381A (en)*2003-11-032006-12-06布拉科成像S.P.A.公司System and methods for screening a luminal organ-lumen viewer
JP2010035768A (en)*2008-08-042010-02-18Olympus Medical Systems CorpActive drive type medical apparatus

Patent Citations (27)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6167296A (en)1996-06-282000-12-26The Board Of Trustees Of The Leland Stanford Junior UniversityMethod for volumetric image navigation
US6016439A (en)1996-10-152000-01-18Biosense, Inc.Method and apparatus for synthetic viewpoint imaging
US6346940B1 (en)*1997-02-272002-02-12Kabushiki Kaisha ToshibaVirtualized endoscope system
JPH11309A (en)1997-06-121999-01-06Hitachi Ltd Image processing device
US20020128547A1 (en)2001-03-062002-09-12Olympus Optical Co., Ltd.Medical image display apparatus and method
JP2002263053A (en)2001-03-062002-09-17Olympus Optical Co LtdMedical image display device and method
US20030152897A1 (en)*2001-12-202003-08-14Bernhard GeigerAutomatic navigation for virtual endoscopy
US20050033117A1 (en)2003-06-022005-02-10Olympus CorporationObject observation system and method of controlling object observation system
US20070173689A1 (en)2003-06-022007-07-26Takashi OzakiObject observation system and method of controlling object observation system
JP2005021353A (en)2003-07-012005-01-27Olympus CorpSurgery supporting apparatus
JP2005211529A (en)2004-01-302005-08-11Olympus CorpOperative technique supporting system
US20050187432A1 (en)*2004-02-202005-08-25Eric Lawrence HaleGlobal endoscopic viewing indicator
US20060004286A1 (en)*2004-04-212006-01-05Acclarent, Inc.Methods and devices for performing procedures within the ear, nose, throat and paranasal sinuses
JP2006198032A (en)2005-01-182006-08-03Olympus CorpSurgery support system
US8611998B2 (en)*2005-05-062013-12-17Cardiac Pacemakers, Inc.Controlled delivery of intermittent stress augmentation pacing for cardioprotective effect
JP2007029232A (en)2005-07-252007-02-08Hitachi Medical CorpSystem for supporting endoscopic operation
US20070135803A1 (en)*2005-09-142007-06-14Amir BelsonMethods and apparatus for performing transluminal and other procedures
US8731367B2 (en)*2006-03-012014-05-20Kabushiki Kaisha ToshibaImage processing apparatus
US20090259102A1 (en)*2006-07-102009-10-15Philippe KoninckxEndoscopic vision system
US20080207997A1 (en)*2007-01-312008-08-28The Penn State Research FoundationMethod and apparatus for continuous guidance of endoscopy
US8672836B2 (en)*2007-01-312014-03-18The Penn State Research FoundationMethod and apparatus for continuous guidance of endoscopy
US20110137156A1 (en)*2009-02-172011-06-09Inneroptic Technology, Inc.Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US20120113111A1 (en)*2009-06-302012-05-10Toshiba Medical Systems CorporationUltrasonic diagnosis system and image data display control program
US8744149B2 (en)*2009-09-302014-06-03Fujifilm CorporationMedical image processing apparatus and method and computer-readable recording medium for image data from multiple viewpoints
US20110245660A1 (en)*2010-03-312011-10-06Fujifilm CorporationProjection image generation apparatus and method, and computer readable recording medium on which is recorded program for the same
US20120136208A1 (en)*2010-11-262012-05-31Fujifilm CorporationMedical image processing apparatus method and program
US8670816B2 (en)*2012-01-302014-03-11Inneroptic Technology, Inc.Multiple medical device guidance

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
International Search Report, PCT/JP2011/001563, Jun. 7, 2011.
Supplementary European Search Report dated Aug. 21, 2013 in corresponding European Application No. 11755922.

Cited By (46)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20150145953A1 (en)*2012-03-172015-05-28Waseda UniversityImage completion system for in-image cutoff region, image processing device, and program therefor
US11540706B2 (en)2017-04-132023-01-03V.T.M. (Virtual Tape Measure) Technologies Ltd.Method of using a manually-operated light plane generating module to make accurate measurements of the dimensions of an object seen in an image taken by an endoscopic camera
US11419604B2 (en)2018-07-162022-08-23Cilag Gmbh InternationalRobotic systems with separate photoacoustic receivers
US12092738B2 (en)2018-07-162024-09-17Cilag Gmbh InternationalSurgical visualization system for generating and updating a three-dimensional digital representation from structured light imaging data
US10792034B2 (en)2018-07-162020-10-06Ethicon LlcVisualization of surgical devices
US11259793B2 (en)2018-07-162022-03-01Cilag Gmbh InternationalOperative communication of light
US12025703B2 (en)2018-07-162024-07-02Cilag Gmbh InternationalRobotic systems with separate photoacoustic receivers
US11304692B2 (en)2018-07-162022-04-19Cilag Gmbh InternationalSingular EMR source emitter assembly
US11754712B2 (en)2018-07-162023-09-12Cilag Gmbh InternationalCombination emitter and camera assembly
US11369366B2 (en)2018-07-162022-06-28Cilag Gmbh InternationalSurgical visualization and monitoring
US12078724B2 (en)2018-07-162024-09-03Cilag Gmbh InternationalSurgical visualization and monitoring
US11471151B2 (en)*2018-07-162022-10-18Cilag Gmbh InternationalSafety logic for surgical suturing systems
US10925598B2 (en)2018-07-162021-02-23Ethicon LlcRobotically-assisted surgical suturing systems
US11559298B2 (en)2018-07-162023-01-24Cilag Gmbh InternationalSurgical visualization of multiple targets
US11564678B2 (en)2018-07-162023-01-31Cilag Gmbh InternationalForce sensor through structured light deflection
US11571205B2 (en)2018-07-162023-02-07Cilag Gmbh InternationalSurgical visualization feedback system
US12181579B2 (en)2018-07-162024-12-31Cilag GmbH IntemationalControlling an emitter assembly pulse sequence
US11000270B2 (en)2018-07-162021-05-11Ethicon LlcSurgical visualization platform
US11793390B2 (en)2019-01-102023-10-24Covidien LpEndoscopic imaging with augmented parallax
US11357593B2 (en)2019-01-102022-06-14Covidien LpEndoscopic imaging with augmented parallax
US12226074B2 (en)2019-01-102025-02-18Covidien LpEndoscopic imaging with augmented parallax
US11625825B2 (en)2019-01-302023-04-11Covidien LpMethod for displaying tumor location within endoscopic images
US12257013B2 (en)2019-03-152025-03-25Cilag Gmbh InternationalRobotic surgical systems with mechanisms for scaling camera magnification according to proximity of surgical tool to tissue
US11759284B2 (en)2019-12-302023-09-19Cilag Gmbh InternationalSurgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11776144B2 (en)2019-12-302023-10-03Cilag Gmbh InternationalSystem and method for determining, adjusting, and managing resection margin about a subject tissue
US11813120B2 (en)2019-12-302023-11-14Cilag Gmbh InternationalSurgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11832996B2 (en)2019-12-302023-12-05Cilag Gmbh InternationalAnalyzing surgical trends by a surgical system
US11850104B2 (en)2019-12-302023-12-26Cilag Gmbh InternationalSurgical imaging system
US11864729B2 (en)2019-12-302024-01-09Cilag Gmbh InternationalMethod of using imaging devices in surgery
US11864956B2 (en)2019-12-302024-01-09Cilag Gmbh InternationalSurgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11882993B2 (en)2019-12-302024-01-30Cilag Gmbh InternationalMethod of using imaging devices in surgery
US11896442B2 (en)2019-12-302024-02-13Cilag Gmbh InternationalSurgical systems for proposing and corroborating organ portion removals
US11908146B2 (en)2019-12-302024-02-20Cilag Gmbh InternationalSystem and method for determining, adjusting, and managing resection margin about a subject tissue
US11925309B2 (en)2019-12-302024-03-12Cilag Gmbh InternationalMethod of using imaging devices in surgery
US11925310B2 (en)2019-12-302024-03-12Cilag Gmbh InternationalMethod of using imaging devices in surgery
US11937770B2 (en)2019-12-302024-03-26Cilag Gmbh InternationalMethod of using imaging devices in surgery
US12002571B2 (en)2019-12-302024-06-04Cilag Gmbh InternationalDynamic surgical visualization systems
US11759283B2 (en)2019-12-302023-09-19Cilag Gmbh InternationalSurgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US12053223B2 (en)2019-12-302024-08-06Cilag Gmbh InternationalAdaptive surgical system control according to surgical smoke particulate characteristics
US11744667B2 (en)2019-12-302023-09-05Cilag Gmbh InternationalAdaptive visualization by a surgical system
US11648060B2 (en)2019-12-302023-05-16Cilag Gmbh InternationalSurgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ
US12096910B2 (en)2019-12-302024-09-24Cilag Gmbh InternationalSurgical hub for use with a surgical system in a surgical procedure
US11589731B2 (en)2019-12-302023-02-28Cilag Gmbh InternationalVisualization systems using structured light
US12207881B2 (en)2019-12-302025-01-28Cilag Gmbh InternationalSurgical systems correlating visualization data and powered surgical instrument data
US11284963B2 (en)2019-12-302022-03-29Cilag Gmbh InternationalMethod of using imaging devices in surgery
US11219501B2 (en)2019-12-302022-01-11Cilag Gmbh InternationalVisualization systems using structured light

Also Published As

Publication numberPublication date
EP2548495B1 (en)2018-05-16
EP2548495A4 (en)2013-09-18
US20120327186A1 (en)2012-12-27
EP2548495A1 (en)2013-01-23
JP2011193885A (en)2011-10-06
JP5421828B2 (en)2014-02-19
WO2011114731A1 (en)2011-09-22
CN102811655B (en)2015-03-04
CN102811655A (en)2012-12-05

Similar Documents

PublicationPublication DateTitle
US9179822B2 (en)Endoscopic observation supporting system, method, device and program
US9375133B2 (en)Endoscopic observation support system
JP5535725B2 (en) Endoscope observation support system, endoscope observation support device, operation method thereof, and program
JP5551957B2 (en) Projection image generation apparatus, operation method thereof, and projection image generation program
JP5934070B2 (en) Virtual endoscopic image generating apparatus, operating method thereof, and program
WO2019181632A1 (en)Surgical assistance apparatus, surgical method, non-transitory computer readable medium and surgical assistance system
JP5961504B2 (en) Virtual endoscopic image generating apparatus, operating method thereof, and program
WO2013145730A1 (en)Surgery assistance device and surgery assistance program
WO2013145727A1 (en)Surgery assistance device and surgery assistance program
CN114159157A (en)Method, device and equipment for assisting moving of instrument and storage medium
CN118845248A (en) A surgical navigation method and device for assisting nasal endoscope

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:FUJIFILM CORPORATION, JAPAN

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KITAMURA, YOSHIRO;NAKAMURA, KEIGO;SIGNING DATES FROM 20120809 TO 20120813;REEL/FRAME:028893/0458

STCFInformation on status: patent grant

Free format text:PATENTED CASE

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:8


[8]ページ先頭

©2009-2025 Movatter.jp