EndoscopeRELATED APPLICATIONS
The present utility model is a continuation of the section of U.S. patent application No. 17,473,5873, U.S. patent No. 11,330,973, filed on day 2021, month 9 and day 13. U.S. patent No. 11,330,973 is a partial continuation of the following: U.S. patent application Ser. No. 17/362,043, filed on month 29 of 2021 and approved on month 13 of 2022; international patent application PCT/US19/36060 filed on 6 th and 7 th 2019; U.S. patent application Ser. No. 16/363,209, filed on 25/3/2019, U.S. patent application publication No. US 2019/0216325; international patent application No. PCT/US17/53171 filed on 25.9.2017.
The present application incorporates by reference the entirety of the above-referenced patent applications and claims to obtain each of the above-referenced patent applications as well as their directly or indirectly incorporated by reference applications and their claimed benefits, including the dates of application of U.S. provisional applications, U.S. non-provisional applications, and international applications.
The U.S. patent application No. 17,473,587 claims the benefit of the following provisional applications, which are incorporated by reference:
U.S. provisional application No. 63/218,362, filed on 7.4 of 2021;
U.S. provisional application No. 63/213,499 filed on 6/22 of 2021;
U.S. provisional application No. 63/210,034 filed on day 13, 6, 2021;
U.S. provisional application No. 63/197,639 filed on 7/6/2021;
U.S. provisional application No. 63/197,611 filed on 7/6/2021;
U.S. provisional application No. 63/183,151 filed on 5/3 of 2021;
U.S. provisional application No. 63/153,252 filed on 24, 2, 2021;
U.S. provisional application No. 63/149,338, filed on day 14, 2, 2021;
U.S. provisional application No. 63/138,751, filed on 18, 1, 2021;
U.S. provisional application No. 63/129,703, filed on 12 months 23 in 2020;
U.S. provisional application No. 63/124,803 filed on 12/13/2020;
U.S. provisional application No. 63/121,924, filed 12/6/2020;
U.S. provisional application No. 63/121,246, filed on 12/4/2020;
U.S. provisional application No. 63/107,344, filed on 10 months 29 in 2020;
U.S. provisional application No. 63/087,935 filed on 10/6/2020;
U.S. provisional application No. 63/083,932 filed on 9/27/2020;
U.S. provisional application No. 63/077,675, filed on 9/13/2020; and
U.S. provisional application No. 63/077,635 filed on 9/13/2020.
This patent application is also related to the following international, non-provisional and provisional applications, and are incorporated by reference:
international patent application PCT/US17/53171 filed on 25 th 9 2017;
U.S. patent No. 8,702,594 issued 22, 4, 2014;
U.S. patent application Ser. No. 16/363,209, filed on 25/3/2019;
international patent application PCT/US19/36060 filed on 6 th and 7 th 2019;
U.S. patent application Ser. No. 16/972,989, filed 12/7/2020;
U.S. provisional application Ser. No. 62/816,366, filed on 3/11/2019;
U.S. provisional application No. 62/671,445 filed on 5/15/2018;
U.S. provisional application Ser. No. 62/654,295 filed on 4/6/2018;
U.S. provisional application No. 62/647,817 filed on 25.3.2018;
U.S. provisional application No. 62/558,818 filed on day 14 of 9 in 2017;
U.S. provisional application No. 62/550,581 filed on 8/26 2017;
U.S. provisional application No. 62/550,560 filed on 25 th 8 of 2017;
U.S. provisional application No. 62/550,188 filed on 25 th 8 of 2017;
U.S. provisional application No. 62/502,670 filed on 5.6.2017;
U.S. provisional application No. 62/485,641 filed on 14 th 4 th 2017;
U.S. provisional application No. 62/485,454 filed on 14 days 4 of 2017;
U.S. provisional application Ser. No. 62/429,368, filed 12/2016;
U.S. provisional application Ser. No. 62/428,018, filed 11/30/2016;
U.S. provisional application No. 62/424,381 filed 11/18/2016;
U.S. provisional application No. 62/423,213, filed 11/17/2016;
U.S. provisional application Ser. No. 62/405,915, filed 10/8/2016;
U.S. provisional application Ser. No. 62/399,712, filed at 9/26/2016;
U.S. provisional application Ser. No. 62/399,436 filed on day 2016, 9 and 25;
U.S. provisional application Ser. No. 62/399,429, filed by day 2016, 9 and 25;
U.S. provisional application Ser. No. 62/287,901, filed on 28 th 1/2016;
U.S. provisional application Ser. No. 62/279,784 filed on day 2016, 1, 17;
U.S. provisional application No. 62/275,241 filed 1/6/2016;
U.S. provisional application Ser. No. 62/275,222, filed 1/5/2016;
U.S. provisional application No. 62/259,991 filed on 11/25/2015;
U.S. provisional application No. 62/254,718 filed on 11/13 2015;
U.S. provisional application No. 62/139,754 filed on 3.29 of 2015;
U.S. provisional application No. 62/120,316 filed on 24 th month 2 2015; and
U.S. provisional application No. 62/119,521 filed on 2.23.2015.
All of the above non-provisional, provisional and international patent applications are collectively referred to herein as "commonly assigned incorporated applications".
Technical Field
The present utility model relates generally to endoscopes. More particularly, some embodiments relate to portable endoscopic devices that include a reusable handle portion and a disposable or single-use cannula portion.
Background
For conventional rigid endoscopes and flexible endoscopes, the lens or fiber optic system is relatively expensive and reused multiple times. Therefore, it must be strictly sterilized and disinfected after each use. Disposable endoscopes are an emerging class of endoscopic instruments. Disposable endoscopes are an emerging category of endoscopic instruments. In some cases, the manufacturing cost of the endoscope may become inexpensive enough to be used with only a single patient. Disposable or single use endoscopes reduce the risk of cross-contamination and hospital-set disease.
The subject matter described or claimed in this patent specification is not limited to what has been described in the particular embodiments in order to solve any particular disadvantages or to operate only in environments such as those described above. Rather, the foregoing background is provided only for the purpose of illustrating the feasibility of some embodiments described herein in the exemplary technical field.
Disclosure of Invention
In some embodiments, a multi-camera, multi-spectral endoscope comprises: a cannula configured to be inserted into a patient; a forward-looking camera CamW positioned at the front end portion of the cannula observes the target and reacts mainly to the wavelength range of the white light; an electronically controlled color filter also located at the front portion of the cannula, configured to selectively operate in mode a to pass light predominantly in the white light wavelength range, or in mode B to pass light predominantly in a selected narrow wavelength band or fluorescent light to the camera CamF; a front-view camera CamFA/B also positioned at the front end part of the cannula observes the target from different angles and through the color electric control filter; a processing system configured to selectively switch the color filters between mode a and mode B and receive image data from the cameras CamW and CamFA/B, forming a white light stereoscopic image of a target when the filters are operating in mode a, but forming a selected narrowband light or fluorescence image from the camera CamFA/B when the filters are operating in mode B; an image display; wherein the processing system and image display are configured to form and display a composite image that is a white light stereoscopic image overlaid with a selected narrowband light or fluorescence image.
In some embodiments, the multi-camera, multi-spectral endoscope may further include one or more of the following features: (a) A fluid hub, said cannula extending from the hub front end, and a handle for securing the fluid hub; (b) The fluid hub and cannula may comprise a single-use unit, and the handle may comprise a reusable unit, the unit being optionally secured to the single-use unit; (c) The cannula may be configured to rotate relative to a rear end portion of the fluid hub; (e) The endoscope may include a manual bending control, and the forward end portion of the cannula may be configured to bend in response to operation of the manual bending control; (f) At least when the filter is operating in the mode B, the spatial resolution of the camera CamF may be lower than the camera CamW.
In some embodiments, a multi-camera, multi-spectral endoscope comprises: a tubular cannula configured to be inserted into a patient; a first forward looking camera system positioned at the front portion of the cannula and comprising two cameras, camW1 and CamW2, for viewing the same target from different angles and reacting to the CamW1 wavelength range and the CamW2 wavelength range, respectively; a second camera system at the cannula front end portion, comprising a camera CamF which also observes the target but which is primarily responsive to a wavelength range of CamF which is different from at least one of the wavelength ranges of CamW1 and CamW 2; a processing system configured to receive image data from the first and second imaging systems and process the received image data into a stereoscopic image of a target using image data from CamW1 and Cam W2, a 2D image of the target using image data from CamF, and a composite image of the stereoscopic image and the target of the 2D image superimposed; and a display configured to display the composite image.
In some embodiments, the endoscope described in the immediately preceding paragraph may further include one or more of the following features: (a) the wavelength ranges CamW1 and CamW2 overlap; (b) the wavelength ranges CamW1 and CamW2 are white light ranges; the CamF wavelength range is a selected narrow band range or a fluorescence range; (c) The 2D image representing a target region that fluoresces beyond a threshold of possible abnormal tissue, thereby highlighting possible abnormal tissue in the composite image; (d) The composite image includes an overlay layer in which regions of the 2D image are visible; (e) The spatial resolution of the camera CamF is lower than at least one of the cameras CamW1 and CamW 2; (f) The endoscope further includes at least one internal channel in which the cannula, a fluid hub, the cannula extending from a forward end thereof and communicating with the internal channel, wherein the cannula is configured to rotate relative to a rearward end portion of the fluid hub; (g) The endoscope further includes a handle to which the fluid hub is optionally attached and which houses at least a portion of the treatment system; (h) the display is mounted on the handle; (i) The endoscope further includes a manual bending control, wherein the forward end portion of the cannula is configured to bend in response to operation of the manual bending control.
In some embodiments, a multi-camera, multi-spectral endoscope comprises: a cannula configured to be inserted into a patient; a first forward looking camera system at the cannula front end portion comprising cameras CamW1 and CamW2 looking at the target from different angles and reacting to a CamW1 wavelength range and a CamW2 wavelength range, respectively; a second forward-looking camera system, also located at the forward end of the cannula, comprising a CamF1 and a CamF2, viewing said target from different angles and reacting to a CamF1 wavelength range and a CamF2 wavelength range, respectively, which ranges are different from at least one of said CamW1 and CamW wavelengths; the processing system receives the image data from the first camera system and the second camera system, processes the received image data into a CamW image of the target according to the image data from the cameras CamW1 and CamW2, and superimposes the CamF image of the target on the composite image according to the image data from the cameras CamF1 and CamF 2; a display configured to display the composite image.
In some embodiments, the endoscope described in the immediately preceding paragraph may further include one or more of the following features: (a) The CamW1 and CamW2 wavelength ranges are white light ranges, and the CamF1 and CamF2 wavelength ranges are selected wavelength bands or fluorescent light ranges; (b) Each of the images CamW and CamF is a stereoscopic image of the object, and the composite image is a superposition of the images CamW and CamF spatially registered.
In some embodiments, an endoscope includes: an L-shaped handle portion comprising a downwardly extending handle and an axially extending housing; a hub removably secured to the rear end of the housing and a cannula extending from the front end of the hub; wherein: one of the housing and hub includes an axially extending slot facing downward and the other includes an axially extending rail facing upward configured to slide into the slot in a rearward direction to removably secure the hub and cannula to the handle portion; the hub and the housing include respective electrical connectors that mate and make electrical contact when the housing and hub are secured to one another; the rear end portion of the handle portion including an opening, the hub and cannula including a bending mechanism configured to bend the front end portion of the cannula and including a thumb lever extending rearward and extending through the opening and out of the front end of the handle portion when the hub and handle portion are secured to one another, manual manipulation of the thumb lever controlling bending of the cannula front end portion; the camera module is positioned at the front end part of the cannula; a display coupled to the camera module to receive the image data thereof and display an image in accordance therewith.
In some embodiments, the endoscope described in the immediately preceding paragraph may further include one or more of the following features: (a) The bending mechanism includes a wheel mounted within the housing for rotation and coupled to the bending lever for rotation in response to manipulation of the bending lever, and a cable coupled to the wheel and the forward end portion of the cannula to translate rotation of the wheel into bending of the forward end portion of the cannula; (b) The hub and cannula are separated from the handle portion by manually sliding the hub in a forward direction relative to the handle portion; (c) The endoscope includes a lock pin at one of the housing and the hub and a catch at the other, configured to engage and secure the hub to the housing when the endoscope is assembled, and a manually operable releasable means to disengage the lock pin and the catch from one another, thereby allowing removal of the hub from the housing.
Drawings
To further clarify the above and other advantages and features of the present patent specification, a more particular embodiment is illustrated in the drawings. These drawings should be understood as depicting only exemplary embodiments and thus should not be taken as limiting the scope of protection of the present patent specification or appended claims. The subject matter of the utility model is described and explained with specificity and detail through the use of the accompanying drawings in which:
FIGS. 1A, 1B, and 1C are side, top, and rear views of a portable ergonomic endoscope with a disposable cannula in some embodiments of the present utility model;
FIGS. 2A and 2B are perspective views of a portable ergonomic endoscope with a disposable cannula in some embodiments of the present utility model;
FIGS. 3A and 3B are perspective views illustrating the engagement and disengagement of the reusable and disposable portions of the portable ergonomic endoscope in some embodiments;
FIGS. 4A and 4B are perspective and schematic views of a front tip including a plurality of cameras and illumination modules for use with a portable ergonomic endoscope in some embodiments of the utility model;
FIG. 5 is a schematic diagram of a dual camera dual light source system for multispectral imaging and surgical applications in some embodiments;
FIG. 6 is a conceptual diagram illustrating aspects of a design of a dual camera dual light source system for multispectral imaging and surgical applications in some embodiments;
FIG. 7 is a diagram illustrating a possible color filter array configuration of a dual camera dual light source system for multispectral imaging and surgical applications in some embodiments;
FIG. 8 is a graph showing quantum efficiency versus wavelength for Nyxel and conventional pixels;
FIG. 9 is a schematic diagram illustrating further aspects of combining multispectral image data from a dual camera dual light source system in some embodiments;
FIG. 10 is a perspective view showing a combined, spatially registered image displayed to a user on an endoscopic system in some embodiments;
FIG. 11 is a perspective view of an endoscope system with one or more front cameras in some embodiments;
FIG. 12 is a schematic diagram showing the exposure of one camera to white light or fluorescence and the use of another camera with an electronically controlled color filter in some embodiments;
FIG. 13 is a plan view of the cannula front end using a pair of white light cameras and a selected narrow band or fluorescent light camera, and a light source and internal channels within the cannula in some embodiments;
FIG. 14 is otherwise similar to FIG. 13, but in some embodiments, shows a different arrangement of cameras and light sources, and a single internal channel;
FIG. 15 is a distal plan view of a cannula using a pair of white light cameras and a pair of selected narrow band or fluorescent light cameras, and a light source and internal channels within the cannula in some embodiments;
FIG. 16 is otherwise similar to FIG. 15, but in some embodiments, shows a different arrangement of cameras and light sources, and a single internal channel;
FIG. 17 is a perspective view of an endoscope in some embodiments;
FIG. 18 is an exploded perspective view of an endoscope in some embodiments;
FIG. 19 is a cross-sectional view of an endoscopic portion in some embodiments;
FIG. 20 is a cross-sectional view of an endoscopic portion in some embodiments;
FIG. 21 is a cross-sectional view of an endoscope assembly in some embodiments;
fig. 22 is a top view of an endoscope in some embodiments.
Detailed Description
A detailed description of the preferred embodiments is provided below. While several embodiments are described, it should be understood that the novel subject matter described in this patent specification is not limited to any one embodiment or combination of embodiments described herein, but includes many alternatives, modifications, and equivalents. Furthermore, although numerous specific details are set forth in the following description in order to provide a thorough understanding, some embodiments may be practiced without some or all of these details. Moreover, for the sake of clarity, certain technical material that is known in the prior art has not been described in detail to avoid unnecessarily obscuring the novel subject matter described herein. It should be clear that each feature of one or several of the specific embodiments described herein may be used in combination with features of other described embodiments or other features. Further, like reference numbers and designations in the various drawings indicate like elements.
Some embodiments describe a portable ergonomic endoscope system that includes an imaging system having at least two independent cameras and two independent light sources. The camera and the light source are configured for simultaneously imaging a target object, such as tissue. By using different illumination, different filters and controlling the spectral response, different features of the target object can be captured. In some embodiments, the system processor may coordinate the camera, the light source, and display an enhanced composite image of the target object to the user in conjunction with the resulting image. In some embodiments, the system may be configured to perform NBI (narrowband imaging). In some embodiments, the system may be further configured to perform fluorescence imaging.
As used herein, a Color Filter Array (CFA) refers to a filter placed over a pixel to allow a certain bandwidth to pass. Conventional consumer cameras, such as cell phone cameras, use RGB CFAs. For other specific applications, a specific CFA may be designed.
Narrowband imaging (NBI), as used herein, refers to a color imaging technique for endoscopic diagnostic medical testing in which specific blue and green wavelengths of light are used to enhance the details of certain aspects of the mucosal surface. In some embodiments, a special filter may be electronically activated by a switch in the endoscope, causing the use of ambient light, preferably with wavelengths of 415 nm (blue) and 540 nm (green). Because the peak light absorption of hemoglobin occurs at these wavelengths, the blood vessel appears very dark, improving its visibility, and better identifying other surface structures.
Fluorescent Imaging (FI), as used herein, refers to fluorescent imaging, sometimes using fluorescent dyes, to label, highlight, or enhance certain biological mechanisms and/or structures. Fluorescence itself is a form of luminescence that is produced by a substance that emits light of a certain wavelength upon absorption of electromagnetic radiation. For example, in blue light endoscopy, a fluorescent dye (Hexvix) is injected into the bladder. The tissue was then irradiated with blue light (about 405 nanometers) and the fluorescent wavelength emitted by Hexvix was about 610 nanometers. Note that in FI, the camera can see fluorescence emitted inside the object, while in NBI, the camera can see reflection of light of various bandwidths by the object.
In some embodiments, a novel dual camera and dual light source (DCDL) system is described for multispectral or polychromatic imaging. Embodiments of surgical applications having simultaneous white light, fluorescent, and infrared images are disclosed.
The method is applicable to multispectral multiband imaging in general. Some embodiments include an endoscope system that includes two separate camera/LED systems that are integrated into the same cannula or endoscope. A white light camera called camera W is mated with a W white LED called light source. A fluorescent camera, called camera F, is mated to a blue LED, called camera C. In this configuration, when either or both of the light source C and the light source W are turned off, the camera F is used as an infrared video camera.
In some embodiments, the camera W is optimized for white light endoscopes, i.e. the object is illuminated with strong and optimal white light LEDs, so that a high image resolution can be obtained. The camera F is optimized for sensitivity because the fluorescent light source is typically weak. In order to maximize the sensitivity and signal-to-noise ratio of the CMOS sensor pixels to obtain high quality imaging, the following measures are implemented.
In some embodiments, a special Color Filter Array (CFA) is used on the pixel array (as shown in fig. 7), such that the CMOS sensor array is sensitive to the red or infrared spectrum (near 600nm or higher). In some embodiments, to further increase sensitivity, a relatively large pixel (e.g., 2.2um x 2.2 um) is preferably used for the CMOS sensor of camera F. In this case, the camera F preferably has a lower spatial resolution (e.g., 1.75um x 1.75um or 1.0um x 1.0 um) than the camera W pixels, but a much higher sensitivity.
FIGS. 1A, 1B, and 1C are side, top, and back views of a portable ergonomic endoscope with a disposable cannula in some embodiments. The system 100 is suitable for simple and quick use, minimizing patient discomfort and high placement accuracy. The system 100 is comprised of a disposable, or single-use portion 102 and a reusable portion 104. The two portions 102 and 104 may be mated and separated by a connector, as will be described in further detail below. Cannula 120 has an imaging and illumination module at its forward end 110. A cable (not shown) is positioned within the cannula to provide control signals and power to the camera and LED lighting modules on the front end 110 and to transmit video image data from the imaging module to the handle 140 and display screen 150 for viewing by the user. In the illustrated embodiment, the handle 140 includes two control buttons 142 and 144, which may be configured to power on/off and image capture, respectively. In some embodiments, the handle 140 is shaped as a pistol grip as shown and includes a rechargeable battery 141 accessible through a battery door 148. In some embodiments, battery 141 is a 18650 type lithium ion battery. Also included within the handle 140 is an electronics module 143 mounted on a Printed Circuit Board (PCB) 145. The electronic module 143 and PCB 145 are configured to perform various processes such as video processing and capturing, wi-fi for transmitting data to external devices, lighting control, user interface processing, and diagnostics. The electronics module 143 is further configured to include at least one non-volatile memory module for storing video and images captured from the imaging module. In some embodiments, the display 150 may be both tilted and rotated to provide an optimal viewing angle for the user. The swivel joint 152 is configured to provide rotation of the display 150 as indicated by the dashed arrow in fig. 1C, while the hinge joint 154 is configured to provide rotation of the display 150 as indicated by the dashed arrow in fig. 1B. In some embodiments, the hinge joint is configured to allow the display to tilt about 90 degrees, or nearly 90 degrees, at the front end. Such tilting is useful, for example, when giving the operator an unobstructed or less obstructed view. The handle 140 also includes a thumb lever 146 that can be moved up or down, as indicated by the dashed arrow. Moving thumb lever 146 upward and downward causes front end 110 to flex upward and downward, respectively, as indicated by dashed lines 180 and 182. Further details regarding the operation of thumb lever 146 to control the steering of front end 110 and cannula 120 are provided in U.S. patent application Ser. No. 17/362,043, filed on 6/29 at 2021, which is incorporated herein by reference as the' 043 application.
The cannula 120 is connected at its rear end to a fluid hub 172, which in this embodiment includes two fluid ports 132 and 134. At the back end of the fluid hub is a collar 168. In some embodiments, collar 168 is configured to be rotatable so as to allow a "plug and twist lock" fit of disposable portion 102 and reusable portion 104, as will be described in further detail below. In some embodiments, at least a portion of fluid hub 172, along with cannula 120 and front end 110, may be manually rotated relative to handle 140 along a major longitudinal axis of cannula 120, as indicated by solid arrow 124. Thus, the rotatable portion of fluid hub 172 causes rotation of cannula 120 and front end 110, as indicated by solid arrow 122. In some embodiments, the combination of rotating cannula 120 and front end 110, and moving thumb lever 146, the user may "guide" the direction of front end 110 as desired. In some embodiments, the preferred working length of cannula 120 is about 12 inches, although shorter or longer lengths, preferably 5.5 to 6.5 inches in outer diameter, may be used depending on the medical application, but larger or smaller diameters may be used depending on the medical application and the development of camera and illumination technology.
Fig. 2A and 2B are perspective views of a portable ergonomic endoscope with a disposable cannula in some embodiments. Fig. 2A shows a syringe 230 for supplying fluid, such as saline, through a fluid lumen (not shown) within cannula 120 via tubing 232, connector 234 and fluid port 134. In some embodiments, cannula 120 is semi-rigid. Cannula 120 is sufficiently stiff so as not to collapse under the longitudinal pushing and pulling forces expected during the medical procedure it is intended to perform. On the other hand, cannula 120 is sufficiently resilient to bend as it passes through the curved anatomy.
Fig. 3A and 3B are perspective views illustrating the engagement and disengagement of the reusable and disposable portions of the portable ergonomic endoscope in some examples. The disposable portion 102 and the reusable portion 104 are connectable and disconnectable by mechanical and electrical connectors. The electrical connection is made through a USB-C plug 302 (fig. 3A) on the disposable portion 102 and a USB-C receptacle 304 (fig. 3B) on the reusable portion 104. The mechanical connection includes both a structural connection that fixedly connects the disposable portion 102 and the reusable portion 104, and a steering connection by which steering inputs from the steering structure of the reusable portion 104 can be steered to the steering components of the disposable portion 102. In this embodiment, the structural connection includes a male rounded portion 312 on the disposable portion 102 that is shaped to mate with a female socket 314 on the reusable portion 104. The structural connection also includes a twist-lock mechanism in which the male portion 322 may be inserted into the female opening 324 and then locked by twisting the male portion 322 about one quarter turn (90 degrees). The twisting action may be performed manually by a textured or knurled girdle 168. In this way, the connection may be configured as a "plug-in" connection. The steering connection is achieved by meshing the drive gear 334 on the reusable part 104 with the driven gear 332 on the disposable part 102.
Fig. 4A and 4B are perspective and schematic views of a front end including a plurality of cameras and illumination modules for a portable ergonomic endoscope in some embodiments. In fig. 4A, front end 110 is shown connected to the front end of cannula 120. In some embodiments, front end 110 includes a housing 410 that is formed separately from the front end of cannula 120 and bonded together. The housing 410 houses two camera modules: camera F module 420 and camera W module 430. Each of the camera F420 and camera W430 modules includes a lens and a sensor. The sensors of each of cameras F420 and W430 include a color sensor, a color filter array, and electronics and circuitry, as described in further detail below. On both sides of the camera F-module 420 are two blue LED lamps 422 and 424 configured to emit laser light suitable for fluorescent endoscopy. In some embodiments, blue LED lamps 422 and 424 are configured to emit light at approximately 410 nanometers (violet blue). On both sides of the camera W module 430 are two white LEDs 430 and 434 configured to emit white light suitable for visual white light endoscopy. Also shown in fig. 4A is a port 412 configured to provide fluid (into or out of the patient) and/or to provide an opening (e.g., a needle) through which a tool or other device may pass. Note that although fig. 4A shows a total of four LEDs (two white and two blue), in general, other numbers of LEDs may be provided depending on factors such as the desired illumination quality, endoscope size, and LED characteristics such as size and brightness. In some embodiments, three or fewer LEDs may be provided, and in some embodiments, 10 or more LEDs may be provided. In addition, the number of white and blue LEDs is not necessarily equal, and depends on various factors. The LED groups may be 3, 4 or more. Other light sources may be substituted, such as optical fibers that transmit light generated elsewhere.
In fig. 4B, the illustrated embodiment includes two separate devices/fluid passages 414 and 416. In this case, both inner diameters were 2.2 mm. In some embodiments, passage 414 may be connected to fluid port 134 (fig. 1A), while passage 416 is connected to fluid port 132 (fig. 1A). In some embodiments, to increase sensitivity to fluorescence, the CMOS sensor of camera F420 is configured as a larger pixel than camera W430. For example, the pixels of camera F may be 2.2um x 2.2um, arranged in a 400x400 matrix size, while the pixels of camera W are 1.0um x 1.0um or 1.75um x1.75um, arranged in a higher spatial resolution matrix size. Since white LEDs tend to be relatively strong, camera W430 modules may include a CMOS sensor with a small pixel, such as 1.75um x1.75um or 1um x 1um, and thus may achieve higher spatial resolution, up to 720x720 in matrix size.
In some embodiments, camera F420 is used to perform blue (fluorescence) endoscopy with a portion of the CFA. An embodiment is shown in fig. 7, where only the R filter is used, so that blue and green light is filtered out and most of the light reaching the sensor is red. In some embodiments, an infrared camera is used as camera F.
FIG. 5 is a schematic diagram of a dual camera dual light source system for multispectral imaging and surgical applications in some embodiments. As shown, front end 110 includes a camera and an illumination module, i.e., camera F, light source C, camera W, and light source W. Camera F camera 420 is configured to capture images of a particular color or bandwidth, such as fluorescence of a narrow band centered at 610 nanometers. The filters of camera F420 are designed to block other wavelengths of incident light, for example by using specially designed CFA arrays. Camera F may be used for NBI or FI, depending on the particular application. The light source C (422 and 424) for camera F420 may be a laser in the case of fluorescence imaging, or simply blue or green in the case of NBI. LEDs or special light sources may be used. In some embodiments, camera W430 is a conventional white light camera, such as a camera of a cell phone. A typical RGB CFA may be used, and an infrared filter may be used. An infrared filter that filters 50% of wavelengths above 650 nm may be used. The light sources W (432 and 434) of the light source W may be LED lamps having various hues close to daylight. Cannula 120 includes cables 450 and 452. "image F" refers to an image captured by camera F, which may be fluorescent or, in the case of NBI, reflective of green or blue light. "image W" refers to an image captured by camera W, possibly fluorescent, or in the case of NBI, reflective of green or blue light. "fig. F" and "fig. W" in this paragraph refer to "fig. F" and "fig. W" in fig. 5 of the drawings of the specification, respectively.
Because the endoscope has two cameras, can be operated simultaneously, and has different illumination combinations, such as light source C, light source W (or other bands of light), the system takes advantage of having two "eyes" looking at the same target, but looking at different aspects of the target at the same time, thereby extracting more information from the object and target. For example, when blue light is coming up, camera F sees most of the fluorescent emission, while camera W sees both the reflection of the object from light source C (which may be very intense) and a little fluorescent light. Since the two cameras are synchronized and also spatially relatively registered, different kinds of integrated information are delivered to the user, improving the clinical experience compared to seeing only one of the two information about the object or target.
In some embodiments, nyxel technology, developed by OmniVision, may be used. Nyxel pixels are available for camera F420, with significantly improved pixel sensitivity, particularly for red and near infrared bandwidths. This is particularly useful for detecting fluorescence around 610 nm.
In the electronic module 143, front-end processing and main system processing are performed. In some embodiments, the images are synthesized and displayed on display 150.
FIG. 6 is a conceptual diagram illustrating aspects of a design of a dual camera dual light source system for multispectral imaging and surgical applications in some embodiments. In general, it is desirable to capture polychromatic or multispectral images of a target object (e.g., human tissue). In general, a visible light image of an object plus images captured by other color bands is used to better describe the target tissue and shape. Two cameras (camera F, camera W) are associated with two light sources (light source C, light source W). Camera F is an optical camera that is sensitive to certain color bands, such as red and infrared. The output of the camera F is an image F. The light source C is a light source (C-band) instead of white light. In dual frequency imaging (DBI), the light source C may be green or blue. In fluorescence imaging, it may also be a light source that excites an object to emit a fluorescent color. Camera W is an optical camera sensitive to certain color bands (B), such as white light. The output of the camera W is "map W". The light source W is a light source emitting a specific color band B, for example, white light. "drawing W" in this paragraph refers to "drawing W" in fig. 6 of the specification drawings.
Fig. 7 illustrates a possible color filter array configuration of a dual camera dual light source system for multispectral imaging and surgical applications in some implementations. In some embodiments, camera F uses Nyxel pixels (from Omnivision) and a "red-only" filter array, i.e., camF RRRR filters. This configuration allows the red and/or infrared bands to pass while filtering out the background blue and green light.
The camera F can achieve four times the red resolution compared to Nyxel CFA or Old CFA, because one of the four pixels in the Nyxel or Old CFA arrangement is used to capture red. On the other hand, each pixel in the camera F arrangement in fig. 7 is used to capture red.
Fig. 8 shows the quantum efficiency versus wavelength for Nyxel and conventional pixels. In this figure, quantum efficiency shows a new sensor developed by OminiVision, nyxel pixels. Curve 810 is an Nyxel blue pixel. Curve 812 is a conventional blue pixel. Curve 820 is Nyxel green pixel. Curve 822 is a conventional green pixel. Curve 830 is the Nyxel red pixel. Curve 832 is a conventional red pixel. In particular, curves 830 and 832 can be seen that Nyxel red pixels have significantly higher sensitivity to the red or infrared band than conventional red pixels.
Fig. 9 illustrates further aspects of multi-band image data in conjunction with a dual camera dual light source system in some embodiments. With the availability of the global shutter-capability camera F, the camera W may capture image frames under different combinations of light sources C and W being "on" or "off. In "surgical embodiment 1", the light source C (blue light) "on", but the light source W "off", the captured images are "map F" of the camera F and "map WB" of the camera W. "map F" and "map WB" are spatially registered or correlated. This can be done because there is a short time lag between the images taken by the different cameras (or completely synchronized when the two cameras are taken simultaneously). "map WB" provides a background image under illumination by light source C, which can be used to correct the background of "map F". When only light source C is on, "panel F" data is synthesized with "panel WB" to produce "eImgB". "diagram F" and "diagram WB" in this paragraph refer to "diagram F" and "diagram WB" in fig. 9 of the drawings of the specification, respectively.
In the case of a blue endoscope, the signal-to-noise ratio of "map F" is low (due to weak fluorescence signals), and thus a CMOS sensor using high signal-to-noise pixels is used. On the other hand, "map W" has a higher signal-to-noise ratio (due to the strong white light), so CMOS sensors with smaller pixels can be used to improve spatial resolution. "fig. F" and "fig. W" in this paragraph refer to "fig. F" and "fig. W" in fig. 9 of the drawings of the specification, respectively.
In "surgical embodiment 2", a camera F is used to capture a map "IR" when the light source C is "off". "map W" captures a standard white light image with the light source W "on". In this case, "map IR" provides a "heat map" of the target; it is useful when using energy devices such as lasers or radio frequencies for tissue modification. "map IR" may alert the user to a hot spot or cold spot. The data of "map IR" and "map W" may be spatially registered or correlated, also because the time difference between images taken by different cameras is very short (or no time difference). The "map IR" and "map W" may also be combined or superimposed to provide precise locations of hot and cold spots. That is, hot and cold spots can be viewed against the background of a normal standard white light image, providing a localized background of hot and cold spots for the viewer. "drawing W" and "drawing IR" in this paragraph refer to "drawing W" and "drawing IR" in fig. 9 of the specification drawings, respectively.
In "surgical example 3," figure W "was combined with eImgB. By combining examples 1 and 2, high quality eImgB data was spatially registered with white light image "map W". The observer can obtain a high resolution "map W", or a synthetic map of fluorescent eImgB, or both. In some embodiments, the surgeon may use existing images to better view their targets. Fluorescence image eImgB, white light image "map W" and infrared image map "IR" are seamlessly switched between different visualization modes. "drawing W" and "drawing IR" in this paragraph refer to "drawing W" and "drawing IR" in fig. 9 of the specification drawings, respectively.
In a fourth "example 4" (not shown in fig. 9), an artificial intelligence algorithm (or machine learning) can be designed for automatic diagnosis as clinical cases accumulate.
FIG. 10 is a perspective view in which a combined, spatially registered image is displayed to a user on an endoscopic system in some embodiments. In the view displayed, a generally white light image ("FIG. W") 1020 is displayed over a majority of the display screen 150. The illustrated embodiment is "embodiment 3" shown in fig. 9, wherein the eImgB image is combined with a standard white light image ("image W") and spatially registered. In this case, regions 1010 and 1012 were derived from eImgB data and clearly showed cancerous tumors. The user can easily view the plain color images of the cancer areas 1010 and 1012 and surrounding tissue in spatial registration. This mixing or combining provides a greatly enhanced view of the target tissue. In some embodiments, the user can easily switch between different modes (e.g., embodiments 1, 2, or 3) by pressing a switch button, such as button 142, button 144 (shown in fig. 1B and 2B), or by touching soft button 1040 on display 150.
Fig. 11 is a perspective view of an endoscope system having one or more forward facing cameras in some embodiments. The illustrated embodiment has two forward (distal) cameras 1140 and 1142. The forward facing camera allows the user to accurately see the position of the front end without having to remove the screen. During surgery, particularly immediately or upon initial insertion of the front end 110, the user's line of sight may be focused primarily on the display screen 150. The exact position of the front end and its surroundings can be seen on the display 150 by means of the front facing cameras 1140 and 1142. Image enhancement, such as artificially providing depth of field, may be beneficial in certain procedures. Two cameras or other means (e.g., lidar imaging) may be used to simulate front-end centered depth of field to improve usability.
FIG. 12 is similar to FIG. 5 in other respects, and shows a multi-camera, multi-spectral endoscope in which two cameras produce white light stereoscopic images in one mode of operation, but produce selected narrowband light images or fluorescence images in another mode. In fig. 12, a forward looking camera 430 (CamW) is located at the forward end portion of cannula 120 for viewing the target, primarily in response to the wavelength range of white light. An electronically controlled color filter 1202 is also at the forward portion of the cannula and is configured to optionally operate in mode a, predominantly through light in the wavelength range of white light, or in mode B, predominantly through light in a selected narrow band or fluorescence. Examples of such filters are discussed in https:// en.wikipedia. Org/wiki/liquid_crystal_tunable_filter, and suitable wavelength and color filters for endoscopes are discussed in U.S. patent application Ser. No. 16/363/209, publication No. 2019/0216325A1, both of which are incorporated herein by reference. A forward looking camera 12420 (CamFA/B), also located at the forward portion of the cannula, views the target from a different angle and through the electronically controlled color filter. Cameras 430 and 12420 view the target like the two cameras shown in fig. 6. Processing system 143 is configured to selectively switch the color filters between mode a and mode B and process the image data received from cameras 430 and 12420 to form a white light stereoscopic image of the object when the filters are operating in mode a, but to form a selected narrowband image or fluorescence image from camera CamFA/B when the filters are operating in mode B. The image display 150 displays an image, and the processing system 143 and the display 150 are configured to form and display a composite image as a superposition of a white light stereoscopic image and a selected narrowband light image or a fluorescence image, like the image shown in fig. 10, with the difference that the areas of the selected parameters are highlighted. Processing system 143 may be configured to rapidly switch filter 1202 between modes 1 and 2, for example, several times per second or hundreds or more, so that the stereoscopic image and the selected narrowband image or the actual use fluorescence image are substantially real-time display targets. As mentioned above, the selected narrowband image or fluorescence image preferably has a lower spatial resolution than the image of the white light camera. The processing system 143 and the display 150 may be configured to selectively display a composite image, or a stereoscopic image, or a selected narrowband image or a fluoroscopic image, or to display all three images simultaneously. The composite image may be as shown in fig. 10-a superposition of two spatially registered images of the same object, but taken with different wavelengths of illumination.
Fig. 13 illustrates a multi-camera, multi-spectral endoscope in which a first front-view camera system provides a white light stereoscopic image of a target and a second camera system provides a selected narrowband image or a fluorescence image of the target, and a processing system combines the two images into a composite image for superimposed display. In fig. 13, a first forward-looking camera system is located at the front portion of cannula 120, including two cameras-camera 430 (CamW 1) and camera head 431 (CamW 2) both looking at the same target, but at different angles, as the two cameras in fig. 6. Camera 430 reacts primarily to the wavelength range of camera W1, while camera 431 reacts primarily to the wavelength range of camera W2. The two wavelength ranges may be the same white light. A second camera system is also located at the front portion of cannula 120 and includes a camera 420 (camera F) that is also able to view the target but is primarily responsive to a camera F wavelength range that is different from at least one of the camera W1 and camera W2 wavelength ranges. The wavelength ranges of the camera W1 and the camera W2 may be the same white light. The photographic wavelength camera F may be a selected narrow band or fluorescent light. A processing system 143 (fig. 6) is coupled to the first and second camera systems and is configured to receive image data from the first and second camera systems and process the received image data into a white light stereoscopic image, a selected narrowband image or a fluorescence image, and a composite image overlaying the stereoscopic image and the selected narrowband image or fluorescence image. The processing system 143 is also configured to control the LED light sources 242, 244, 432, 434, and 435 to be turned on or off as needed for the respective images. In this embodiment, all three cameras in FIG. 13 may view the target simultaneously. The processing system 143 and the display 150 may be configured to optionally display a composite image, or a stereoscopic image, or a selected narrowband image or a fluoroscopic image, or all three images simultaneously. The composite image may be like that of fig. 10-a superposition of two spatially registered images of the same object, but taken in different wavelength ranges of light. Fig. 13 shows two channels in cannula 120-414 and 416-but one channel or more than two channels may be used.
Fig. 14 is otherwise similar to fig. 13 but shows a multi-camera, multi-spectral endoscope in which three cameras and their light sources are arranged differently, with cannula 120 having a single channel 1402.
Fig. 15 is otherwise identical to fig. 13, but shows a multi-camera, multi-spectral endoscope in which the second front-view camera system includes a camera F1 and a camera F2, both of which are imaged in a selected narrow band or fluorescence wavelength range, such that the system is capable of producing stereoscopic images in white light and in selected narrow band or fluorescence light. In fig. 14, the first forward-looking camera system at the cannula front end portion includes one camera head 430 (camera W1) and one camera 431 (camera W2), and the object is viewed from different angles as in the two cameras in fig. 6. The cameras W1 and W2 react to the wavelength range of the camera W1 and the wavelength range of the camera W2, respectively. A second forward-looking camera system is also located at the forward end portion of the cannula, comprising a camera F1 and a camera F2, views the target from different angles and reacts to the camera F1 wavelength range and the camera F2 wavelength range, respectively, which may be the same or overlapping, including selected narrowband light or fluorescent light. The camera W1 and camera W2 wavelength ranges are white light ranges and may be the same or overlapping. The camera F1 and camera F2 wavelength ranges may be selected narrow band light ranges or fluorescent light ranges, and may be the same or overlapping. Processing system 143 receives the image data from the first and second camera systems and processes the received image data into a composite image of the target superimposed with the white light stereoscopic image and the selected narrowband image or the fluorescence image of the target, which is displayed by display 150. The display 150 may display any one or more of a white light stereoscopic image, a selected narrow band or fluorescent image, and a composite image. The positions of the cameras may be interchanged as long as the two cameras of the first camera system observe the object from different angles, as well as the two cameras of the second camera system observe the object from different angles. Fig. 14 also shows two channels, 414 and 416, in cannula 120, although a different number of channels may be used. Fig. 14 also shows the light sources 242, 244, 432, 434, 433, 435, 437 and 439 of the four cameras, respectively, although a different number or arrangement of light sources may be used.
Fig. 16 is otherwise identical to fig. 14, but shows a multi-camera, multi-spectral endoscope in which four cameras and their light sources are arranged differently around a channel 1502 in cannula 12.
Figures 17-22 illustrate an endoscope in some embodiments. Fig. 17 is a perspective view of an assembled endoscope 17100, fig. 18 shows the reusable portion 17104 and the disposable portion 17102 as separate units prior to their being removably assembled by sliding the portion 1702 from the front end into the portion 17104. Portion 17104 includes a display 150 and an L-shaped handle portion 17140 that is comprised of a downwardly extending handle 17141 and an axially extending housing 17142, the handle 17141 being graspable by a user's hand. The display 150 is mounted at section 17104. Portion 17102 includes a hub 17172 removably secured to housing 17142 and a cannula 17120 extending rearward from the hub. The housing 17142 has an axially extending, downward facing slot 1920 (fig. 18 and 19), and the hub 17172 includes an axially extending, upward facing track 1802 configured to slide into the slot 1920 in a rearward direction to removably secure the portions 17102 and 17104 to one another. 17-22, the disposable unit 17102 extends along a longitudinal axis, and the reusable unit 17104 has an upper portion with an open slot 1920 extending along the longitudinal axis and a handle portion 17141 extending along a handle axis transverse to the longitudinal axis, wherein the fluid hub 17172 is configured to releasably snap into the open slot 1920. The rear end of the hub 17172 has a rear-facing electrical connector 1804 (fig. 18) and the housing 17142 has a mating front-facing electrical connector 1904. When the portions 17102 and 17104 are secured to one another, resulting in the assembled endoscope 17100 shown in FIG. 17, the two electrical connectors mate and make electrical contact. The rear end of the handle portion 17140 has an oblong opening 1906 through which the rear end of the thumb lever passes and protrudes toward the rear end when the endoscope is assembled into the form of fig. 17. The oblong opening 1906 is connected to a vertical opening 1908 that moves the shaft of the thumb shaft 1910 up and down. The thumb lever is part of a bending mechanism, described below, which bends the forward end of the cannula to a bent position as shown in fig. 17, as well as any intermediate positions. The curvature may be upward or downward.
Fig. 19 is a perspective view of portions 17102 and 17104. The portion 17104 has an opening 1912, 17102 in which the portion slides, as seen from the front end. Seen in this opening is an axial, downward, C-shaped slot 1914 and an electrical connector 1916. As shown in fig. 18, the hub 17172 has an upward, axially extending rail 1802 that is T-shaped and configured to slide into the slot 1914 during endoscope assembly. Also seen in fig. 18 is an electrical connector 1804 that is configured to mate with and make electrical contact with electrical connector 1916 of fig. 19 when the endoscope is assembled. Fig. 19 also shows a locking pin 1918 and a release 1921 for securely locking portions 17102 and 17104 when the endoscope is assembled, as described in more detail below in connection with fig. 20. The cannula 17120 has a camera and light module at its front end, which may be any of the modules discussed above with respect to other embodiments of the endoscope, and is connected to the display 150 by internal cables and electrical connectors (1916 and 1804 in this case), as in the other examples discussed above. Portions 17104 may have buttons or other manually operated inputs, as discussed above with respect to other embodiments of the endoscope, to control camera functions and/or other functions. The handle portion 17140 may house electronics for processing image data, as discussed above with respect to other embodiments. In some embodiments of the endoscope 17100, the display 150 may be eliminated and the image data may be displayed on an external display that is connected to the camera module at the front end of the cannula 17120 wirelessly or by a cable.
Fig. 20 is a cross-sectional view of a portion of section 17102 showing latch 1918 being urged upward by spring 2002 and lock release 1921, which when pushed to the rear pushes latch 1918 downward and out of engagement with bayonet 1922 (fig. 18), bayonet 1922 being a recess in the bottom surface of opening 1914. When the endoscope is assembled, locking pin 1918 engages bayonet 1922, securing portions 17102 and 17104 together. After the medical procedure is completed, the user presses the lock release 1921, releasing the engagement of the pin 1918 and the catch 1922 and pulling the portion 17102 out of the front end of the portion 17104. Fig. 20 further illustrates a bending mechanism for bending the front end of the cannula 17120, which includes a half wheel 2004 mounted for rotation about its center and secured to the thumb lever 1910 so as to translate up and down movement of the thumb shaft 1910 into rotation of the half wheel 2004. The cable 2006 is fixed to the half wheel 2006 and the distal end of the insertion tube 17120, such that when the half wheel 2004 rotates in one direction, the distal end of the insertion tube bends in one direction, and when the half wheel 2004 rotates in the opposite direction, the distal end of the insertion tube bends in the opposite direction.
Fig. 21 is an exploded perspective view of the handle portion 17140 and the display 150 and components of portion 17102. The hub 17172 is composed of left and right covers 17172a and 127172b and left and right covers 17172c and 17172d, and extends toward the front end. A cap 2102 is threaded onto the front end of the hub 17172 to secure the cannula 17120 to the hub 17172. The liquid ports 2104 and 2106 merge into one Lu Ercha, into the cannula 17120, as does the cable 2006. An electrical connector 1916 (which may be a DP20 connector) is also part of the 17102 unit. The mechanical connector 2008 helps assemble 17102 portions of the endoscope.
Fig. 22 is a top view of the assembled endoscope 17100, showing the relative positions of the components, including the fluid ports 2104 and 2106.
As described above, features and components associated with one of the embodiments may be used with another of the embodiments. As a non-limiting example, different configurations of imaging and illumination modules may be used with any of the described endoscopes, the cannula bending mechanism described in connection with fig. 20 may be used with any of the described endoscopes, and so forth.
Although the foregoing has been described in some detail for purposes of clarity of illustration, it will be apparent that certain changes and modifications may be practiced without departing from the principles of the utility model. It should be noted that there are many alternative ways of implementing the processes and apparatuses described herein. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the body of work described herein is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.