Disclosure of Invention
an object of the embodiments of the present application is to provide an endoscope system, a hybrid light source, a video capture device and an image processor, so as to improve the accuracy of the endoscope system in distinguishing normal tissues from diseased tissues. To achieve the above object, in one aspect, an embodiment of the present application provides an endoscope system including:
The mixed light source is used for synchronously emitting visible light and near infrared light to a target detection area containing a contrast agent;
The video image acquisition device is used for acquiring image mixed data of the target detection area; the image mixing data comprises visible light image data and near infrared light image data of the target detection area;
an image processor for generating a contrast image of the target detection region from the image blending data;
The video output unit is used for converting the contrast images into a video format and outputting the video format;
and the display unit is used for displaying the contrast images in the video format in real time.
In an embodiment of the present application, the hybrid light source includes: a white light laser light source and a near infrared light laser light source which are close to each other and are independently arranged.
in an embodiment of the present application, the video image capturing apparatus includes:
the endoscope is used for acquiring an image mixed light signal of the target detection area;
the focusing lens is used for focusing the image mixed light signal;
The optical filter is used for filtering out the part of the focused image mixed optical signal, the wavelength of which is positioned outside the visible light and the near infrared light;
and the photoelectric conversion module is used for converting the filtered image mixed optical signal into a corresponding electric signal to serve as image mixed data.
In an embodiment of the present application, the photoelectric conversion module includes any one of:
A charge-coupled element;
A CMOS image sensor.
in an embodiment of the present application, the hybrid light source, the video output unit, and the image processor are integrated into an integral structure.
On the other hand, the embodiment of the application also provides a mixed light source, the mixed light source is applied to an endoscope system, and the mixed light source is used for synchronously emitting visible light and near infrared light to a target detection area containing a contrast agent.
in an embodiment of the present application, the hybrid light source includes: a white light laser light source and a near infrared light laser light source which are close to each other and are independently arranged.
on the other hand, the embodiment of the application also provides a video acquisition device, wherein the video acquisition device is applied to an endoscope system, and the video image acquisition device is used for acquiring image mixed data of a target detection area; the image blending data includes visible light image data and near infrared light image data of the target detection region.
In an embodiment of the present application, the video image capturing apparatus includes:
the endoscope is used for acquiring an image mixed light signal of the target detection area;
The focusing lens is used for focusing the image mixed light signal;
the optical filter is used for filtering out the part of the focused image mixed optical signal, the wavelength of which is positioned outside the visible light and the near infrared light;
and the photoelectric conversion module is used for converting the filtered image mixed optical signal into a corresponding electric signal to serve as image mixed data.
in an embodiment of the present application, the photoelectric conversion module includes any one of:
A charge-coupled element;
A CMOS image sensor.
On the other hand, an embodiment of the present application further provides an image processor, including:
The image acquisition module is used for acquiring image mixed data of a target detection area; the image mixing data comprises visible light image data and near infrared light image data of the target detection area;
The identification and marking module is used for identifying near infrared light image data and visible light image data from the image mixed data and marking the near infrared light image data;
and the image synthesis module is used for synthesizing the marked near infrared light image data and the identified visible light image data into a contrast image of the target detection area.
in an embodiment of the application, the image processor comprises a field programmable gate array.
As can be seen from the above technical solutions provided in the embodiments of the present application, in the endoscope system based on the molecular imaging technology in the embodiments of the present application, because the endoscope system can acquire the near-infrared light image of the target detection region, and the near-infrared light image can penetrate through the deeper tissue of the target detection region, the endoscope system in the embodiments of the present application outputs the displayed contrast image, and can more accurately distinguish the normal tissue and the diseased tissue in the target detection region, thereby improving the accuracy of distinguishing the normal tissue and the diseased tissue by the endoscope system.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. For example, in the following description, forming the second component over the first component may include embodiments in which the first and second components are formed in direct contact, embodiments in which the first and second components are formed in non-direct contact (i.e., additional components may be included between the first and second components), and so on.
also, for ease of description, some embodiments of the present application may use spatially relative terms such as "above …," "below …," "top," "below," etc., to describe the relationship of one element or component to another (or other) element or component as illustrated in the various figures of the embodiments. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements or components described as "below" or "beneath" other elements or components would then be oriented "above" or "over" the other elements or components.
Research shows that the ability of light to penetrate tissue is related to the intensity of light absorbed by the tissue, the characteristics of light waves, the biological tissue structure and the physicochemical characteristics thereof. And Near-Infrared light (NIR) with a wavelength range of 650-900 nm, compared with visible light, has: the absorption and scattering effects of biological tissues on near-infrared light of the wave band are minimum, and the near-infrared light can penetrate deeper tissues compared with visible light; due to the fact that autofluorescence of biological tissues to near-infrared light in the wave band is small, the Signal-to-background ratio (SBR for short) is relatively high.
Referring to fig. 1, based on the above principle, in order to improve the ability of the endoscope system to distinguish between normal tissue and lesion tissue, the endoscope system of some embodiments of the present application may include a hybrid light source, a video image capturing device, an image processor, a video output unit, a display unit, and the like. The mixed light source can be used for synchronously emitting visible light and near infrared light to a target detection area containing a contrast agent. The video image acquisition device can be used for acquiring image mixing data of the target detection area. The image blending data includes visible light image data and near infrared light image data of the target detection region. An image processor may be used to generate a contrast image of the target detection region from the image blending data. The video output unit can be used for converting the contrast images into a video format and outputting the video format. The display unit may be used to display the contrast images in video format in real time.
In the endoscope system according to the above-mentioned embodiment of the present application, since the endoscope system can acquire the near-infrared light image of the target detection region, and the near-infrared light image can penetrate through the tissue at a deeper layer of the target detection region, the endoscope system according to the above-mentioned embodiment of the present application outputs the displayed contrast image, and can distinguish the normal tissue and the lesion tissue in the target detection region more accurately.
In an embodiment of the present application, the hybrid light source may be a white light laser light source and a near infrared light laser light source that are close to each other and are independently disposed, and the white light laser light source and the near infrared light laser light source may synchronously emit visible light and near infrared light to the target detection area correspondingly. In another embodiment of the present application, the hybrid light source may also be a single laser source that emits laser light having a wavelength range at least covering the visible light band and the near infrared light band. Therefore, in some embodiments of the present application, the structure of the hybrid light source is not limited, and may be selected according to the needs.
In an embodiment of the present application, the video image capturing device may include an endoscope, a focusing lens, a filter, and a photoelectric conversion module. Wherein the endoscope is operable to acquire an image-mixed optical signal of the target detection region. The focusing lens can be used for focusing the image mixed light signal to obtain a clear image. The optical filter can be used for filtering out the part of the focused image mixed optical signal, the wavelength of which is positioned outside the visible light and the near infrared light, namely, the visible light part and the near infrared light part can be only reserved in the image mixed optical signal after being filtered by the optical filter. The photoelectric conversion module can be used for converting the filtered image mixed optical signals into corresponding electric signals to serve as image mixed data, so that subsequent processing is facilitated. In some embodiments of the present application, the photoelectric conversion module may be single, so as to reduce power consumption and volume.
In some exemplary embodiments, the photoelectric conversion module may include, but is not limited to, a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) image sensor, for example.
In some embodiments of the present application, the image processor may be implemented by any suitable hardware or combination of hardware and software, as desired. For example, in some exemplary embodiments, the image processor may be implemented by a hardware structure such as a Field Programmable Gate Array (FPGA) or a Complex Programmable Logic Device (CPLD). The FPGA is a processing system based on hardware foundation, and has the advantages that the parallel processing of data can be realized, a plurality of data can be processed simultaneously in one clock cycle, so that the processing time can be shortened, and the image delay is reduced. In the above hardware implementation, as shown in fig. 2, the image processor may include an image acquisition module 21, a recognition and marking module 22, and an image synthesis module 23. Wherein:
The image acquisition module 21 may be configured to acquire image blending data of the target detection region. The image blending data includes visible light image data and near infrared light image data of the target detection region.
The identification and marking module 22 may be configured to identify near-infrared light image data and visible light image data from the image mixture data and mark the near-infrared light image data. In an embodiment of the present application, under irradiation of near-visible light, the target detection region may present a visible light image; since the target detection region contains the contrast agent, the portion of the target detection region containing the contrast agent will present another near-infrared image under the irradiation of the near-infrared light. And in the image mixture data acquired by the image acquisition module 21: the color value of the visible light image corresponding to the visible light image data is significantly different from the color value of the near-infrared light image corresponding to the near-infrared light image data, and therefore, based on the difference of the color values, the recognition and marking module 22 recognizes the near-infrared light image data and the visible light image data from the image mixed data. On the basis, in order to enhance or highlight the effect for the convenience of operation, the recognized near infrared light image data may be marked (e.g., fluorescent mark, etc.).
The image synthesizing module 23 may be configured to synthesize the marked near-infrared light image data and the identified visible light image data into a contrast image of the target detection region.
In some embodiments of the present application, the image processor may further include an image defogging module and an image denoising module. Wherein:
The image defogging module can be used for carrying out dark channel defogging processing on the image mixed data so as to reduce or eliminate the fogging of the picture. Images may be obscured by fogging due to steam or other factors during surgery and other applications. Based on the image defogging module comprising the dark channel defogging algorithm, the fogging blur of the image can be reduced or even eliminated to a certain extent.
the image de-fogging module can be used for performing multi-frame de-noising processing on the image processed by the image de-fogging module so as to improve the signal-to-noise ratio of the image. In the process of image acquisition, image noise may increase due to factors such as light conditions and exposure time, so that the image may not be clear. Through the image defogging module containing the multi-frame noise reduction algorithm, different pixel points with noise point properties can be found out from collected multi-frame images aiming at the same target region under different frame numbers, and a clean and pure image is finally obtained after synthesis through repeated weighting and replacement.
in some embodiments of the present application, the image synthesis module 23 may include:
The RGB conversion sub-module may be configured to RGB-convert the near-infrared light image data marked in the target area to an image with a specified fluorescence effect.
The image overlay sub-module may be configured to overlay the image with the specified fluorescence effect off an image of a visible portion of the target area from visible light.
In an exemplary embodiment of the present application, the image specifying the fluorescence effect may be an image of a green fluorescence effect, for example. The green fluorescence effect is more visual and obvious, so that doctors can distinguish normal tissues and pathological tissues more clearly during operation (or in other application scenes).
In another exemplary embodiment of the present application, the image with the specified fluorescence effect may be displayed in multiple levels, that is, the brightness of the displayed fluorescence may be changed according to the intensity change of the original data, so as to facilitate the physician to distinguish and judge the lesion degree of the lesion tissue.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the units may be implemented in the same or more or hardware modules when implementing the present application.
in other exemplary embodiments, as shown in fig. 3, the image processor may also include a processor and a computer program stored on a memory, the computer program when executed by the processor performing the steps of:
Acquiring image mixed data of a target detection area; the image mixing data comprises visible light image data and near infrared light image data of the target detection area;
Identifying near infrared light image data and visible light image data from the image mixed data, and marking the near infrared light image data;
And synthesizing the marked near infrared light image data and the identified visible light image data into a contrast image of the target detection area.
While the process flows described above include operations that occur in a particular order, it should be appreciated that the processes may include more or less operations that are performed sequentially or in parallel (e.g., using parallel processors or a multi-threaded environment).
in some embodiments of the present application, the video output module may be, for example, a video capture card, such as an HDMI capture card, a VGA video capture card, a PCI video card, a PCI-E video capture card, or the like.
in some embodiments of the present application, the display device may be a display, such as a cathode ray tube (i.e., display), a plasma display, a liquid crystal display (i.e., LCD display), a light emitting diode display (i.e., LED display), an organic light emitting diode display (i.e., OLED display), or the like.
In some embodiments of the present application, the hybrid light source, the video output unit, and the image processor are integrated into a single structure in order to make the endoscope system compact and convenient to use.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
these computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
these computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element.
as will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
the embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.