Disclosure of Invention
The technical problem to be solved by the invention is as follows: how to distinguish whether the surface of the tableware is clean or not conveniently and accurately.
The invention discloses a tableware pollution state analyzer, which comprises a shooting area for containing tableware, an ultraviolet lamp for irradiating the tableware in the shooting area, and a control module connected with the ultraviolet lamp, wherein the control module is connected with a terminal interface for accessing an external mobile terminal with a camera; the control module controls the mobile terminal accessed to the terminal interface to shoot towards the shooting area to obtain a fluorescence image, receives the fluorescence image from the mobile terminal, and analyzes the tableware pollution state according to the fluorescence image.
Preferably, a tableware placing means is included, which places the tableware into the photographing area.
Preferably, the camera comprises a photosensitive module connected with the control module and used for sensing whether the shooting area is a darkroom environment or not, and if the non-darkroom environment is sensed, the control module does not control the shooting to be carried out.
Preferably, the ultraviolet environment providing module comprises an ultraviolet lamp.
In a second aspect of the present invention, a method for analyzing tableware image fluorescence is disclosed, wherein fluorescence features are extracted from a fluorescence image captured by the tableware contamination status analyzer, and the contamination degree of the image is marked according to the pixel proportion of the extracted fluorescence features in the corresponding image.
Preferably, the degree of contamination is clean, light contamination, moderate contamination and severe contamination in turn from small to large according to the pixel proportion.
Preferably, the extracting of the fluorescence features is specifically realized by using an image segmentation algorithm.
Preferably, the image segmentation algorithm uses a region growing method: all pixels in the image having a luminance within a predetermined range of an initial luminance value are marked as a fluorescent feature, with the luminance of a certain fluorescent region in the image as the initial value.
Preferably, the image segmentation algorithm employs a threshold segmentation method: and dividing the fluorescence characteristic of the image and the background by a preset gray value.
A third aspect of the present invention discloses a computer-readable storage medium storing a computer program which, when executed by a processor of a tableware contamination status analyzer, is capable of implementing the tableware image fluorescence analyzing method described above.
Compared with the prior art, the embodiment of the invention has the following beneficial effects: some stains on the tableware, such as oil stain residues, detergent residues and finger prints, are not obvious under the visual observation, but can show a fluorescent effect under the irradiation of ultraviolet light, and the images with the fluorescent effect are marked as stained tableware images. The tableware is placed in a shooting area of the tableware pollution state analyzer, a mobile phone or other mobile terminals are connected to a terminal interface of the tableware pollution state analyzer, a camera of the mobile phone is aligned to the shooting area under the control of the control module to shoot to obtain a fluorescence image of the tableware, the control module extracts fluorescence characteristics after receiving the fluorescence image of the tableware from the mobile terminal, and the pollution degree of the image is marked according to the pixel proportion of the extracted fluorescence characteristics in the corresponding image, so that the tableware pollution state analysis is realized, whether the surface of the tableware is clean or not can be rapidly and accurately distinguished, and the pollution degree of the tableware is analyzed under the condition that the tableware is not clean.
Detailed Description
Example one
A tableware pollution state analyzer capable of analyzing the pollution state of tableware is arranged in a tableware storage area in the intelligent kitchen or other areas needing to use the tableware. The tableware contamination state analyzer is shown in fig. 1, and includes a photographing region for receiving tableware and a tableware placing device for placing the tableware into the photographing region. The control module of the tableware pollution state analyzer is respectively connected with the image capturing module, the ultraviolet environment providing module and the photosensitive module. The tableware pollution state analyzer is provided with a sealable area or is shielded to form a darkroom environment with weak light intensity, the control module controls the light intensity of the light sensation sensor serving as the light sensation module to sense the environment to confirm that the shooting area is in the darkroom environment, and the specific light intensity range of the darkroom environment is selected to have different values according to different scenes. The ultraviolet environment providing module comprises an ultraviolet lamp for providing an ultraviolet light source for the shooting area. The shooting area is provided with a placing area, and the placing area is provided with a terminal interface which is connected to the control module and used for externally connecting the mobile terminal. The image capturing module is a camera of a mobile phone or other mobile terminals. The camera serving as the image capturing module is aligned with the shooting area. The mobile phone or other mobile terminals are connected with the control module through the terminal interface.
The process of analyzing the contamination of dishes using the tableware contamination status analyzer is shown in FIG. 2 and described in detail below.
(1) The control module controls the light sensor to sense the light intensity of the shooting area, and controls the ultraviolet lamp to be turned on when the state that the shooting area is in a darkroom environment is sensed, so that an ultraviolet light source is provided for the shooting area, and therefore an ultraviolet light image taking environment is provided for cameras of mobile terminals such as mobile phones.
(2) Step P: the control module controls a mobile phone accessed to the terminal interface to start a camera to shoot tableware placed in a shooting area to obtain a tableware image, wherein the polluted tableware can be displayed as a fluorescent image, the control module receives the tableware image from the mobile phone, the tableware image is input to a trained deep neural network by adopting a tableware pollution state analysis method, and the deep neural network judges whether the tableware is clean tableware or dirty tableware according to the tableware image.
(3) And extracting a fluorescence characteristic from the fluorescence image obtained by the tableware contamination state analyzer by adopting a tableware image fluorescence analysis method, and marking the image and the degree of contamination of the photographed tableware according to the pixel proportion of the extracted fluorescence characteristic in the corresponding image.
Preferably, the control module does not control the photographing when the photosensitive module senses that the photographing area is not a dark room environment.
In the step P, the trained deep neural network is specifically a convolutional neural network capable of realizing image classification and identification, and in order to enable the deep neural network to have the capability of judging whether tableware is clean tableware or dirty tableware, that is, the accuracy of judging whether the tableware is clean tableware or dirty tableware according to the tableware image reaches a predetermined standard, for example, the accuracy is more than 98%, the deep neural network for analyzing the tableware pollution state needs to be trained by adopting a plurality of groups of learning samples of clean tableware and a plurality of groups of dirty tableware.
The deep neural network training method is shown in fig. 3, and each group of learning samples includes the following steps:
A. the following sample acquisition steps are performed for clean tableware and for soiled tableware, respectively, to thereby obtain learning samples, respectively, each of the sample acquisition steps including the following S1 and S2,
s1, shooting a tableware image under the ultraviolet irradiation state;
s2, a set of learning samples for the deep neural network to carry out tableware pollution state analysis training are formed by taking the tableware image shot in the step S1 as an input signal and taking whether the tableware is clean or dirty as an output signal;
B. and (4) carrying out tableware pollution state analysis training on the deep neural network by adopting the omic learning samples until the deep neural network has the capability of judging whether the tableware is clean or dirty according to the tableware image.
Preferably, the ultraviolet lamp is turned on before the step S1 is performed and the step S1 is ensured to be performed in a dark room environment.
And (4) giving the clean tableware image and the dirty tableware image to the deep neural network together for training, thus obtaining the deep neural network capable of judging whether the tableware is in a clean state or a dirty state. Through the method that utilizes the deep neural network that has trained to carry out tableware pollution state analysis, ultraviolet lamp with tableware pollution state analyzer or other can provide the ultraviolet environment of ultraviolet irradiation environment and provide the module and get for the picture for the camera provides the ultraviolet ray and get for instance the environment, tableware pollution state analyzer's camera shoots the tableware image under the state of ultraviolet irradiation, only need judge by the deep neural network that passes through the training whether the image that obtains is dirty tableware image, can analyze whether the tableware is in dirty state, thereby can distinguish fast clearly whether the tableware surface is clean.
In the process of analyzing the tableware pollution, the method also comprises the step of extracting the fluorescence characteristics of the image by a tableware image fluorescence analysis method from the image which is judged to be the dirty tableware image by the neural network. Some stains on the tableware, such as oil stain residues, detergent residues and fingerprints, are not obvious under the visual observation, but show different fluorescence effects under the ultraviolet irradiation, so that the image with the fluorescence effect can be regarded as the tableware image with the stains, and the stain types can be distinguished according to the different fluorescence effects represented by the extracted fluorescence characteristics. In addition, the range of the fluorescence characteristics can reflect the degree of contamination, as shown in fig. 4, after the control module judges that the tableware is the dirty tableware through the deep neural network, the degree of contamination of the tableware is sequentially defined as light contamination, moderate contamination and severe contamination from small to large according to the pixel proportion of the extracted fluorescence characteristics in the corresponding image, so that the analysis of the contamination state of the tableware is realized, and the degree of contamination of the surface of the tableware can be rapidly and clearly distinguished.
The extraction of the fluorescence characteristics in the tableware image fluorescence analysis method is realized by adopting an image segmentation algorithm. The image segmentation algorithm uses a region growing method or a threshold segmentation method or a combination of the two. Specifically, the region growing method uses the luminance of a certain fluorescence region in an image as an initial value, and marks all pixels in the image having a luminance within a predetermined range of the initial value as a fluorescence feature. The preset range of the initial value of the brightness is determined according to the specific situation of image analysis, and can be values of-1 nit to 1nit, -2nit to 2nit and the like. The threshold segmentation method specifically segments the fluorescence characteristics of the image with a preset gray value. The image segmentation algorithms are all implemented by using existing algorithms in the prior art, and are not described herein again.
In this embodiment, the control module includes a computer-readable storage medium and a processor connected to each other, and a computer program is stored in the computer-readable storage medium, and when the computer program is executed by the processor, the method for analyzing the tableware contamination state, the method for training the deep neural network for analyzing the tableware contamination state, and the method for analyzing the tableware image fluorescence are implemented.
Example two
In this embodiment, the shooting area of the tableware contamination status analyzer is changed from the semi-closed type in the first embodiment to the open type, and a darkroom environment is provided for the camera by the tableware storage area in the intelligent kitchen or other areas needing tableware. The process of analyzing the tableware contamination using the tableware contamination status analyzer is the same as that of the first embodiment, and is not described herein again.