Movatterモバイル変換


[0]ホーム

URL:


CN114764772A - Tableware contamination state analyzer, image fluorescence analysis method and storage medium - Google Patents

Tableware contamination state analyzer, image fluorescence analysis method and storage medium
Download PDF

Info

Publication number
CN114764772A
CN114764772ACN202110033176.7ACN202110033176ACN114764772ACN 114764772 ACN114764772 ACN 114764772ACN 202110033176 ACN202110033176 ACN 202110033176ACN 114764772 ACN114764772 ACN 114764772A
Authority
CN
China
Prior art keywords
tableware
image
fluorescence
analysis method
control module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110033176.7A
Other languages
Chinese (zh)
Inventor
傅峰峰
刘嘉荣
王培彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fugang Life Intelligent Technology Co Ltd
Original Assignee
Guangzhou Fugang Life Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fugang Life Intelligent Technology Co LtdfiledCriticalGuangzhou Fugang Life Intelligent Technology Co Ltd
Priority to CN202110033176.7ApriorityCriticalpatent/CN114764772A/en
Publication of CN114764772ApublicationCriticalpatent/CN114764772A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

The invention relates to the technical field of tableware pollution identification, in particular to a tableware pollution state analyzer, a tableware image fluorescence analysis method and a computer readable storage medium. The tableware pollution state analysis method comprises the steps of placing tableware in a shooting area of a tableware pollution state analyzer, connecting a mobile phone or other mobile terminals to a terminal interface of the tableware pollution state analyzer, shooting a fluorescent image of the tableware by aiming at the shooting area through a camera of the mobile phone under the control of a control module, extracting fluorescent characteristics after the control module receives the fluorescent image of the tableware from the mobile terminal, and marking the pollution degree of the image according to the pixel proportion of the extracted fluorescent characteristics in the corresponding image, so that the tableware pollution state analysis is realized, whether the surface of the tableware is clean can be rapidly and clearly distinguished, and the pollution degree of the tableware is analyzed under the condition that the tableware is not clean.

Description

Tableware contamination state analyzer, image fluorescence analysis method and storage medium
Technical Field
The invention relates to the technical field of tableware pollution identification, in particular to a tableware pollution state analyzer, a tableware image fluorescence analysis method and a computer readable storage medium.
Background
In real life, no matter common families or catering enterprises can use the tableware, the tableware used by people looks clean with naked eyes, and dust, residual oil stains, residual detergent, finger prints and the like can be generated in reality. Most of the prior art adopts detection test paper to detect, the process is complex, the detection can be carried out by professional personnel, the cost is high, the efficiency is low, a large amount of tableware is generally subjected to sampling detection, and each piece of tableware cannot be detected.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: how to distinguish whether the surface of the tableware is clean or not conveniently and accurately.
The invention discloses a tableware pollution state analyzer, which comprises a shooting area for containing tableware, an ultraviolet lamp for irradiating the tableware in the shooting area, and a control module connected with the ultraviolet lamp, wherein the control module is connected with a terminal interface for accessing an external mobile terminal with a camera; the control module controls the mobile terminal accessed to the terminal interface to shoot towards the shooting area to obtain a fluorescence image, receives the fluorescence image from the mobile terminal, and analyzes the tableware pollution state according to the fluorescence image.
Preferably, a tableware placing means is included, which places the tableware into the photographing area.
Preferably, the camera comprises a photosensitive module connected with the control module and used for sensing whether the shooting area is a darkroom environment or not, and if the non-darkroom environment is sensed, the control module does not control the shooting to be carried out.
Preferably, the ultraviolet environment providing module comprises an ultraviolet lamp.
In a second aspect of the present invention, a method for analyzing tableware image fluorescence is disclosed, wherein fluorescence features are extracted from a fluorescence image captured by the tableware contamination status analyzer, and the contamination degree of the image is marked according to the pixel proportion of the extracted fluorescence features in the corresponding image.
Preferably, the degree of contamination is clean, light contamination, moderate contamination and severe contamination in turn from small to large according to the pixel proportion.
Preferably, the extracting of the fluorescence features is specifically realized by using an image segmentation algorithm.
Preferably, the image segmentation algorithm uses a region growing method: all pixels in the image having a luminance within a predetermined range of an initial luminance value are marked as a fluorescent feature, with the luminance of a certain fluorescent region in the image as the initial value.
Preferably, the image segmentation algorithm employs a threshold segmentation method: and dividing the fluorescence characteristic of the image and the background by a preset gray value.
A third aspect of the present invention discloses a computer-readable storage medium storing a computer program which, when executed by a processor of a tableware contamination status analyzer, is capable of implementing the tableware image fluorescence analyzing method described above.
Compared with the prior art, the embodiment of the invention has the following beneficial effects: some stains on the tableware, such as oil stain residues, detergent residues and finger prints, are not obvious under the visual observation, but can show a fluorescent effect under the irradiation of ultraviolet light, and the images with the fluorescent effect are marked as stained tableware images. The tableware is placed in a shooting area of the tableware pollution state analyzer, a mobile phone or other mobile terminals are connected to a terminal interface of the tableware pollution state analyzer, a camera of the mobile phone is aligned to the shooting area under the control of the control module to shoot to obtain a fluorescence image of the tableware, the control module extracts fluorescence characteristics after receiving the fluorescence image of the tableware from the mobile terminal, and the pollution degree of the image is marked according to the pixel proportion of the extracted fluorescence characteristics in the corresponding image, so that the tableware pollution state analysis is realized, whether the surface of the tableware is clean or not can be rapidly and accurately distinguished, and the pollution degree of the tableware is analyzed under the condition that the tableware is not clean.
Drawings
FIG. 1 is a block diagram showing the structure of the tableware contamination state analyzer;
FIG. 2 is a general flow chart of a method for analyzing tableware contamination;
FIG. 3 is a general flow diagram of a training method for a deep neural network for analyzing the contamination status of dishware;
FIG. 4 is a general flow chart of a method for fluorescence analysis of a dish image.
Detailed Description
Example one
A tableware pollution state analyzer capable of analyzing the pollution state of tableware is arranged in a tableware storage area in the intelligent kitchen or other areas needing to use the tableware. The tableware contamination state analyzer is shown in fig. 1, and includes a photographing region for receiving tableware and a tableware placing device for placing the tableware into the photographing region. The control module of the tableware pollution state analyzer is respectively connected with the image capturing module, the ultraviolet environment providing module and the photosensitive module. The tableware pollution state analyzer is provided with a sealable area or is shielded to form a darkroom environment with weak light intensity, the control module controls the light intensity of the light sensation sensor serving as the light sensation module to sense the environment to confirm that the shooting area is in the darkroom environment, and the specific light intensity range of the darkroom environment is selected to have different values according to different scenes. The ultraviolet environment providing module comprises an ultraviolet lamp for providing an ultraviolet light source for the shooting area. The shooting area is provided with a placing area, and the placing area is provided with a terminal interface which is connected to the control module and used for externally connecting the mobile terminal. The image capturing module is a camera of a mobile phone or other mobile terminals. The camera serving as the image capturing module is aligned with the shooting area. The mobile phone or other mobile terminals are connected with the control module through the terminal interface.
The process of analyzing the contamination of dishes using the tableware contamination status analyzer is shown in FIG. 2 and described in detail below.
(1) The control module controls the light sensor to sense the light intensity of the shooting area, and controls the ultraviolet lamp to be turned on when the state that the shooting area is in a darkroom environment is sensed, so that an ultraviolet light source is provided for the shooting area, and therefore an ultraviolet light image taking environment is provided for cameras of mobile terminals such as mobile phones.
(2) Step P: the control module controls a mobile phone accessed to the terminal interface to start a camera to shoot tableware placed in a shooting area to obtain a tableware image, wherein the polluted tableware can be displayed as a fluorescent image, the control module receives the tableware image from the mobile phone, the tableware image is input to a trained deep neural network by adopting a tableware pollution state analysis method, and the deep neural network judges whether the tableware is clean tableware or dirty tableware according to the tableware image.
(3) And extracting a fluorescence characteristic from the fluorescence image obtained by the tableware contamination state analyzer by adopting a tableware image fluorescence analysis method, and marking the image and the degree of contamination of the photographed tableware according to the pixel proportion of the extracted fluorescence characteristic in the corresponding image.
Preferably, the control module does not control the photographing when the photosensitive module senses that the photographing area is not a dark room environment.
In the step P, the trained deep neural network is specifically a convolutional neural network capable of realizing image classification and identification, and in order to enable the deep neural network to have the capability of judging whether tableware is clean tableware or dirty tableware, that is, the accuracy of judging whether the tableware is clean tableware or dirty tableware according to the tableware image reaches a predetermined standard, for example, the accuracy is more than 98%, the deep neural network for analyzing the tableware pollution state needs to be trained by adopting a plurality of groups of learning samples of clean tableware and a plurality of groups of dirty tableware.
The deep neural network training method is shown in fig. 3, and each group of learning samples includes the following steps:
A. the following sample acquisition steps are performed for clean tableware and for soiled tableware, respectively, to thereby obtain learning samples, respectively, each of the sample acquisition steps including the following S1 and S2,
s1, shooting a tableware image under the ultraviolet irradiation state;
s2, a set of learning samples for the deep neural network to carry out tableware pollution state analysis training are formed by taking the tableware image shot in the step S1 as an input signal and taking whether the tableware is clean or dirty as an output signal;
B. and (4) carrying out tableware pollution state analysis training on the deep neural network by adopting the omic learning samples until the deep neural network has the capability of judging whether the tableware is clean or dirty according to the tableware image.
Preferably, the ultraviolet lamp is turned on before the step S1 is performed and the step S1 is ensured to be performed in a dark room environment.
And (4) giving the clean tableware image and the dirty tableware image to the deep neural network together for training, thus obtaining the deep neural network capable of judging whether the tableware is in a clean state or a dirty state. Through the method that utilizes the deep neural network that has trained to carry out tableware pollution state analysis, ultraviolet lamp with tableware pollution state analyzer or other can provide the ultraviolet environment of ultraviolet irradiation environment and provide the module and get for the picture for the camera provides the ultraviolet ray and get for instance the environment, tableware pollution state analyzer's camera shoots the tableware image under the state of ultraviolet irradiation, only need judge by the deep neural network that passes through the training whether the image that obtains is dirty tableware image, can analyze whether the tableware is in dirty state, thereby can distinguish fast clearly whether the tableware surface is clean.
In the process of analyzing the tableware pollution, the method also comprises the step of extracting the fluorescence characteristics of the image by a tableware image fluorescence analysis method from the image which is judged to be the dirty tableware image by the neural network. Some stains on the tableware, such as oil stain residues, detergent residues and fingerprints, are not obvious under the visual observation, but show different fluorescence effects under the ultraviolet irradiation, so that the image with the fluorescence effect can be regarded as the tableware image with the stains, and the stain types can be distinguished according to the different fluorescence effects represented by the extracted fluorescence characteristics. In addition, the range of the fluorescence characteristics can reflect the degree of contamination, as shown in fig. 4, after the control module judges that the tableware is the dirty tableware through the deep neural network, the degree of contamination of the tableware is sequentially defined as light contamination, moderate contamination and severe contamination from small to large according to the pixel proportion of the extracted fluorescence characteristics in the corresponding image, so that the analysis of the contamination state of the tableware is realized, and the degree of contamination of the surface of the tableware can be rapidly and clearly distinguished.
The extraction of the fluorescence characteristics in the tableware image fluorescence analysis method is realized by adopting an image segmentation algorithm. The image segmentation algorithm uses a region growing method or a threshold segmentation method or a combination of the two. Specifically, the region growing method uses the luminance of a certain fluorescence region in an image as an initial value, and marks all pixels in the image having a luminance within a predetermined range of the initial value as a fluorescence feature. The preset range of the initial value of the brightness is determined according to the specific situation of image analysis, and can be values of-1 nit to 1nit, -2nit to 2nit and the like. The threshold segmentation method specifically segments the fluorescence characteristics of the image with a preset gray value. The image segmentation algorithms are all implemented by using existing algorithms in the prior art, and are not described herein again.
In this embodiment, the control module includes a computer-readable storage medium and a processor connected to each other, and a computer program is stored in the computer-readable storage medium, and when the computer program is executed by the processor, the method for analyzing the tableware contamination state, the method for training the deep neural network for analyzing the tableware contamination state, and the method for analyzing the tableware image fluorescence are implemented.
Example two
In this embodiment, the shooting area of the tableware contamination status analyzer is changed from the semi-closed type in the first embodiment to the open type, and a darkroom environment is provided for the camera by the tableware storage area in the intelligent kitchen or other areas needing tableware. The process of analyzing the tableware contamination using the tableware contamination status analyzer is the same as that of the first embodiment, and is not described herein again.

Claims (10)

CN202110033176.7A2021-01-112021-01-11Tableware contamination state analyzer, image fluorescence analysis method and storage mediumPendingCN114764772A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202110033176.7ACN114764772A (en)2021-01-112021-01-11Tableware contamination state analyzer, image fluorescence analysis method and storage medium

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202110033176.7ACN114764772A (en)2021-01-112021-01-11Tableware contamination state analyzer, image fluorescence analysis method and storage medium

Publications (1)

Publication NumberPublication Date
CN114764772Atrue CN114764772A (en)2022-07-19

Family

ID=82363983

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202110033176.7APendingCN114764772A (en)2021-01-112021-01-11Tableware contamination state analyzer, image fluorescence analysis method and storage medium

Country Status (1)

CountryLink
CN (1)CN114764772A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2007017376A (en)*2005-07-112007-01-25Ishikawajima Harima Heavy Ind Co Ltd Fluorescence flaw detector and fluorescent flaw detection method
US20190239716A1 (en)*2018-02-022019-08-08Dishcraft Robotics, Inc.Intelligent Dishwashing Systems And Methods
CN112188190A (en)*2020-09-302021-01-05广东美的厨房电器制造有限公司Stain detection method, cooking appliance, server, and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2007017376A (en)*2005-07-112007-01-25Ishikawajima Harima Heavy Ind Co Ltd Fluorescence flaw detector and fluorescent flaw detection method
US20190239716A1 (en)*2018-02-022019-08-08Dishcraft Robotics, Inc.Intelligent Dishwashing Systems And Methods
CN112188190A (en)*2020-09-302021-01-05广东美的厨房电器制造有限公司Stain detection method, cooking appliance, server, and storage medium

Similar Documents

PublicationPublication DateTitle
CN111275679B (en)Image-based solar cell defect detection system and method
CN100377166C (en)Method and device for iris recognition
US8103061B2 (en)Method and apparatus for identifying facial regions
US12061196B2 (en)Method for detection and interpretation of results indicated on a photographed diagnostic test
CN110363087B (en)Long-baseline binocular face in-vivo detection method and system
CN108881710A (en)Image processing method, device and system and storage medium
RU2007145734A (en) METHOD AND DEVICE FOR DETECTING SMOKE
CN101685363A (en)Image capture device, image analysis device, external light intensity calculation method and image analysis method
CN112528888A (en)Optical fingerprint acquisition method and device, electronic equipment and storage medium
JP2022501594A (en) Systems, methods, and equipment for autonomous diagnostic verification of optical components of vision-based inspection systems
CN104463827A (en)Image acquisition module automatic detection method and corresponding electronic device
CN111062887B (en)Image definition judging method based on improved Retinex algorithm
CN111563869B (en)Stain test method for quality inspection of camera module
CN103942523B (en)A kind of sunshine scene recognition method and device
EP1947441B1 (en)Apparatus for determining positions of objects contained in a sample
CN115100273A (en)Immunochromatographic test strip quantitative analysis system and detection method based on image processing
CN114764772A (en)Tableware contamination state analyzer, image fluorescence analysis method and storage medium
US20060078199A1 (en)Method for collecting data for color measurements from a digital electronic image capturing device or system
CN112364884B (en)Method for detecting moving object
CN104966060A (en)Target identification method and device for moving object
CN118397664A (en)Method and device for detecting fingerprint residues of shell
CN114764866A (en)Tableware contamination state analysis method and analyzer
CN110880171A (en)Detection method of display device and electronic equipment
CN117809110A (en)Training method of defect detection model and defect detection method
CN116721039A (en)Image preprocessing method applied to automatic optical defect detection

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination

[8]ページ先頭

©2009-2025 Movatter.jp