Disclosure of Invention
In order to solve the technical problem of poor detection accuracy of the sintering condition of the shuttle kiln in the prior art, the invention aims to provide an image identification method of the sintering condition of the shuttle kiln.
The invention provides a shuttle kiln sintering condition image identification method, which comprises the following steps:
acquiring a flame image of the shuttle kiln to be detected during sintering in real time, and carrying out pretreatment operation on the flame image to obtain a to-be-treated area of a flame gray image;
carrying out edge detection processing on the to-be-processed area of the flame gray level image to obtain to-be-processed edge areas, and further obtaining communication areas corresponding to all edges in the to-be-processed edge areas;
determining furnace wall edge straight line significant coefficients, furnace wall edge chromaticity significant coefficients and furnace wall edge roughness significant coefficients corresponding to each edge according to the position of each pixel point in a connected domain corresponding to each edge in the edge region to be processed, and further determining the furnace wall edge significant coefficients corresponding to each edge;
performing corner point detection processing on each edge in the edge region to be processed to obtain a plurality of corner point target clusters corresponding to each edge, and further obtaining a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes corresponding to each edge;
determining a flame outer flame edge significant coefficient corresponding to each edge according to a plurality of corner point target clusters, a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes corresponding to each edge;
determining a flame segmentation edge significant coefficient corresponding to each edge according to the furnace wall edge significant coefficient and the flame outer flame edge significant coefficient corresponding to each edge, and further determining a target flame region image;
and determining the current sintering condition of the shuttle kiln to be detected according to the target flame area image and the pre-constructed and trained working condition detection neural network.
Further, the step of determining the furnace wall edge straight line significant coefficient, the furnace wall edge chroma significant coefficient and the furnace wall edge roughness significant coefficient corresponding to each edge comprises the following steps:
determining a fitting straight line corresponding to each edge, the fitting goodness of the fitting straight line, a window area corresponding to each pixel point in the connected domain corresponding to each edge and a connected domain expansion area corresponding to each edge according to the position of each pixel point in the connected domain corresponding to each edge in the edge area to be processed;
determining a fitting expansion area corresponding to each pixel point of the fitting straight line according to the fitting straight line corresponding to each edge, and further determining furnace wall edge straight line significant coefficients corresponding to each edge according to the goodness of fit of the fitting straight line corresponding to each edge, the fitting expansion area corresponding to each pixel point of the fitting straight line and each pixel point in a connected domain corresponding to each edge;
according to each pixel point in the window region corresponding to each pixel point in the connected domain corresponding to each edgeRGBAnd determining furnace wall edge chroma significant coefficients corresponding to each edge, and further determining the furnace wall edge roughness significant coefficients corresponding to each edge according to the gray values of all pixel points in the connected domain expansion region corresponding to each edge.
Further, the step of determining the linear saliency of the furnace wall edge for each edge includes:
counting the number of outlier pixels corresponding to each edge according to each pixel point in the fitting expansion region corresponding to each pixel point of the fitting straight line corresponding to each edge and each pixel point in the connected domain corresponding to each edge;
determining the furnace wall edge straight line significant coefficient corresponding to each edge according to the number of pixel points in the connected domain corresponding to each edge, the number of outlier pixel points and the fitting goodness of the fitting straight line, wherein the calculation formula is as follows:
wherein,
for each edge a linear saliency coefficient of the furnace wall edge is assigned,
the goodness of fit for the corresponding fitted line for each edge,
afor the number of outlier pixels corresponding to each edge,
band the number of pixel points in the connected domain corresponding to each edge.
Further, the step of determining the furnace wall edge chroma significant coefficient corresponding to each edge comprises the following steps:
according to each pixel point in the window area corresponding to each pixel point in the connected domain corresponding to each edgeRGBDetermining a first target pixel point and a second target pixel point corresponding to each pixel point in a connected domain;
according to the first target pixel point and the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBDetermining the primary color difference index of each pixel point in the connected domain corresponding to each edge;
and determining the median value of the primary color difference index of the connected domain corresponding to each edge according to the primary color difference index of each pixel point in the connected domain corresponding to each edge, and taking the median value of the primary color difference index as the furnace wall edge chroma significant coefficient corresponding to the corresponding edge.
Further, the calculation formula for determining the primary color difference index of each pixel point in the connected domain corresponding to each edge is as follows:
wherein,
ppfor the primary color difference index of each pixel point in the connected domain corresponding to each edge,
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edge
RGBOf value
RValue of,
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edge
RGBOf value
RValue of,
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edge
RGBOf value
GValue of a step of,
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edge
RGBOf value
GValue of,
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edge
RGBOf value
BValue of,
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edge
RGBOf value
BValue of,
in order to be a non-zero hyper-parameter,
max(-) is a function that takes the maximum value.
Further, the step of determining the furnace wall edge roughness significant coefficient corresponding to each edge comprises:
determining the energy value of each pixel point in the connected domain expansion area according to the gray value of each pixel point in the connected domain expansion area corresponding to each edge;
and calculating the energy value mean value of the connected domain expansion region corresponding to each edge according to the energy value of each pixel point in the connected domain expansion region corresponding to each edge, and taking the energy value mean value as the furnace wall edge roughness significant coefficient corresponding to the corresponding edge.
Further, the step of further obtaining a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes corresponding to each edge comprises:
determining a flame tip angle vertex angle area of each corner point target cluster corresponding to each edge according to the position of each corner point in a plurality of corner point target clusters corresponding to each edge, and determining a plurality of corner point distance indexes corresponding to each edge;
determining flame sharp corner valley bottom areas corresponding to any two adjacent flame sharp corner areas according to the position of each pixel point between any two adjacent flame sharp corner areas corresponding to each edge;
and determining the gliding gradient of the flame sharp angles corresponding to each edge according to the positions of the mass centers in any two adjacent flame sharp angle vertex areas corresponding to each edge and the positions of the mass centers in the flame sharp angle valley bottom areas corresponding to the edge.
Further, the calculation formula for determining the downward sliding gradient of the sharp angle of the plurality of flames corresponding to each edge is as follows:
wherein,
Sa plurality of flame sharp angles corresponding to each edge may be downsloped in a gradient,
is the ordinate of the mass center in the vertex angle area of any two adjacent flame sharp angles,
the horizontal coordinate of the mass center in the vertex angle area of any two adjacent flame sharp angles,
is the ordinate of the mass center in the vertex angle area of any two adjacent other flame sharp angles,
is the abscissa of the mass center in the vertex angle area of any two adjacent other flame sharp angles,
is the longitudinal coordinate of the mass center in the flame sharp angle valley bottom area corresponding to any two adjacent flame sharp angle apex angle areas,
is the abscissa of the mass center in the flame sharp corner valley bottom area corresponding to the vertex angle areas of any two adjacent flame sharp corners,
rounding up is performed for pairs.
Further, the calculation formula for determining the edge significant coefficient of the flame outer flame corresponding to each edge is as follows:
wherein,
OFEfor each edge corresponding to the out-flame edge saliency coefficient,
Nthe total number of corners within the target cluster of corners for each edge,
Sa plurality of flame sharp angles corresponding to each edge slide down in a gradient manner to form a series,
for each edge corresponds to
iThe distance index of each corner point is used,
the number of corner target clusters corresponding to each edge,
max(-) is a function that takes the maximum value.
Further, the step of determining the image of the target flame region further comprises:
calculating a flame segmentation edge significant coefficient mean value according to the flame segmentation edge significant coefficient corresponding to each edge, and taking the flame segmentation edge significant coefficient mean value as a flame segmentation edge threshold value;
if the flame segmentation edge significant coefficient corresponding to a certain edge is larger than the flame segmentation edge threshold value, judging that the edge is the segmentation edge of the flame outer flame and the furnace wall background, otherwise, judging that the edge is not the segmentation edge of the flame outer flame and the furnace wall background, and further obtaining a plurality of segmentation edges of the flame outer flame and the furnace wall background;
and determining a target flame area image according to a plurality of segmentation edges of the flame outer flame and the background of the furnace wall and the to-be-processed area of the flame gray level image.
The invention has the following beneficial effects:
the invention provides a shuttle kiln sintering condition image identification method, which utilizes a data identification technology to accurately identify a target flame area image in a flame image, takes the target flame area image as a reference image for detecting the sintering condition of a shuttle kiln, effectively improves the detection accuracy of the sintering condition of the shuttle kiln, and has lower detection cost; acquiring a flame image during sintering of the shuttle kiln to be detected in real time, carrying out pretreatment operation on the flame image to obtain a to-be-treated area of a flame gray image, and further obtaining a communication area corresponding to each edge in the to-be-treated area. In order to eliminate the influence caused by noise and external interference, preprocessing operation is carried out on the collected flame image, in addition, in order to facilitate the analysis of the subsequent steps and reduce the range of image identification, graying processing is carried out on the flame image, the flame area in the flame gray image is extracted as the area to be processed, the target flame area image can be accurately extracted subsequently, edge detection and connected domain analysis are carried out on the area to be processed of the flame gray image, and the connected domain corresponding to each edge in the edge area to be processed is obtained; and determining furnace wall edge straight line significant coefficients, furnace wall edge chromaticity significant coefficients and furnace wall edge roughness significant coefficients corresponding to each edge according to the position of each pixel point in the connected domain corresponding to each edge in the edge region to be processed, and further determining the furnace wall edge significant coefficients corresponding to each edge. The furnace wall edge significant coefficient corresponding to each edge is constructed by mathematical modeling technology through the edge characteristics that the edge in the shuttle kiln wall background area is close to a straight line, the texture distribution of the edge is rough, the colors and the brightness of two sides of the edge are equal. The larger the furnace wall edge straight line significant coefficient, the furnace wall edge chromaticity significant coefficient and the furnace wall edge roughness significant coefficient corresponding to the edge are, the larger the furnace wall edge significant coefficient corresponding to the edge is; and determining the outer flame edge significant coefficient corresponding to each edge according to the plurality of corner point target clusters, the plurality of flame corner angle gliding gradients and the plurality of corner point distance indexes corresponding to each edge. And performing corner detection on each edge based on the shape characteristics of the flame, wherein the larger the number of corner points of a corner point target cluster corresponding to the edge is, the higher the possibility that the edge is the flame outer flame edge is. The flame sharp angle gliding gradient refers to the slope from the vertex of two adjacent flame vertex angles to the valley bottom of the flame vertex angle, the larger the slope is, the more violent the flame combustion is represented, so that the gliding gradient of the flame sharp angles corresponding to each edge is determined, and whether the edge is the outer flame edge of the flame can be effectively judged; and determining the flame segmentation edge significant coefficient corresponding to each edge according to the furnace wall edge significant coefficient and the flame outer flame edge significant coefficient corresponding to each edge, and further determining the target flame area image. Each edge has a furnace wall edge significant coefficient and a flame outer flame edge significant coefficient corresponding to the edge, and the flame segmentation edge significant coefficient can be obtained by calculating the ratio of the furnace wall edge significant coefficient to the flame outer flame edge significant coefficient, wherein the smaller the furnace wall edge significant coefficient of the edge is, the larger the flame outer flame edge significant coefficient is, the larger the flame segmentation edge significant coefficient of the edge is, and the more the edge is represented as a segmentation edge between the flame outer flame and a furnace wall background. The segmentation edges between the flame outer flames and the furnace wall background can be screened from each edge in the edge area to be processed through the flame segmentation edge significant coefficient corresponding to each edge, and then the segmentation edges between the flame outer flames and the furnace wall background are utilized to obtain a target flame area image, the target flame area image can accurately represent the flame characteristic information during sintering of the shuttle kiln, the influence of other interference factors in the collected flame image is eliminated, and the more accurate target flame area image is obtained; and determining the current sintering condition of the shuttle kiln to be detected according to the target flame area image and the pre-constructed and trained working condition detection neural network. The target flame area image is used as an input image of the working condition detection neural network, so that the detection precision of the working condition detection neural network can be effectively improved, and the detection accuracy of the sintering working condition of the shuttle kiln is further improved.
Detailed Description
To further explain the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the embodiments, structures, features and effects of the technical solutions according to the present invention will be given with reference to the accompanying drawings and preferred embodiments. In the following description, different references to "one embodiment" or "another embodiment" do not necessarily refer to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The embodiment provides a shuttle kiln sintering condition image identification method, as shown in fig. 1, the method comprises the following steps:
(1) Acquiring a flame image of a to-be-detected shuttle kiln during sintering in real time, and carrying out pretreatment operation on the flame image to obtain a to-be-treated area of a flame gray image, wherein the method comprises the following steps of:
and (1-1) acquiring a flame image of the shuttle kiln to be detected during sintering in real time.
In the embodiment, the industrial camera is arranged right opposite to the shuttle kiln to be detected for producing ceramics to collect the flame image when the shuttle kiln to be detected is sintered, the flame area in the shuttle kiln to be detected is enabled to be in a more central position in the flame image by adjusting the shooting angle and the focusing multiple of the industrial camera, the industrial camera is utilized to obtain the flame image when the shuttle kiln to be detected is sintered in real time, and the flame image is visible lightRGBAnd (4) an image.
And (1-2) preprocessing the flame image during sintering of the shuttle kiln to be detected to obtain a to-be-processed area of the flame gray image.
In order to enhance the accuracy of the acquired flame image, the flame image during the sintering of the shuttle kiln to be detected is subjected to preprocessing operation, and the preprocessing operation can eliminate the influence caused by image noise and part of external interference factors. Specifically, gaussian filtering is adopted to respectively reduce noise of each channel of the flame image, that is, a gaussian function and the flame image are utilized to carry out convolution processing to eliminate random noise, and then graying operation is carried out on the flame image after gaussian filtering processing to obtain a flame grayscale image. Then, according to the priori knowledge, the position of the flame source of the shuttle kiln to be detected is fixed, the size of the flame area can be in a fixed range, the area where the flame in the flame gray level image is located is marked by a rectangular frame artificially, and the area is called as the area to be processed of the flame gray level image.
It should be noted that when the area to be treated is to be protected from the flame, the flame may be completely contained within the area to be treated, and there may be some background of the furnace wall in the area to be treated. Since the position of the industrial camera generally does not change, the coordinate position range of the to-be-processed region of the flame gray image is fixed, the schematic diagram of the to-be-processed region of the flame gray image is shown in fig. 2, the rectangular frame flame regions marked by 201 and 202 in the schematic diagram are both to-be-processed regions, and the embodiment can explain the two to-be-processed regions in the subsequent steps.
(2) Carrying out edge detection processing on a to-be-processed area of the flame gray level image to obtain a to-be-processed edge area, and further obtaining a connected domain corresponding to each edge in the to-be-processed edge area, wherein the method comprises the following steps of:
and (2-1) carrying out edge detection processing on the to-be-processed area of the flame gray level image to obtain the to-be-processed edge area.
It should be noted that, a main purpose of this embodiment is to accurately partition a flame region in a region to be processed of a flame grayscale image, where the flame region is a region that only includes an outer flame and an inner flame, that is, the outer flame with low flame brightness is partitioned from an oven wall background, and an edge corresponding to a connection between the outer flame and the oven wall background is found, so that edge detection processing needs to be performed on the region to be processed of the flame grayscale image.
Because the inner flame part of the flame has the highest brightness and is bright yellow, the outer flame part of the flame has lower brightness and is orange red, the difference between the outer flame and the inner flame of the flame is increased, and a clear boundary is formed, the inner flame area of the flame in the area to be processed can be accurately divided by the OTSU large body threshold segmentation method, and the inner flame area of the flame in the area to be processed is obtained. And performing edge detection on the to-be-processed area by utilizing a canny edge detection algorithm according to the to-be-processed area of the flame gray level image to obtain the to-be-processed edge area of the flame gray level image, wherein the to-be-processed edge area is a binary image. Both the OTSU tsu threshold segmentation method and the canny edge detection algorithm are prior art, and are not in the protection scope of the present invention, and are not elaborated herein.
And (2-2) obtaining a connected domain corresponding to each edge in the edge region to be processed according to the edge region to be processed.
Due to the irregularity of the shape and color presented by the flame, the rough inner wall of the shuttle kiln and the existence of many speckles, a plurality of edges can be detected in the area to be processed, the edges can be called as edges to be distinguished, and the edges to be distinguished can be divided into three categories, namely: the edges of the flame inner flame and the flame inner flame, the edges of the flame outer flame and the furnace wall background and the edges generated by the furnace wall background.
In order to prevent the edge at the joint of the inner flame and the outer flame from interfering with the extraction of the accurate flame image, the edge formed by connecting the inner flame and the outer flame is removed according to the position information of each edge pixel point in the inner flame area of the area to be processed, so that the number of the subsequent edges to be analyzed is reduced, and the efficiency of flame image identification is improved. In order to make each edge in the edge area to be processed clearer and more accurate, the edge in the edge area to be processed is firstly subjected to opening operation, so that the independence of each edge is improved, then the closing operation is continued, and the same type of edge which is not clear and causes fracture is connected to obtain each independent edge in the edge area to be processed. Then, performing connected domain analysis on each edge in the edge region to be processed by using a connected domain algorithm to obtain a connected domain corresponding to each edge in the edge region to be processed, wherein each edge has a corresponding connected domain, and the connected domain corresponding to each edge can be analyzed in the subsequent steps. The implementation processes of the open operation, the close operation and the connected domain algorithm are all the prior art, are out of the protection scope of the invention, and are not elaborated herein.
(3) And determining furnace wall edge straight line significant coefficients, furnace wall edge chromaticity significant coefficients and furnace wall edge roughness significant coefficients corresponding to each edge according to the position of each pixel point in the connected domain corresponding to each edge in the edge region to be processed, and further determining the furnace wall edge significant coefficients corresponding to each edge.
Firstly, determining furnace wall edge straight line significant coefficients, furnace wall edge chromaticity significant coefficients and furnace wall edge roughness significant coefficients corresponding to each edge according to the position of each pixel point in a connected domain corresponding to each edge in an edge region to be processed, wherein the furnace wall edge straight line significant coefficients, the furnace wall edge chromaticity significant coefficients and the furnace wall edge roughness significant coefficients comprise the following steps:
and (3-1) determining a fitting straight line corresponding to each edge, the fitting goodness of the fitting straight line, a window area corresponding to each pixel point in the connected domain corresponding to each edge and a connected domain expansion area corresponding to each edge according to the position of each pixel point in the connected domain corresponding to each edge in the edge area to be processed.
In the embodiment, the edge areas to be processed are based on each connected domain corresponding to each edgeFitting each pixel point in the connected domain corresponding to each edge according to the position of each pixel point to obtain a fitting straight line corresponding to each edge, and obtaining the fitting goodness of the fitting straight line corresponding to each edge according to the fitting straight line corresponding to each edge and recording the fitting goodness as

. And constructing a sliding window with the size of 5*5 by taking each pixel point in the connected domain corresponding to each edge in the edge region to be processed as a central point, and sliding the sliding window on the connected domain corresponding to each edge to obtain the sliding window corresponding to each pixel point in the connected domain. And taking the sliding window corresponding to each pixel point as a window area, wherein each pixel point in the connected domain has the window area corresponding to the sliding window, and the area formed by all the pixel points in each sliding window becomes a connected domain extended area, and each edge has the connected domain extended area corresponding to the connected domain extended area. The processes of constructing the sliding window, fitting the straight line and determining the goodness of fit are all prior art and are not within the scope of the invention, and are not described in detail herein.
And (3-2) determining a fitting expansion area corresponding to each pixel point of the fitting straight line according to the fitting straight line corresponding to each edge, and further determining furnace wall edge straight line significant coefficients corresponding to each edge according to the goodness of fit of the fitting straight line corresponding to each edge, the fitting expansion area corresponding to each pixel point of the fitting straight line and each pixel point in a connected domain corresponding to each edge.
In this embodiment, a window with a size of 3*3 is created with each pixel point on the fitted straight line corresponding to each edge as a center through the fitted straight line corresponding to each edge obtained in step (3-1), and a window area corresponding to each pixel point on the fitted straight line is referred to as a fitted extended area.
Determining the furnace wall edge straight line significant coefficient corresponding to each edge according to the goodness of fit of the fitted straight line corresponding to each edge and the fitted extended area corresponding to each pixel point of the fitted straight line, wherein the furnace wall edge straight line significant coefficient corresponding to each edge comprises the following steps:
(3-2-1) counting the number of outlier pixels corresponding to each edge according to each pixel point in the fitting expansion region corresponding to each pixel point of the fitting straight line corresponding to each edge and each pixel point in the connected domain corresponding to each edge.
In this embodiment, based on each pixel point in the fitting extension region and each pixel point in the connected domain corresponding to each pixel point of the fitting straight line, the number of the pixel points in the connected domain is recorded asbCounting the number of pixel points corresponding to each edge in the connected domain but not in the fitting expansion region, calling the pixel points as outlier pixel points, and recording the number of the outlier pixel points asa。
And (3-2-2) determining the furnace wall edge straight line significant coefficient corresponding to each edge according to the number of pixel points in the connected domain corresponding to each edge, the number of outlier pixel points and the goodness of fit of the fit straight line.
Firstly, it should be noted that the shuttle kiln wall has a compact structure, and is mostly constructed by regular vertical or horizontal objects, so that the edge of the shuttle kiln wall background is relatively close to a straight line, and in order to facilitate subsequent determination of the edge of the furnace wall background in a plurality of edges to be resolved, in this embodiment, based on the above analysis, according to the number of pixels in a connected domain corresponding to each edge, the number of outlier pixels, and the goodness of fit of a fitted straight line, the furnace wall edge straight line significant coefficient corresponding to each edge is determined through mathematical modeling, and the calculation formula is as follows:
wherein,
for each edge a linear saliency coefficient of the furnace wall edge is assigned,
the goodness of fit for the corresponding fitted line for each edge,
afor the number of outlier pixels corresponding to each edge,
band the number of pixel points in the connected domain corresponding to each edge.
When the goodness of fit of the fitting straight line corresponding to any edge is larger and the number of the outlier pixel points is smaller, the furnace wall edge straight line significant coefficient corresponding to the edge is larger, and when the goodness of fit of the fitting straight line corresponding to any edge is smaller and the number of the outlier pixel points is larger, the furnace wall edge straight line significant coefficient corresponding to the edge is smaller.
(3-3) according to each pixel point in the window area corresponding to each pixel point in the connected domain corresponding to each edgeRGBAnd determining furnace wall edge chroma significant coefficients corresponding to each edge, and further determining the furnace wall edge roughness significant coefficients corresponding to each edge according to the gray value of each pixel point in the connected domain expansion area corresponding to each edge, wherein the steps comprise:
(3-3-1) according to each pixel point in the window area corresponding to each pixel point in the connected domain corresponding to each edgeRGBAnd (4) determining the furnace wall edge chroma significant coefficient corresponding to each edge.
It should be noted that the inner wall and the outer wall of the shuttle kiln are artificially and specially constructed to show obvious color difference, the inner wall of the shuttle kiln is greatly influenced by flame illumination, and the image shows bright orange yellow, while the outer wall of the shuttle kiln is slightly influenced by flame illumination, and the image shows obviously dull orange. Based on the analysis, the furnace wall background edges can be distinguished by calculating the primary color difference of the pixel points, and then the furnace wall edge chromaticity significant coefficient corresponding to each edge is determined, and the method comprises the following steps:
(3-3-1-1) according to each pixel point in the window region corresponding to each pixel point in the connected domain corresponding to each edgeRGBAnd determining a first target pixel point and a second target pixel point corresponding to each pixel point in the connected domain.
In this embodiment, based on the window region corresponding to each pixel point in the connected domain corresponding to each edge obtained in step (3-1), it may be determined that each pixel point in the window region corresponding to each pixel point in the connected domain corresponds toThe position of (a). Because the flame image of the shuttle kiln to be detected during sintering is visible lightRGBThe image is obtained by detecting the position of each pixel point in the window area in the flame image during sintering of the shuttle kilnRGBThe value is obtained. Selecting each pixel point in the window area corresponding to each pixel point from each pixel point in the window area corresponding to each pixel point in the connected domainRValue (c),GValue andBpixel points with larger values andRvalue (c),GValue andBpixel points with smaller values are arranged in the window areaRValue (c),GValue andBthe pixel points with larger values are used as second target pixel points, and the window area is internally provided withRValue (c),GValue andBand the pixel points with smaller values are used as first target pixel points, and each pixel point in the connected domain has a corresponding first target pixel point and a corresponding second target pixel point.
It should be noted that, in this embodiment, the first target pixel point and the second target pixel point corresponding to each pixel point are pixel points having a characteristic property in the window region, and the accuracy of subsequently determining the primary color difference index can be improved by determining the first target pixel point and the second target pixel point corresponding to each pixel point in the connected domain.
(3-3-1-2) according to the first target pixel point and the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edgeRGBAnd determining the primary color difference index of each pixel point in the connected domain corresponding to each edge.
In this embodiment, the first target pixel point corresponding to each pixel point in the connected domain is calculatedRGBOf value and second target pixel pointRGBThe difference of value, each pixel in the connected domain all has 3 primary color difference values, selects the maximum value in 3 primary color difference values as the primary color difference index of corresponding pixel, and the computational formula of primary color difference index is:
wherein,
ppfor each edge pairThe primary color difference index of each pixel point in the corresponding connected domain,
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edge
RGBOf value
RValue of,
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edge
RGBOf value
RValue of,
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edge
RGBOf value
GValue of a step of,
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edge
RGBOf value
GValue of,
for the second target pixel point corresponding to each pixel point in the connected domain corresponding to each edge
RGBOf value
BValue of,
for the first target pixel point corresponding to each pixel point in the connected domain corresponding to each edge
RGBOf value
BValue of a step of,
in order to be a non-zero hyper-parameter,
max(-) is a function that takes the maximum value.
It should be noted that, in the following description,
is a non-zero hyper-parameter, the function of which is to adjust the value range of the function, the empirical value is 255, and each edge corresponds to a connected domainEach pixel point has a corresponding primary color difference index, and the larger the primary color difference index is, the higher the significance of the subsequently determined edge chromaticity is.
(3-3-1-3) determining the median value of the primary color difference index of the connected domain corresponding to each edge according to the primary color difference index of each pixel point in the connected domain corresponding to each edge, and taking the median value of the primary color difference index as the furnace wall edge chroma significant coefficient corresponding to the corresponding edge.
In this embodiment, based on the primary color difference index of each pixel point in the connected domain corresponding to each edge obtained in step (3-3-1-2), a median value of the primary color difference index of each pixel point in the connected domain is calculated, where the median value is also called a median, specifically, the primary color difference indexes of each pixel point are arranged in order of magnitude to form a sequence, the primary color difference index at the middle position of the sequence is the furnace wall edge chroma significant coefficient, and the calculation formula of the furnace wall edge chroma significant coefficient is as follows:
wherein,
for each edge the corresponding furnace wall edge chroma significant coefficient,
the array formed by the primary color difference indexes of all the pixel points in the connected domain corresponding to each edge,
for the number of primary color difference indexes of each pixel point in the connected domain corresponding to each edge,
is a median function.
It should be noted that the difference between the two sides of the edge corresponding to the stripes at the rough inner wall of the shuttle kiln is obvious, and the difference between the edge formed by the flame outer flame and the background of the furnace wall is not obvious. When the primary color difference index in the connected domain corresponding to a certain edge is larger, the furnace wall edge chroma significant coefficient Δ c corresponding to the edge is larger, and the more the furnace wall edge chroma significant coefficient Δ c is larger, the more the furnace wall edge chroma significant coefficient is impossible to be the segmentation edge of the flame outer flame and the furnace wall background.
And (3-3-2) determining the furnace wall edge roughness significant coefficient corresponding to each edge according to the gray value of each pixel point in the connected domain expansion region corresponding to each edge.
Firstly, it should be noted that, the shuttle kiln is provided with a smoke abatement device, the flame outer flame can generate dust but is less affected by smoke dust, each edge in the flame is layered uniformly, and the flame outer flame and the vicinity of the division edge formed by the furnace wall background are smooth. The texture distribution of the inner wall of the shuttle kiln is rough, the vicinity of each edge of the furnace wall background is also rough, and the significance degree of the roughness of the furnace wall edge corresponding to each edge is determined by analyzing the texture characteristics in the communication domain expansion area corresponding to each edge, wherein the method comprises the following steps:
(3-3-2-1) determining the energy value of each pixel point in the connected domain expansion region according to the gray value of each pixel point in the connected domain expansion region corresponding to each edge.
In this embodiment, based on the connected component expanded region corresponding to each edge obtained in step (3-1), the gray value of each pixel point in the connected component expanded region corresponding to each edge in the flame gray image is obtained. In the connected domain expansion area corresponding to each edge, using a Laws texture measurement method to obtain the energy value of each pixel point in the connected domain expansion area, and recording the energy value of each pixel point in the connected domain expansion area corresponding to each edge as the energy value
IAre respectively a
,
,…,
Wherein
And the number of pixel points in the connected domain expansion region corresponding to each edge is determined. The implementation of the Laws texture measurement method is prior art and is not within the scope of the present invention, and will not be described in detail herein.
(3-3-2-2) calculating the mean value of the energy values of the connected domain expansion regions corresponding to the edges according to the energy values of the pixels in the connected domain expansion regions corresponding to the edges, and taking the mean value of the energy values as the furnace wall edge roughness significant coefficients corresponding to the corresponding edges.
In this embodiment, the energy value of each pixel point in the connected domain expansion region corresponding to each edge is obtained based on the step (3-3-2-1)
,
,…,
Calculating the energy mean value of the connected domain expansion region corresponding to each edge, taking the energy mean value as the furnace wall edge roughness significant coefficient, and calculating the furnace wall edge roughness significant coefficient corresponding to each edge according to the following calculation formula:
wherein,
a furnace wall edge roughness significant factor for each edge,
extension of connected component corresponding to each edgeThe number of pixels in the region is,
is a non-zero value of the number,
the first in the connected domain expansion area corresponding to each edge
nThe energy value of each pixel point.
It should be noted that, when the texture distribution of the inner wall of the shuttle kiln to be detected is rougher, the energy average value of the connected domain expansion area corresponding to the edge is larger, that is, the furnace wall edge roughness significant coefficient corresponding to the edge is larger
The larger the furnace wall edge roughness coefficient
The larger the signature the less likely the edge is to be a separate edge of the flame envelope from the background of the furnace wall.
Thus, the furnace wall edge straight line significant coefficient, the furnace wall edge chromaticity significant coefficient and the furnace wall edge roughness significant coefficient corresponding to each edge are obtained, and the furnace wall edge significant coefficient corresponding to each edge is calculated according to the furnace wall edge straight line significant coefficient, the furnace wall edge chromaticity significant coefficient and the furnace wall edge roughness significant coefficient corresponding to each edge.
In this embodiment, the significance of each edge as the furnace wall background edge is analyzed from three angles, and a probability indicator of each edge as the furnace wall background edge is determined, which is the furnace wall edge significance coefficient in this embodiment. Specifically, the furnace wall edge straight line significant coefficient, the furnace wall edge chromaticity significant coefficient and the furnace wall edge roughness significant coefficient corresponding to each edge are multiplied, and the multiplied numerical value is used as the furnace wall edge significant coefficient of the corresponding edge, and the calculation formula is as follows:
wherein,
FWEfor each edge a corresponding furnace wall edge saliency coefficient,
for each edge a linear saliency coefficient of the furnace wall edge is assigned,
for each edge the corresponding furnace wall edge chroma significant coefficient,
the furnace wall edge roughness significant factor corresponding to each edge.
It should be noted that the linear significance coefficient of the furnace wall edge at a certain edge
Greater, significant coefficient of furnace wall edge chroma
The larger the furnace wall edge chroma is, the significant coefficient
The larger the edge, the significant coefficient of the furnace wall edge of the edge
FWEThe larger, i.e. the more likely it is that the edge is the edge of the furnace wall background.
(4) And carrying out corner point detection processing on each edge in the edge region to be processed to obtain a plurality of corner point target clusters corresponding to each edge, and further obtaining a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes corresponding to each edge.
In this embodiment, each edge in the edge region to be processed is subjected to corner detection processing to obtain a plurality of corners corresponding to each edge, the plurality of corners corresponding to each edge are clustered by using a DBSCAN clustering algorithm with a radius of 3, each edge forms a plurality of corner clusters, and the implementation processes of corner detection and DBSCAN clustering algorithms are both the prior artHowever, it is not within the scope of the present invention and will not be described in detail herein. Selecting corner clusters with the number of corners larger than 2 from a plurality of corner clusters corresponding to each edge, and calling the corner clusters with the number of corners larger than 2 corresponding to each edge as corner target clusters, thereby obtaining the corner clusters corresponding to each edge

The method comprises the following steps of obtaining a plurality of flame sharp corner gliding gradients and a plurality of corner point distance indexes corresponding to each edge according to a plurality of corner point target clusters corresponding to each edge, wherein the steps comprise:
and (4-1) determining a flame tip angle vertex angle area of each corner point target cluster corresponding to each edge according to the position of each corner point in the plurality of corner point target clusters corresponding to each edge, and determining a plurality of corner point distance indexes corresponding to each edge.
(4-1-1) determining a flame sharp corner vertex angle area of each corner point target cluster corresponding to each edge, wherein the flame in a flame image in the sintering process of the shuttle kiln is not in a regular shape, a plurality of flame sharp corners can appear at the top of the flame, the corner can have a plurality of corner points, and the more the number of the corner points in the corner point target clusters is, the higher the combustion degree of the flame sharp corners corresponding to the corner point target clusters is. Based on the analysis, the approximate position of the corresponding flame tip angle can be obtained from the position of each corner point in the corner point target cluster, the convex hull of each corner point target cluster is determined according to the position of each corner point in each corner point target cluster corresponding to each edge, and the convex hull region of each corner point target cluster corresponding to each edge is used as the flame tip angle vertex angle region. The process of determining the convex hull is prior art and is not within the scope of the present invention and will not be described in detail herein.
(4-1-2) determining a plurality of corner distance indexes corresponding to each edge, calculating the distance between every two adjacent corners in each corner target cluster corresponding to each edge according to the position of each corner in the plurality of corner target clusters corresponding to each edge, obtaining each corner distance of each corner target cluster corresponding to each edge, further calculating the corner distance mean value of each corner target cluster corresponding to each edge based on each corner distance of each corner target cluster and the number of corner distances of each corner target cluster, taking the corner distance mean value as a corner distance index, wherein each corner target cluster has a corresponding corner distance index, and therefore each edge can correspond to a plurality of corner distance indexes.
And (4-2) determining flame sharp corner valley bottom areas corresponding to any two adjacent flame sharp corner vertex angle areas according to the positions of all pixel points between any two adjacent flame sharp corner vertex angle areas corresponding to each edge.
In this embodiment, at least one region similar to the valley bottom is determined to exist between two adjacent flame tip angle vertex angle regions corresponding to the edges, based on the position of each pixel point between any two adjacent flame tip angle vertex angle regions corresponding to each edge, a pixel point with the largest vertical coordinate of each pixel point in the flame edge image between any two adjacent flame tip angle vertex angle regions is selected, and the vertical coordinate of the pixel point is recorded asvAnd if a plurality of pixels with the maximum vertical coordinates exist, selecting all the pixels. Traversing the longitudinal coordinate interval of the vertex angle areas of two adjacent flame vertex angles by taking the selected pixel point with the maximum longitudinal coordinate as a starting pointv-w,v]The edge pixel point connected to the starting point,wis 10, a convex hull is constructed by the location of these edge pixel points, and this convex hull region is called the flame tip angle valley floor region.
And (4-3) determining the downward sliding gradient of the flame sharp angles corresponding to each edge according to the positions of the mass centers in any two adjacent flame sharp angle vertex areas corresponding to each edge and the positions of the mass centers in the flame sharp angle valley bottom areas corresponding to each edge.
In this embodiment, based on any two adjacent flame sharp corner vertex angle areas corresponding to each edge and the flame sharp corner valley bottom area corresponding to the flame sharp corner vertex angle areas, distance disturbance and angle disturbance are set, and the positions of the centroids in any two adjacent flame sharp corner vertex angle areas corresponding to each edge and the positions of the centroids in the flame sharp corner valley bottom areas corresponding to the flame sharp corner areas are obtained by using an OpenCV (Open Source Computer Vision Library) recognition technology. According to the position of the barycenter in two arbitrary adjacent flame closed angle apex angle regions that every edge corresponds and the position of the barycenter in the flame closed angle valley bottom region that corresponds, confirm the slope between two arbitrary adjacent flame closed angle apex angle regions and the flame closed angle valley bottom region that corresponds respectively, also calculate a plurality of flame closed angle glide gradients that every edge corresponds, its computational formula is:
wherein,
Sa plurality of flame sharp angle downslide gradients for each edge,
is the ordinate of the mass center in the vertex angle area of any two adjacent flame sharp angles,
the abscissa of the mass center in the vertex angle area of any two adjacent flame sharp angles,
is the ordinate of the mass center in the vertex angle area of any two adjacent other flame sharp angles,
is the abscissa of the mass center in the vertex angle area of any two adjacent other flame sharp angles,
is the longitudinal coordinate of the mass center in the flame sharp angle valley bottom area corresponding to any two adjacent flame sharp angle apex angle areas,
is the abscissa of the mass center in the flame sharp corner valley bottom area corresponding to any two adjacent flame sharp corner apex angle areas,
rounding up is performed for pairs.
It should be noted that, two adjacent flame sharp corner vertex angle regions corresponding to the edges and the corresponding flame sharp corner valley bottom region can obtain a flame sharp corner downward-sliding gradient, if the edges correspond to each otherlA flame tip corner region, the edge corresponding tol1 flame tip glide gradient, each edge corresponding to a plurality of flame tip glide gradients. When the difference between the longitudinal coordinate of the mass center in the flame sharp angle vertex angle area and the longitudinal coordinate of the mass center in the flame sharp angle valley bottom area is larger, the downward sliding gradient of the flame sharp angle is larger, and when the difference between the longitudinal coordinate of the mass center in the flame sharp angle vertex angle area and the longitudinal coordinate of the mass center in the flame sharp angle valley bottom area is smaller, the downward sliding gradient of the flame sharp angle is smaller.
(5) And determining the flame outer flame edge significant coefficient corresponding to each edge according to the plurality of corner point target clusters corresponding to each edge, the plurality of flame sharp angle gliding gradients and the plurality of corner point distance indexes.
In this embodiment, the total number of angular points in a plurality of angular point target clusters corresponding to each edge, a number series composed of a plurality of flame sharp-angle gliding gradients, and a plurality of angular point distance indexes are obtained, and a data modeling method is used to construct a flame outer flame edge significant coefficient corresponding to each edge, where there are two cases when constructing the flame outer flame edge significant coefficient, one is that there is an angular point target cluster corresponding to an edge, and the other is that there is no corresponding angular point target cluster at an edge, and the calculation formula is:
wherein,
OFEfor each edge corresponding to the out-flame edge saliency coefficient,
Nthe total number of corners within the target cluster of corners for each edge,
Sa plurality of flame sharp angles corresponding to each edge slide down in a gradient manner to form a series,
for each edge corresponds to
iThe distance index of each corner point is used,
the number of corner target clusters corresponding to each edge,
max(-) is a function that takes the maximum value.
It should be noted that, when a certain edge corresponds to the total number of corner points in the target cluster of corner pointsNWhen the distance index of the corner points is more and the downward sliding gradient of the sharp corner of the flame is more, the obvious coefficient of the outer flame edge corresponding to the edge is largerOFEThe larger the edge, the more likely the edge is to be a dividing edge between the flame envelope and the furnace wall background.
(6) And determining a flame segmentation edge significant coefficient corresponding to each edge according to the furnace wall edge significant coefficient and the flame outer flame edge significant coefficient corresponding to each edge, and further determining a target flame region image.
In this embodiment, the flame segmentation edge significant coefficient corresponding to each edge is determined by calculating the ratio of the furnace wall edge significant coefficient corresponding to each edge to the flame outer flame edge significant coefficient, and the calculation formula is as follows:
wherein,FSEdividing the edge saliency coefficient for each edge's corresponding flame,OFEfor each edge corresponding to a flame out edge saliency coefficient,FEWthe furnace wall edge saliency factor for each edge.
It should be noted that each edge of the edge region to be processed has its corresponding flame segmentation edge significant coefficient, and when the flame outer flame edge significant coefficient of a certain edgeOFEGreater, significant coefficient of furnace wall edgeOFEThe smaller the flame split edge saliency of the edgeFSEThe larger the edge is, the more likely it is to be a dividing edge between the flame envelope and the background of the furnace wall.
After obtaining the flame segmentation edge significant coefficient corresponding to each edge, determining a target flame area image according to the flame segmentation edge significant coefficient corresponding to each edge, wherein the method comprises the following steps:
and (6-1) calculating a flame segmentation edge significant coefficient mean value according to the flame segmentation edge significant coefficient corresponding to each edge, and taking the flame segmentation edge significant coefficient mean value as a flame segmentation edge threshold value.
In order to facilitate the subsequent screening of the segmentation edges between the flame outer flame and the furnace wall background from the edges to be distinguished, the flame segmentation edge threshold value needs to be determined through the flame segmentation edge significant coefficient corresponding to each edge, is determined through the mean value self-adaption of the flame segmentation edge significant coefficient corresponding to each edge, is higher in referential performance, and is more beneficial to obtaining the accurate segmentation edges of the flame outer flame and the furnace wall background subsequently.
(6-2) if the flame segmentation edge significant coefficient corresponding to a certain edge is greater than the flame segmentation edge threshold value, judging that the edge is the segmentation edge of the flame outer flame and the furnace wall background, otherwise, judging that the edge is not the segmentation edge of the flame outer flame and the furnace wall background, and further obtaining a plurality of segmentation edges of the flame outer flame and the furnace wall background.
When a certain edge corresponds to the flame segmentation edge significance coefficient
FSEGreater than flame cut edge threshold
When the edge is taken as the dividing edge of the flame outer flame and the furnace wall background, when the flame dividing edge corresponding to a certain edge has a significant coefficient
FSELess than flame cut edge threshold
This edge is then designated as the disturbance edge, i.e. the edge is not the dividing edge between the flame envelope and the background of the furnace wall. In this way, a plurality of divided edges of the flame envelope and the furnace wall background in the edge region to be treated can be obtained.
And (6-3) determining a target flame area image according to a plurality of segmentation edges of the flame outer flame and the background of the furnace wall and the to-be-processed area of the flame gray level image.
In the embodiment, based on the plurality of segmentation edges of the flame outer flame and the furnace wall background in the edge region to be processed obtained in the step (4-2) and the region to be processed of the flame gray scale image in the step (1-2), the flame outer flame region in the region to be processed of the flame gray scale image is extracted by using the plurality of segmentation edges of the flame outer flame and the furnace wall background in the edge region to be processed, the flame outer flame region includes the flame inner flame region, and the flame outer flame region in the region to be processed is taken as the target flame region image.
(7) And determining the current sintering condition of the shuttle kiln to be detected according to the target flame area image and the pre-constructed and trained working condition detection neural network.
Inputting the target flame area image into a pre-constructed and trained working condition detection neural network, and outputting the current sintering working condition of the shuttle kiln to be detected, wherein the sintering working condition of the embodiment has four stages: a low-temperature preheating stage at normal temperature of 300-300 ℃, an oxidative decomposition stage at 300-950 ℃, a high-temperature sintering stage at 950-1300 ℃ and a heat preservation stage at about 1300 ℃, wherein the normal temperature can be set by an implementer according to the local specific actual conditions.
A convolutional neural Network ResNet (deep Residual error Network) is adopted in the framework of the working condition detection neural Network, an AdaGrad algorithm is adopted in the optimization algorithm, the loss function is a cross entropy loss function, and the training image set is a plurality of target flame area images. The construction and training process of the working condition detection neural network is the prior art, is out of the protection scope of the invention, and is not elaborated herein.
The acquisition mode of the training image set may be: shooting a flame image of a sintering area of the ceramic shuttle kiln by using an industrial camera to obtain a flame image sample by adoptingKThe thermocouple and the infrared thermometer are combined to measure the corresponding internal temperature of the kiln. Shooting flame images from 50 ℃, shooting flame images in a group of kilns when the temperature rises to 5 ℃, wherein the number of the flame images in the group of kilns is about 20, and shooting is finished until 1300 ℃, and acquiring 5000 more shuttle kilns corresponding to corresponding temperaturesAnd (3) obtaining 250 groups of image data samples with different temperature values by flame images. To prevent overfitting of the network and to improve the accuracy of the training effect, the data size of the training image set used for network training should be as large as possible. However, the process of manually collecting the flame image is complicated and is easily interfered by the smoke of the kiln. Therefore, to obtain a sufficient training image set, a data enhancement method is used to enlarge the amount of data. According to the practical situation of image acquisition, 5 data enhancement methods of noise increase, image brightness increase, rotation, turnover and scaling are selected, 250 groups of image data samples with different temperature values are subjected to data processing, the number of the image data samples in each group is expanded to 100 pictures, and the total number of a training image set is 25000 multiple flame images. Based on the collected 25000 multiple flame images, by utilizing an image recognition technology, referring to the process of determining the target flame region images in the steps (1) to (6), obtaining the target flame region images corresponding to the 25000 multiple flame images, and taking the target flame region images as training samples of the working condition detection neural network.
Therefore, the method and the device realize the detection of the current sintering condition of the shuttle kiln to be detected by using a data identification technology, namely determine the stage of the four temperature stages of the current sintering condition of the shuttle kiln to be detected, and effectively improve the accuracy of the detection of the sintering condition of the shuttle kiln by determining the target flame area image.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.