Disclosure of Invention
The embodiment of the application solves the problem of uneven quality of the face image under complex environments such as different illumination, shielding and the like caused by poor adaptability in the prior art by providing the real-time face quality filtering method and system energized by edge calculation, realizes the technical effects of dynamically adjusting the image screening grade according to the real-time running state, optimizing the filtering strategy and improving the face image quality filtering precision and system adaptability.
The embodiment of the application provides a real-time face quality filtering method based on edge computing energization, which comprises the steps of acquiring face images in real time through edge equipment and acquiring a baseline filtering strategy preset by the edge equipment;
according to a baseline filtering strategy, carrying out quality evaluation and screening on the face image;
Monitoring the running state of the current image screening grade on the edge equipment, and adjusting the current image screening grade according to the running state;
judging whether the running state of the current image screening grade meets a preset event triggering condition or not;
When the event triggering condition is met, generating an event report containing context metadata and sending the event report to a central management platform;
adjusting the baseline filtering strategy through the received event report containing the context metadata, and generating an adjusted baseline filtering strategy;
and issuing the adjusted baseline filtering strategy to the edge equipment, wherein the edge equipment receives and replaces the original baseline filtering strategy by the adjusted baseline filtering strategy.
Further, the step of evaluating and screening the quality of the face image includes:
preprocessing the acquired face image;
Acquiring characteristic information of the face image based on the preprocessed face image;
Obtaining a standard value of the face image based on the characteristic information of the face image;
Comparing the standard value of the face image with the standard value of the face image corresponding to the current image screening level in the baseline filtering strategy;
If the standard value of the face image is not smaller than the standard value of the face image corresponding to the current image screening grade, reserving the face image, and marking the image screening grade as the current grade;
And if the standard value of the face image is smaller than the standard value of the face image corresponding to the current image screening level, reviewing the face image.
Further, the step of obtaining a standard value of the face image based on the feature information of the face image includes:
Obtaining a standard value of the face image through a standard value calculation formula of the face image;
The standard value calculation formula of the face image is as follows:
Wherein Q is a standard value of a face image, B is an image ambiguity, I is an illumination intensity of the image, Imid is a desired intermediate illumination intensity value, Imax is a maximum value of the illumination intensity, O is a shielding degree of the image, C is a face integrity of the image, and omega1、ω2、ω3、ω4 is a weight coefficient of the image ambiguity, the illumination intensity, the shielding degree and the face integrity in standard value calculation respectively.
Further, the step of reviewing the face image includes:
temporarily storing the face images with the standard values smaller than the standard values of the face images corresponding to the current image screening level;
The face images in the temporary buffer area are reviewed through a preset review period;
In the review process, if the current image screening level is found to be adjusted, re-evaluating the temporarily stored face image;
If the standard value of the re-evaluated face image is not smaller than the standard value of the face image corresponding to the image screening grade after grade adjustment, the face image is moved out of the temporary cache area and is normally used;
if the standard value of the face image after re-evaluation is smaller than the standard value of the face image corresponding to the image screening grade after grade adjustment, discarding the face image.
Further, the step of monitoring the running state of the current image filtering level on the edge device and adjusting the current image filtering level according to the running state includes:
Setting a preset screening time window, and counting the number of face images which accord with the filtering standard corresponding to the current screening grade in the preset screening time window;
if the images meeting the filtering standards corresponding to the current filtering grades are not filtered out at the end of the preset filtering time window, carrying out degradation treatment on the current image filtering grades;
and if the image meeting the filtering standard corresponding to the current filtering level is filtered out after the preset filtering time window is finished, upgrading the current image filtering level.
Further, the step of determining whether the running state of the current image filtering level meets a preset event triggering condition includes:
Setting an event monitoring period, and acquiring a stability index of an image screening grade in the event monitoring period;
if the stability index of the image screening grade is not smaller than the preset stability threshold, judging that the image screening grade does not meet the preset event triggering condition;
If the stability index of the image screening grade is smaller than the preset stability threshold, judging that the image screening grade meets the preset event triggering condition.
Further, the step of obtaining the stability index of the image screening level in the event monitoring period includes:
acquiring the change times of the image screening grade in the event monitoring period and the duration of the current image screening grade in a low grade state continuously;
obtaining the stability index of the current image screening grade through a stability index calculation formula;
The stability index calculation formula is as follows:
Wherein S is a stability index, Nchange is the number of changes of the current image screening grade, Tcycle is the time length of an event monitoring period, Tlow is the time length of a state continuously in a low grade, and alpha is a weight coefficient.
Further, the step of generating an event report containing the context metadata and transmitting the event report to the central management platform comprises:
when the image screening level is judged to meet the event triggering condition, collecting context metadata when triggering the event;
carrying out structural arrangement on the collected context metadata according to a preset data format to generate structural data;
Integrating the structured data into a data structure to generate an event report of the structured data;
firstly, storing an event report of the generated structured data to edge equipment, and transmitting the event report stored to the edge equipment from the edge equipment to a central management platform;
if the transmission fails, marking the event report as an undelivered state, and retransmitting according to a preset retransmission strategy;
If the sending is successful, generating an instruction for clearing the event report in the edge equipment, and generating log information at the same time;
and storing the generated log information in the edge equipment.
Further, the step of adjusting the baseline filtering strategy to generate an adjusted baseline filtering strategy includes:
the central management platform receives the event report of the structured data to obtain the context metadata of the event report;
According to the obtained context metadata, determining the characteristic information of the face image to be adjusted;
Integrating the characteristic information of the adjusted face image into a baseline filtering strategy to generate an adjusted filtering standard;
Integrating the adjusted filtering standard into a baseline filtering strategy to generate an optimized baseline filtering strategy;
and sending the adjusted baseline filtering strategy to edge equipment to replace the original baseline filtering strategy.
The embodiment of the application provides a real-time face quality filtering system for edge computing energization, which is used for realizing the real-time face quality filtering method for the edge computing energization and comprises an image acquisition module, an evaluation and screening module, a screening grade adjusting module, an event triggering module, a report generating and transmitting module and a strategy adjusting and issuing module;
the image acquisition module is used for acquiring face images in real time through the edge equipment and acquiring a baseline filtering strategy preset by the edge equipment;
The evaluation and screening module is used for evaluating and screening the quality of the face image according to a baseline filtering strategy;
the screening grade adjusting module is used for monitoring the running state of the current image screening grade on the edge equipment and adjusting the current image screening grade according to the running state;
the event triggering module is used for judging whether the running state of the current image screening grade meets a preset event triggering condition;
The report generation and transmission module is used for generating an event report containing context metadata and transmitting the event report to the central management platform when the event triggering condition is met;
The policy adjustment and issuing module is used for adjusting the baseline filtering policy through the received event report containing the context metadata, and generating an adjusted baseline filtering policy;
and issuing the adjusted baseline filtering strategy to the edge equipment, wherein the edge equipment receives and replaces the original baseline filtering strategy by the adjusted baseline filtering strategy.
One or more technical solutions provided in the embodiments of the present application at least have the following technical effects or advantages:
1. The running state of the current image screening level on the edge equipment is monitored, and the screening level is dynamically adjusted according to the running state, so that the device is flexibly suitable for complex environment conditions such as different illumination and shielding, accurate and efficient face image quality filtering is realized, and the problem that the face image quality is uneven under complex environments such as different illumination and shielding due to poor adaptability in the prior art is effectively solved.
2. By generating an event report containing the context metadata and sending the event report to the central management platform, the comprehensive monitoring and the accurate analysis of the operation state of the edge equipment are realized, the timely and effective strategy optimization is further realized, and the overall stability and reliability of the system are effectively improved.
3. The filtering standard in the baseline filtering strategy is adjusted, and the optimized baseline filtering strategy is generated, so that the filtering effect of the face image quality is continuously improved, the continuous improvement of the system performance is realized, and the complex and changeable practical application requirements are effectively met.
Detailed Description
According to the embodiment of the application, the problem of uneven face image quality under complex environments such as different illumination and shielding caused by poor adaptability in the prior art is solved by providing the real-time face quality filtering method and system based on edge computing energization, and the event report containing context metadata is generated and sent to the central management platform for analysis and optimization by monitoring the running state of the current image screening level on the edge equipment and dynamically adjusting the screening level according to the running state, so that accurate and efficient face image quality filtering is realized.
In order to better understand the above technical solutions, the following detailed description will refer to the accompanying drawings and specific embodiments.
As shown in fig. 1, a flowchart of a real-time face quality filtering method energized by edge calculation provided by the embodiment of the application includes the steps of collecting face images in real time by edge equipment, and acquiring a baseline filtering strategy preset by the edge equipment, wherein a plurality of image filtering grades are defined in the baseline filtering strategy, each image filtering grade corresponds to a group of face image quality filtering standards, and the filtering standards of the image filtering grades are different in strictness;
initializing the current image screening grade to the grade with the highest standard on the edge equipment, and evaluating and screening the face image acquired in real time according to the filtering standard corresponding to the grade;
Monitoring the running state of the current image screening grade on the edge equipment, and adjusting the current image screening grade according to the running state;
judging whether the running state of the current image screening grade meets a preset event triggering condition or not;
When the event triggering condition is met, generating an event report containing context metadata and sending the event report to a central management platform, wherein the context metadata records the environment and equipment state information when the event is triggered;
The central management platform adjusts the baseline filtering strategy according to the received event report and the context metadata in the report to generate an adjusted baseline filtering strategy;
The center management platform transmits the adjusted baseline filtering strategy to the edge equipment, and the edge equipment receives and uses the adjusted baseline filtering strategy to replace the original baseline filtering strategy for subsequent face quality filtering.
Further, according to the baseline filtering strategy, the step of performing quality evaluation and screening on the face image comprises the following steps:
preprocessing the acquired face image, wherein the preprocessing comprises gray level conversion, normalization and noise removal;
Based on the preprocessed face image, obtaining feature information of the face image, wherein the feature information comprises the ambiguity, illumination intensity, shielding degree and face integrity of the image;
Obtaining a standard value of the face image based on the characteristic information of the face image;
Comparing the standard value of the face image with the standard value of the face image corresponding to the current image screening level in the baseline filtering strategy;
Determining whether the face image accords with the quality requirement of the current image screening level according to the comparison result;
If the standard value of the face image is not smaller than the standard value of the face image corresponding to the current image screening grade, reserving the face image, and marking the image screening grade as the current grade;
And if the standard value of the face image is smaller than the standard value of the face image corresponding to the current image screening level, reviewing the face image.
Further, the step of obtaining a standard value of the face image based on the feature information of the face image includes:
Obtaining a standard value of the face image through a standard value calculation formula of the face image;
The standard value calculation formula of the face image is as follows:
Wherein Q is a standard value of a face image, B is an image ambiguity, I is an illumination intensity of the image, Imid is a desired intermediate illumination intensity value, Imax is a maximum value of the illumination intensity, O is a shielding degree of the image, C is a face integrity of the image, and omega1、ω2、ω3、ω4 is a weight coefficient of the image ambiguity, the illumination intensity, the shielding degree and the face integrity in standard value calculation respectively.
Further, the step of reviewing the face image includes:
temporarily storing the face images with the standard values smaller than the standard values of the face images corresponding to the current image screening level;
The face images in the temporary buffer area are reviewed through a preset review period;
In the review process, if the current image screening level is found to be adjusted, re-evaluating the temporarily stored face image;
If the standard value of the re-evaluated face image is not smaller than the standard value of the face image corresponding to the image screening grade after grade adjustment, the face image is moved out of the temporary cache area and is normally used;
if the standard value of the face image after re-evaluation is smaller than the standard value of the face image corresponding to the image screening grade after grade adjustment, discarding the face image.
Further, the step of monitoring the running state of the current image filtering level on the edge device and adjusting the current image filtering level according to the running state includes:
Setting a preset screening time window for evaluating the effectiveness of the current image screening level, and counting the number of face images meeting the filtering standard corresponding to the current screening level in the preset screening time window;
If the images meeting the filtering standards corresponding to the current filtering level are not filtered out at the end of the preset filtering time window, carrying out degradation treatment on the current image filtering level, and adjusting the current image filtering level to be a filtering level lower than the current image filtering level;
If the filtering standard corresponding to the current filtering level is screened out after the preset filtering time window is finished, upgrading the current filtering level of the image, and adjusting the current filtering level of the image to be a filtering level higher than the current filtering level;
Applying the adjusted image screening grade to a subsequent face image quality evaluation and screening process;
When the image screening level is adjusted, correspondingly updating the review period and the temporary storage strategy related to the screening level, so as to ensure the continuity and consistency of the system operation;
The adjustment history of the image screening level is periodically reviewed to optimize the preset screening time window and screening level adjustment logic to accommodate changing operating environments and requirements.
And counting the number of face images meeting the current screening grade filtering standard in the time window by setting a preset screening time window. And (3) according to the statistical result, if the images meeting the standard are not screened, carrying out degradation treatment on the screening level of the current image, and otherwise, carrying out upgrading treatment. By the method, the screening grade can be dynamically adjusted in response to the change of the image quality in time so as to adapt to different environmental conditions and image acquisition conditions, and the dynamic property and the adaptability of the image screening process are ensured.
Further, the step of determining whether the running state of the current image filtering level meets a preset event triggering condition includes:
setting an event monitoring period for continuously monitoring the running state of the current image screening level, and acquiring the stability index of the image screening level in the event monitoring period;
if the stability index of the image screening grade is not smaller than the preset stability threshold, judging that the image screening grade does not meet the preset event triggering condition;
if the stability index of the image screening grade is smaller than a preset stability threshold, judging that the image screening grade meets a preset event triggering condition;
when the preset event triggering condition is judged to be met, an event response mechanism is started, and an event report containing context metadata is prepared to be generated.
Collecting context metadata at the time of a triggering event, including but not limited to current image screening level, time of last level adjustment, number of face images currently collected, current illumination environment, etc.;
integrating the collected context metadata into an event report, and ensuring that the event report can accurately reflect the system state when an event is triggered;
after the event report is generated, the event report is sent to a central management platform for subsequent analysis and processing;
after receiving the event report, the central management platform analyzes the context metadata in the event report and analyzes the reasons of abnormal running states of the current image screening level;
and according to the analysis result, the central management platform determines whether the baseline filtering strategy needs to be adjusted so as to optimize the running state of the system.
And acquiring the stability index of the image screening grade in the period by setting the event monitoring period. And comparing the stability index with a preset stability threshold value to judge whether the current image screening level meets the event triggering condition, thereby realizing real-time monitoring and evaluation of the running state of the system, ensuring that the system can timely detect abnormal conditions and make corresponding adjustment and optimization.
The dynamic adjustment of the quality of the face image of interest is a short-term, real-time quality change, while the event trigger condition judgment is to look at the operational status of the quality of service class from a longer-term perspective. For example, when the quality of service level continues to fall below a certain preset level for a period of time, this may mean that there is a general problem with the quality of the face image in the area, such as poor lighting conditions for a long period of time, general use of the shade, etc., which may be difficult to find by short-term dynamic adjustment.
Meanwhile, if the service quality level is frequently switched between different levels, the current filtering strategy may be indicated to be not stable enough, or other factors affecting the normal operation of the system exist, such as fluctuation of equipment performance, severe change of environmental factors, and the like. This frequent switching is an abnormal pattern that needs to be identified by the judgment of the event trigger condition.
Through judging the event triggering condition, the possible performance degradation or other problems of the system in the long-term operation process can be found in time, so that corresponding measures are adopted to optimize and improve, and the long-term stable operation and high-performance of the whole system in different areas and different environments are ensured.
Further, the step of obtaining the stability index of the image screening level in the event monitoring period includes:
acquiring the change times of the image screening grade in the event monitoring period and the duration of the current image screening grade in a low grade state continuously;
obtaining the stability index of the current image screening grade through a stability index calculation formula;
The stability index calculation formula is as follows:
Wherein S is a stability index, Nchange is the change times of the current image screening grade, Tcycle is the time length of an event monitoring period, Tlow is the time length of a state continuously in a low grade, and alpha is a weight coefficient for adjusting the change times and the influence degree of the duration time length of the low grade on stability.
Further, the step of generating an event report containing the context metadata and transmitting the event report to the central management platform comprises:
when the image screening level is judged to meet the event triggering condition, collecting context metadata when triggering the event;
The metadata comprises a current image screening level, the time and the reason of the last image screening level adjustment, the number of face images to be reviewed in a current temporary cache area, the number of face images collected currently, current illumination environment information and the running state of current equipment (CPU (central processing unit) utilization rate, memory utilization rate and storage space remaining amount of edge equipment);
carrying out structural arrangement on the collected context metadata according to a preset data format to generate structural data;
Integrating the structured data into a data structure, adding header information and check codes of the event report, and generating an event report of the structured data;
firstly, storing an event report of the generated structured data to edge equipment, and transmitting the event report stored to the edge equipment from the edge equipment to a central management platform;
if the transmission fails, marking the event report as an undelivered state, and retransmitting according to a preset retransmission strategy;
if the sending is successful, generating an instruction for clearing the event report in the edge equipment;
Recording the result information (success or failure) of the sending operation, a unique identifier of the event report and the address of the receiver, and generating log information at the same time;
and storing the generated log information in the edge equipment.
Further, the step of adjusting the baseline filtering strategy to generate an adjusted baseline filtering strategy includes:
The method comprises the steps that event reports of structured data received by a central management platform are obtained, context metadata of the event reports are obtained, and information such as the number of times of change of an image screening grade, duration of continuous in a low-grade state, running state of current edge equipment and the like is extracted;
According to the obtained context metadata, determining the characteristic information of the face image to be adjusted, such as improving or reducing the tolerance of the ambiguity, adjusting the proper range of illumination intensity, modifying the judgment rule of the occlusion degree and the face integrity, or resetting the weight coefficient of the ambiguity, the illumination intensity, the occlusion degree and the face integrity of the image in standard value calculation;
Integrating the characteristic information of the adjusted face image into a baseline filtering strategy to generate an adjusted filtering standard, so as to ensure the integrity and consistency of the adjusted face image;
Integrating the adjusted filtering standard into a baseline filtering strategy, generating an optimized baseline filtering strategy, and ensuring the integrity and consistency of the baseline filtering strategy;
And sending the optimized baseline filtering strategy to the edge equipment through a secure communication protocol, and replacing the original baseline filtering strategy to adapt to new running states and environmental conditions.
After receiving the event report, the central management platform analyzes the context metadata to determine the face image characteristic information to be adjusted, adjusts the corresponding filtering standard items in the baseline filtering strategy, generates an optimized baseline filtering strategy and sends the optimized baseline filtering strategy to the edge equipment to replace the original strategy, so that the dynamic optimization of the filtering standard is realized, and the adaptability and the processing capacity of the system to different environmental conditions and image quality changes are improved.
As shown in fig. 2, a schematic structural diagram of an edge-computation-enabled real-time face quality filtering system according to an embodiment of the present application is configured to implement the edge-computation-enabled real-time face quality filtering method, where the method includes an image acquisition module, an evaluation and screening module, a screening level adjustment module, an event triggering module, a report generation and transmission module, and a policy adjustment and issuing module;
the image acquisition module is used for acquiring face images in real time through the edge equipment and acquiring a baseline filtering strategy preset by the edge equipment;
The evaluation and screening module is used for evaluating and screening the quality of the face image according to a baseline filtering strategy;
the screening grade adjusting module is used for monitoring the running state of the current image screening grade on the edge equipment and adjusting the current image screening grade according to the running state;
the event triggering module is used for judging whether the running state of the current image screening grade meets a preset event triggering condition;
The report generation and transmission module is used for generating an event report containing context metadata and transmitting the event report to the central management platform when the event triggering condition is met;
The policy adjustment and issuing module is used for adjusting the baseline filtering policy through the received event report containing the context metadata, and generating an adjusted baseline filtering policy;
and issuing the adjusted baseline filtering strategy to the edge equipment, wherein the edge equipment receives and replaces the original baseline filtering strategy by the adjusted baseline filtering strategy.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.