Disclosure of Invention
The invention aims to provide a safety monitoring system based on character gesture recognition, which solves the following technical problems:
how to more accurately realize the monitoring process of the physiological state of the personnel in the time dimension.
The aim of the invention can be achieved by the following technical scheme:
a safety monitoring system based on character gesture recognition, the system comprising:
the image acquisition module is used for acquiring image information of personnel;
the identification module is used for identifying the personnel image information and obtaining the action state information of each personnel;
the intelligent wearable device is used for monitoring physiological parameter information of each person;
the safety early warning module is used for respectively carrying out state analysis and physiological parameter analysis on each person according to the state information and the physiological parameter information, carrying out comparison analysis on the analysis results of the state analysis and the physiological parameter analysis, and carrying out safety monitoring on the state of the person according to the state analysis and the comparison analysis results;
the physiological parameter analysis process comprises the following steps:
acquiring real-time data and historical data of each physiological parameter of a person according to physiological parameter information, analyzing the historical data, and acquiring average value data and peak value data of each physiological parameter of the person;
substituting the real-time data, the mean value data and the peak value data of each physiological parameter of the personnel into a preset physiological analysis model to obtain a judgment result of the physiological state of the personnel.
In an embodiment, the process of analyzing the real-time data, the mean value data and the peak value data of each physiological parameter by the preset physiological analysis model is as follows:
by the formula
Calculating and obtaining a person physiological state value phy (t) at the current time point;
wherein S is the number of physiological parameter monitoring items, z E [1, S];
A threshold interval corresponding to the z-th physiological parameter monitoring item; />
Monitoring a measured value for the z-th physiological parameter; />
A standard reference value corresponding to a z-th physiological parameter monitoring item; w is a judgment function, when->
When (I)>
;
When (when)
When (I)>
;/>
The corresponding early warning value of the z-th physiological parameter monitoring item; />
The first preset time period is set; />
Is->
In the period->
A maximum value; />
、/>
、/>
Is a preset coefficient, and->
;/>
The dimensionality-removed weighting coefficient corresponding to the z-th physiological parameter monitoring item;
the physiological state value phy (t) of the person is set to be a preset warning threshold value
And (3) performing comparison:
if it is
Generating an early warning signal;
otherwise, judging by combining the state analysis result.
In one embodiment, the process of identifying the image information by the identification module includes:
identifying the body contour of the person based on the AI technology, and adding identification points on the body contour of the person;
acquiring distribution information of identification points through key frames in the image information;
the process of the state analysis comprises the following steps:
judging the current action gesture of the personnel according to the distribution information of the identification points;
and judging the action state type of the personnel according to the action postures of the personnel.
In one embodiment, the identification points include a center identification point and a plurality of edge identification points;
the process for judging the current posture of the personnel comprises the following steps:
respectively establishing vectors of the edge recognition points relative to the center recognition points to obtain a personnel vector sequence;
presetting an action gesture library, and setting a reference model for each action gesture;
and respectively comparing the personnel vector sequences with reference models of different postures, and determining the current action posture of the personnel according to the comparison result.
In one embodiment, the process of comparing the human vector sequence with the reference models of different poses includes size comparison:
the size comparison process comprises the following steps:
extracting vector module sequences in the personnel vector sequences according to a preset fixed order
And satisfy->
;
By the formula
Calculating the size deviation coefficient of the vector mode sequence of the obtained personnel relative to the j-th gesture reference model +.>
;
Wherein n is the number of edge recognition points, i E [1, n];
For j-th pose reference model +.>
Corresponding to the reference range interval>
As a first judgment function, when->
When (I)>
The method comprises the steps of carrying out a first treatment on the surface of the Otherwise the first set of parameters is selected,
;/>
for interval->
Intermediate value of>
For interval->
Is a range value of (2); />
The dimension characteristic coefficient corresponding to the ith gesture is obtained;
will be
And a preset threshold->
And (3) performing comparison:
if it is
Judging that the matching with the j-th gesture reference model is successful;
otherwise, judging that the matching is failed.
In an embodiment, the process of comparing the personnel vector sequences with the reference models of different poses respectively further includes vector comparison;
the vector comparison process comprises the following steps:
comparing the personnel vector sequence with the corresponding vector of the gesture reference model successfully matched;
by the formula
Calculating to obtain vector deviation coefficient of personnel vector sequence relative to j-th attitude reference model>
;
Wherein,,
for the second judgment function, ++>
、/>
Respectively +.j in the j-th gesture reference model>
Corresponding to the reference range boundary vector;
if it is
Is positioned at->
And->
In the acute angle range, then->
;
Otherwise the first set of parameters is selected,
;/>
is vector quantity
Vector->
Angle of (1)>
For vector->
Vector->
Is included in the plane of the first part; />
Is an angle conversion function; />
The direction characteristic coefficient corresponding to the ith gesture is obtained;
selecting
And taking the action gesture corresponding to the minimum value reference model as a judging result.
In an embodiment, the action gesture library includes an abnormal gesture type and a normal gesture type;
the process of state analysis further comprises:
acquiring the action postures of the personnel in a plurality of key frames in the image information, and judging the action postures of the personnel:
if the action gesture of the person belongs to the abnormal gesture type, early warning is carried out;
otherwise, comparing the action postures of the personnel in the key frames with a preset state class library;
a plurality of action state categories are preset in the preset state category library, and each action state category is provided with a corresponding gesture type set;
by the formula
Calculating to obtain the matching value +.>
;
Wherein,,
for the number of human action gestures present in the corresponding gesture type set of the q-th action state category,/->
;/>
The occurrence probability of the action gesture of the kth personnel in the gesture type set corresponding to the qth action state category is given;
judging
And the maximum value corresponds to the action state type as a judging result.
In one embodiment, phy (t) is compared with a predetermined reference threshold
And performing comparison, wherein,
;
if it is
Judging that the physiological state of the personnel is normal;
otherwise, get personnel
Action state category within a time period;
by the formula
Calculating to obtain a personnel motion quantity coefficient Y;
wherein,,
a second preset time period; b is->
Action state class number in time period x epsilon 1, B];
Duration for the x-th action state category; />
An influence function of the xth action state category;
y is matched with a preset threshold value
And (3) performing comparison:
if it is
Early warning is carried out;
otherwise, judging that the physiological state of the personnel is normal.
The invention has the beneficial effects that:
(1) According to the invention, through the analysis process of the physiological parameter information, on one hand, when the individual physiological parameter data of the personnel exceeds the threshold value interval, the early warning process can be realized through the early warning values corresponding to the parameters; on the other hand, the real-time data, the maximum value data and the average value data are combined with process data of a person in a period of time before the current time point to form a judging model through preset coefficients, when any parameter is abnormal, the abnormal parameter can be reflected to a result, and further the fluctuation, the variability and the overall state of the physiological parameter of the user in the time dimension are judged, so that a continuous and accurate monitoring process of the physiological state of the person is realized, and the monitoring result has better robustness.
(2) Compared with the prior art, the method for judging the human action gesture directly compares the human action model skeleton with the preset skeleton, and firstly, the two-layer screening mode of size comparison and vector comparison can greatly reduce the reference comparison action gesture, further reduce the data processing amount and improve the judging efficiency; and secondly, the judging method sets different weight values according to the characteristics of different identification points, so that the influence degree of the key identification points on the result can be enlarged in the comparison and analysis process, and the suitability of the judging process is higher in the matching process, namely the accuracy of judging the action gesture of the user is improved.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring now to FIG. 1, in one embodiment, a person gesture recognition based safety monitoring system is provided, the system comprising:
the image acquisition module is used for acquiring image information of personnel;
the identification module is used for identifying the personnel image information and obtaining the action state information of each personnel;
the intelligent wearable device is used for monitoring physiological parameter information of each person;
the safety early warning module is used for respectively carrying out state analysis and physiological parameter analysis on each person according to the state information and the physiological parameter information, carrying out comparison analysis on the analysis results of the state analysis and the physiological parameter analysis, and carrying out safety monitoring on the state of the person according to the state analysis and the comparison analysis results;
the physiological parameter analysis process comprises the following steps:
acquiring real-time data and historical data of each physiological parameter of a person according to physiological parameter information, analyzing the historical data, and acquiring average value data and peak value data of each physiological parameter of the person;
substituting the real-time data, the mean value data and the peak value data of each physiological parameter of the personnel into a preset physiological analysis model to obtain a judgment result of the physiological state of the personnel.
According to the technical scheme, the motion state information of the personnel is acquired through the image acquisition module and the identification module, the physiological parameter information of each personnel is monitored through the intelligent wearable equipment, the analysis results of the two are compared and analyzed, and the personnel state is safely monitored according to the state analysis and the comparison and analysis results; furthermore, when obvious problems occur in the personnel state or obvious abnormalities occur in the personnel physical state, an early warning signal can be generated to remind a manager of timely processing the problems; meanwhile, further judgment is carried out according to comparison of analysis results of the two, and when potential safety hazards exist in personnel states, corresponding early warning is timely generated, so that accuracy of judgment results is improved.
Through the analysis process of the physiological parameter information of the personnel in the embodiment, on one hand, when the single physiological parameter data of the personnel exceeds a threshold value interval, the early warning process can be realized through the early warning values corresponding to the parameters; on the other hand, the embodiment combines the process data of the personnel in a period of time before the current time point, forms a judging model by the real-time data, the maximum value data and the mean value data through preset coefficients, and can reflect the abnormal condition of any parameter into the result, so as to judge the fluctuation, the variability and the overall state of the physiological parameter of the user in the time dimension, realize the continuous accurate monitoring process of the physiological state of the personnel, and have better robustness.
It should be noted that, the image acquisition module in the system can be realized by a high-definition camera device, and the identification module is realized based on an AI character identification model; physiological parameters monitored by the intelligent wearable device include common physiological parameters such as heart rate, blood pressure, blood oxygen, body temperature and the like, and are not further described in detail in the embodiment.
The process of analyzing the real-time data, the mean value data and the peak value data of each physiological parameter by the preset physiological analysis model is as follows: by the formula
Calculating and obtaining a person physiological state value phy (t) at the current time point;
wherein S is the number of physiological parameter monitoring items, z E [1, S];
A threshold interval corresponding to the z-th physiological parameter monitoring item; />
Monitoring a measured value for the z-th physiological parameter; />
A standard reference value corresponding to a z-th physiological parameter monitoring item; w is a judgment function, when->
When (I)>
;
When (when)
When (I)>
;/>
The corresponding early warning value of the z-th physiological parameter monitoring item; />
The first preset time period is set; />
Is->
In the period->
A maximum value; />
、/>
、/>
Is a preset coefficient, and->
;/>
The dimensionality-removed weighting coefficient corresponding to the z-th physiological parameter monitoring item;
the physiological state value phy (t) of the person is set to be a preset warning threshold value
And (3) performing comparison:
if it is
Generating an early warning signal;
otherwise, judging by combining the state analysis result.
Through the technical scheme, the embodiment givesA method for assessing a physiological state of a person is presented by the formula:
to calculate and obtain the physiological state value phy (t) of the person at the current time point for evaluation, wherein the evaluation is mainly based on the deviation state +_ of the physiological parameter monitoring item at the current time point relative to the corresponding interval>
Historical maximum state before the current point in time
And the historical average state before the current time point, and obviously,
for real-time data of each physiological parameter +.>
For the mean data of each physiological parameter, +.>
Peak data for each physiological parameter; wherein the preset coefficient
、/>
、/>
Fitting setting according to empirical data, and performing normalization treatment to satisfy +.>
The method comprises the steps of carrying out a first treatment on the surface of the Thus by +.>
Can comprehensively evaluate the monitoring item of the ith physiological parameter, and in addition, the dimensionality-removed weighting coefficient +.>
According to the value range of each physiological parameter and the influence weight, the physiological parameter is set after data fitting, so that more accurate evaluation can be carried out by synthesizing a plurality of physiological parameter monitoring items through the calculation process of a personnel physiological state value phy (t), and then the phy (t) is compared with a preset warning threshold value>
In->
Early warning is carried out, and the abnormal physical condition of the personnel is found in time.
It should be noted that, in the above technical solution, the first preset period
Selecting and setting according to the scene of the system application; threshold interval->
Standard reference value->
The settings are selected based on empirical data corresponding to each physiological parameter item, which is not described in detail herein.
As one embodiment of the present invention, the process of identifying the image information by the identification module includes:
identifying the body contour of the person based on the AI technology, and adding identification points on the body contour of the person;
acquiring distribution information of identification points through key frames in the image information;
the process of the state analysis comprises the following steps:
judging the current action gesture of the personnel according to the distribution information of the identification points;
and judging the action state type of the personnel according to the action postures of the personnel.
The identification points comprise a center identification point and a plurality of edge identification points;
the process for judging the current posture of the personnel comprises the following steps:
respectively establishing vectors of the edge recognition points relative to the center recognition points to obtain a personnel vector sequence;
presetting an action gesture library, and setting a reference model for each action gesture;
and respectively comparing the personnel vector sequences with reference models of different postures, and determining the current action posture of the personnel according to the comparison result.
Through the above technical scheme, the present embodiment provides a process of identifying image information and analyzing a state by the identification module, firstly, identifying a body contour of a person based on AI, and adding identification points on the body contour of the person according to the identification, in this embodiment, the identification points include a center identification point located at the center of the body and five edge identification points located at the head and limbs, distribution information of the identification points, that is, vectors of the edge identification points relative to the center identification point, is obtained through key frames in the image information, and further, in the process of analyzing the state, the distribution information of the identification points is compared with an action gesture library, and then, an action state of the person is determined according to the comparison result, and then, an action state category of the person is determined according to a plurality of action gestures of the person within a period of time.
As one embodiment of the present invention, the process of comparing the human vector sequence with the reference models of different poses, respectively, includes size comparison:
the size comparison process comprises the following steps:
extracting vector module sequences in the personnel vector sequences according to a preset fixed order
And satisfy->
;
By the formula
Calculating the size deviation coefficient of the vector mode sequence of the obtained personnel relative to the j-th gesture reference model +.>
;
Wherein n is the number of edge recognition points, i E [1, n];
For j-th pose reference model +.>
Corresponding to the reference range interval>
As a first judgment function, when->
When (I)>
The method comprises the steps of carrying out a first treatment on the surface of the Otherwise the first set of parameters is selected,
;/>
for interval->
Intermediate value of>
Is interval of
Is a range value of (2); />
The dimension characteristic coefficient corresponding to the ith gesture is obtained;
will be
And a preset threshold->
And (3) performing comparison:
if it is
Judging that the matching with the j-th gesture reference model is successful;
otherwise, judging that the matching is failed.
Through the above technical solution, the present embodiment provides a first screening process of the comparison process, that is, performing preliminary matching by a size comparison method, and firstly extracting a vector mode sequence in a personnel vector sequence according to a preset fixed order
And satisfy->
Then pass through the formula
Calculating the size deviation coefficient of the vector mode sequence of the obtained personnel relative to the j-th gesture reference model +.>
Wherein the first judgment function->
Is defined as: when->
When (I)>
The method comprises the steps of carrying out a first treatment on the surface of the Otherwise the first set of parameters is selected,
,/>
corresponding reference range interval +.>
Setting the dimension characteristic coefficient according to the error interval of the standard value of the jth attitude reference model>
The influence weights determined from the different edge position points are set after fitting the data, thus, by +.>
Can judge the deviation state of the personnel posture data relative to the corresponding posture reference model, and then pass +.>
And a preset threshold->
In (a) alignment of
And judging that the current personnel gesture is successfully matched with the jth gesture reference model, and realizing the preliminary screening process of the gesture.
As one implementation mode of the invention, the process of comparing the personnel vector sequence with the reference models with different postures respectively further comprises vector comparison;
the vector comparison process comprises the following steps:
comparing the personnel vector sequence with the corresponding vector of the gesture reference model successfully matched;
by the formula
Calculating to obtain vector deviation coefficient of personnel vector sequence relative to j-th attitude reference model>
;
Wherein,,
for the second judgment function, ++>
、/>
Respectively in the jth gesture reference model/>
Corresponding to the reference range boundary vector;
if it is
Is positioned at->
And->
In the acute angle range, then->
;
Otherwise the first set of parameters is selected,
;/>
for vector->
Vector->
Angle of (1)>
For vector->
Vector->
Is included in the plane of the first part; />
Is an angle conversion function; />
The direction characteristic coefficient corresponding to the ith gesture is obtained;
selecting
And taking the action gesture corresponding to the minimum value reference model as a judging result.
Through the above technical scheme, in this embodiment, on the basis of the size comparison to complete the preliminary screening, a further screening and judging process is performed through a vector comparison process, and a formula is used for the method
Calculating to obtain vector deviation coefficient of personnel vector sequence relative to j-th attitude reference model>
Wherein the boundary vector->
、/>
Setting the error interval of the standard vector according to the jth gesture reference model, < >>
As a second judging function, if->
Is positioned at->
And (3) with
In the acute angle range, then->
The method comprises the steps of carrying out a first treatment on the surface of the Otherwise the first set of parameters is selected,
the method comprises the steps of carrying out a first treatment on the surface of the At the same time, the angle conversion function->
For a preset function of quantifying the angle, the direction characteristic coefficient +.>
Then the influence weights measured according to different edge position points are set after data fitting, so that the +.>
The minimum value corresponds to the gesture reference model to be used for judging the gesture type of the personnel, and the current gesture type of the personnel can be determined.
Compared with the prior art which directly compares the human motion model skeleton with the preset skeleton, the method for judging the human motion gesture in the embodiment has the advantages that firstly, the two-layer screening mode of size comparison and vector comparison can greatly reduce the reference comparison motion gesture, further reduce the data processing amount and improve the judging efficiency; secondly, in the embodiment, different weight values are set for the characteristics of different identification points, so that the influence degree of key identification points on results can be enlarged in the comparison and analysis process, and the suitability of the judgment process is higher in the matching process, namely the accuracy of judging the action gestures of the user is improved.
As one embodiment of the present invention, the motion gesture library includes an abnormal gesture type and a normal gesture type;
the process of state analysis further comprises:
acquiring the action postures of the personnel in a plurality of key frames in the image information, and judging the action postures of the personnel:
if the action gesture of the person belongs to the abnormal gesture type, early warning is carried out;
otherwise, comparing the action postures of the personnel in the key frames with a preset state class library;
a plurality of action state categories are preset in the preset state category library, and each action state category is provided with a corresponding gesture type set;
by the formula
Calculating to obtain the matching value +.>
;
Wherein,,
for the number of human action gestures present in the corresponding gesture type set of the q-th action state category,/->
;/>
The occurrence probability of the action gesture of the kth personnel in the gesture type set corresponding to the qth action state category is given;
judging
And the maximum value corresponds to the action state type as a judging result.
Through the technical scheme, the method and the device for early warning according to the action gestures of the personnel firstly carry out preliminary judgment according to the action gestures of the personnel, early warning is carried out when abnormal gesture types exist, otherwise, the personnel action gestures of a plurality of key frames are compared with a preset action state library, the states of the personnel are judged, specifically, the corresponding person gesture types are firstly set according to common state types, the average duration occupied by each gesture type in the state types is counted, the occurrence probability is further determined, and then the occurrence probability is further determined through a formula
Calculating to obtain the matching value +.>
,/>
The greater the probability of occurrence
The higher, the match value +.>
The larger, and then by choosing +.>
And the maximum value corresponds to the state category as a judging result, so that the current state of the personnel is accurately judged.
As one embodiment of the invention, phy (t) is compared with a preset reference threshold value
And performing comparison, wherein,
;
if it is
Judging that the physiological state of the personnel is normal;
otherwise, get personnel
Action state category within a time period;
by the formula
Calculating to obtain a personnel motion quantity coefficient Y;
wherein,,
a second preset time period; b is->
Action state class number in time period x epsilon 1, B];
Duration for the x-th action state category; />
An influence function of the xth action state category;
y is matched with a preset threshold value
And (3) performing comparison:
if it is
Early warning is carried out;
otherwise, judging that the physiological state of the personnel is normal.
Through the technical scheme, the embodiment is as follows
In the state, phy (t) is further combined with a preset reference threshold +.>
Performing comparison, and adding->
Is a preset early warning threshold value->
Is a preset reference threshold value, is determined according to the normal range data fitting of the physiological parameters of the human body, and is +.>
Thus->
In the case of normal physiological conditions, if +.>
When it is, by judging personnel->
The state within the period is further determined, in particular by the formula +.>
Calculating to obtain a personnel motion quantity coefficient Y; influence function->
According to the influence range of different state categories on personnel statesThe measurement data of the degree are obtained by fitting, and therefore by
Can judge->
The state of motion of the person in the period of time, and thus +.>
The physiological parameter states of the personnel are described to be abnormal, and then the personnel are timely reminded in an early warning mode, so that the accuracy of judging the physical states of the personnel is improved.
The second preset period of time
The present embodiment is not further limited according to the application field selectivity setting of the system.
The foregoing describes one embodiment of the present invention in detail, but the description is only a preferred embodiment of the present invention and should not be construed as limiting the scope of the invention. All equivalent changes and modifications within the scope of the present invention are intended to be covered by the present invention.