Movatterモバイル変換


[0]ホーム

URL:


CN116269355A - Safety monitoring system based on figure gesture recognition - Google Patents

Safety monitoring system based on figure gesture recognition
Download PDF

Info

Publication number
CN116269355A
CN116269355ACN202310525777.9ACN202310525777ACN116269355ACN 116269355 ACN116269355 ACN 116269355ACN 202310525777 ACN202310525777 ACN 202310525777ACN 116269355 ACN116269355 ACN 116269355A
Authority
CN
China
Prior art keywords
personnel
state
gesture
action
person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310525777.9A
Other languages
Chinese (zh)
Other versions
CN116269355B (en
Inventor
李淑琴
肖勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangxi Minxuan Intelligent Science & Technology Co ltd
Original Assignee
Jiangxi Minxuan Intelligent Science & Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangxi Minxuan Intelligent Science & Technology Co ltdfiledCriticalJiangxi Minxuan Intelligent Science & Technology Co ltd
Priority to CN202310525777.9ApriorityCriticalpatent/CN116269355B/en
Publication of CN116269355ApublicationCriticalpatent/CN116269355A/en
Application grantedgrantedCritical
Publication of CN116269355BpublicationCriticalpatent/CN116269355B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The invention relates to the technical field of person safety monitoring, and particularly discloses a safety monitoring system based on person gesture recognition, which comprises the following steps: the image acquisition module is used for acquiring image information of personnel; the identification module is used for identifying the personnel image information and obtaining the action state information of each personnel; the intelligent wearable device is used for monitoring physiological parameter information of each person; the safety early warning module is used for respectively carrying out state analysis and physiological parameter analysis on each person according to the state information and the physiological parameter information, carrying out comparison analysis on the analysis results of the state analysis and the physiological parameter analysis, and carrying out safety monitoring on the state of the person according to the state analysis and the comparison analysis results; according to the system, on the basis of acquiring the personnel images to identify and analyze the personnel states, comprehensive analysis and judgment are carried out on the personnel states by combining the intelligent wearable equipment of each personnel, so that the accuracy of judgment results is improved.

Description

Safety monitoring system based on figure gesture recognition
Technical Field
The invention relates to the technical field of person safety monitoring, in particular to a safety monitoring system based on person gesture recognition.
Background
In many situations, the behavior and the state of the person need to be monitored and judged, for example, in the monitoring process of the constructor, whether the working state of the person is abnormal, whether the physical state of the person is abnormal or not and other problems need to be judged, so as to assist the manager in intelligently judging the construction site.
The existing character state monitoring system mainly utilizes AI to identify the character actions, judges whether the actions and the postures of the characters are abnormal or not so as to ensure the normal physical state of the people, and judges the working state of the people at the same time so as to be convenient for managing the people; however, the existing AI identification model has limited accuracy, so that there is a problem of erroneous judgment; meanwhile, in the prior art, a mode of monitoring the physiological parameters of the personnel is adopted to assist in completing the judgment process of the personnel state, so that the accuracy of the judgment result is improved.
For the personnel state monitoring method in the prior art, the real-time physiological parameter data of the human body, such as heartbeat, blood pressure, blood oxygen and the like, are mainly obtained, and each real-time physiological parameter data is compared with the standard range of normal physiological parameters of the human body one by one to judge, so that early warning can be carried out when obvious abnormality exists in the physiological parameters of the personnel in the judging mode, however, the judging result is only in a state of a single time point, on one hand, the fluctuation exists in the judging result, namely, a larger error exists, on the other hand, the changing state of the physiological parameters of the personnel in the time dimension is not further analyzed, and further the robustness of the judging result is poor.
Disclosure of Invention
The invention aims to provide a safety monitoring system based on character gesture recognition, which solves the following technical problems:
how to more accurately realize the monitoring process of the physiological state of the personnel in the time dimension.
The aim of the invention can be achieved by the following technical scheme:
a safety monitoring system based on character gesture recognition, the system comprising:
the image acquisition module is used for acquiring image information of personnel;
the identification module is used for identifying the personnel image information and obtaining the action state information of each personnel;
the intelligent wearable device is used for monitoring physiological parameter information of each person;
the safety early warning module is used for respectively carrying out state analysis and physiological parameter analysis on each person according to the state information and the physiological parameter information, carrying out comparison analysis on the analysis results of the state analysis and the physiological parameter analysis, and carrying out safety monitoring on the state of the person according to the state analysis and the comparison analysis results;
the physiological parameter analysis process comprises the following steps:
acquiring real-time data and historical data of each physiological parameter of a person according to physiological parameter information, analyzing the historical data, and acquiring average value data and peak value data of each physiological parameter of the person;
substituting the real-time data, the mean value data and the peak value data of each physiological parameter of the personnel into a preset physiological analysis model to obtain a judgment result of the physiological state of the personnel.
In an embodiment, the process of analyzing the real-time data, the mean value data and the peak value data of each physiological parameter by the preset physiological analysis model is as follows:
by the formula
Figure SMS_1
Calculating and obtaining a person physiological state value phy (t) at the current time point;
wherein S is the number of physiological parameter monitoring items, z E [1, S];
Figure SMS_2
A threshold interval corresponding to the z-th physiological parameter monitoring item; />
Figure SMS_3
Monitoring a measured value for the z-th physiological parameter; />
Figure SMS_4
A standard reference value corresponding to a z-th physiological parameter monitoring item; w is a judgment function, when->
Figure SMS_5
When (I)>
Figure SMS_6
When (when)
Figure SMS_8
When (I)>
Figure SMS_12
;/>
Figure SMS_15
The corresponding early warning value of the z-th physiological parameter monitoring item; />
Figure SMS_10
The first preset time period is set; />
Figure SMS_11
Is->
Figure SMS_14
In the period->
Figure SMS_17
A maximum value; />
Figure SMS_7
、/>
Figure SMS_13
、/>
Figure SMS_16
Is a preset coefficient, and->
Figure SMS_18
;/>
Figure SMS_9
The dimensionality-removed weighting coefficient corresponding to the z-th physiological parameter monitoring item;
the physiological state value phy (t) of the person is set to be a preset warning threshold value
Figure SMS_19
And (3) performing comparison:
if it is
Figure SMS_20
Generating an early warning signal;
otherwise, judging by combining the state analysis result.
In one embodiment, the process of identifying the image information by the identification module includes:
identifying the body contour of the person based on the AI technology, and adding identification points on the body contour of the person;
acquiring distribution information of identification points through key frames in the image information;
the process of the state analysis comprises the following steps:
judging the current action gesture of the personnel according to the distribution information of the identification points;
and judging the action state type of the personnel according to the action postures of the personnel.
In one embodiment, the identification points include a center identification point and a plurality of edge identification points;
the process for judging the current posture of the personnel comprises the following steps:
respectively establishing vectors of the edge recognition points relative to the center recognition points to obtain a personnel vector sequence;
presetting an action gesture library, and setting a reference model for each action gesture;
and respectively comparing the personnel vector sequences with reference models of different postures, and determining the current action posture of the personnel according to the comparison result.
In one embodiment, the process of comparing the human vector sequence with the reference models of different poses includes size comparison:
the size comparison process comprises the following steps:
extracting vector module sequences in the personnel vector sequences according to a preset fixed order
Figure SMS_21
And satisfy->
Figure SMS_22
By the formula
Figure SMS_23
Calculating the size deviation coefficient of the vector mode sequence of the obtained personnel relative to the j-th gesture reference model +.>
Figure SMS_24
Wherein n is the number of edge recognition points, i E [1, n];
Figure SMS_26
For j-th pose reference model +.>
Figure SMS_29
Corresponding to the reference range interval>
Figure SMS_32
As a first judgment function, when->
Figure SMS_27
When (I)>
Figure SMS_28
The method comprises the steps of carrying out a first treatment on the surface of the Otherwise the first set of parameters is selected,
Figure SMS_31
;/>
Figure SMS_34
for interval->
Figure SMS_25
Intermediate value of>
Figure SMS_30
For interval->
Figure SMS_33
Is a range value of (2); />
Figure SMS_35
The dimension characteristic coefficient corresponding to the ith gesture is obtained;
will be
Figure SMS_36
And a preset threshold->
Figure SMS_37
And (3) performing comparison:
if it is
Figure SMS_38
Judging that the matching with the j-th gesture reference model is successful;
otherwise, judging that the matching is failed.
In an embodiment, the process of comparing the personnel vector sequences with the reference models of different poses respectively further includes vector comparison;
the vector comparison process comprises the following steps:
comparing the personnel vector sequence with the corresponding vector of the gesture reference model successfully matched;
by the formula
Figure SMS_39
Calculating to obtain vector deviation coefficient of personnel vector sequence relative to j-th attitude reference model>
Figure SMS_40
Wherein,,
Figure SMS_41
for the second judgment function, ++>
Figure SMS_42
、/>
Figure SMS_43
Respectively +.j in the j-th gesture reference model>
Figure SMS_44
Corresponding to the reference range boundary vector;
if it is
Figure SMS_45
Is positioned at->
Figure SMS_46
And->
Figure SMS_47
In the acute angle range, then->
Figure SMS_48
Otherwise the first set of parameters is selected,
Figure SMS_51
;/>
Figure SMS_54
is vector quantity
Figure SMS_56
Vector->
Figure SMS_50
Angle of (1)>
Figure SMS_53
For vector->
Figure SMS_55
Vector->
Figure SMS_57
Is included in the plane of the first part; />
Figure SMS_49
Is an angle conversion function; />
Figure SMS_52
The direction characteristic coefficient corresponding to the ith gesture is obtained;
selecting
Figure SMS_58
And taking the action gesture corresponding to the minimum value reference model as a judging result.
In an embodiment, the action gesture library includes an abnormal gesture type and a normal gesture type;
the process of state analysis further comprises:
acquiring the action postures of the personnel in a plurality of key frames in the image information, and judging the action postures of the personnel:
if the action gesture of the person belongs to the abnormal gesture type, early warning is carried out;
otherwise, comparing the action postures of the personnel in the key frames with a preset state class library;
a plurality of action state categories are preset in the preset state category library, and each action state category is provided with a corresponding gesture type set;
by the formula
Figure SMS_59
Calculating to obtain the matching value +.>
Figure SMS_60
Wherein,,
Figure SMS_61
for the number of human action gestures present in the corresponding gesture type set of the q-th action state category,/->
Figure SMS_62
;/>
Figure SMS_63
The occurrence probability of the action gesture of the kth personnel in the gesture type set corresponding to the qth action state category is given;
judging
Figure SMS_64
And the maximum value corresponds to the action state type as a judging result.
In one embodiment, phy (t) is compared with a predetermined reference threshold
Figure SMS_65
And performing comparison, wherein,
Figure SMS_66
if it is
Figure SMS_67
Judging that the physiological state of the personnel is normal;
otherwise, get personnel
Figure SMS_68
Action state category within a time period;
by the formula
Figure SMS_69
Calculating to obtain a personnel motion quantity coefficient Y;
wherein,,
Figure SMS_70
a second preset time period; b is->
Figure SMS_71
Action state class number in time period x epsilon 1, B];
Figure SMS_72
Duration for the x-th action state category; />
Figure SMS_73
An influence function of the xth action state category;
y is matched with a preset threshold value
Figure SMS_74
And (3) performing comparison:
if it is
Figure SMS_75
Early warning is carried out;
otherwise, judging that the physiological state of the personnel is normal.
The invention has the beneficial effects that:
(1) According to the invention, through the analysis process of the physiological parameter information, on one hand, when the individual physiological parameter data of the personnel exceeds the threshold value interval, the early warning process can be realized through the early warning values corresponding to the parameters; on the other hand, the real-time data, the maximum value data and the average value data are combined with process data of a person in a period of time before the current time point to form a judging model through preset coefficients, when any parameter is abnormal, the abnormal parameter can be reflected to a result, and further the fluctuation, the variability and the overall state of the physiological parameter of the user in the time dimension are judged, so that a continuous and accurate monitoring process of the physiological state of the person is realized, and the monitoring result has better robustness.
(2) Compared with the prior art, the method for judging the human action gesture directly compares the human action model skeleton with the preset skeleton, and firstly, the two-layer screening mode of size comparison and vector comparison can greatly reduce the reference comparison action gesture, further reduce the data processing amount and improve the judging efficiency; and secondly, the judging method sets different weight values according to the characteristics of different identification points, so that the influence degree of the key identification points on the result can be enlarged in the comparison and analysis process, and the suitability of the judging process is higher in the matching process, namely the accuracy of judging the action gesture of the user is improved.
Drawings
The invention is further described below with reference to the accompanying drawings.
FIG. 1 is a logic block diagram of a person gesture recognition based safety monitoring system of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring now to FIG. 1, in one embodiment, a person gesture recognition based safety monitoring system is provided, the system comprising:
the image acquisition module is used for acquiring image information of personnel;
the identification module is used for identifying the personnel image information and obtaining the action state information of each personnel;
the intelligent wearable device is used for monitoring physiological parameter information of each person;
the safety early warning module is used for respectively carrying out state analysis and physiological parameter analysis on each person according to the state information and the physiological parameter information, carrying out comparison analysis on the analysis results of the state analysis and the physiological parameter analysis, and carrying out safety monitoring on the state of the person according to the state analysis and the comparison analysis results;
the physiological parameter analysis process comprises the following steps:
acquiring real-time data and historical data of each physiological parameter of a person according to physiological parameter information, analyzing the historical data, and acquiring average value data and peak value data of each physiological parameter of the person;
substituting the real-time data, the mean value data and the peak value data of each physiological parameter of the personnel into a preset physiological analysis model to obtain a judgment result of the physiological state of the personnel.
According to the technical scheme, the motion state information of the personnel is acquired through the image acquisition module and the identification module, the physiological parameter information of each personnel is monitored through the intelligent wearable equipment, the analysis results of the two are compared and analyzed, and the personnel state is safely monitored according to the state analysis and the comparison and analysis results; furthermore, when obvious problems occur in the personnel state or obvious abnormalities occur in the personnel physical state, an early warning signal can be generated to remind a manager of timely processing the problems; meanwhile, further judgment is carried out according to comparison of analysis results of the two, and when potential safety hazards exist in personnel states, corresponding early warning is timely generated, so that accuracy of judgment results is improved.
Through the analysis process of the physiological parameter information of the personnel in the embodiment, on one hand, when the single physiological parameter data of the personnel exceeds a threshold value interval, the early warning process can be realized through the early warning values corresponding to the parameters; on the other hand, the embodiment combines the process data of the personnel in a period of time before the current time point, forms a judging model by the real-time data, the maximum value data and the mean value data through preset coefficients, and can reflect the abnormal condition of any parameter into the result, so as to judge the fluctuation, the variability and the overall state of the physiological parameter of the user in the time dimension, realize the continuous accurate monitoring process of the physiological state of the personnel, and have better robustness.
It should be noted that, the image acquisition module in the system can be realized by a high-definition camera device, and the identification module is realized based on an AI character identification model; physiological parameters monitored by the intelligent wearable device include common physiological parameters such as heart rate, blood pressure, blood oxygen, body temperature and the like, and are not further described in detail in the embodiment.
The process of analyzing the real-time data, the mean value data and the peak value data of each physiological parameter by the preset physiological analysis model is as follows: by the formula
Figure SMS_76
Calculating and obtaining a person physiological state value phy (t) at the current time point;
wherein S is the number of physiological parameter monitoring items, z E [1, S];
Figure SMS_77
A threshold interval corresponding to the z-th physiological parameter monitoring item; />
Figure SMS_78
Monitoring a measured value for the z-th physiological parameter; />
Figure SMS_79
A standard reference value corresponding to a z-th physiological parameter monitoring item; w is a judgment function, when->
Figure SMS_80
When (I)>
Figure SMS_81
When (when)
Figure SMS_84
When (I)>
Figure SMS_87
;/>
Figure SMS_90
The corresponding early warning value of the z-th physiological parameter monitoring item; />
Figure SMS_83
The first preset time period is set; />
Figure SMS_88
Is->
Figure SMS_91
In the period->
Figure SMS_93
A maximum value; />
Figure SMS_85
、/>
Figure SMS_86
、/>
Figure SMS_89
Is a preset coefficient, and->
Figure SMS_92
;/>
Figure SMS_82
The dimensionality-removed weighting coefficient corresponding to the z-th physiological parameter monitoring item;
the physiological state value phy (t) of the person is set to be a preset warning threshold value
Figure SMS_94
And (3) performing comparison:
if it is
Figure SMS_95
Generating an early warning signal;
otherwise, judging by combining the state analysis result.
Through the technical scheme, the embodiment givesA method for assessing a physiological state of a person is presented by the formula:
Figure SMS_97
to calculate and obtain the physiological state value phy (t) of the person at the current time point for evaluation, wherein the evaluation is mainly based on the deviation state +_ of the physiological parameter monitoring item at the current time point relative to the corresponding interval>
Figure SMS_102
Historical maximum state before the current point in time
Figure SMS_106
And the historical average state before the current time point, and obviously,
Figure SMS_99
for real-time data of each physiological parameter +.>
Figure SMS_101
For the mean data of each physiological parameter, +.>
Figure SMS_105
Peak data for each physiological parameter; wherein the preset coefficient
Figure SMS_109
、/>
Figure SMS_96
、/>
Figure SMS_100
Fitting setting according to empirical data, and performing normalization treatment to satisfy +.>
Figure SMS_104
The method comprises the steps of carrying out a first treatment on the surface of the Thus by +.>
Figure SMS_108
Can comprehensively evaluate the monitoring item of the ith physiological parameter, and in addition, the dimensionality-removed weighting coefficient +.>
Figure SMS_98
According to the value range of each physiological parameter and the influence weight, the physiological parameter is set after data fitting, so that more accurate evaluation can be carried out by synthesizing a plurality of physiological parameter monitoring items through the calculation process of a personnel physiological state value phy (t), and then the phy (t) is compared with a preset warning threshold value>
Figure SMS_103
In->
Figure SMS_107
Early warning is carried out, and the abnormal physical condition of the personnel is found in time.
It should be noted that, in the above technical solution, the first preset period
Figure SMS_110
Selecting and setting according to the scene of the system application; threshold interval->
Figure SMS_111
Standard reference value->
Figure SMS_112
The settings are selected based on empirical data corresponding to each physiological parameter item, which is not described in detail herein.
As one embodiment of the present invention, the process of identifying the image information by the identification module includes:
identifying the body contour of the person based on the AI technology, and adding identification points on the body contour of the person;
acquiring distribution information of identification points through key frames in the image information;
the process of the state analysis comprises the following steps:
judging the current action gesture of the personnel according to the distribution information of the identification points;
and judging the action state type of the personnel according to the action postures of the personnel.
The identification points comprise a center identification point and a plurality of edge identification points;
the process for judging the current posture of the personnel comprises the following steps:
respectively establishing vectors of the edge recognition points relative to the center recognition points to obtain a personnel vector sequence;
presetting an action gesture library, and setting a reference model for each action gesture;
and respectively comparing the personnel vector sequences with reference models of different postures, and determining the current action posture of the personnel according to the comparison result.
Through the above technical scheme, the present embodiment provides a process of identifying image information and analyzing a state by the identification module, firstly, identifying a body contour of a person based on AI, and adding identification points on the body contour of the person according to the identification, in this embodiment, the identification points include a center identification point located at the center of the body and five edge identification points located at the head and limbs, distribution information of the identification points, that is, vectors of the edge identification points relative to the center identification point, is obtained through key frames in the image information, and further, in the process of analyzing the state, the distribution information of the identification points is compared with an action gesture library, and then, an action state of the person is determined according to the comparison result, and then, an action state category of the person is determined according to a plurality of action gestures of the person within a period of time.
As one embodiment of the present invention, the process of comparing the human vector sequence with the reference models of different poses, respectively, includes size comparison:
the size comparison process comprises the following steps:
extracting vector module sequences in the personnel vector sequences according to a preset fixed order
Figure SMS_113
And satisfy->
Figure SMS_114
By the formula
Figure SMS_115
Calculating the size deviation coefficient of the vector mode sequence of the obtained personnel relative to the j-th gesture reference model +.>
Figure SMS_116
Wherein n is the number of edge recognition points, i E [1, n];
Figure SMS_119
For j-th pose reference model +.>
Figure SMS_122
Corresponding to the reference range interval>
Figure SMS_125
As a first judgment function, when->
Figure SMS_118
When (I)>
Figure SMS_121
The method comprises the steps of carrying out a first treatment on the surface of the Otherwise the first set of parameters is selected,
Figure SMS_124
;/>
Figure SMS_127
for interval->
Figure SMS_117
Intermediate value of>
Figure SMS_120
Is interval of
Figure SMS_123
Is a range value of (2); />
Figure SMS_126
The dimension characteristic coefficient corresponding to the ith gesture is obtained;
will be
Figure SMS_128
And a preset threshold->
Figure SMS_129
And (3) performing comparison:
if it is
Figure SMS_130
Judging that the matching with the j-th gesture reference model is successful;
otherwise, judging that the matching is failed.
Through the above technical solution, the present embodiment provides a first screening process of the comparison process, that is, performing preliminary matching by a size comparison method, and firstly extracting a vector mode sequence in a personnel vector sequence according to a preset fixed order
Figure SMS_133
And satisfy->
Figure SMS_136
Then pass through the formula
Figure SMS_140
Calculating the size deviation coefficient of the vector mode sequence of the obtained personnel relative to the j-th gesture reference model +.>
Figure SMS_131
Wherein the first judgment function->
Figure SMS_138
Is defined as: when->
Figure SMS_142
When (I)>
Figure SMS_145
The method comprises the steps of carrying out a first treatment on the surface of the Otherwise the first set of parameters is selected,
Figure SMS_132
,/>
Figure SMS_137
corresponding reference range interval +.>
Figure SMS_141
Setting the dimension characteristic coefficient according to the error interval of the standard value of the jth attitude reference model>
Figure SMS_144
The influence weights determined from the different edge position points are set after fitting the data, thus, by +.>
Figure SMS_134
Can judge the deviation state of the personnel posture data relative to the corresponding posture reference model, and then pass +.>
Figure SMS_135
And a preset threshold->
Figure SMS_139
In (a) alignment of
Figure SMS_143
And judging that the current personnel gesture is successfully matched with the jth gesture reference model, and realizing the preliminary screening process of the gesture.
As one implementation mode of the invention, the process of comparing the personnel vector sequence with the reference models with different postures respectively further comprises vector comparison;
the vector comparison process comprises the following steps:
comparing the personnel vector sequence with the corresponding vector of the gesture reference model successfully matched;
by the formula
Figure SMS_146
Calculating to obtain vector deviation coefficient of personnel vector sequence relative to j-th attitude reference model>
Figure SMS_147
Wherein,,
Figure SMS_148
for the second judgment function, ++>
Figure SMS_149
、/>
Figure SMS_150
Respectively in the jth gesture reference model/>
Figure SMS_151
Corresponding to the reference range boundary vector;
if it is
Figure SMS_152
Is positioned at->
Figure SMS_153
And->
Figure SMS_154
In the acute angle range, then->
Figure SMS_155
Otherwise the first set of parameters is selected,
Figure SMS_158
;/>
Figure SMS_160
for vector->
Figure SMS_163
Vector->
Figure SMS_157
Angle of (1)>
Figure SMS_159
For vector->
Figure SMS_162
Vector->
Figure SMS_164
Is included in the plane of the first part; />
Figure SMS_156
Is an angle conversion function; />
Figure SMS_161
The direction characteristic coefficient corresponding to the ith gesture is obtained;
selecting
Figure SMS_165
And taking the action gesture corresponding to the minimum value reference model as a judging result.
Through the above technical scheme, in this embodiment, on the basis of the size comparison to complete the preliminary screening, a further screening and judging process is performed through a vector comparison process, and a formula is used for the method
Figure SMS_167
Calculating to obtain vector deviation coefficient of personnel vector sequence relative to j-th attitude reference model>
Figure SMS_172
Wherein the boundary vector->
Figure SMS_175
、/>
Figure SMS_169
Setting the error interval of the standard vector according to the jth gesture reference model, < >>
Figure SMS_171
As a second judging function, if->
Figure SMS_174
Is positioned at->
Figure SMS_177
And (3) with
Figure SMS_166
In the acute angle range, then->
Figure SMS_173
The method comprises the steps of carrying out a first treatment on the surface of the Otherwise the first set of parameters is selected,
Figure SMS_176
the method comprises the steps of carrying out a first treatment on the surface of the At the same time, the angle conversion function->
Figure SMS_178
For a preset function of quantifying the angle, the direction characteristic coefficient +.>
Figure SMS_168
Then the influence weights measured according to different edge position points are set after data fitting, so that the +.>
Figure SMS_170
The minimum value corresponds to the gesture reference model to be used for judging the gesture type of the personnel, and the current gesture type of the personnel can be determined.
Compared with the prior art which directly compares the human motion model skeleton with the preset skeleton, the method for judging the human motion gesture in the embodiment has the advantages that firstly, the two-layer screening mode of size comparison and vector comparison can greatly reduce the reference comparison motion gesture, further reduce the data processing amount and improve the judging efficiency; secondly, in the embodiment, different weight values are set for the characteristics of different identification points, so that the influence degree of key identification points on results can be enlarged in the comparison and analysis process, and the suitability of the judgment process is higher in the matching process, namely the accuracy of judging the action gestures of the user is improved.
As one embodiment of the present invention, the motion gesture library includes an abnormal gesture type and a normal gesture type;
the process of state analysis further comprises:
acquiring the action postures of the personnel in a plurality of key frames in the image information, and judging the action postures of the personnel:
if the action gesture of the person belongs to the abnormal gesture type, early warning is carried out;
otherwise, comparing the action postures of the personnel in the key frames with a preset state class library;
a plurality of action state categories are preset in the preset state category library, and each action state category is provided with a corresponding gesture type set;
by the formula
Figure SMS_179
Calculating to obtain the matching value +.>
Figure SMS_180
Wherein,,
Figure SMS_181
for the number of human action gestures present in the corresponding gesture type set of the q-th action state category,/->
Figure SMS_182
;/>
Figure SMS_183
The occurrence probability of the action gesture of the kth personnel in the gesture type set corresponding to the qth action state category is given;
judging
Figure SMS_184
And the maximum value corresponds to the action state type as a judging result.
Through the technical scheme, the method and the device for early warning according to the action gestures of the personnel firstly carry out preliminary judgment according to the action gestures of the personnel, early warning is carried out when abnormal gesture types exist, otherwise, the personnel action gestures of a plurality of key frames are compared with a preset action state library, the states of the personnel are judged, specifically, the corresponding person gesture types are firstly set according to common state types, the average duration occupied by each gesture type in the state types is counted, the occurrence probability is further determined, and then the occurrence probability is further determined through a formula
Figure SMS_185
Calculating to obtain the matching value +.>
Figure SMS_186
,/>
Figure SMS_187
The greater the probability of occurrence
Figure SMS_188
The higher, the match value +.>
Figure SMS_189
The larger, and then by choosing +.>
Figure SMS_190
And the maximum value corresponds to the state category as a judging result, so that the current state of the personnel is accurately judged.
As one embodiment of the invention, phy (t) is compared with a preset reference threshold value
Figure SMS_191
And performing comparison, wherein,
Figure SMS_192
if it is
Figure SMS_193
Judging that the physiological state of the personnel is normal;
otherwise, get personnel
Figure SMS_194
Action state category within a time period;
by the formula
Figure SMS_195
Calculating to obtain a personnel motion quantity coefficient Y;
wherein,,
Figure SMS_196
a second preset time period; b is->
Figure SMS_197
Action state class number in time period x epsilon 1, B];
Figure SMS_198
Duration for the x-th action state category; />
Figure SMS_199
An influence function of the xth action state category;
y is matched with a preset threshold value
Figure SMS_200
And (3) performing comparison:
if it is
Figure SMS_201
Early warning is carried out;
otherwise, judging that the physiological state of the personnel is normal.
Through the technical scheme, the embodiment is as follows
Figure SMS_204
In the state, phy (t) is further combined with a preset reference threshold +.>
Figure SMS_208
Performing comparison, and adding->
Figure SMS_211
Is a preset early warning threshold value->
Figure SMS_205
Is a preset reference threshold value, is determined according to the normal range data fitting of the physiological parameters of the human body, and is +.>
Figure SMS_209
Thus->
Figure SMS_212
In the case of normal physiological conditions, if +.>
Figure SMS_214
When it is, by judging personnel->
Figure SMS_202
The state within the period is further determined, in particular by the formula +.>
Figure SMS_207
Calculating to obtain a personnel motion quantity coefficient Y; influence function->
Figure SMS_210
According to the influence range of different state categories on personnel statesThe measurement data of the degree are obtained by fitting, and therefore by
Figure SMS_213
Can judge->
Figure SMS_203
The state of motion of the person in the period of time, and thus +.>
Figure SMS_206
The physiological parameter states of the personnel are described to be abnormal, and then the personnel are timely reminded in an early warning mode, so that the accuracy of judging the physical states of the personnel is improved.
The second preset period of time
Figure SMS_215
The present embodiment is not further limited according to the application field selectivity setting of the system.
The foregoing describes one embodiment of the present invention in detail, but the description is only a preferred embodiment of the present invention and should not be construed as limiting the scope of the invention. All equivalent changes and modifications within the scope of the present invention are intended to be covered by the present invention.

Claims (8)

1. A safety monitoring system based on character gesture recognition, the system comprising:
the image acquisition module is used for acquiring image information of personnel;
the identification module is used for identifying the personnel image information and obtaining the action state information of each personnel;
the intelligent wearable device is used for monitoring physiological parameter information of each person;
the safety early warning module is used for respectively carrying out state analysis and physiological parameter analysis on each person according to the state information and the physiological parameter information, carrying out comparison analysis on the analysis results of the state analysis and the physiological parameter analysis, and carrying out safety monitoring on the state of the person according to the state analysis and the comparison analysis results;
the physiological parameter analysis process comprises the following steps:
acquiring real-time data and historical data of each physiological parameter of a person according to physiological parameter information, analyzing the historical data, and acquiring average value data and peak value data of each physiological parameter of the person;
substituting the real-time data, the mean value data and the peak value data of each physiological parameter of the personnel into a preset physiological analysis model to obtain a judgment result of the physiological state of the personnel.
2. The safety monitoring system based on character gesture recognition according to claim 1, wherein the process of analyzing the real-time data, the mean value data and the peak value data of each physiological parameter by the preset physiological analysis model is as follows:
by the formula
Figure QLYQS_1
Calculating and obtaining a person physiological state value phy (t) at the current time point;
wherein S is the number of physiological parameter monitoring items, z E [1, S];
Figure QLYQS_2
A threshold interval corresponding to the z-th physiological parameter monitoring item; />
Figure QLYQS_3
Monitoring a measured value for the z-th physiological parameter; />
Figure QLYQS_4
A standard reference value corresponding to a z-th physiological parameter monitoring item; w is a judgment function, when->
Figure QLYQS_5
When (I)>
Figure QLYQS_6
When (when)
Figure QLYQS_10
When (I)>
Figure QLYQS_13
;/>
Figure QLYQS_16
The corresponding early warning value of the z-th physiological parameter monitoring item; />
Figure QLYQS_9
The first preset time period is set; />
Figure QLYQS_12
Is->
Figure QLYQS_15
Within a period of time
Figure QLYQS_18
A maximum value; />
Figure QLYQS_7
、/>
Figure QLYQS_11
、/>
Figure QLYQS_14
Is a preset coefficient, and->
Figure QLYQS_17
;/>
Figure QLYQS_8
The dimensionality-removed weighting coefficient corresponding to the z-th physiological parameter monitoring item;
the physiological state value phy (t) of the person is set to be a preset warning threshold value
Figure QLYQS_19
And (3) performing comparison:
if it is
Figure QLYQS_20
Generating an early warning signal;
otherwise, judging by combining the state analysis result.
3. The safety monitoring system according to claim 2, wherein the process of identifying the image information by the identification module comprises:
identifying the body contour of the person based on the AI technology, and adding identification points on the body contour of the person;
acquiring distribution information of identification points through key frames in the image information;
the process of the state analysis comprises the following steps:
judging the current action gesture of the personnel according to the distribution information of the identification points;
and judging the action state type of the personnel according to the action postures of the personnel.
4. A person gesture recognition based safety monitoring system according to claim 3, wherein the recognition points comprise a center recognition point and a plurality of edge recognition points;
the process for judging the current posture of the personnel comprises the following steps:
respectively establishing vectors of the edge recognition points relative to the center recognition points to obtain a personnel vector sequence;
presetting an action gesture library, and setting a reference model for each action gesture;
and respectively comparing the personnel vector sequences with reference models of different postures, and determining the current action posture of the personnel according to the comparison result.
5. The person gesture recognition based safety monitoring system of claim 4, wherein the process of comparing the person vector sequences with reference models of different gestures, respectively, comprises size comparison:
the size comparison process comprises the following steps:
according toExtracting vector module sequences in personnel vector sequences in preset fixed order
Figure QLYQS_21
And satisfy->
Figure QLYQS_22
By the formula
Figure QLYQS_23
Calculating the size deviation coefficient of the vector mode sequence of the obtained personnel relative to the j-th gesture reference model +.>
Figure QLYQS_24
Wherein n is the number of edge recognition points, i E [1, n];
Figure QLYQS_27
For j-th pose reference model +.>
Figure QLYQS_28
Corresponding to the interval of the reference range,
Figure QLYQS_31
as a first judgment function, when->
Figure QLYQS_25
When (I)>
Figure QLYQS_29
The method comprises the steps of carrying out a first treatment on the surface of the Otherwise, go (L)>
Figure QLYQS_32
Figure QLYQS_34
For interval->
Figure QLYQS_26
Intermediate value of>
Figure QLYQS_30
For interval->
Figure QLYQS_33
Is a range value of (2); />
Figure QLYQS_35
The dimension characteristic coefficient corresponding to the ith gesture is obtained;
will be
Figure QLYQS_36
And a preset threshold->
Figure QLYQS_37
And (3) performing comparison:
if it is
Figure QLYQS_38
Judging that the matching with the j-th gesture reference model is successful;
otherwise, judging that the matching is failed.
6. The person gesture recognition based safety monitoring system of claim 5, wherein the process of comparing the person vector sequence with reference models of different gestures, respectively, further comprises vector comparison;
the vector comparison process comprises the following steps:
comparing the personnel vector sequence with the corresponding vector of the gesture reference model successfully matched;
by the formula
Figure QLYQS_39
Calculating to obtain vector deviation coefficient of personnel vector sequence relative to j-th attitude reference model>
Figure QLYQS_40
Wherein,,
Figure QLYQS_41
for the second judgment function, ++>
Figure QLYQS_42
、/>
Figure QLYQS_43
Respectively +.j in the j-th gesture reference model>
Figure QLYQS_44
Corresponding to the reference range boundary vector;
if it is
Figure QLYQS_45
Is positioned at->
Figure QLYQS_46
And->
Figure QLYQS_47
In the acute angle range, then->
Figure QLYQS_48
Otherwise the first set of parameters is selected,
Figure QLYQS_51
;/>
Figure QLYQS_54
for vector->
Figure QLYQS_56
Vector->
Figure QLYQS_50
Angle of (1)>
Figure QLYQS_53
For vector->
Figure QLYQS_55
Vector->
Figure QLYQS_57
Is included in the plane of the first part; />
Figure QLYQS_49
Is an angle conversion function; />
Figure QLYQS_52
The direction characteristic coefficient corresponding to the ith gesture is obtained;
selecting
Figure QLYQS_58
And taking the action gesture corresponding to the minimum value reference model as a judging result.
7. The safety monitoring system based on character gesture recognition according to claim 6, wherein the action gesture library comprises an abnormal gesture type and a normal gesture type;
the process of state analysis further comprises:
acquiring the action postures of the personnel in a plurality of key frames in the image information, and judging the action postures of the personnel:
if the action gesture of the person belongs to the abnormal gesture type, early warning is carried out;
otherwise, comparing the action postures of the personnel in the key frames with a preset state class library;
a plurality of action state categories are preset in the preset state category library, and each action state category is provided with a corresponding gesture type set;
by the formula
Figure QLYQS_59
Calculating to obtain the matching value +.>
Figure QLYQS_60
Wherein,,
Figure QLYQS_61
for the number of human action gestures present in the corresponding gesture type set of the q-th action state category,
Figure QLYQS_62
;/>
Figure QLYQS_63
the occurrence probability of the action gesture of the kth personnel in the gesture type set corresponding to the qth action state category is given;
judging
Figure QLYQS_64
And the maximum value corresponds to the action state type as a judging result.
8. The person gesture recognition based safety monitoring system of claim 7, wherein phy (t) is compared to a predetermined reference threshold value
Figure QLYQS_65
Performing comparison, wherein->
Figure QLYQS_66
If it is
Figure QLYQS_67
Judging that the physiological state of the personnel is normal;
otherwise, get personnel
Figure QLYQS_68
Action state category within a time period;
by the formula
Figure QLYQS_69
Calculating to obtain a personnel motion quantity coefficient Y;
wherein,,
Figure QLYQS_70
a second preset time period; b is->
Figure QLYQS_71
Action state class number in time period x epsilon 1, B];/>
Figure QLYQS_72
Duration for the x-th action state category; />
Figure QLYQS_73
An influence function of the xth action state category;
y is matched with a preset threshold value
Figure QLYQS_74
And (3) performing comparison: if->
Figure QLYQS_75
Early warning is carried out;
otherwise, judging that the physiological state of the personnel is normal.
CN202310525777.9A2023-05-112023-05-11Safety monitoring system based on figure gesture recognitionActiveCN116269355B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202310525777.9ACN116269355B (en)2023-05-112023-05-11Safety monitoring system based on figure gesture recognition

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202310525777.9ACN116269355B (en)2023-05-112023-05-11Safety monitoring system based on figure gesture recognition

Publications (2)

Publication NumberPublication Date
CN116269355Atrue CN116269355A (en)2023-06-23
CN116269355B CN116269355B (en)2023-08-01

Family

ID=86796182

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202310525777.9AActiveCN116269355B (en)2023-05-112023-05-11Safety monitoring system based on figure gesture recognition

Country Status (1)

CountryLink
CN (1)CN116269355B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN116703429A (en)*2023-08-072023-09-05深圳市磐锋精密技术有限公司 Intelligent tray access system based on the Internet of Things
CN116778670A (en)*2023-07-062023-09-19紫光汇智信息技术有限公司 Early warning method for monitoring personnel lodging based on state analysis
CN117351405A (en)*2023-12-062024-01-05江西珉轩智能科技有限公司Crowd behavior analysis system and method
CN117936073A (en)*2024-01-192024-04-26南京铁道职业技术学院Remote real-time on-line monitoring management system for human health information

Citations (12)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2005000265A (en)*2003-06-102005-01-06Hitachi Ltd Onset risk knowledge construction method and health management device by health condition
JP2012045373A (en)*2010-07-262012-03-08Sharp CorpBiometric apparatus, biometric method, control program for biometric apparatus, and recording medium recording the control program
CN105142515A (en)*2013-04-052015-12-09赫尔比公司Method for determining a person's sleeping phase which is favourable for waking up
CN111904400A (en)*2020-07-172020-11-10三峡大学Electronic wrist strap
CN113850535A (en)*2021-11-302021-12-28中通服建设有限公司Intelligent construction site personnel management method based on wearable equipment
CN114067358A (en)*2021-11-022022-02-18南京熊猫电子股份有限公司Human body posture recognition method and system based on key point detection technology
CN114283494A (en)*2021-12-142022-04-05联仁健康医疗大数据科技股份有限公司Early warning method, device, equipment and storage medium for user falling
CN114668388A (en)*2022-02-162022-06-28深圳技术大学Intelligent elderly health monitoring method, device, terminal and storage medium
CN114916935A (en)*2022-07-202022-08-19南京慧智灵杰信息技术有限公司Posture analysis auxiliary correction system based on correction process of correction personnel
CN115299887A (en)*2022-10-102022-11-08安徽星辰智跃科技有限责任公司Detection and quantification method and system for dynamic metabolic function
CN115736902A (en)*2022-12-012023-03-07广州市汇源通信建设监理有限公司Constructor management system based on intelligent wearable equipment
CN115969383A (en)*2023-02-162023-04-18北京科技大学Human body physiological fatigue detection method based on electrocardiosignals and respiratory signals

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2005000265A (en)*2003-06-102005-01-06Hitachi Ltd Onset risk knowledge construction method and health management device by health condition
JP2012045373A (en)*2010-07-262012-03-08Sharp CorpBiometric apparatus, biometric method, control program for biometric apparatus, and recording medium recording the control program
CN105142515A (en)*2013-04-052015-12-09赫尔比公司Method for determining a person's sleeping phase which is favourable for waking up
CN111904400A (en)*2020-07-172020-11-10三峡大学Electronic wrist strap
CN114067358A (en)*2021-11-022022-02-18南京熊猫电子股份有限公司Human body posture recognition method and system based on key point detection technology
CN113850535A (en)*2021-11-302021-12-28中通服建设有限公司Intelligent construction site personnel management method based on wearable equipment
CN114283494A (en)*2021-12-142022-04-05联仁健康医疗大数据科技股份有限公司Early warning method, device, equipment and storage medium for user falling
CN114668388A (en)*2022-02-162022-06-28深圳技术大学Intelligent elderly health monitoring method, device, terminal and storage medium
CN114916935A (en)*2022-07-202022-08-19南京慧智灵杰信息技术有限公司Posture analysis auxiliary correction system based on correction process of correction personnel
CN115299887A (en)*2022-10-102022-11-08安徽星辰智跃科技有限责任公司Detection and quantification method and system for dynamic metabolic function
CN115736902A (en)*2022-12-012023-03-07广州市汇源通信建设监理有限公司Constructor management system based on intelligent wearable equipment
CN115969383A (en)*2023-02-162023-04-18北京科技大学Human body physiological fatigue detection method based on electrocardiosignals and respiratory signals

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张力行;叶宁;黄海平;王汝传;: "基于皮肤电信号与文本信息的双模态情感识别系统", 计算机系统应用, no. 11*

Cited By (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN116778670A (en)*2023-07-062023-09-19紫光汇智信息技术有限公司 Early warning method for monitoring personnel lodging based on state analysis
CN116703429A (en)*2023-08-072023-09-05深圳市磐锋精密技术有限公司 Intelligent tray access system based on the Internet of Things
CN116703429B (en)*2023-08-072023-12-15深圳市磐锋精密技术有限公司Intelligent charging tray access system based on Internet of things
CN117351405A (en)*2023-12-062024-01-05江西珉轩智能科技有限公司Crowd behavior analysis system and method
CN117351405B (en)*2023-12-062024-02-13江西珉轩智能科技有限公司Crowd behavior analysis system and method
CN117936073A (en)*2024-01-192024-04-26南京铁道职业技术学院Remote real-time on-line monitoring management system for human health information

Also Published As

Publication numberPublication date
CN116269355B (en)2023-08-01

Similar Documents

PublicationPublication DateTitle
CN116269355B (en)Safety monitoring system based on figure gesture recognition
CN109009017B (en) An intelligent health monitoring system and data processing method thereof
CN109117730B (en)Real-time electrocardiogram atrial fibrillation judgment method, device and system and storage medium
CN114742090B (en)Cabin man-machine interaction system based on mental fatigue monitoring
CN109700450B (en)Heart rate detection method and electronic equipment
CN107440680A (en)Sleep state estimating device
US10368792B2 (en)Method for detecting deception and predicting interviewer accuracy in investigative interviewing using interviewer, interviewee and dyadic physiological and behavioral measurements
WO2006113697A1 (en)Trainable diagnotic system and method of use
CN115736902A (en)Constructor management system based on intelligent wearable equipment
US20210290139A1 (en)Apparatus and method for cardiac signal processing, monitoring system comprising the same
CN108596087A (en)A kind of driving fatigue degree detecting regression model based on dual network result
CN109567832A (en)A kind of method and system of the angry driving condition of detection based on Intelligent bracelet
Jin et al.A novel incremental and interactive method for actual heartbeat classification with limited additional labeled samples
CN113907770B (en) Spike-slow complex detection and recognition method and system based on feature fusion
CN112597949A (en)Psychological stress measuring method and system based on video
TW202247816A (en)Non-contact heart rhythm category monitoring system and method
CN116115239A (en)Embarrassing working gesture recognition method for construction workers based on multi-mode data fusion
Ziaratnia et al.Multimodal deep learning for remote stress estimation using CCT-LSTM
CN119523432A (en) An intelligent cervical vertebra status data monitoring system
CN118279964B (en)Passenger cabin comfort level recognition system and method based on face video non-contact measurement
CN117807551B (en)Heart rate abnormality capturing method and system based on intelligent ring
CN118873148A (en) Electrocardiogram long-tail classification method, device and medium based on anomaly detection pre-training
CN116712099A (en) Fetal heart rate status detection method, electronic equipment, and storage media based on multi-modal data
CN115496105B (en)Sleep prediction model training method, sleep condition prediction method and related devices
CN117174314A (en)Edge data processing system of personal information acquisition equipment

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp