Movatterモバイル変換


[0]ホーム

URL:


CN111052049A - Motion recognition method, device and terminal - Google Patents

Motion recognition method, device and terminal
Download PDF

Info

Publication number
CN111052049A
CN111052049ACN201880059049.0ACN201880059049ACN111052049ACN 111052049 ACN111052049 ACN 111052049ACN 201880059049 ACN201880059049 ACN 201880059049ACN 111052049 ACN111052049 ACN 111052049A
Authority
CN
China
Prior art keywords
user
amplitude
sign information
ecg
ecg sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880059049.0A
Other languages
Chinese (zh)
Inventor
孙士友
王志勇
李彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co LtdfiledCriticalHuawei Technologies Co Ltd
Publication of CN111052049ApublicationCriticalpatent/CN111052049A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

Translated fromChinese

本申请实施例公开了一种动作识别方法、装置及终端,其中,所述方法包括:通过所述ECG检测装置获取用户的体征信息;根据该用户的体征信息确定该用户的手势,可根据体征信息识别手势,以便根据手势实现人机交互,使人机交互更加便捷。

Figure 201880059049

The embodiments of the present application disclose a motion recognition method, device, and terminal, wherein the method includes: acquiring physical sign information of a user through the ECG detection device; determining the user's gesture according to the physical sign information of the user, The information recognizes gestures, so as to realize human-computer interaction according to gestures, making human-computer interaction more convenient.

Figure 201880059049

Description

Action identification method and device and terminalTechnical Field
The present application relates to the field of terminal technologies, and in particular, to a method and an apparatus for motion recognition, and a terminal.
Background
With the rapid development of terminal technology, the man-machine interaction technology permeates into the aspects of daily life, and is also an important development direction of new technologies such as terminals and the like.
Generally, a terminal realizes human-computer interaction by adopting a touch screen or physical keys, but the method for realizing human-computer interaction brings inconvenience to users. For example, when the terminal plays music, if the user wants to play the next music, but if the user is inconvenient to touch the terminal with a hand, the user cannot select to play the next music based on a touch screen or a physical button.
Disclosure of Invention
The technical problem to be solved by the embodiments of the present application is to provide a method, a device and a terminal for motion recognition, which can realize human-computer interaction based on gesture information, so that human-computer interaction is more convenient.
In a first aspect, an embodiment of the present application provides a motion recognition method, which is applied to a terminal including an ECG detection device, in the method, sign information of a user is obtained through the ECG detection device, and a gesture of the user is determined according to the sign information of the user.
In the technical scheme, the terminal can acquire sign information according to the ECG detection device, the gesture of the user is identified according to the sign information, so that the man-machine interaction can be realized according to the gesture of the user, the gesture is made by identifying the hand of wearing the terminal, the man-machine interaction can be realized, namely, the man-machine interaction can be realized through one-hand operation, the man-machine interaction is more convenient, the scheme can be realized only by the terminal provided with the ECG detection device for detecting the electrocardiogram information, additional cost is not required to be added, and therefore the cost required by the scheme is lower.
As an alternative embodiment, the ECG detecting device includes a first ECG sensor and a second ECG sensor, when detecting the characteristic information of the user, the first ECG sensor and the second ECG sensor are in contact with the skin of the user, the first ECG sensor is configured as a signal positive pole, the second ECG sensor is configured as a signal negative pole or a signal ground, and the characteristic information of the user wearing the terminal is acquired through the first ECG sensor and the second ECG sensor.
In the technical scheme, the first ECG sensor is configured as a signal anode, and the second ECG sensor is configured as a signal cathode or signal grounding, so that the information acquired by the second ECG sensor can be used as reference information to improve the accuracy and stability of acquiring the sign information.
As an optional implementation, the ECG detection apparatus includes a third ECG sensor configured as a positive signal pole or a negative signal pole, and performs sign information detection on the user through the third ECG sensor, and performs filtering processing on the detected sign information.
In the technical scheme, the terminal performs filtering processing on the sign information acquired by the third ECG sensor so as to improve the accuracy of the acquired sign information and reduce noise interference.
As an alternative embodiment, if the operation of starting the detection of the vital sign information is detected, the step of obtaining the vital sign information by the ECG detection device is executed.
In the technical scheme, the ECG detection device acquires the physical sign information only when the operation of starting the detection of the physical sign information is detected, so that the power consumption can be reduced, and the consumption of CPU resources is reduced.
As an optional implementation, if the sign information includes a valid amplitude pair, determining that the gesture of the user is a fist making, where the valid amplitude pair includes a valid positive amplitude and a valid negative amplitude, and a time interval between occurrence of the valid positive amplitude and the valid negative amplitude is smaller than a preset time interval threshold, an amplitude value of the valid positive amplitude is larger than a first preset amplitude threshold, and an absolute value of an amplitude value of the valid negative amplitude is larger than a second preset amplitude threshold; and if the sign information comprises the amplitude of which the amplitude value is larger than the third preset amplitude value, determining that the gesture of the user is wrist turning.
In the technical scheme, the terminal can identify the gesture of the user according to the amplitude of the sign information, so that the man-machine interaction can be realized through gesture identification, and the man-machine interaction is more convenient and faster.
In a second aspect, a motion recognition apparatus is provided, which is applied to a terminal and has a function of implementing behaviors in the first aspect or a possible implementation manner of the first aspect. The function can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above. The module may be software and/or hardware.
In a third aspect, a terminal is provided, which includes: a memory for storing one or more programs; a processor for calling the program stored in the memory to implement the scheme in the method design of the first aspect.
In a fourth aspect, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by at least one processor, may carry out the possible embodiments and advantages of the first aspect and the first aspect described above.
In a fifth aspect, an embodiment of the present invention provides a computer program product, where the computer program product includes a non-volatile computer-readable storage medium storing a computer program, and the computer program, when executed, causes a computer to implement the steps of the method of the first aspect, and the problem solving embodiments and advantages of the computer program product may refer to the foregoing first aspect and each possible method embodiment and advantage of the first aspect, and repeated details are not repeated.
Drawings
Fig. 1-2 are schematic structural diagrams of a terminal disclosed in an embodiment of the present application;
fig. 3 is a schematic structural diagram of a dial disclosed in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an ECG sensor of a smart watch according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an ECG sensor of another smart watch disclosed in an embodiment of the present application;
FIG. 6 is a flow chart illustrating a method for motion recognition disclosed in an embodiment of the present application;
FIG. 7 is a schematic diagram of a gesture disclosed in an embodiment of the present application as a fist making action;
FIG. 8 is a schematic diagram of a gesture disclosed in an embodiment of the present application as a punch action;
FIG. 9 is a schematic diagram illustrating a wrist flipping gesture according to an embodiment of the disclosure;
fig. 10 is a schematic structural diagram of a motion recognition device according to an embodiment of the present invention.
Detailed Description
The following description will be made with reference to the drawings in the embodiments of the present application.
Generally, a terminal realizes human-computer interaction by adopting a touch screen or physical keys, but the method for realizing human-computer interaction brings inconvenience to users. The application provides an action recognition method, an action recognition device and a terminal, which can realize human-computer interaction based on gesture information and make the human-computer interaction more convenient.
The terminal in the embodiment of the present invention may refer to a mobile phone, a bracelet, a smart watch, and the like, and the terminal may be in the form of a bracelet, an armband, or other mobile devices.
In one embodiment, the terminal is exemplified by the smart watch shown in fig. 1 and fig. 2, where fig. 1 is a schematic side view (a side of the smart watch that contacts the skin of the user when the smart watch is worn) of the smart watch, and fig. 2 is a schematic front view (a display side of a display of the smart watch) of the smart watch, thesmart watch 10 may include awatch face 11, and abezel 12 for attaching thewatch face 11 to the user, and thebezel 12 may be integrated with a housing of thewatch face 11 or may be a separate component. If integrated, thecase band 12 may be a continuation of the outer cover. In some cases (e.g., when the watch ring and the watch face can exchange information), an element may be disposed on thewatch ring 12, for example, an inner surface of the watch ring 12 (the surface that contacts the skin of the user when the smart watch is worn) may be disposed with an Electrocardiogram (ECG) sensor for detecting vital sign information, and the ECG sensor on thewatch ring 12 may transmit the detected vital sign information to thewatch face 11.
Thewatch face 11 may include a housing and various elements, which may include a processor, memory, display, etc., that may be disposed on an exterior surface of the housing, partially within the housing, disposed through the housing, fully within the housing, etc. The housing may include adial bottom 13 and adial top 14, and curved side portions extending from thedial bottom 13 to thedial top 14.
Wherein,dial plate bottom 13 is the face of contact with user's hand when intelligent watch wears, anddial plate top 14 is the display surface of the display of intelligent watch.
Adisplay 15 is disposed on the outer surface of thedial top 14, and a user can interact with information displayed on thedisplay 15 through gestures. For example, an application is displayed on thedisplay 15 and a user may make a designated gesture (e.g., a fist-making gesture) to cause thedisplay 15 to display the next page of the application.
Referring to fig. 3, which is a schematic structural diagram of the elements of thedial 11, thedial 11 includes aclock module 112, apower management module 113, abaseband chip 114, a memory 115 (one or more computer-readable storage media), acommunication module 116, and aperipheral system 117. These elements are interconnected to a processor (CPU)111 via one or more communication buses. The functions of these elements may be implemented by theprocessor 111, by various elements independently, or by a combination of multiple elements.
Theclock module 112 is mainly used for generating clocks required for data transmission and timing control for theprocessor 111. Thepower management module 113 is mainly used to provide stable and high-precision voltage for theprocessor 111, therf module 116 and peripheral systems. Thebaseband chip 114 is mainly used to synthesize a baseband signal to be transmitted or decode a received baseband signal.
Thememory 115 is coupled to theprocessor 111 for storing various software programs and/or sets of instructions. In particular implementations,memory 115 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices.Memory 115 may store an operating system (hereinafter referred to simply as a system), such as an embedded operating system like ANDROID, IOS, WINDOWS, or LINUX.Memory 115 may also store network communication programs that may be used to communicate with one or more additional devices, one or more terminal devices, one or more network devices. Thememory 115 may further store a user interface program, which may vividly display the content of the application program through a graphical operation interface, and receive a control operation of the application program from a user through input controls such as menus, dialog boxes, and buttons.
Memory 115 may also store one or more application programs. As shown in fig. 3, these applications may include: social applications (e.g., Facebook), image management applications (e.g., photo album), map-like applications (e.g., Google map), browsers (e.g., Safari, Google Chrome), sign information detection applications, and so forth.
Thecommunication module 116 may be configured to perform a communication connection with an external device, thecommunication module 116 may include a Radio Frequency (RF)module 1160 configured to receive and transmit a radio frequency signal, and thecommunication module 116 may further include aSIM card 1161, aWiFi module 1162, abluetooth module 1163, and a near fieldcommunication module NFC 1164.
Theperipheral system 117 is mainly used to implement an interactive function between thesmart watch 10 and a user/external environment, and mainly includes an input/output device of thesmart watch 10. In a specific implementation, theperipheral system 117 may include: atouch screen controller 118, acamera controller 119, anaudio controller 120, and asensor management module 121. Wherein, each controller can be coupled/integrated with the corresponding peripheral devices (such as thetouch screen 123, thecamera 124, theaudio circuit 125 and the sensor 126), or can be integrated with the central processing unit CPU in the dial. In some embodiments, thetouch screen 123 may be a touch screen configured with a self-capacitive floating touch panel, and may also be a touch screen configured with an infrared floating touch panel. In some embodiments, thecamera 124 may be a 3D camera. It should be noted that theperipheral system 117 may also include other I/O peripherals.
Thetouch screen 123 may include or correspond to thedisplay 15 shown in fig. 2, among other things.
Thesensors 126 may includeinfrared sensors 1261,motion sensors 1262, fingerprint sensors 1263, andECG detection devices 1260. In practice, theECG detecting device 1260 is only used for detecting the electrocardiogram information of the user, such as detecting the heart rate value, blood pressure value, blood oxygen value, etc. of the user, and theprocessor 111 can determine the health status of the user according to the electrocardiogram information. In the invention, theECG detection device 1260 is used for detecting the physical sign information of the user, and theprocessor 111 determines the gesture of the user according to the physical sign information of the user, so that the man-machine interaction can be conveniently realized through the gesture of the user, and the scheme can be realized only by a terminal provided with the ECG detection device for detecting the electrocardiogram information without increasing extra cost, so that the cost required by the scheme is lower.
TheECG detection device 1260 may include at least one ECG sensor disposed on the housing of thedial 11 or on thebracelet 12, and the at least one ECG sensor may be used for performing vital sign information detection, wherein the manner of disposition of the ECG sensor of thesmart watch 10 and the implementation of the vital sign information detection may refer to the embodiments shown in fig. 4 to 5.
Referring to fig. 4, which is a schematic structural diagram of an ECG sensor of thesmart watch 10, thesmart watch 10 includes afirst ECG sensor 16 and asecond ECG sensor 17 disposed on an outer surface of the dial bottom 13, and thefirst ECG sensor 16 and thesecond ECG sensor 17 can be used for detecting sign information.
Thefirst ECG sensor 16 and thesecond ECG sensor 17 can acquire the sign information of the user wearing the terminal in real time, so that the gesture of the user can be recognized in real time to improve the recognition accuracy, and thefirst ECG sensor 16 and thesecond ECG sensor 17 can also acquire the sign information of the user wearing the terminal periodically or under a certain specific scene, so that the power consumption can be reduced, and the consumption of CPU resources can be reduced.
In a specific scene, a scene in which the user is not convenient to trigger the terminal to perform corresponding operations through a touch screen or a physical key is detected, for example, when the user is detected to be in a motion state (such as running), thefirst ECG sensor 16 and thesecond ECG sensor 17 may acquire sign information of the user, so that a gesture of the user may be recognized according to the sign information, and then corresponding operations are performed according to the gesture, so that human-computer interaction may be more conveniently achieved.
Under a certain specific scene, the scene of detecting the operation of starting the detection of the physical sign information may also be a scene of detecting the operation of starting the detection of the physical sign information, where the scene of detecting the operation of starting the detection of the physical sign information may refer to: receiving an operation instruction aiming at the sign information detection application program, or detecting the operation of starting the function option of detecting the sign information.
In a specific scenario, the target application may be an application that can be controlled by a user gesture, and the user may set the application having the usage frequency greater than the preset frequency value to be controllable by the user gesture, such as a music application, a social application, a browser application, or an e-reader application.
For example, if theprocessor 111 detects a click operation on an icon of the target application program, or receives a voice instruction for running the target application program, or determines that the target application program is included in the list by looking up a process list of the application program, thesmart watch 10 determines that the running instruction for the target application program is detected, theprocessor 111 may configure thefirst ECG sensor 16 as a signal positive pole, configure thesecond ECG sensor 17 as a signal negative pole or a signal ground, perform information detection on thefirst ECG sensor 16 and thesecond ECG sensor 17, and synthesize information detected by thefirst ECG sensor 16 and information detected by thesecond ECG sensor 17 to obtain sign information of the user wearing the terminal.
It should be noted that, in order to prevent interference of noise information on the vital sign information and improve accuracy of the acquired vital sign information, theprocessor 111 may configure the second ECG sensor as a reference sensor, that is, the vital sign information acquired by the second ECG sensor as reference information, and therefore, thefirst ECG sensor 16 is configured as a signal positive pole, and thesecond ECG sensor 17 is configured as a signal negative pole or a signal ground.
The above-mentioned synthesizing of the information detected by thefirst ECG sensor 16 and the information detected by thesecond ECG sensor 17 may be: and calculating a difference value between the value of the information detected by thefirst ECG sensor 16 and the value of the information detected by thesecond ECG sensor 17, and further taking the difference value as sign information of the user, wherein the sign information comprises at least one of electrocardio information, picoelectric information and myoelectric information.
Thefirst ECG sensor 16 and thesecond ECG sensor 17 are not limited to be disposed on the outer surface of the dial bottom 13, that is, thefirst ECG sensor 16 and thesecond ECG sensor 17 may be disposed in any area that is in contact with the skin of the user when the smart watch is worn, such as on the inner surface of the bracelet 12 (i.e., the surface that is in contact with the skin of the user when the smart watch is worn), and so on. Meanwhile, thefirst ECG sensor 16 and thesecond ECG sensor 17 may be disposed on the same face, such as both disposed on the inner surface of the bracelet, or may be disposed on different faces, such as disposing the first ECG sensor on the inner surface of the bracelet and disposing the second ECG sensor on the outer surface of thedial bottom 13. Thefirst ECG sensor 16 and thesecond ECG sensor 17 may be deployed in a landscape or portrait arrangement or otherwise on the smart watch.
Referring to fig. 5, which is a schematic structural diagram of an ECG sensor of anothersmart watch 10, thesmart watch 10 includes athird ECG sensor 18 disposed on an outer surface of the watch dial bottom 13, and thethird ECG sensor 18 can be used for sign information detection.
Thethird ECG sensor 18 can acquire the sign information of the user wearing the terminal in real time, so that the gesture of the user can be recognized in real time to improve the recognition accuracy, and thethird ECG sensor 18 can also acquire the sign information of the user wearing the terminal periodically or under a certain specific scene, so that the power consumption can be reduced, and the consumption of CPU resources is reduced.
For example, if thesmart watch 10 detects a click operation on an icon of the application program for detecting vital sign information, or receives a voice instruction for operating the application program for detecting vital sign information, or determines that the list includes the application program for detecting vital sign information by looking up a process list of the application program, thesmart watch 10 determines that the operation instruction for the application program for detecting vital sign information is detected, performs the detection of the vital sign information by using thethird ECG sensor 18, and performs filtering processing on the detected vital sign information.
Note that, when theECG detection device 1260 includes only thethird ECG sensor 18, the vital sign information acquired by thethird ECG sensor 18 does not include reference information, so that theprocessor 111 may perform filtering processing on the vital sign information acquired by thethird ECG sensor 18 in order to improve the accuracy of the acquired vital sign information, because the vital sign information acquired by thethird ECG sensor 18 includes interference information including large noise.
The filtering process may include filtering manners such as low-pass filtering, normalized filtering, or gaussian filtering, but is not limited to the above filtering manners.
Among other things, thethird ECG sensor 18 is not limited to being disposed on the outer surface of the dial bottom 13, that is, thethird ECG sensor 18 may be disposed in any area that is in contact with the skin of the user when the smart watch is worn, such as may also be disposed on the inner surface of thebracelet 12, and so on.
The shape of the ECG sensor may be rectangular, circular, triangular, or the like, which is not limited in the embodiments of the present invention.
Based on the terminal shown in fig. 1-3 and some embodiments shown in fig. 4-5, a method for recognizing an action according to an embodiment of the present invention is described below.
Referring to fig. 6, which is a schematic flow chart of a motion recognition method, the method shown in fig. 6 includes:
s101, theECG detection device 1260 obtains the physical sign information of the user.
TheECG detection device 1260 can obtain the sign information of the user wearing the terminal in real time, so that the gesture of the user can be recognized in real time to improve the recognition accuracy, and theECG detection device 1260 can obtain the sign information of the user wearing the terminal periodically or under a certain specific scene, so that the power consumption can be reduced, and the consumption of CPU resources can be reduced.
Under a certain specific scene, for a scene in which it is inconvenient for the user to trigger the terminal to perform a corresponding operation in a manner of touching a screen or a physical key, for example, when it is detected that the user is in a motion state (such as running), theECG detection device 1260 may obtain sign information of the user, so that the gesture of the user may be identified according to the sign information, and then a corresponding operation is performed according to the gesture, thereby more conveniently implementing human-computer interaction.
The scene of detecting the operation of starting the detection of the physical sign information may be a scene of detecting the operation of starting the detection of the physical sign information, and the scene of detecting the operation of starting the detection of the physical sign information may refer to: receiving an operation instruction aiming at the sign information detection application program, or detecting the operation of starting the function option of detecting the sign information.
As an alternative embodiment, theECG detection apparatus 1260 may include two ECG sensors for performing vital sign information detection, and the deployment manner of the two ECG sensors and the specific implementation manner of the vital sign information detection may refer to the embodiment shown in fig. 4.
As an alternative embodiment, theECG detection device 1260 may include an ECG sensor for detecting the vital sign information, and the deployment manner of the ECG sensor and the specific implementation manner of the vital sign information detection can refer to the embodiment shown in fig. 5.
S102, theprocessor 111 determines the gesture of the user according to the sign information of the user.
Since the hand wearing the terminal makes different gestures, which results in different waveform characteristics of the vital sign information detected by the target ECG sensor, theprocessor 111 may determine the gesture information of the hand wearing thesmart watch 10 according to the waveform characteristics of the vital sign information.
The specific implementation manner of determining the gesture of the user by theprocessor 111 according to the sign information may include: if the sign information comprises an effective amplitude pair, determining that the gesture of the user is a fist making, wherein the effective amplitude pair comprises an effective positive amplitude and an effective negative amplitude, the time interval between the effective positive amplitude and the effective negative amplitude is smaller than a preset time interval threshold, the amplitude value of the effective positive amplitude is larger than a first preset amplitude threshold, and the absolute value of the amplitude value of the effective negative amplitude is larger than a second preset amplitude threshold.
If the sign information comprises a positive amplitude of which the amplitude value is greater than a third preset amplitude value and a negative amplitude of which the amplitude value is greater than the third preset amplitude value is not detected within a preset time period, determining that the gesture of the user is wrist overturning; or if the sign information includes a negative amplitude of which the amplitude value is greater than a third preset amplitude value and a positive amplitude of which the amplitude value is greater than the third preset amplitude value is not detected within a preset time period, determining that the gesture of the user is wrist turning.
Theprocessor 111 may determine the gesture according to the sign information, and may determine the number of times of the gesture according to the sign information.
Specifically, theprocessor 111 may obtain a waveform characteristic of the sign information within a preset time period, and determine a gesture of a hand wearing the terminal and a number of times of the gesture according to the waveform characteristic.
In one embodiment, if the waveform feature includes a valid amplitude pair, theprocessor 111 may determine that the gesture is a punch and count the number of occurrences of the valid amplitude pair, and determine the number of occurrences of the valid amplitude pair as the number of punches.
Wherein the effective amplitude pair comprises an effective positive amplitude and an effective negative amplitude, and a time interval between the effective positive amplitude and the effective negative amplitude is smaller than a preset time interval threshold, an amplitude value of the effective positive amplitude is larger than a first preset amplitude threshold, and an absolute value of the amplitude value of the effective negative amplitude is larger than a second preset amplitude threshold.
The first preset amplitude threshold and the second preset amplitude threshold may be set by a user, or may be equipment of an equipment manufacturer of the terminal. In addition, the first preset amplitude threshold and the second preset amplitude threshold may be the same or different.
Wherein, a fist gesture of making a fist includes a fist action and a fist action of making a pine, wears the hand of this intelligent wrist-watch 10 and for making a fist action as shown in figure 7, wears the hand of this intelligent wrist-watch 10 and for making a fist action as shown in figure 8, andprocessor 111 detects an effective amplitude right, and the hand of just confirming wearing this terminal has made a fist action and a fist action of making a fist.
If the number of times of making a fist is determined to be at least two times, and the time interval between the adjacent fist gestures is smaller than the first time interval, theprocessor 111 may determine the adjacent fist as a gesture occurring at the same time, that is, if the time interval between the adjacent fist gestures is too large, that is, the time interval between the adjacent fist gestures is greater than or equal to the first time interval threshold, theprocessor 111 may determine the adjacent fist as a gesture occurring at different times.
For example, theprocessor 111 may obtain a waveform of the vital sign information, the waveform characteristics are shown in table 1, where the columns in table 1 represent time (in ms) and the rows represent amplitude values (in μ V) of the vital sign information. Theprocessor 111 may obtain a waveform characteristic of the vital sign information within a preset time period according to the waveform, where the preset time period may be 221369-221600 ms, and theprocessor 111 may determine that the waveform characteristic within the 221369-221600 ms time period includes a positive amplitude with an amplitude value of 820 μ V and a negative amplitude with an amplitude value of-1000 μ V.
If the first preset amplitude threshold is set to be 600 μ V and the second preset amplitude threshold is set to be 800 μ V, theprocessor 111 may determine the forward amplitude with the amplitude value of 820 μ V as the effective forward amplitude because the amplitude value 820 μ V is greater than the first preset amplitude threshold; since the absolute value of the amplitude value-1000 μ ν is greater than the second preset amplitude threshold, theprocessor 111 may determine the negative amplitude of the amplitude value-1000 μ ν as the valid negative amplitude, i.e. determine that the characteristic waveform comprises valid amplitude pairs. Theprocessor 111 may determine that the gesture is a punch, count the number of occurrences of a valid amplitude pair, the number of occurrences of the valid amplitude pair being 1, and may determine that the number of punches is 1.
As another example, theprocessor 111 may obtain a waveform of the sign information, the waveform characteristic is shown in table 2, where the columns in table 2 represent time (in ms) and the rows represent amplitude values (in μ V) of the sign information. Theprocessor 111 may obtain a waveform characteristic of the vital sign information within a preset time period according to the waveform, where the preset time period may be 505100-505330 ms, and theprocessor 111 may determine that the waveform characteristic includes positive amplitudes with amplitudes of 1900 μ V and 1500 μ V and negative amplitudes with amplitudes of-200 μ V and-300 μ V within a time period of 505100-505330 ms, and if the first preset amplitude threshold is set to be 1000 μ V, and the second preset amplitude threshold is set to be 100 μ V.
Table 1:
Figure PCTCN2018079140-APPB-000001
since amplitude values 1900 μ V and 1500 μ V are greater than the first preset amplitude threshold,processor 111 may determine forward amplitudes having amplitude values 1900 μ V and 1500 μ V as valid forward amplitudes; since the absolute values of the amplitude values-200 μ V and-300 μ V are greater than the second predetermined amplitude threshold, theprocessor 111 may determine the negative amplitudes having amplitude values-200 μ V and 300 μ V as valid negative amplitudes. Because the effective positive amplitudes alternate with the effective negative amplitudes, theprocessor 111 may determine that the waveform specifically includes an effective amplitude pair, may determine that the gesture is a punch, and may count the number of occurrences of the effective amplitude pair, which is 2, and may determine that the number of punches is 2.
Table 2:
Figure PCTCN2018079140-APPB-000002
in an embodiment, the gesture of the hand wearing thesmart watch 10 is a wrist-flipping gesture as shown in fig. 9, if the waveform feature includes a forward amplitude having an amplitude value greater than a third preset amplitude value, theprocessor 111 may determine that the gesture is a wrist-flipping gesture, count the number of occurrences of the forward amplitude having the amplitude value greater than the third preset amplitude value, and determine that the number of occurrences of the forward amplitude having the amplitude value greater than the third preset amplitude value is a wrist-flipping gesture.
If the number of the wrist flipping times is determined to be at least two times, and the time interval between the adjacent wrist flipping gestures is smaller than the second time interval threshold, theprocessor 111 may determine the adjacent wrist flipping gestures as the gestures occurring at the same time, that is, if the time interval between the adjacent wrist flipping gestures is too large, that is, the time interval between the adjacent wrist flipping gestures is greater than or equal to the second preset time interval threshold, the terminal may determine the adjacent wrist flipping gestures as the gestures occurring at different times.
For example, theprocessor 111 may obtain a waveform of the vital sign information, the waveform characteristics are shown in table 3, where the columns in table 3 represent time (in ms) and the rows represent amplitude values (in μ V) of the vital sign information. Theprocessor 111 may obtain a waveform characteristic of the vital sign information within a preset time period according to the waveform, where the preset time period may be 291200-291500 ms, and theprocessor 111 may determine that the waveform characteristic within the 291200-291500 ms time period includes a forward amplitude having an amplitude value of 3500 μ V. If the third preset amplitude threshold is set to 2000 μ V, since the amplitude value of 3500 μ V is greater than the third preset amplitude value, theprocessor 111 may determine that the waveform feature includes a forward amplitude whose amplitude value is greater than the third preset amplitude value, and then may determine that the gesture is wrist flipping, and count the number of occurrences of the forward amplitude whose amplitude value is greater than the third preset amplitude value, where the number of occurrences of the forward amplitude whose amplitude value is greater than the third preset amplitude value is 1, and determine that the number of times of wrist flipping is 1.
Table 3:
Figure PCTCN2018079140-APPB-000003
theprocessor 111 may execute the control instruction according to the gesture information, for example, theprocessor 111 may determine the control instruction mapped by the gesture information according to a preset mapping relationship between the gesture information and the control instruction, and execute the control instruction, so that human-computer interaction may be realized based on the gesture information, and the human-computer interaction is more convenient.
Wherein the control instruction comprises one or more of a determining instruction, a canceling instruction, a returning instruction, a page turning instruction or a next instruction.
The preset mapping relationship between the gesture information and the control instruction may be set by a user or set by a device manufacturer of the terminal. If the user can set 1 fist to be mapped with a determining instruction, an entering instruction or a next selecting instruction, and 2 fist to be mapped with a canceling instruction or a returning instruction; mapping the 1-time wrist turning and page turning instructions, and the like.
Theprocessor 111 may set a mapping relationship between the gesture information and the control instruction according to the application, for example, in an application of an electronic reader application program, theprocessor 111 may set a 1-time fist making gesture to be mapped with an entry instruction, and if theprocessor 111 determines that the gesture is the 1-time fist making gesture, theprocessor 111 may execute the entry instruction, that is, start the electronic reader application program; theprocessor 111 may set the 2-time fist making gesture to be mapped with the return instruction, and if theprocessor 111 determines that the gesture is the 2-time fist making gesture, theprocessor 111 may execute the return instruction, such as returning to a main interface of an application of the e-reader application program; or 1 time of wrist turning and page turning instruction mapping may be set, theprocessor 111 determines that the gesture is the 1 time of wrist turning, and theprocessor 111 may execute the page turning instruction, so that the electronic reader application program loads the next page.
In one embodiment, thecommunication module 116 may transmit the control instruction to an external device connected to thesmart watch 10 and instruct the external device to execute the control instruction; or sending the gesture information to an external device connected with the terminal, instructing the external device to analyze the gesture information to obtain the control instruction, and executing the control instruction.
For example, thecommunication module 116 may send the control command (e.g., the control command is a return home interface) to a smartphone connected to the terminal, and instruct the smartphone to execute the return home interface command.
For another example, assuming that the gesture included in the gesture information is a wrist flipping, and the number of times of wrist flipping is 1, thecommunication module 116 may send the gesture information to a set top box connected to the terminal, instruct the set top box to analyze the gesture information, obtain the control instruction, and execute the control instruction.
Thesmart watch 10 may establish a connection with an external device through Communication methods such as bluetooth, Wireless Fidelity (WiFi), infrared, Near Field Communication (NFC), and the like.
The external device may include a set-top box, a game console, a television, a smart phone, a tablet computer, but is not limited to the above devices.
In the application, the terminal can acquire sign information according to the ECG detection device, and the gesture of the user is identified according to the sign information, so that the man-machine interaction can be realized according to the gesture of the user, and the gesture is made by identifying the hand wearing the terminal, so that the man-machine interaction can be realized, namely, the man-machine interaction can be realized through one-hand operation, and the man-machine interaction is more convenient. The scheme can be realized only by the terminal with the ECG detection device for detecting the electrocardiogram information, and extra cost is not required to be added, so that the cost required by the scheme is low.
Based on the description of the embodiment of the motion recognition method, the embodiment of the present application provides a schematic structural diagram of a motion recognition device, please refer to fig. 10, the motion recognition device shown in fig. 10 can be used in a terminal, the motion recognition device includes anECG detection module 901, adetermination module 902, wherein,
anECG detection module 901, configured to obtain sign information of a user;
a determiningmodule 902, configured to determine a gesture of the user according to the sign information of the user.
Wherein theECG detection module 901 comprises a first ECG sensor and a second ECG sensor; when detecting characteristic information of a user, the first and second ECG sensors are in contact with the skin of the user, the first ECG sensor is configured as a signal positive pole, the second ECG sensor is configured as a signal negative pole or a signal ground; the first ECG sensor and the second ECG sensor are used for acquiring sign information of a user wearing the terminal.
Wherein theECG detection module 901 comprises a third ECG sensor; the third ECG sensor is configured as a signal positive or a signal negative; the third ECG sensor is used for acquiring sign information of a user; and filtering the detected sign information.
Optionally, theECG detection module 901 is specifically configured to execute the step of acquiring the vital sign information by the ECG detection apparatus if it is detected that the operation of detecting the vital sign information is started.
Optionally, the determiningmodule 902 is specifically configured to determine that the gesture of the user is a fist if the sign information includes an effective amplitude pair, where the effective amplitude pair includes an effective positive amplitude and an effective negative amplitude, a time interval between occurrence of the effective positive amplitude and the effective negative amplitude is smaller than a preset time interval threshold, an amplitude value of the effective positive amplitude is greater than a first preset amplitude threshold, and an absolute value of an amplitude value of the effective negative amplitude is greater than a second preset amplitude threshold; and if the sign information comprises an amplitude of which the amplitude value is larger than a third preset amplitude value, determining that the gesture of the user is wrist turning.
In the embodiment of the present invention, the motion recognition apparatus has a function of implementing corresponding steps executed by the terminal in the motion recognition method in the embodiment corresponding to fig. 6. The function can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above. The modules may be software and/or hardware.
Based on the same inventive concept, as the principle and the beneficial effects of the motion recognition apparatus for solving the problems can refer to the implementation of the motion recognition method described in fig. 6 and the beneficial effects brought thereby, the implementation of the motion recognition apparatus can refer to the implementation of the motion recognition described in fig. 6, and repeated details are not repeated.
The present invention also provides a computer-readable storage medium, on which a computer program is stored, and the implementation and beneficial effects of the program for solving the problems can refer to the implementation and beneficial effects of the motion recognition method in fig. 6, and repeated details are not repeated.
The embodiment of the present invention further provides a computer program product, where the computer program product includes a non-volatile computer-readable storage medium storing a computer program, and when the computer program is executed, the computer executes the steps of the motion recognition method in the embodiment corresponding to fig. 6, and the implementation and beneficial effects of the computer program product for solving the problem may refer to the implementation and beneficial effects of the motion recognition method in fig. 6, and repeated details are not repeated.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above.

Claims (13)

  1. A motion recognition method applied to a terminal, wherein the terminal comprises an Electrocardiogram (ECG) detection device, and the method is characterized by comprising the following steps:
    acquiring sign information of a user through the ECG detection device;
    and determining the gesture of the user according to the sign information of the user.
  2. The method of claim 1,
    the ECG detecting device comprises a first ECG sensor and a second ECG sensor, and when the sign information of the user is detected, the first ECG sensor and the second ECG sensor are in contact with the skin of the user;
    the first ECG sensor is configured as a signal positive pole, the second ECG sensor is configured as a signal negative pole or a signal ground;
    the acquiring, by the ECG detection apparatus, sign information of a user includes:
    acquiring sign information of a user through the first ECG sensor and the second ECG sensor.
  3. The method of claim 1,
    the ECG detection device comprises a third ECG sensor;
    the third ECG sensor is configured as a signal positive or a signal negative;
    the acquiring, by the ECG detection apparatus, sign information of a user includes:
    detecting, by the third ECG sensor, vital sign information of a user;
    and filtering the detected sign information.
  4. The method according to any one of claims 1-3, further comprising:
    and if the operation of starting the detection of the physical sign information is detected, executing the step of acquiring the physical sign information of the user through the ECG detection device.
  5. The method according to claim 4, wherein the recognizing the gesture of the user according to the sign information of the user comprises:
    if the sign information of the user comprises an effective amplitude pair, determining that the gesture of the user is a fist making, wherein the effective amplitude pair comprises an effective positive amplitude and an effective negative amplitude, the time interval between the effective positive amplitude and the effective negative amplitude is smaller than a preset time interval threshold, the amplitude value of the effective positive amplitude is larger than a first preset amplitude threshold, and the absolute value of the amplitude value of the effective negative amplitude is larger than a second preset amplitude threshold;
    and if the sign information of the user comprises the amplitude of which the amplitude value is larger than a third preset amplitude value, determining that the gesture of the user is wrist turning.
  6. An action recognition device, comprising:
    the ECG detection module is used for acquiring sign information of a user;
    and the determining module is used for determining the gesture of the user according to the sign information of the user.
  7. The apparatus of claim 6,
    the ECG detection module comprises a first ECG sensor and a second ECG sensor, the first ECG sensor and the second ECG sensor are in contact with the skin of the user when detecting the characteristic information of the user;
    the first ECG sensor is configured as a signal positive pole, the second ECG sensor is configured as a signal negative pole or a signal ground;
    the first ECG sensor and the second ECG sensor are used for detecting sign information of a user.
  8. The apparatus of claim 6,
    the ECG detection module includes a third ECG sensor;
    the third ECG sensor is configured as a signal positive or a signal negative;
    and the third ECG sensor is used for detecting the physical sign information of the user and filtering the detected physical sign information.
  9. The apparatus according to any one of claims 6 to 8,
    the ECG detection module is specifically configured to execute the step of acquiring the sign information of the user through the ECG detection device if it is detected that the operation of detecting the sign information is started.
  10. The apparatus of claim 9,
    the determining module is specifically configured to determine that the gesture of the user is a fist if the sign information of the user includes an effective amplitude pair, where the effective amplitude pair includes an effective positive amplitude and an effective negative amplitude, a time interval between occurrence of the effective positive amplitude and the effective negative amplitude is smaller than a preset time interval threshold, an amplitude value of the effective positive amplitude is greater than a first preset amplitude threshold, and an absolute value of an amplitude value of the effective negative amplitude is greater than a second preset amplitude threshold; and if the sign information of the user comprises the amplitude of which the amplitude value is larger than a third preset amplitude value, determining that the gesture of the user is wrist turning.
  11. A terminal, characterized in that the terminal comprises at least one processor, a memory, and instructions stored on the memory and executable by the at least one processor, the at least one processor executing the instructions to implement the steps of the action recognition method according to any one of claims 1 to 5.
  12. A computer-readable storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions which, when executed by a processor, cause the processor to carry out the steps of the action recognition method according to any one of claims 1 to 5.
  13. A computer program product, characterized in that the computer program product comprises a non-transitory computer-readable storage medium storing a computer program which, when executed, causes a computer to implement the steps of the action recognition method of any one of claims 1 to 5.
CN201880059049.0A2017-10-092018-03-15 Motion recognition method, device and terminalPendingCN111052049A (en)

Applications Claiming Priority (3)

Application NumberPriority DateFiling DateTitle
CN2017109380542017-10-09
CN20171093805462017-10-09
PCT/CN2018/079140WO2019071913A1 (en)2017-10-092018-03-15Motion recognition method and device, and terminal

Publications (1)

Publication NumberPublication Date
CN111052049Atrue CN111052049A (en)2020-04-21

Family

ID=66100364

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201880059049.0APendingCN111052049A (en)2017-10-092018-03-15 Motion recognition method, device and terminal

Country Status (2)

CountryLink
CN (1)CN111052049A (en)
WO (1)WO2019071913A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101449971A (en)*2008-12-302009-06-10南京大学Portable cardiac diagnosis monitoring device based on rhythm mode
CN105373719A (en)*2014-09-012016-03-02三星电子株式会社User authentication method and apparatus based on electrocardiogram (ecg) signal
CN105935289A (en)*2015-03-062016-09-14三星电子株式会社Wearable electronic device and method for controlling the same
CN106104408A (en)*2013-11-292016-11-09行动股份有限公司 wearable computing device
CN106108892A (en)*2016-08-042016-11-16戴琨A kind of Electrocardiographic processing means and processing method
CN106527716A (en)*2016-11-092017-03-22努比亚技术有限公司Wearable equipment based on electromyographic signals and interactive method between wearable equipment and terminal
CN106774818A (en)*2015-11-202017-05-31三星电子株式会社Posture identification method, gesture recognition device and wearable device
CN106774842A (en)*2016-11-242017-05-31中国科学技术大学Driving-situation assistant's gesture intersection control routine

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103391594B (en)*2012-05-092016-08-03电信科学技术研究院A kind of method and device of wireless body area network routing optimality
US9282893B2 (en)*2012-09-112016-03-15L.I.F.E. Corporation S.A.Wearable communication platform
CN103645804A (en)*2013-12-182014-03-19三星电子(中国)研发中心Method and device for identifying human body gestures as well as watch using device
CN103676604B (en)*2013-12-242017-02-15华勤通讯技术有限公司Watch and running method thereof
CN106249851B (en)*2015-09-152020-03-17北京智谷睿拓技术服务有限公司Input information determination method and device
CN106249853B (en)*2015-09-152019-03-19北京智谷睿拓技术服务有限公司Exchange method and equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101449971A (en)*2008-12-302009-06-10南京大学Portable cardiac diagnosis monitoring device based on rhythm mode
CN106104408A (en)*2013-11-292016-11-09行动股份有限公司 wearable computing device
CN105373719A (en)*2014-09-012016-03-02三星电子株式会社User authentication method and apparatus based on electrocardiogram (ecg) signal
CN105935289A (en)*2015-03-062016-09-14三星电子株式会社Wearable electronic device and method for controlling the same
CN106774818A (en)*2015-11-202017-05-31三星电子株式会社Posture identification method, gesture recognition device and wearable device
CN106108892A (en)*2016-08-042016-11-16戴琨A kind of Electrocardiographic processing means and processing method
CN106527716A (en)*2016-11-092017-03-22努比亚技术有限公司Wearable equipment based on electromyographic signals and interactive method between wearable equipment and terminal
CN106774842A (en)*2016-11-242017-05-31中国科学技术大学Driving-situation assistant's gesture intersection control routine

Also Published As

Publication numberPublication date
WO2019071913A1 (en)2019-04-18

Similar Documents

PublicationPublication DateTitle
KR102348486B1 (en)Electronic device
EP3242186B1 (en)Electronic device comprising rotating body
US10949012B2 (en)Electronic device comprising force sensor
CN110764673B (en) Scrolling screenshot method and electronic device
CN107257949B (en)Touch processing method and electronic device supporting the same
US10256658B2 (en)Operating method of an electronic device and electronic device supporting the same
US11112889B2 (en)Electronic device and method for mapping function of electronic device to operation of stylus pen
US11328469B2 (en)Electronic device and method for providing drawing environment
CN107918760A (en)Electronic device and its control method with multiple fingerprint sensing patterns
CN110020622A (en)Fingerprint identification method and Related product
US11650674B2 (en)Electronic device and method for mapping function to button input
US10474274B2 (en)Electronic device and controlling method thereof
EP3396508A1 (en)Method of applying graphic effect and electronic device performing same
CN112703534A (en)Image processing method and related product
US11175821B2 (en)Pressure touch method and terminal
US11341219B2 (en)Apparatus for unlocking electronic device by using stylus pen and method thereof
EP3392749B1 (en)Content display method and electronic device for performing same
CN111052049A (en) Motion recognition method, device and terminal
CN110059203B (en)Data storage method, wearable device and computer readable storage medium
CN207780653U (en)A kind of mobile terminal shell and mobile terminal
CN115617192B (en) A touch positioning method and electronic device
US20230224401A1 (en)Electronic device including expandable display and operation method thereof
CN108415567B (en) Brain training methods and related products
CN108399326A (en)Electronic device, display control method and related product

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
RJ01Rejection of invention patent application after publication

Application publication date:20200421

RJ01Rejection of invention patent application after publication

[8]ページ先頭

©2009-2025 Movatter.jp