Disclosure of Invention
In view of this, it is necessary to provide a non-contact gesture control method and an intelligent terminal, aiming at improving the recognition rate of non-contact gesture control and reducing the misoperation of the intelligent terminal performed based on the non-contact gesture.
The embodiment of the invention discloses a non-contact gesture control method, which comprises the following steps:
the method comprises the steps that the intelligent terminal detects gesture signals through a sensor arranged on the intelligent terminal and obtains gesture tracks corresponding to the gesture signals; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal;
extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event;
and executing the operation event matched with the track characteristic value according to the search result.
Preferably, the method further comprises the step of:
and setting a mapping relation between the track characteristic value and the corresponding operation event.
Preferably, the step of setting the mapping relationship between the track characteristic value and the corresponding operation event includes:
acquiring a gesture signal generated by a user-defined gesture action, and acquiring a corresponding gesture track;
extracting a track characteristic value corresponding to the gesture track;
configuring an operation event to be executed correspondingly for the extracted track characteristic value;
and storing the mapping relation between the track characteristic value and the corresponding operation event.
Preferably, the step of detecting the gesture signal by the intelligent terminal through a sensor arranged on the intelligent terminal includes:
the intelligent terminal is provided with an infrared sensor, and acquires a corresponding gesture signal according to an infrared signal which reflects gesture track change and is detected by the infrared sensor;
or,
the intelligent terminal is provided with a light sensor, and the corresponding gesture signal is obtained according to a light intensity signal which reflects light intensity change and is detected by the light sensor;
or,
an ultrasonic sensor is arranged on the intelligent terminal, and the corresponding gesture signal is obtained according to an ultrasonic signal which reflects the change of the amplitude of the sound wave and is detected by the ultrasonic sensor;
or,
and a capacitive sensor is arranged on the intelligent terminal, and the corresponding gesture signal is acquired according to an electric signal which reflects the gesture track change and is detected by the capacitive sensor.
Preferably, the intelligent terminal receives an operation request triggered by a user, and updates the set mapping relationship between the track characteristic value and the corresponding operation event.
The embodiment of the invention also discloses an intelligent terminal, which comprises:
the detection module is used for detecting a gesture signal through a sensor arranged on the intelligent terminal and acquiring a gesture track corresponding to the gesture signal; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal;
the acquisition module is used for extracting a track characteristic value corresponding to the gesture track, and searching an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event;
and the execution module is used for executing the operation event matched with the track characteristic value according to the search result.
Preferably, the intelligent terminal further comprises:
and the setting module is used for setting the mapping relation between the track characteristic value and the corresponding operation event.
Preferably, the setting module is further configured to:
acquiring a gesture signal generated by a user-defined gesture action, and acquiring a corresponding gesture track;
extracting a track characteristic value corresponding to the gesture track;
configuring an operation event to be executed correspondingly for the extracted track characteristic value;
and storing the mapping relation between the track characteristic value and the corresponding operation event.
Preferably, the detection module is further configured to:
detecting an infrared signal reflecting the change of the gesture track by using an infrared sensor arranged on the intelligent terminal, and acquiring a corresponding gesture signal;
or,
detecting a light intensity signal reflecting light intensity change by using a light sensor arranged on the intelligent terminal to obtain a corresponding gesture signal;
or,
detecting an ultrasonic signal reflecting the amplitude change of the sound wave by using an ultrasonic sensor arranged on the intelligent terminal, and acquiring a corresponding gesture signal;
or,
and detecting an electric signal reflecting capacitance change by using a capacitance sensor arranged on the intelligent terminal to obtain the corresponding gesture signal.
Preferably, the setting module is further configured to:
and receiving an operation request triggered by a user, and updating the set mapping relation between the track characteristic value and the corresponding operation event.
The intelligent terminal detects a gesture signal through a sensor arranged on the intelligent terminal, and obtains a gesture track corresponding to the gesture signal; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal; extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event; executing an operation event matched with the track characteristic value according to the search result; compared with the operation mode of carrying out non-contact gesture control based on image recognition in the prior art, the embodiment of the invention improves the recognition rate of the non-contact gesture control and reduces the misoperation of the intelligent terminal based on the non-contact gesture.
Detailed Description
The technical scheme of the invention is further explained by combining the drawings and the specific embodiments in the specification. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The number of the intelligent terminals described in the invention is at least one, and the intelligent terminals can be any intelligent terminals such as a smart phone and a tablet personal computer. Those skilled in the art can understand that the embodiment of the present invention does not limit the specific representation form of the intelligent terminal as long as the idea of the non-contact gesture control method of the present invention can be implemented. The description of the intelligent terminal is as described above, and in the following specific embodiment that describes the corresponding technical solution, the above description is not repeated.
Referring to fig. 1, fig. 1 is a schematic flow chart of a non-contact gesture control method according to a first embodiment of the present invention; as shown in FIG. 1, the non-contact gesture control method of the present invention comprises the following steps:
step S11, the intelligent terminal detects a gesture signal through a sensor arranged on the intelligent terminal, and obtains a gesture track corresponding to the gesture signal;
in the embodiment of the invention, the intelligent terminal is provided with a corresponding sensor; the specific number of the sensors is set according to needs, and the specific configuration number of the sensors is not limited in the embodiment of the invention. The sensor arranged on the intelligent terminal is used for detecting a gesture signal generated by the gesture change of the user; the gesture signal is generated correspondingly to a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal.
After detecting a gesture signal generated by a non-contact gesture action of a user, the intelligent terminal acquires a gesture track corresponding to the gesture signal.
In a preferred embodiment of the present invention, the detection modes of the intelligent terminal, through the sensor arranged on the intelligent terminal, for detecting the gesture signal corresponding to the gesture action triggered by the user include, but are not limited to, the following:
the intelligent terminal is provided with an infrared sensor, and the corresponding gesture signal is obtained according to the infrared signal which reflects the gesture track change and is detected by the infrared sensor. Or, a light sensor is arranged on the intelligent terminal, and the corresponding gesture signal is acquired according to a light intensity signal which reflects light intensity change and is detected by the light sensor. Or, an ultrasonic sensor is arranged on the intelligent terminal, and the corresponding gesture signal is obtained according to the ultrasonic signal which is detected by the ultrasonic sensor and reflects the amplitude change of the sound wave. Or, a capacitance sensor is arranged on the intelligent terminal, and the corresponding gesture signal is obtained according to the electric signal which reflects capacitance change and is detected by the capacitance sensor.
Those skilled in the art can understand that the embodiment of the present invention is not limited as to which sensor is specifically adopted to detect the gesture signal, which is triggered by the user and corresponds to the non-contact gesture motion of the intelligent terminal, by the sensor arranged on the intelligent terminal.
After the intelligent terminal detects a gesture signal corresponding to the gesture action of the user, a gesture track corresponding to the gesture signal is obtained according to the gesture signal. The gesture track acquired by the intelligent terminal can be understood as follows: and the gesture action triggered by the user is mapped to the operation track.
Step S12, extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event;
after the intelligent terminal acquires a gesture track corresponding to a gesture signal, extracting a track characteristic value corresponding to the gesture track according to key points on the gesture track; and the track characteristic value corresponding to the gesture track can uniquely determine the specific form of the gesture track. One extraction mode for the intelligent terminal to extract the trajectory characteristic value corresponding to the gesture trajectory may be: obtaining a track key point which uniquely determines the gesture track, extracting the track key point and obtaining the coordinate of the key point; and taking the coordinates of the key points as input parameters, and acquiring a track characteristic value corresponding to the gesture track according to a preset algorithm.
After the track characteristic value corresponding to the track is obtained, the intelligent terminal searches the operation event matched with the track characteristic value according to the preset mapping relation between the track characteristic value and the operation event.
And step S13, executing the operation event matched with the track characteristic value according to the search result.
The intelligent terminal determines an operation event matched with the track characteristic value according to the search result, namely the operation event which needs to be executed by the intelligent terminal; and after the operation event is determined, the intelligent terminal executes the operation event matched with the track characteristic value.
For example, if the operation event matched with the trajectory feature value is to light a screen, the intelligent terminal executes an operation of lighting the screen.
The intelligent terminal detects a gesture signal through a sensor arranged on the intelligent terminal, and obtains a gesture track corresponding to the gesture signal; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal; extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event; executing an operation event matched with the track characteristic value according to the search result; compared with the operation mode of carrying out non-contact gesture control based on image recognition in the prior art, the embodiment of the invention improves the recognition rate of the non-contact gesture control and reduces the misoperation of the intelligent terminal based on the non-contact gesture.
Based on the description of the embodiment shown in fig. 1, please refer to fig. 2, and fig. 2 is a flowchart illustrating a non-contact gesture control method according to a second embodiment of the present invention; the difference between the embodiment of the present invention and the embodiment shown in fig. 1 is that before the intelligent terminal executes the function of non-contact gesture control operation, a trajectory characteristic value of a gesture trajectory corresponding to a gesture action triggered by a user needs to be preset, and a mapping relationship between the trajectory characteristic value and a corresponding operation event.
The setting action is only needed to be executed when the mobile terminal is used for the first time, and the setting action is only needed to be executed once under the condition that other factors are not considered; after the setting is completed, the set mapping relationship between the track characteristic value and the operation event can be updated according to specific requirements.
As shown in fig. 2, before the intelligent terminal detects a gesture signal through a sensor arranged on the intelligent terminal in step S11 and obtains a gesture track corresponding to the gesture signal in the embodiment of fig. 1, the method for controlling a non-contact gesture further includes the steps of:
and step S10, setting the mapping relation between the track characteristic value and the corresponding operation event.
In the embodiment of the invention, the intelligent terminal configures and stores the mapping relation between the track characteristic value corresponding to the gesture track and the operation event based on the setting operation triggered by the user. In a preferred embodiment of the present invention, different track characteristic values and their corresponding operation events may be set in a table form, that is, the mapping relationship between the track characteristic values and the operation events is stored in a mapping relationship table form. When the updating condition of the mapping relation table is triggered, the intelligent terminal responds to a corresponding updating instruction to update the mapping relation table; for example, the intelligent terminal receives an operation request triggered by a user, and updates the set mapping relationship between the track characteristic value and the corresponding operation event.
The mapping relationship between the track characteristic value of the intelligent terminal updating gesture track and the corresponding operation event includes, but is not limited to, the following: adding a mapping relation between the track characteristic value of the gesture track and the corresponding operation event; or deleting the mapping relation between the track characteristic value of the stored gesture track and the corresponding operation event; or changing the mapping relation between the track characteristic value of the stored gesture track and the corresponding operation event, and the like. The embodiment of the present invention does not specifically limit the updating content of the intelligent terminal for updating the mapping relationship between the trajectory characteristic value and the corresponding operation event.
According to the embodiment of the invention, the intelligent terminal improves the recognition efficiency of recognizing the operation event corresponding to the non-contact gesture by the intelligent terminal by setting the mapping relation between the track characteristic value of the gesture track and the corresponding operation event, so that the execution efficiency of executing the non-contact gesture control by the intelligent terminal is improved.
Referring to fig. 3 based on the description of the embodiments of fig. 1 and fig. 2, fig. 3 is a schematic flow chart illustrating an embodiment of a mapping relationship between a terminal-set trajectory feature value and a corresponding operation event in the non-contact gesture control method according to the present invention; in the embodiments of the present invention, only one specific setting manner in the mapping relationship between the trajectory characteristic value of the gesture trajectory and the corresponding operation event is described, and for other descriptions related to the non-contact gesture control method of the present invention, please refer to the description of the corresponding embodiments, which is not described herein again.
As shown in fig. 3, in the non-contact gesture control method according to the embodiment of the present invention, the mapping relationship between the trajectory feature value and the corresponding operation event may be set in the following manner:
s21, acquiring gesture signals generated by user-defined gesture actions, and acquiring corresponding gesture tracks;
in the embodiment of the invention, the intelligent terminal acquires the gesture signal generated by the user-defined gesture action by using the set sensor, for example, acquires the corresponding gesture signal by using the light sensor, the ultrasonic sensor, the infrared sensor, the capacitive sensor or other sensors, and acquires the corresponding gesture track according to the acquired gesture signal.
S22, extracting a track characteristic value corresponding to the gesture track;
after the intelligent terminal acquires a gesture track corresponding to a gesture signal, extracting a track characteristic value corresponding to the gesture track according to key points on the gesture track; and the track characteristic value corresponding to the gesture track can uniquely determine the specific form of the gesture track. One extraction mode for the intelligent terminal to extract the trajectory characteristic value corresponding to the gesture trajectory may be: obtaining a track key point which uniquely determines the gesture track, extracting the track key point and obtaining the coordinate of the key point; and taking the coordinates of the key points as input parameters, and acquiring a track characteristic value corresponding to the gesture track according to a preset algorithm.
Step S23, configuring the operation event to be executed correspondingly for the extracted track characteristic value;
and step S24, storing the mapping relation between the track characteristic value and the corresponding operation event.
And configuring an operation event to be executed correspondingly for the extracted track characteristic value based on an operation instruction triggered by a user, generating a mapping relation between the track characteristic value and the corresponding operation event, and storing the mapping relation.
According to the embodiment of the invention, the intelligent terminal realizes the configuration of the mapping relation between the track characteristic value and the corresponding operation event.
Referring to fig. 4, fig. 4 is a schematic diagram of a functional module of an intelligent terminal according to a first embodiment of the present invention; as shown in fig. 4, in the embodiment of the present invention, the intelligent terminal includes: the device comprises a detection module 01, an acquisition module 02 and an execution module 03.
Wherein the detection module 01 is configured to: detecting a gesture signal through a sensor arranged on the intelligent terminal, and acquiring a gesture track corresponding to the gesture signal; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal;
in the embodiment of the invention, the intelligent terminal is provided with a corresponding sensor; the specific number of the sensors is set according to needs, and the specific configuration number of the sensors is not limited in the embodiment of the invention. The detection module 01 detects a gesture signal generated by the gesture change of the user based on the sensor arranged on the intelligent terminal; the gesture signal is generated correspondingly to a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal.
After the detection module 01 detects a gesture signal generated by a non-contact gesture action of a user, a gesture track corresponding to the gesture signal is obtained.
In a preferred embodiment of the present invention, the detection module 01 detects, through a sensor arranged on the intelligent terminal, gesture signals corresponding to gesture actions triggered by a user in a detection manner, which includes but is not limited to the following:
the intelligent terminal is provided with an infrared sensor, and the detection module 01 acquires the corresponding gesture signal according to the infrared signal which reflects the gesture track change and is detected by the infrared sensor. Or, a light sensor is arranged on the intelligent terminal, and the detection module 01 acquires the corresponding gesture signal according to a light intensity signal which is detected by the light sensor and reflects the change of the light intensity. Or, an ultrasonic sensor is arranged on the intelligent terminal, and the detection module 01 acquires the corresponding gesture signal according to the ultrasonic signal which is detected by the ultrasonic sensor and reflects the amplitude change of the sound wave. Or, a capacitance sensor is arranged on the intelligent terminal, and the detection module 01 acquires the corresponding gesture signal according to an electric signal which reflects capacitance change and is detected by the capacitance sensor.
Those skilled in the art can understand that the detection module 01 only needs to detect a gesture signal corresponding to a non-contact gesture motion of the intelligent terminal triggered by a user through a sensor arranged on the intelligent terminal, and as for what kind of sensor is specifically adopted on the intelligent terminal for detection, the embodiment of the present invention does not limit the detection.
After the detection module 01 detects a gesture signal corresponding to a gesture action of a user, a gesture track corresponding to the gesture signal is obtained according to the gesture signal. The gesture track acquired by the detection module 01 can also be understood as: and the gesture action triggered by the user is mapped to the operation track.
The obtaining module 02 is configured to: extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event;
after the detection module 01 acquires a gesture track corresponding to a gesture signal, the acquisition module 02 extracts a track characteristic value corresponding to the gesture track according to key points on the gesture track; and the track characteristic value corresponding to the gesture track can uniquely determine the specific form of the gesture track. One of the extraction modes of the obtaining module 02 for extracting the trajectory feature value corresponding to the gesture trajectory may be: obtaining a track key point which uniquely determines the gesture track, extracting the track key point and obtaining the coordinate of the key point; and taking the coordinates of the key points as input parameters, and acquiring a track characteristic value corresponding to the gesture track according to a preset algorithm.
After obtaining the track characteristic value corresponding to the track, the obtaining module 02 searches for the operation event matched with the track characteristic value according to a preset mapping relationship between the track characteristic value and the operation event.
The execution module 03 is configured to: and executing the operation event matched with the track characteristic value according to the search result.
The executing module 03 determines, according to the search result of the obtaining module 02, an operation event that matches the trajectory eigenvalue, that is, an operation event that the executing module 03 needs to execute; after the operation event is determined, the execution module 03 executes the operation event matched with the trajectory feature value.
For example, if the operation event that the trajectory feature value matches is to light a screen, the execution module 03 executes an operation of lighting the screen.
The intelligent terminal detects a gesture signal through a sensor arranged on the intelligent terminal, and obtains a gesture track corresponding to the gesture signal; the gesture signal is generated correspondingly by a gesture action triggered by a user, and the gesture action triggered by the user is not in contact with the intelligent terminal; extracting a track characteristic value corresponding to the gesture track, and searching for an operation event matched with the track characteristic value by using a preset mapping relation between the track characteristic value and the operation event; executing an operation event matched with the track characteristic value according to the search result; compared with the operation mode of carrying out non-contact gesture control based on image recognition in the prior art, the embodiment of the invention improves the recognition rate of the non-contact gesture control and reduces the misoperation of the intelligent terminal based on the non-contact gesture.
Referring to fig. 5, based on the description of the embodiment shown in fig. 4, fig. 5 is a schematic diagram of a functional module of an intelligent terminal according to a second embodiment of the present invention; the difference between the embodiment of the present invention and the embodiment described in fig. 4 is that before the intelligent terminal executes the function of non-contact gesture control operation, a trajectory characteristic value of a gesture trajectory corresponding to a gesture action triggered by a user needs to be preset, and a mapping relationship between the trajectory characteristic value and a corresponding operation event.
The setting action is only needed to be executed when the mobile terminal is used for the first time, and the setting action is only needed to be executed once under the condition that other factors are not considered; after the setting is completed, the set mapping relationship between the track characteristic value and the operation event can be updated according to specific requirements. This setting operation will be done by the setting module 04 on the smart terminal.
As shown in fig. 5, the intelligent terminal further includes:
and the setting module 04 is configured to set a mapping relationship between the track characteristic value and the corresponding operation event.
In the embodiment of the present invention, the setting module 04 configures and stores a mapping relationship between the trajectory characteristic value corresponding to the gesture trajectory and the operation event based on a setting operation triggered by a user. The setting module 04 may set the mapping relationship between the track characteristic value and the corresponding operation event by:
acquiring a gesture signal generated by a user-defined gesture action, and acquiring a corresponding gesture track; extracting a track characteristic value corresponding to the gesture track; configuring an operation event to be executed correspondingly for the extracted track characteristic value; and storing the mapping relation between the track characteristic value and the corresponding operation event.
In a preferred embodiment of the present invention, the setting module 04 may set different track characteristic values and their corresponding operation events in a table form, that is, a mapping relationship between the track characteristic values and the operation events is stored in a form of a mapping relationship table.
When the update condition of the mapping table is triggered, the setting module 04 responds to a corresponding update instruction to update the mapping table; for example, the setting module 04 receives an operation request triggered by a user, and updates the set mapping relationship between the track characteristic value and the corresponding operation event.
The setting module 04 updates the mapping relationship between the trajectory characteristic value of the gesture trajectory and the corresponding operation event, including but not limited to the following: adding a mapping relation between the track characteristic value of the gesture track and the corresponding operation event; or deleting the mapping relation between the track characteristic value of the stored gesture track and the corresponding operation event; or changing the mapping relation between the track characteristic value of the stored gesture track and the corresponding operation event, and the like. The embodiment of the present invention does not specifically limit the updating content of the setting module 04 for updating the mapping relationship between the trajectory feature value and the corresponding operation event.
According to the embodiment of the invention, the intelligent terminal improves the recognition efficiency of recognizing the operation event corresponding to the non-contact gesture by the intelligent terminal by setting the mapping relation between the track characteristic value of the gesture track and the corresponding operation event, so that the execution efficiency of executing the non-contact gesture control by the intelligent terminal is improved.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only for the preferred embodiment of the present invention and is not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes that can be directly or indirectly applied to other related technical fields using the contents of the present specification and the accompanying drawings are included in the scope of the present invention.