Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides an application program operation method, which comprises the following steps: recognizing a touch gesture input on the intelligent terminal; and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in a preset application program touch recognition area, executing corresponding operation on the application program corresponding to the application program touch recognition area, wherein each application program touch recognition area corresponds to one application program. The embodiment of the invention also provides a corresponding application program operating device, which is respectively described in detail below.
Example one
An embodiment of the present invention provides an application program operating method, as shown in fig. 1-a, the application program operating method in the embodiment of the present invention includes:
step 101, identifying whether a touch gesture input on the intelligent terminal is sliding up and down or sliding down and up in a preset application program touch identification area;
wherein each application touch recognition area corresponds to an application.
In the embodiment of the invention, a corresponding application program touch identification area is preset for an installed application program on the intelligent terminal, and whether a touch gesture input on the intelligent terminal slides up and down or slides down and up in the preset application program touch identification area is identified by detecting the position information of each touch point triggered (namely touched) in the touch process. Each touch process starts from the moment that a touch point on a touch screen of the intelligent terminal is triggered to the moment that all touch points on the touch screen are released.
Because the sliding up and down or the sliding down and up in the area with the smaller range has certain difficulty, the upper and lower boundary ranges of the application program clicking area of the application program can be expanded on the basis of the application program clicking area of the application program for facilitating the operation of a user, and the upper and lower boundary ranges are used as the application program touch recognition area of the application program. Optionally, the left and right boundaries of the application touch recognition area are the same as the application click area of the corresponding application, and the upper and lower boundary ranges of the application touch recognition area are larger than the application click area of the corresponding application, for example, when the application touch recognition area is preset for the application 1, the left and right boundaries of the application touch recognition area of the application 1 are set as the left and right boundaries of the application click area of the application 1, and the upper and lower boundary ranges of the application touch recognition area of the application 1 are set as the upper and lower boundary ranges of the application click area of the application 1. The application program click area is a range in which the corresponding application program receives click triggering (that is, a user executes a click operation in a certain application program click area to trigger an application program corresponding to the application program click area). Specifically, the ratio of the upper and lower boundary ranges of the touch recognition area of the application program to be larger than the application program click area of the corresponding application program may be 40%, and of course, the upper and lower boundary ranges may be set to other values. It should be noted that the ratio of the upper and lower boundary ranges of the touch recognition area of the application program being larger than the application program click area of the corresponding application program cannot be too large, otherwise, the upper and lower boundary ranges may conflict with other operations (for example, an operation of drawing a screen upwards).
Optionally, when it is recognized that the touch gesture input on the smart terminal meets the first condition, it is determined that the touch gesture slides up and down in a preset application program touch recognition area, and otherwise, it is determined that the touch gesture does not slide up and down in the preset application program touch recognition area. Wherein the first condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the lower half portion of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the upper half portion of the application program touch recognition area.
Optionally, when it is recognized that the touch gesture input on the smart terminal meets the second condition, it is determined that the touch gesture slides up and down in a preset application program touch recognition area, and otherwise, it is determined that the touch gesture does not slide up and down in the preset application program touch recognition area. Wherein the second condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the upper half area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the lower half area of the application program touch recognition area.
102, when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch recognition area, executing corresponding operation on the application program corresponding to the application program touch recognition area;
in the embodiment of the present invention, a corresponding operation is set in advance for a touch gesture sliding up and down or sliding down and up in an application touch recognition area, so that when the touch gesture input on the smart terminal is recognized instep 101 as sliding up and down or sliding down and up in a preset application touch recognition area, a corresponding operation is performed on an application corresponding to the corresponding application touch recognition area.
Specifically, the area where the icon of the application is located may be set as the application touch recognition area. For example, as shown in fig. 1-b, a schematic diagram of an interface and an application touch recognition area of an intelligent terminal in an application scenario is shown, where an icon 1, an icon 2, an icon 3, an icon 4, an icon 5, and an icon 6 are icons of 6 applications installed on the intelligent terminal (for convenience of description, these 6 applications are described as an application 1, an application 2, an application 3, an application 4, an application 5, and an application 6, respectively), as shown in fig. 1-b, an area a1 where the icon 1 is located may be set as an application touch recognition area corresponding to the application 1 in advance, an area a2 where the icon 2 is located may be set as an application touch recognition area corresponding to the application 2, an area A3 where the icon 3 is located may be set as an application touch recognition area corresponding to the application 3, the area a4 in which the icon 4 is located is set as an application touch recognition area corresponding to the application 4, the area a5 in which the icon 5 is located is set as an application touch recognition area corresponding to the application 5, and the area a6 in which the icon 6 is located is set as an application touch recognition area corresponding to the application 6. It should be noted that, in fig. 1-b, blank areas are left between the touch recognition areas of the applications (i.e., they are not connected to each other), and in other embodiments, the touch recognition areas of the applications corresponding to the adjacent icons may be connected, as shown in fig. 1-c, the area a1 is connected to the area a2 and the area a4, the area a2 is connected to the area a1, the area A3 and the area a5, and the area A3 is connected to the area a2 and the area a6, respectively. Of course, in other embodiments, other areas may be set as the application touch recognition areas corresponding to the applications, which is not limited herein. Further, the area where the icon of the application program is located may also be enlarged, and the enlarged area is used as the application program touch recognition area of the corresponding application program, for example, the position of the upper boundary of the area where the icon of the application program is located may be raised by a preset height on the basis of the area where the icon of the application program is located, so as to enlarge the touch range of the application program touch recognition area of the application program in the vertical direction.
In an application scenario, an operation corresponding to a touch gesture of sliding up and down or sliding down and up in a preset application touch recognition area is set as starting an application, and instep 102, when it is recognized that the touch gesture input on the smart terminal is sliding up and down or sliding down and up in the preset application touch recognition area, the application corresponding to the application touch recognition area is started. Taking fig. 1-b as an example, when thestep 101 recognizes that the application program slides up and down or slides down and up in the area a1, the application program corresponding to the area a1 (i.e., the application program 1) is started.
In another application scenario, it is set that an operation corresponding to a touch gesture of sliding up and down or sliding down and up in a preset application touch recognition area is to pop up a widget interface (also referred to as a pop-up frame interface) related to an application, and instep 102, when it is recognized that the touch gesture input on the smart terminal is sliding up and down or sliding down and up in the preset application touch recognition area, the widget interface related to the application corresponding to the application touch recognition area is popped up. Taking fig. 1-b as an example for explanation, whenstep 101 identifies that the application program slides up and down or slides down and up in the area a1, the widget interface related to the application program corresponding to the area a1 (i.e. the aforementioned application program 1) is popped up.
In another application scenario, the touch gestures of the up-down sliding or the up-down sliding in the touch recognition area of the preset application program may be set to correspond to different operations respectively. For example, the operation corresponding to the touch gesture sliding up and down in the preset application touch recognition area may be set as starting the application, the operation corresponding to the touch gesture sliding up and down in the preset application touch recognition area may be set as popping up the widget interface related to the application, or the operation corresponding to the touch gesture sliding up and down in the preset application touch recognition area may be set as popping up the widget interface related to the application, and the operation corresponding to the touch gesture sliding up and down in the preset application touch recognition area may be set as starting the application.
Further, when thestep 101 recognizes that the touch gesture input on the smart terminal does not slide up and down in the application touch recognition area and the touch gesture does not slide up and down in the application touch recognition area, immediately, or after waiting for a preset time or a preset event, returning to thestep 101.
It should be noted that the application operating device may be integrated into the smart terminal in a hardware and/or software manner, and the smart terminal may be specifically a smart phone, a tablet computer, or another terminal equipped with a touch screen, which is not limited herein.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
Example two
The difference between the embodiment of the present invention and the first embodiment is that the embodiment of the present invention further defines an identification condition of up-down sliding to further reduce the probability of misoperation of the application program, as shown in fig. 2-a, the application program operation method in the embodiment of the present invention includes:
step 201, identifying whether a touch gesture input on the intelligent terminal meets a first condition;
wherein the first condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the lower half portion of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the upper half portion of the application program touch recognition area.
If the touch gesture input on the intelligent terminal is recognized to satisfy the first condition,step 202 is executed, and if the touch gesture input on the intelligent terminal is recognized not to satisfy the first condition,step 203 is executed.
An x-y coordinate system used by the touch screen of the intelligent terminal is shown in fig. 2-b (the coordinate system shown in fig. 2-b is also a default coordinate system of the current intelligent terminal), in the coordinate system shown in fig. 2-b, coordinate values of an x axis sequentially increase from left to right, and coordinate values of a y axis sequentially decrease from top to bottom, and correspondingly, in the touch screen of the intelligent terminal applying the coordinate system shown in fig. 2-b, the x coordinate values of touch points sequentially increase from left to right, and the y coordinate values of touch points sequentially increase from top to bottom. Taking the preset application touch recognition area as a rectangular area as an example, instep 201, if an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt or less, and the above-mentioned one touch pointOrdinate value y ofiSatisfying (ymax + ymin)/2. ltoreq. yiIf the touch point is less than or equal to ymax, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; if the abscissa value x of a touch point of the touch gesture input on the intelligent terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, judging that the touch point is positioned in the upper half area of the touch identification area of the application program; wherein the xleft and the xrt are abscissa values of a left boundary and an abscissa value of a right boundary of the application touch recognition area, respectively, and the ymin and the ymax are ordinate values of an upper boundary and an ordinate value of a lower boundary of the application touch recognition area, respectively.
In an application scenario, whether the touch gesture input on the intelligent terminal meets the first condition can be identified by detecting the positions of the touch points of the input touch gesture. Taking the area a1 in fig. 1-b as an example for explanation, setting an x-y coordinate system used by a touch screen of an intelligent terminal as shown in fig. 2-b, setting x coordinate values of a left boundary and a right boundary of the area a1 as xleft and xrt, respectively, and y coordinate values of an upper boundary and a lower boundary of the area a1 as ymax and ymin, respectively, and then a y coordinate value ymid of a boundary between an upper half area and a lower half area of the area a1 as (ymax + ymin)/2, when each touch point of the input touch gesture is recognized in the area a1, and y coordinate values of a first touch point and a last touch point of the input touch gesture are both greater than ymid (i.e. both located in the lower half area of the application touch recognition area), and when at least one touch point of the input touch gesture has a y coordinate value less than ymid (i.e. located in the upper half area of the application touch recognition area), and if not, determining that the touch gesture input on the intelligent terminal does not meet the first condition, and executingstep 203.
In another application scenario, the first touch point, the inflection point and the last touch point of the input touch gesture can be also detectedTo identify whether the touch gesture input on the smart terminal satisfies the first condition. Specifically, if it is recognized that a first touch point, an inflection point and a last touch point of a touch gesture input on the intelligent terminal are all in the same preset application touch recognition area, the first touch point and the last touch point are located in a lower half area of the application touch recognition area, and the inflection point is located in an upper half area of the application touch recognition area, it is determined that the touch gesture satisfies the first condition. In this application scenario, it is necessary to determine an inflection point, where the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1I.e., a trip point where the y coordinate value increases to decrease in the y coordinate value or decreases to increase in the y coordinate value), where the y coordinate value increasesjA vertical coordinate value y representing the inflection pointj-1Ordinate value, y, of a touch point immediately preceding the inflection pointj+1And a vertical coordinate value of a touch point subsequent to the inflection point. Taking the area a1 in fig. 1-b as an example for explanation, assuming that an x-y coordinate system used by a touch screen of the smart terminal is as shown in fig. 2-b, x coordinate values of a left boundary and a right boundary of the area a1 are xleft and xrt, respectively, y coordinate values of an upper boundary and a lower boundary of the area a1 are ymax and ymin, respectively, a y coordinate value ymid of a boundary between an upper half area and a lower half area of the area a1 is (ymax + ymin)/2, such as an enlarged schematic diagram of the area a1 shown in fig. 2-c, where P is an input touch gesture, where a point P1 and a point P3 are a first touch point and a last touch point of the input touch gesture, respectively, a point P2 is an inflection point in the input touch gesture, and when a point P1 is recognized to satisfy ymid<y1<ymax and xleft<x1<xrt, Point P2 satisfies ymin<y2<ymid and xleft<x2<xrt, Point P3 satisfies ymid<y3<ymax and xleft<x3<xrt, judging that the input touch gesture satisfies the first condition, and executing step 202 if the input touch gesture satisfies the first condition, or else judging that the input touch gesture does not satisfy the first conditionThe first condition is that step 203 is executed.
Optionally, the triggered touch point is detected in the input process of the touch gesture, when it is detected that the triggered touch point does not meet the requirement in the first condition, that is, it is determined that the touch gesture to be currently input does not meet the first condition,step 203 is executed, and then, input of a new touch gesture is waited.
Step 202, judging that the touch gesture slides up and down in a preset application program touch recognition area, and executing corresponding operation on an application program corresponding to the application program touch recognition area;
in the embodiment of the present invention, when it is determined that the touch gesture input on the smart terminal is sliding up and down in a preset application touch recognition area, a preset operation corresponding to the sliding up and down is performed on an application corresponding to the application touch recognition area, specifically, the preset operation corresponding to the sliding up and down may be, for example, starting the application, popping up a widget interface related to the application, or other application operations, which is not limited herein.
And 203, judging that the touch gesture does not slide up and down in a preset application program touch recognition area, and not executing preset operation corresponding to the up-down sliding on the application program corresponding to the application program touch recognition area.
It should be noted that the application operating device may be integrated into the smart terminal in a hardware and/or software manner, and the smart terminal may be specifically a smart phone, a tablet computer, or another terminal equipped with a touch screen, which is not limited herein.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to slide up and down in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
EXAMPLE III
The difference between the embodiment of the present invention and the first embodiment is that the embodiment of the present invention further defines an identification condition of up-down sliding to further reduce the probability of misoperation of the application program, as shown in fig. 3-a, the application program operation method in the embodiment of the present invention includes:
step 301, identifying whether a touch gesture input on the intelligent terminal meets a second condition;
wherein the second condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the upper half area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the lower half area of the application program touch recognition area.
And executingstep 302 when the touch gesture input on the intelligent terminal is recognized to meet the second condition, and executingstep 303 when the touch gesture input on the intelligent terminal is recognized to not meet the second condition.
An x-y coordinate system used by the touch screen of the intelligent terminal is shown in fig. 2-b (the coordinate system shown in fig. 2-b is also a default coordinate system of the current intelligent terminal), in the coordinate system shown in fig. 2-b, coordinate values of an x axis sequentially increase from left to right, and coordinate values of a y axis sequentially decrease from top to bottom, and correspondingly, in the touch screen of the intelligent terminal applying the coordinate system shown in fig. 2-b, the x coordinate values of touch points sequentially increase from left to right, and the y coordinate values of touch points sequentially increase from top to bottom. If the touch recognition area of the application program preset in the embodiment of the present invention is a rectangular area, instep 201, specifically, if it is recognized that the touch recognition area is intelligentAbscissa value x of a touch point of a touch gesture input on a terminaliX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiSatisfying (ymax + ymin)/2. ltoreq. yiIf the touch point is less than or equal to ymax, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; if the abscissa value x of a touch point of the touch gesture input on the intelligent terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, judging that the touch point is positioned in the upper half area of the touch identification area of the application program; wherein the xleft and the xrt are abscissa values of a left boundary and an abscissa value of a right boundary of the application touch recognition area, respectively, and the ymin and the ymax are ordinate values of an upper boundary and an ordinate value of a lower boundary of the application touch recognition area, respectively.
In an application scenario, whether the touch gesture input on the intelligent terminal meets the second condition may be identified by detecting the position of each touch point of the input touch gesture. Taking the area a1 in fig. 1-b as an example, let the x-y coordinate system used by the touch screen of the smart terminal be as shown in fig. 2-b, and correspondingly, in the touch screen of the smart terminal applying the coordinate system shown in fig. 2-b, the x coordinate values of the touch points from left to right increase in sequence, and the y coordinate values of the touch points from top to bottom increase in sequence. If the x coordinate values of the left boundary and the right boundary of the area a1 are xleft and xrt, respectively, and the y coordinate values of the upper boundary and the lower boundary of the area a1 are ymax and ymin, respectively, then the y coordinate value ymid of the boundary between the upper half area and the lower half area of the area a1 is (ymax + ymin)/2, when it is recognized that each touch point of the input touch gesture is in the area a1, and the y coordinate values of the first touch point and the last touch point of the input touch gesture are both smaller than ymid (i.e., both located in the upper half area of the touch recognition area of the application), and when at least one touch point of the input touch gesture exists with a y coordinate value larger than ymid (i.e., located in the lower half area of the touch recognition area of the application), it is determined that the touch gesture input on the smart terminal satisfies the second condition,step 302 is performed, otherwise it is determined that the touch gesture input on the smart terminal does not satisfy the second condition,step 303 is performed.
In another application scenario, whether the touch gesture input on the intelligent terminal meets the second condition may also be identified by detecting the positions of the first touch point, the inflection point and the last touch point of the input touch gesture. Specifically, if it is recognized that a first touch point, an inflection point and a last touch point of a touch gesture input on the intelligent terminal are all in the same preset application touch recognition area, the first touch point and the last touch point are located in the upper half area of the application touch recognition area, and the inflection point is located in the lower half area of the application touch recognition area, it is determined that the touch gesture satisfies the second condition. In this application scenario, it is necessary to determine an inflection point, where the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1I.e., a trip point where the y coordinate value increases to decrease in the y coordinate value or decreases to increase in the y coordinate value), where the y coordinate value increasesjA vertical coordinate value y representing the inflection pointj-1Ordinate value, y, of a touch point immediately preceding the inflection pointj+1And a vertical coordinate value of a touch point subsequent to the inflection point. Taking the area a1 in fig. 1-b as an example for explanation, assuming that an x-y coordinate system used by a touch screen of the smart terminal is as shown in fig. 2-b, x coordinate values of a left boundary and a right boundary of the area a1 are xleft and xrt, respectively, y coordinate values of an upper boundary and a lower boundary of the area a1 are ymax and ymin, respectively, a y coordinate value ymid of a boundary between an upper half area and a lower half area of the area a1 is (ymax + ymin)/2, as shown in an enlarged schematic diagram of the area a1 shown in fig. 3-b, S is an input touch gesture, wherein points S1 and S3 are a first touch point and a last touch point of the input touch gesture, respectively, point S2 is an inflection point in the input touch gesture, and when it is recognized that the point S1 satisfies ymin<y4<ymid and xleft<x4<xrt, point S2Satisfy ymid<y5<ymax and xleft<x5<xrt, Point S3 satisfies ymin<y6<ymid and xleft<x6<xrt, judging that the input touch gesture satisfies the second condition, and executing step 302, otherwise, judging that the touch gesture input on the intelligent terminal does not satisfy the second condition, and executing step 303.
Optionally, the triggered touch point is detected in the input process of the touch gesture, and when it is detected that the triggered touch point does not meet the requirement in the second condition, it is determined that the touch gesture to be currently input does not meet the second condition, and step 303 is executed.
Step 302, judging that the touch gesture slides up and down in a preset application program touch recognition area, and executing corresponding operation on an application program corresponding to the application program touch recognition area;
in an embodiment of the present invention, when it is determined that the touch gesture input on the smart terminal is sliding up and down in a preset application touch recognition area, a preset operation corresponding to the sliding up and down is performed on an application corresponding to the application touch recognition area, specifically, the preset operation corresponding to the sliding up and down may be, for example, starting the application, popping up a small window interface related to the application, or other application operations, which is not limited herein.
And step 303, determining that the touch gesture does not slide up and down in a preset application program touch recognition area, and not executing a preset operation corresponding to the up-and-down sliding on the application program corresponding to the application program touch recognition area.
It should be noted that, the embodiment (i.e., the third embodiment) of the present invention may be combined with the second embodiment for recognition, or, on the basis of the first embodiment and the third embodiment, the touch gesture input on the smart terminal may be recognized in other ways as sliding up and down in a preset application touch recognition area, which is not limited herein.
It should be noted that the application operating device may be integrated into the smart terminal in a hardware and/or software manner, and the smart terminal may be specifically a smart phone, a tablet computer, or another terminal equipped with a touch screen, which is not limited herein.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to slide up and down in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
Example four
Referring to fig. 4, anapplication operating device 400 according to an embodiment of the present invention is described, including: a touchgesture recognition unit 401 and acontrol unit 402.
The touchgesture recognition unit 401 is configured to recognize whether a touch gesture input on the smart terminal slides up and down or slides down and up in a preset application touch recognition area, where each application touch recognition area corresponds to an application. Thecontrol unit 402 is configured to execute a corresponding operation on the application program corresponding to the application program touch recognition area when the touchgesture recognition unit 401 recognizes that the touch gesture input on the smart terminal is sliding up and down or sliding down and up in the preset application program touch recognition area.
Optionally, the touchgesture recognition unit 401 includes: the first sub-recognition unit is used for recognizing whether a touch gesture input on the intelligent terminal meets a first condition; a first determination unit for determining whether the first sub-recognition unit recognizes an input touchWhen the touch gesture meets the first condition, judging that the touch gesture slides up and down in a preset application program touch recognition area; and when the first sub-recognition unit recognizes that the input touch gesture does not meet the first condition, judging that the touch gesture does not slide up and down in a preset application program touch recognition area. Wherein the first condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the lower half portion of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the upper half portion of the application program touch recognition area. Further, the first sub-identification unit is specifically configured to: when the first touch point, the inflection point and the last touch point of the touch gesture input on the intelligent terminal are recognized to be in the same preset application program touch recognition area, the first touch point and the last touch point are located in the lower half part area of the application program touch recognition area, and the inflection point is located in the upper half part area of the application program touch recognition area, it is judged that the touch gesture meets the first condition. Wherein the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1Touch point of, yjA vertical coordinate value y representing the inflection pointj-1Ordinate value, y, of a touch point immediately preceding the inflection pointj+1And a vertical coordinate value of a touch point subsequent to the inflection point.
Optionally, the touchgesture recognition unit 401 includes: the second sub-recognition unit is used for recognizing whether the touch gesture input on the intelligent terminal meets a second condition; the second judging unit is used for judging that the touch gesture slides up and down in a preset application program touch recognition area when the second sub-recognition unit recognizes that the input touch gesture meets the second condition; when the second sub-recognition unit recognizes that the input touch gesture does not satisfy the second condition, determining that the touch gesture is performedThe touch screen is not slid up and down in a preset application program touch recognition area; wherein the second condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the upper half area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the lower half area of the application program touch recognition area. Further, the second sub-identification unit is specifically configured to: when a first touch point, an inflection point and a last touch point of a touch gesture input on the intelligent terminal are identified to be in the same preset application program touch identification area, the first touch point and the last touch point are located in the upper half area of the application program touch identification area, and the inflection point is located in the lower half area of the application program touch identification area, it is determined that the touch gesture meets the second condition. Wherein the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1Touch point of, yjA vertical coordinate value y representing the inflection pointj-1Ordinate value, y, of a touch point immediately preceding the inflection pointj+1And a vertical coordinate value of a touch point subsequent to the inflection point.
Optionally, in the embodiment of the present invention, the preset application touch identification area is a rectangular area. The first sub-identification unit may be specifically configured to: when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiSatisfying (ymax + ymin)/2. ltoreq. yiIf the touch point is less than or equal to ymax, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, then, it is determined that the touch point is located at the positionThe application touches the upper half area of the recognition area. The second sub-identification unit is specifically configured to: when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiSatisfying (ymax + ymin)/2. ltoreq. yiIf not more than ymax, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, determining that the touch point is located in the upper half area of the touch recognition area of the application program. Wherein the xleft and the xrt are abscissa values of a left boundary and an abscissa value of a right boundary of the application touch recognition area, respectively, and the ymin and the ymax are ordinate values of an upper boundary and an ordinate value of a lower boundary of the application touch recognition area, respectively.
Optionally, the operation corresponding to the touch gesture sliding up and down or sliding down and up in the preset application program touch recognition area is to start an application program; thecontrol unit 402 is specifically configured to start the application program corresponding to the application program touch recognition area when the touchgesture recognition unit 401 recognizes that the touch gesture input on the smart terminal is sliding up and down or sliding down and up in a preset application program touch recognition area.
Optionally, the operation corresponding to the touch gesture sliding up and down or sliding down and up in the preset application program touch recognition area is to pop up a widget interface related to the application program; thecontrol unit 402 is specifically configured to pop up a widget interface related to an application program corresponding to the touch recognition area of the application program when the touchgesture recognition unit 401 recognizes that the touch gesture input on the smart terminal is sliding up and down or sliding up and down in the preset touch recognition area of the application program.
Because the sliding up and down or the sliding down and up in the area with the smaller range has certain difficulty, the upper and lower boundary ranges of the application program clicking area of the application program can be expanded on the basis of the application program clicking area of the application program for facilitating the operation of a user, and the upper and lower boundary ranges are used as the application program touch recognition area of the application program. Optionally, the left and right boundaries of the touch recognition area of the application program are the same as the application program click area of the corresponding application program, and the upper and lower boundary ranges of the touch recognition area of the application program are larger than the application program click area of the corresponding application program. The application program click area is a range in which the corresponding application program receives click triggering (that is, a user executes a click operation in a certain application program click area to trigger an application program corresponding to the application program click area). Specifically, the ratio of the upper and lower boundary ranges of the touch recognition area of the application program to be larger than the application program click area of the corresponding application program may be 40%, and of course, the upper and lower boundary ranges may be set to other values. It should be noted that the ratio of the upper and lower boundary ranges of the touch recognition area of the application program being larger than the application program click area of the corresponding application program cannot be too large, otherwise, the upper and lower boundary ranges may conflict with other operations (for example, an operation of drawing a screen upwards).
It should be noted that the application operating device may be integrated into the smart terminal in a hardware and/or software manner, and the smart terminal may be specifically a smart phone, a tablet computer, or another terminal equipped with a touch screen, which is not limited herein.
It should be understood that, in the embodiment of the present invention, functions of each functional module of the application program operating apparatus may be specifically implemented according to the method in the foregoing method embodiment, and a specific implementation process thereof may refer to relevant descriptions in the foregoing method embodiment, which is not described herein again.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
In the several embodiments provided in the present application, it should be understood that the disclosed system and method may be implemented in other ways.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present invention is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no acts or modules are necessarily required of the invention.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In view of the above, it is intended that the present invention not be limited to the specific embodiments and applications described in the above, but that the present invention is not limited to the embodiments and applications described in the above.