Movatterモバイル変換


[0]ホーム

URL:


CN106095303B - Application program operation method and device - Google Patents

Application program operation method and device
Download PDF

Info

Publication number
CN106095303B
CN106095303BCN201610377935.0ACN201610377935ACN106095303BCN 106095303 BCN106095303 BCN 106095303BCN 201610377935 ACN201610377935 ACN 201610377935ACN 106095303 BCN106095303 BCN 106095303B
Authority
CN
China
Prior art keywords
touch
application program
area
point
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610377935.0A
Other languages
Chinese (zh)
Other versions
CN106095303A (en
Inventor
周奇
钱伟强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yibin Bond China Smart Technology Co ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IndividualfiledCriticalIndividual
Priority to CN201610377935.0ApriorityCriticalpatent/CN106095303B/en
Publication of CN106095303ApublicationCriticalpatent/CN106095303A/en
Application grantedgrantedCritical
Publication of CN106095303BpublicationCriticalpatent/CN106095303B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The invention discloses an application program operation method and device, wherein the application program operation method comprises the following steps: identifying whether a touch gesture input on the intelligent terminal is sliding up and down or sliding down and up in a preset application program touch identification area, wherein each application program touch identification area corresponds to an application program; and when the touch gesture input on the intelligent terminal is recognized to be sliding up and down or sliding down and up in the application program touch recognition area, corresponding operation is executed on the application program corresponding to the application program touch recognition area. The technical scheme provided by the invention can effectively reduce the probability of misoperation on the application program.

Description

Application program operation method and device
Technical Field
The invention relates to the technical field of intelligent terminals, in particular to an application program operation method and device.
Background
With the development of science and technology, the functions of smart terminals (such as smart phones, tablet computers, and the like) are more and more powerful, thousands of application programs are developed for users to use, the application programs are indispensable parts of the smart terminals, and after the application programs are installed in the smart terminals, the users can use the installed application programs to realize corresponding functions (such as office, chat, video, games, and the like).
Because the current intelligent terminal basically supports touch control, the touch mode is mostly adopted to operate the application program installed on the intelligent terminal (for example, to start the application program), however, the existing mode for operating the application program installed on the intelligent terminal is too single, and a user can only perform corresponding operation on the corresponding application program by clicking or long-pressing an application program icon, and in addition, because the touched touch point is fixed in the process of clicking or long-pressing, misoperation on the corresponding application program is easily caused due to careless clicking or long-pressing of a certain application program icon.
Disclosure of Invention
The invention provides an application program operation method and device, which are used for reducing the probability of misoperation on an application program.
The invention provides an application program operation method in a first aspect, which comprises the following steps:
identifying whether a touch gesture input on the intelligent terminal is sliding up and down or sliding down and up in a preset application program touch identification area, wherein each application program touch identification area corresponds to an application program;
and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch recognition area, executing corresponding operation on the application program corresponding to the application program touch recognition area.
A second aspect of the present invention provides an application operating apparatus, including:
the touch gesture recognition unit is used for recognizing whether a touch gesture input on the intelligent terminal slides up and down or slides up and down in a preset application program touch recognition area, wherein each application program touch recognition area corresponds to one application program;
and the control unit is used for executing corresponding operation on the application program corresponding to the application program touch recognition area when the touch gesture recognition unit recognizes that the touch gesture input on the intelligent terminal is sliding up and down or sliding down and up in the preset application program touch recognition area.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1-a is a flowchart illustrating an embodiment of a method for operating an application according to the present invention;
FIG. 1-b is a schematic diagram illustrating an interface of an intelligent terminal and a touch recognition area of an application program in an application scenario;
1-c are schematic diagrams illustrating an interface of an intelligent terminal and an application touch recognition area in another application scenario;
FIG. 2-a is a schematic flow chart diagram illustrating a method for operating an application according to another embodiment of the present invention;
FIG. 2-b is a schematic view of an x-y coordinate system provided by the present invention;
FIG. 2-c is an enlarged view of region A1 in an application scenario;
FIG. 3-a is a flowchart illustrating a method for operating an application according to yet another embodiment of the present invention;
FIG. 3-b is an enlarged view of region A1 in another application scenario;
fig. 4 is a schematic structural diagram of an application operating device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides an application program operation method, which comprises the following steps: recognizing a touch gesture input on the intelligent terminal; and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in a preset application program touch recognition area, executing corresponding operation on the application program corresponding to the application program touch recognition area, wherein each application program touch recognition area corresponds to one application program. The embodiment of the invention also provides a corresponding application program operating device, which is respectively described in detail below.
Example one
An embodiment of the present invention provides an application program operating method, as shown in fig. 1-a, the application program operating method in the embodiment of the present invention includes:
step 101, identifying whether a touch gesture input on the intelligent terminal is sliding up and down or sliding down and up in a preset application program touch identification area;
wherein each application touch recognition area corresponds to an application.
In the embodiment of the invention, a corresponding application program touch identification area is preset for an installed application program on the intelligent terminal, and whether a touch gesture input on the intelligent terminal slides up and down or slides down and up in the preset application program touch identification area is identified by detecting the position information of each touch point triggered (namely touched) in the touch process. Each touch process starts from the moment that a touch point on a touch screen of the intelligent terminal is triggered to the moment that all touch points on the touch screen are released.
Because the sliding up and down or the sliding down and up in the area with the smaller range has certain difficulty, the upper and lower boundary ranges of the application program clicking area of the application program can be expanded on the basis of the application program clicking area of the application program for facilitating the operation of a user, and the upper and lower boundary ranges are used as the application program touch recognition area of the application program. Optionally, the left and right boundaries of the application touch recognition area are the same as the application click area of the corresponding application, and the upper and lower boundary ranges of the application touch recognition area are larger than the application click area of the corresponding application, for example, when the application touch recognition area is preset for the application 1, the left and right boundaries of the application touch recognition area of the application 1 are set as the left and right boundaries of the application click area of the application 1, and the upper and lower boundary ranges of the application touch recognition area of the application 1 are set as the upper and lower boundary ranges of the application click area of the application 1. The application program click area is a range in which the corresponding application program receives click triggering (that is, a user executes a click operation in a certain application program click area to trigger an application program corresponding to the application program click area). Specifically, the ratio of the upper and lower boundary ranges of the touch recognition area of the application program to be larger than the application program click area of the corresponding application program may be 40%, and of course, the upper and lower boundary ranges may be set to other values. It should be noted that the ratio of the upper and lower boundary ranges of the touch recognition area of the application program being larger than the application program click area of the corresponding application program cannot be too large, otherwise, the upper and lower boundary ranges may conflict with other operations (for example, an operation of drawing a screen upwards).
Optionally, when it is recognized that the touch gesture input on the smart terminal meets the first condition, it is determined that the touch gesture slides up and down in a preset application program touch recognition area, and otherwise, it is determined that the touch gesture does not slide up and down in the preset application program touch recognition area. Wherein the first condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the lower half portion of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the upper half portion of the application program touch recognition area.
Optionally, when it is recognized that the touch gesture input on the smart terminal meets the second condition, it is determined that the touch gesture slides up and down in a preset application program touch recognition area, and otherwise, it is determined that the touch gesture does not slide up and down in the preset application program touch recognition area. Wherein the second condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the upper half area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the lower half area of the application program touch recognition area.
102, when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch recognition area, executing corresponding operation on the application program corresponding to the application program touch recognition area;
in the embodiment of the present invention, a corresponding operation is set in advance for a touch gesture sliding up and down or sliding down and up in an application touch recognition area, so that when the touch gesture input on the smart terminal is recognized instep 101 as sliding up and down or sliding down and up in a preset application touch recognition area, a corresponding operation is performed on an application corresponding to the corresponding application touch recognition area.
Specifically, the area where the icon of the application is located may be set as the application touch recognition area. For example, as shown in fig. 1-b, a schematic diagram of an interface and an application touch recognition area of an intelligent terminal in an application scenario is shown, where an icon 1, an icon 2, an icon 3, an icon 4, an icon 5, and an icon 6 are icons of 6 applications installed on the intelligent terminal (for convenience of description, these 6 applications are described as an application 1, an application 2, an application 3, an application 4, an application 5, and an application 6, respectively), as shown in fig. 1-b, an area a1 where the icon 1 is located may be set as an application touch recognition area corresponding to the application 1 in advance, an area a2 where the icon 2 is located may be set as an application touch recognition area corresponding to the application 2, an area A3 where the icon 3 is located may be set as an application touch recognition area corresponding to the application 3, the area a4 in which the icon 4 is located is set as an application touch recognition area corresponding to the application 4, the area a5 in which the icon 5 is located is set as an application touch recognition area corresponding to the application 5, and the area a6 in which the icon 6 is located is set as an application touch recognition area corresponding to the application 6. It should be noted that, in fig. 1-b, blank areas are left between the touch recognition areas of the applications (i.e., they are not connected to each other), and in other embodiments, the touch recognition areas of the applications corresponding to the adjacent icons may be connected, as shown in fig. 1-c, the area a1 is connected to the area a2 and the area a4, the area a2 is connected to the area a1, the area A3 and the area a5, and the area A3 is connected to the area a2 and the area a6, respectively. Of course, in other embodiments, other areas may be set as the application touch recognition areas corresponding to the applications, which is not limited herein. Further, the area where the icon of the application program is located may also be enlarged, and the enlarged area is used as the application program touch recognition area of the corresponding application program, for example, the position of the upper boundary of the area where the icon of the application program is located may be raised by a preset height on the basis of the area where the icon of the application program is located, so as to enlarge the touch range of the application program touch recognition area of the application program in the vertical direction.
In an application scenario, an operation corresponding to a touch gesture of sliding up and down or sliding down and up in a preset application touch recognition area is set as starting an application, and instep 102, when it is recognized that the touch gesture input on the smart terminal is sliding up and down or sliding down and up in the preset application touch recognition area, the application corresponding to the application touch recognition area is started. Taking fig. 1-b as an example, when thestep 101 recognizes that the application program slides up and down or slides down and up in the area a1, the application program corresponding to the area a1 (i.e., the application program 1) is started.
In another application scenario, it is set that an operation corresponding to a touch gesture of sliding up and down or sliding down and up in a preset application touch recognition area is to pop up a widget interface (also referred to as a pop-up frame interface) related to an application, and instep 102, when it is recognized that the touch gesture input on the smart terminal is sliding up and down or sliding down and up in the preset application touch recognition area, the widget interface related to the application corresponding to the application touch recognition area is popped up. Taking fig. 1-b as an example for explanation, whenstep 101 identifies that the application program slides up and down or slides down and up in the area a1, the widget interface related to the application program corresponding to the area a1 (i.e. the aforementioned application program 1) is popped up.
In another application scenario, the touch gestures of the up-down sliding or the up-down sliding in the touch recognition area of the preset application program may be set to correspond to different operations respectively. For example, the operation corresponding to the touch gesture sliding up and down in the preset application touch recognition area may be set as starting the application, the operation corresponding to the touch gesture sliding up and down in the preset application touch recognition area may be set as popping up the widget interface related to the application, or the operation corresponding to the touch gesture sliding up and down in the preset application touch recognition area may be set as popping up the widget interface related to the application, and the operation corresponding to the touch gesture sliding up and down in the preset application touch recognition area may be set as starting the application.
Further, when thestep 101 recognizes that the touch gesture input on the smart terminal does not slide up and down in the application touch recognition area and the touch gesture does not slide up and down in the application touch recognition area, immediately, or after waiting for a preset time or a preset event, returning to thestep 101.
It should be noted that the application operating device may be integrated into the smart terminal in a hardware and/or software manner, and the smart terminal may be specifically a smart phone, a tablet computer, or another terminal equipped with a touch screen, which is not limited herein.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
Example two
The difference between the embodiment of the present invention and the first embodiment is that the embodiment of the present invention further defines an identification condition of up-down sliding to further reduce the probability of misoperation of the application program, as shown in fig. 2-a, the application program operation method in the embodiment of the present invention includes:
step 201, identifying whether a touch gesture input on the intelligent terminal meets a first condition;
wherein the first condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the lower half portion of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the upper half portion of the application program touch recognition area.
If the touch gesture input on the intelligent terminal is recognized to satisfy the first condition,step 202 is executed, and if the touch gesture input on the intelligent terminal is recognized not to satisfy the first condition,step 203 is executed.
An x-y coordinate system used by the touch screen of the intelligent terminal is shown in fig. 2-b (the coordinate system shown in fig. 2-b is also a default coordinate system of the current intelligent terminal), in the coordinate system shown in fig. 2-b, coordinate values of an x axis sequentially increase from left to right, and coordinate values of a y axis sequentially decrease from top to bottom, and correspondingly, in the touch screen of the intelligent terminal applying the coordinate system shown in fig. 2-b, the x coordinate values of touch points sequentially increase from left to right, and the y coordinate values of touch points sequentially increase from top to bottom. Taking the preset application touch recognition area as a rectangular area as an example, instep 201, if an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt or less, and the above-mentioned one touch pointOrdinate value y ofiSatisfying (ymax + ymin)/2. ltoreq. yiIf the touch point is less than or equal to ymax, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; if the abscissa value x of a touch point of the touch gesture input on the intelligent terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, judging that the touch point is positioned in the upper half area of the touch identification area of the application program; wherein the xleft and the xrt are abscissa values of a left boundary and an abscissa value of a right boundary of the application touch recognition area, respectively, and the ymin and the ymax are ordinate values of an upper boundary and an ordinate value of a lower boundary of the application touch recognition area, respectively.
In an application scenario, whether the touch gesture input on the intelligent terminal meets the first condition can be identified by detecting the positions of the touch points of the input touch gesture. Taking the area a1 in fig. 1-b as an example for explanation, setting an x-y coordinate system used by a touch screen of an intelligent terminal as shown in fig. 2-b, setting x coordinate values of a left boundary and a right boundary of the area a1 as xleft and xrt, respectively, and y coordinate values of an upper boundary and a lower boundary of the area a1 as ymax and ymin, respectively, and then a y coordinate value ymid of a boundary between an upper half area and a lower half area of the area a1 as (ymax + ymin)/2, when each touch point of the input touch gesture is recognized in the area a1, and y coordinate values of a first touch point and a last touch point of the input touch gesture are both greater than ymid (i.e. both located in the lower half area of the application touch recognition area), and when at least one touch point of the input touch gesture has a y coordinate value less than ymid (i.e. located in the upper half area of the application touch recognition area), and if not, determining that the touch gesture input on the intelligent terminal does not meet the first condition, and executingstep 203.
In another application scenario, the first touch point, the inflection point and the last touch point of the input touch gesture can be also detectedTo identify whether the touch gesture input on the smart terminal satisfies the first condition. Specifically, if it is recognized that a first touch point, an inflection point and a last touch point of a touch gesture input on the intelligent terminal are all in the same preset application touch recognition area, the first touch point and the last touch point are located in a lower half area of the application touch recognition area, and the inflection point is located in an upper half area of the application touch recognition area, it is determined that the touch gesture satisfies the first condition. In this application scenario, it is necessary to determine an inflection point, where the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1I.e., a trip point where the y coordinate value increases to decrease in the y coordinate value or decreases to increase in the y coordinate value), where the y coordinate value increasesjA vertical coordinate value y representing the inflection pointj-1Ordinate value, y, of a touch point immediately preceding the inflection pointj+1And a vertical coordinate value of a touch point subsequent to the inflection point. Taking the area a1 in fig. 1-b as an example for explanation, assuming that an x-y coordinate system used by a touch screen of the smart terminal is as shown in fig. 2-b, x coordinate values of a left boundary and a right boundary of the area a1 are xleft and xrt, respectively, y coordinate values of an upper boundary and a lower boundary of the area a1 are ymax and ymin, respectively, a y coordinate value ymid of a boundary between an upper half area and a lower half area of the area a1 is (ymax + ymin)/2, such as an enlarged schematic diagram of the area a1 shown in fig. 2-c, where P is an input touch gesture, where a point P1 and a point P3 are a first touch point and a last touch point of the input touch gesture, respectively, a point P2 is an inflection point in the input touch gesture, and when a point P1 is recognized to satisfy ymid<y1<ymax and xleft<x1<xrt, Point P2 satisfies ymin<y2<ymid and xleft<x2<xrt, Point P3 satisfies ymid<y3<ymax and xleft<x3<xrt, judging that the input touch gesture satisfies the first condition, and executing step 202 if the input touch gesture satisfies the first condition, or else judging that the input touch gesture does not satisfy the first conditionThe first condition is that step 203 is executed.
Optionally, the triggered touch point is detected in the input process of the touch gesture, when it is detected that the triggered touch point does not meet the requirement in the first condition, that is, it is determined that the touch gesture to be currently input does not meet the first condition,step 203 is executed, and then, input of a new touch gesture is waited.
Step 202, judging that the touch gesture slides up and down in a preset application program touch recognition area, and executing corresponding operation on an application program corresponding to the application program touch recognition area;
in the embodiment of the present invention, when it is determined that the touch gesture input on the smart terminal is sliding up and down in a preset application touch recognition area, a preset operation corresponding to the sliding up and down is performed on an application corresponding to the application touch recognition area, specifically, the preset operation corresponding to the sliding up and down may be, for example, starting the application, popping up a widget interface related to the application, or other application operations, which is not limited herein.
And 203, judging that the touch gesture does not slide up and down in a preset application program touch recognition area, and not executing preset operation corresponding to the up-down sliding on the application program corresponding to the application program touch recognition area.
It should be noted that the application operating device may be integrated into the smart terminal in a hardware and/or software manner, and the smart terminal may be specifically a smart phone, a tablet computer, or another terminal equipped with a touch screen, which is not limited herein.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to slide up and down in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
EXAMPLE III
The difference between the embodiment of the present invention and the first embodiment is that the embodiment of the present invention further defines an identification condition of up-down sliding to further reduce the probability of misoperation of the application program, as shown in fig. 3-a, the application program operation method in the embodiment of the present invention includes:
step 301, identifying whether a touch gesture input on the intelligent terminal meets a second condition;
wherein the second condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the upper half area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the lower half area of the application program touch recognition area.
And executingstep 302 when the touch gesture input on the intelligent terminal is recognized to meet the second condition, and executingstep 303 when the touch gesture input on the intelligent terminal is recognized to not meet the second condition.
An x-y coordinate system used by the touch screen of the intelligent terminal is shown in fig. 2-b (the coordinate system shown in fig. 2-b is also a default coordinate system of the current intelligent terminal), in the coordinate system shown in fig. 2-b, coordinate values of an x axis sequentially increase from left to right, and coordinate values of a y axis sequentially decrease from top to bottom, and correspondingly, in the touch screen of the intelligent terminal applying the coordinate system shown in fig. 2-b, the x coordinate values of touch points sequentially increase from left to right, and the y coordinate values of touch points sequentially increase from top to bottom. If the touch recognition area of the application program preset in the embodiment of the present invention is a rectangular area, instep 201, specifically, if it is recognized that the touch recognition area is intelligentAbscissa value x of a touch point of a touch gesture input on a terminaliX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiSatisfying (ymax + ymin)/2. ltoreq. yiIf the touch point is less than or equal to ymax, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; if the abscissa value x of a touch point of the touch gesture input on the intelligent terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, judging that the touch point is positioned in the upper half area of the touch identification area of the application program; wherein the xleft and the xrt are abscissa values of a left boundary and an abscissa value of a right boundary of the application touch recognition area, respectively, and the ymin and the ymax are ordinate values of an upper boundary and an ordinate value of a lower boundary of the application touch recognition area, respectively.
In an application scenario, whether the touch gesture input on the intelligent terminal meets the second condition may be identified by detecting the position of each touch point of the input touch gesture. Taking the area a1 in fig. 1-b as an example, let the x-y coordinate system used by the touch screen of the smart terminal be as shown in fig. 2-b, and correspondingly, in the touch screen of the smart terminal applying the coordinate system shown in fig. 2-b, the x coordinate values of the touch points from left to right increase in sequence, and the y coordinate values of the touch points from top to bottom increase in sequence. If the x coordinate values of the left boundary and the right boundary of the area a1 are xleft and xrt, respectively, and the y coordinate values of the upper boundary and the lower boundary of the area a1 are ymax and ymin, respectively, then the y coordinate value ymid of the boundary between the upper half area and the lower half area of the area a1 is (ymax + ymin)/2, when it is recognized that each touch point of the input touch gesture is in the area a1, and the y coordinate values of the first touch point and the last touch point of the input touch gesture are both smaller than ymid (i.e., both located in the upper half area of the touch recognition area of the application), and when at least one touch point of the input touch gesture exists with a y coordinate value larger than ymid (i.e., located in the lower half area of the touch recognition area of the application), it is determined that the touch gesture input on the smart terminal satisfies the second condition,step 302 is performed, otherwise it is determined that the touch gesture input on the smart terminal does not satisfy the second condition,step 303 is performed.
In another application scenario, whether the touch gesture input on the intelligent terminal meets the second condition may also be identified by detecting the positions of the first touch point, the inflection point and the last touch point of the input touch gesture. Specifically, if it is recognized that a first touch point, an inflection point and a last touch point of a touch gesture input on the intelligent terminal are all in the same preset application touch recognition area, the first touch point and the last touch point are located in the upper half area of the application touch recognition area, and the inflection point is located in the lower half area of the application touch recognition area, it is determined that the touch gesture satisfies the second condition. In this application scenario, it is necessary to determine an inflection point, where the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1I.e., a trip point where the y coordinate value increases to decrease in the y coordinate value or decreases to increase in the y coordinate value), where the y coordinate value increasesjA vertical coordinate value y representing the inflection pointj-1Ordinate value, y, of a touch point immediately preceding the inflection pointj+1And a vertical coordinate value of a touch point subsequent to the inflection point. Taking the area a1 in fig. 1-b as an example for explanation, assuming that an x-y coordinate system used by a touch screen of the smart terminal is as shown in fig. 2-b, x coordinate values of a left boundary and a right boundary of the area a1 are xleft and xrt, respectively, y coordinate values of an upper boundary and a lower boundary of the area a1 are ymax and ymin, respectively, a y coordinate value ymid of a boundary between an upper half area and a lower half area of the area a1 is (ymax + ymin)/2, as shown in an enlarged schematic diagram of the area a1 shown in fig. 3-b, S is an input touch gesture, wherein points S1 and S3 are a first touch point and a last touch point of the input touch gesture, respectively, point S2 is an inflection point in the input touch gesture, and when it is recognized that the point S1 satisfies ymin<y4<ymid and xleft<x4<xrt, point S2Satisfy ymid<y5<ymax and xleft<x5<xrt, Point S3 satisfies ymin<y6<ymid and xleft<x6<xrt, judging that the input touch gesture satisfies the second condition, and executing step 302, otherwise, judging that the touch gesture input on the intelligent terminal does not satisfy the second condition, and executing step 303.
Optionally, the triggered touch point is detected in the input process of the touch gesture, and when it is detected that the triggered touch point does not meet the requirement in the second condition, it is determined that the touch gesture to be currently input does not meet the second condition, and step 303 is executed.
Step 302, judging that the touch gesture slides up and down in a preset application program touch recognition area, and executing corresponding operation on an application program corresponding to the application program touch recognition area;
in an embodiment of the present invention, when it is determined that the touch gesture input on the smart terminal is sliding up and down in a preset application touch recognition area, a preset operation corresponding to the sliding up and down is performed on an application corresponding to the application touch recognition area, specifically, the preset operation corresponding to the sliding up and down may be, for example, starting the application, popping up a small window interface related to the application, or other application operations, which is not limited herein.
And step 303, determining that the touch gesture does not slide up and down in a preset application program touch recognition area, and not executing a preset operation corresponding to the up-and-down sliding on the application program corresponding to the application program touch recognition area.
It should be noted that, the embodiment (i.e., the third embodiment) of the present invention may be combined with the second embodiment for recognition, or, on the basis of the first embodiment and the third embodiment, the touch gesture input on the smart terminal may be recognized in other ways as sliding up and down in a preset application touch recognition area, which is not limited herein.
It should be noted that the application operating device may be integrated into the smart terminal in a hardware and/or software manner, and the smart terminal may be specifically a smart phone, a tablet computer, or another terminal equipped with a touch screen, which is not limited herein.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to slide up and down in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
Example four
Referring to fig. 4, anapplication operating device 400 according to an embodiment of the present invention is described, including: a touchgesture recognition unit 401 and acontrol unit 402.
The touchgesture recognition unit 401 is configured to recognize whether a touch gesture input on the smart terminal slides up and down or slides down and up in a preset application touch recognition area, where each application touch recognition area corresponds to an application. Thecontrol unit 402 is configured to execute a corresponding operation on the application program corresponding to the application program touch recognition area when the touchgesture recognition unit 401 recognizes that the touch gesture input on the smart terminal is sliding up and down or sliding down and up in the preset application program touch recognition area.
Optionally, the touchgesture recognition unit 401 includes: the first sub-recognition unit is used for recognizing whether a touch gesture input on the intelligent terminal meets a first condition; a first determination unit for determining whether the first sub-recognition unit recognizes an input touchWhen the touch gesture meets the first condition, judging that the touch gesture slides up and down in a preset application program touch recognition area; and when the first sub-recognition unit recognizes that the input touch gesture does not meet the first condition, judging that the touch gesture does not slide up and down in a preset application program touch recognition area. Wherein the first condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the lower half portion of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the upper half portion of the application program touch recognition area. Further, the first sub-identification unit is specifically configured to: when the first touch point, the inflection point and the last touch point of the touch gesture input on the intelligent terminal are recognized to be in the same preset application program touch recognition area, the first touch point and the last touch point are located in the lower half part area of the application program touch recognition area, and the inflection point is located in the upper half part area of the application program touch recognition area, it is judged that the touch gesture meets the first condition. Wherein the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1Touch point of, yjA vertical coordinate value y representing the inflection pointj-1Ordinate value, y, of a touch point immediately preceding the inflection pointj+1And a vertical coordinate value of a touch point subsequent to the inflection point.
Optionally, the touchgesture recognition unit 401 includes: the second sub-recognition unit is used for recognizing whether the touch gesture input on the intelligent terminal meets a second condition; the second judging unit is used for judging that the touch gesture slides up and down in a preset application program touch recognition area when the second sub-recognition unit recognizes that the input touch gesture meets the second condition; when the second sub-recognition unit recognizes that the input touch gesture does not satisfy the second condition, determining that the touch gesture is performedThe touch screen is not slid up and down in a preset application program touch recognition area; wherein the second condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the upper half area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the lower half area of the application program touch recognition area. Further, the second sub-identification unit is specifically configured to: when a first touch point, an inflection point and a last touch point of a touch gesture input on the intelligent terminal are identified to be in the same preset application program touch identification area, the first touch point and the last touch point are located in the upper half area of the application program touch identification area, and the inflection point is located in the lower half area of the application program touch identification area, it is determined that the touch gesture meets the second condition. Wherein the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1Touch point of, yjA vertical coordinate value y representing the inflection pointj-1Ordinate value, y, of a touch point immediately preceding the inflection pointj+1And a vertical coordinate value of a touch point subsequent to the inflection point.
Optionally, in the embodiment of the present invention, the preset application touch identification area is a rectangular area. The first sub-identification unit may be specifically configured to: when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiSatisfying (ymax + ymin)/2. ltoreq. yiIf the touch point is less than or equal to ymax, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, then, it is determined that the touch point is located at the positionThe application touches the upper half area of the recognition area. The second sub-identification unit is specifically configured to: when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiSatisfying (ymax + ymin)/2. ltoreq. yiIf not more than ymax, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, determining that the touch point is located in the upper half area of the touch recognition area of the application program. Wherein the xleft and the xrt are abscissa values of a left boundary and an abscissa value of a right boundary of the application touch recognition area, respectively, and the ymin and the ymax are ordinate values of an upper boundary and an ordinate value of a lower boundary of the application touch recognition area, respectively.
Optionally, the operation corresponding to the touch gesture sliding up and down or sliding down and up in the preset application program touch recognition area is to start an application program; thecontrol unit 402 is specifically configured to start the application program corresponding to the application program touch recognition area when the touchgesture recognition unit 401 recognizes that the touch gesture input on the smart terminal is sliding up and down or sliding down and up in a preset application program touch recognition area.
Optionally, the operation corresponding to the touch gesture sliding up and down or sliding down and up in the preset application program touch recognition area is to pop up a widget interface related to the application program; thecontrol unit 402 is specifically configured to pop up a widget interface related to an application program corresponding to the touch recognition area of the application program when the touchgesture recognition unit 401 recognizes that the touch gesture input on the smart terminal is sliding up and down or sliding up and down in the preset touch recognition area of the application program.
Because the sliding up and down or the sliding down and up in the area with the smaller range has certain difficulty, the upper and lower boundary ranges of the application program clicking area of the application program can be expanded on the basis of the application program clicking area of the application program for facilitating the operation of a user, and the upper and lower boundary ranges are used as the application program touch recognition area of the application program. Optionally, the left and right boundaries of the touch recognition area of the application program are the same as the application program click area of the corresponding application program, and the upper and lower boundary ranges of the touch recognition area of the application program are larger than the application program click area of the corresponding application program. The application program click area is a range in which the corresponding application program receives click triggering (that is, a user executes a click operation in a certain application program click area to trigger an application program corresponding to the application program click area). Specifically, the ratio of the upper and lower boundary ranges of the touch recognition area of the application program to be larger than the application program click area of the corresponding application program may be 40%, and of course, the upper and lower boundary ranges may be set to other values. It should be noted that the ratio of the upper and lower boundary ranges of the touch recognition area of the application program being larger than the application program click area of the corresponding application program cannot be too large, otherwise, the upper and lower boundary ranges may conflict with other operations (for example, an operation of drawing a screen upwards).
It should be noted that the application operating device may be integrated into the smart terminal in a hardware and/or software manner, and the smart terminal may be specifically a smart phone, a tablet computer, or another terminal equipped with a touch screen, which is not limited herein.
It should be understood that, in the embodiment of the present invention, functions of each functional module of the application program operating apparatus may be specifically implemented according to the method in the foregoing method embodiment, and a specific implementation process thereof may refer to relevant descriptions in the foregoing method embodiment, which is not described herein again.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
In the several embodiments provided in the present application, it should be understood that the disclosed system and method may be implemented in other ways.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present invention is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no acts or modules are necessarily required of the invention.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In view of the above, it is intended that the present invention not be limited to the specific embodiments and applications described in the above, but that the present invention is not limited to the embodiments and applications described in the above.

Claims (2)

CN201610377935.0A2016-05-312016-05-31Application program operation method and deviceActiveCN106095303B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201610377935.0ACN106095303B (en)2016-05-312016-05-31Application program operation method and device

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201610377935.0ACN106095303B (en)2016-05-312016-05-31Application program operation method and device

Publications (2)

Publication NumberPublication Date
CN106095303A CN106095303A (en)2016-11-09
CN106095303Btrue CN106095303B (en)2021-03-23

Family

ID=57230705

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201610377935.0AActiveCN106095303B (en)2016-05-312016-05-31Application program operation method and device

Country Status (1)

CountryLink
CN (1)CN106095303B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107045421B (en)*2017-04-272021-06-18宇龙计算机通信科技(深圳)有限公司 Screen switching method and mobile terminal
CN107479746A (en)*2017-07-312017-12-15广州源创网络科技有限公司A kind of method of toch control and device
CN112363613B (en)*2020-09-252025-03-25惠州市德赛西威汽车电子股份有限公司 Infrared sliding gesture sensing and recognition method

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101568896A (en)*2007-06-082009-10-28索尼株式会社Information processing apparatus, input device, information processing system, information processing method, and program
CN102799339A (en)*2011-05-242012-11-28汉王科技股份有限公司Touch implementation method and device of application function button
CN103617002A (en)*2013-12-162014-03-05深圳市理邦精密仪器股份有限公司Method and device for achieving touch interface
CN105224215A (en)*2015-08-282016-01-06小米科技有限责任公司Terminal control method and device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7509588B2 (en)*2005-12-302009-03-24Apple Inc.Portable electronic device with interface reconfiguration mode
CN103309482A (en)*2012-03-122013-09-18富泰华工业(深圳)有限公司Electronic equipment and touch control method and touch control device thereof
CN102841682B (en)*2012-07-122016-03-09宇龙计算机通信科技(深圳)有限公司Terminal and gesture control method
CN104460999B (en)*2014-11-282017-07-28广东欧珀移动通信有限公司A kind of gesture identification method and device with flex point
CN104360805B (en)*2014-11-282018-01-16广东欧珀移动通信有限公司Application icon management method and device
CN104636065A (en)*2014-12-312015-05-20小米科技有限责任公司Method and device for awakening terminal
CN105138241B (en)*2015-09-022018-09-18Tcl移动通信科技(宁波)有限公司A kind of application program launching method, system and mobile terminal based on mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101568896A (en)*2007-06-082009-10-28索尼株式会社Information processing apparatus, input device, information processing system, information processing method, and program
CN102799339A (en)*2011-05-242012-11-28汉王科技股份有限公司Touch implementation method and device of application function button
CN103617002A (en)*2013-12-162014-03-05深圳市理邦精密仪器股份有限公司Method and device for achieving touch interface
CN105224215A (en)*2015-08-282016-01-06小米科技有限责任公司Terminal control method and device

Also Published As

Publication numberPublication date
CN106095303A (en)2016-11-09

Similar Documents

PublicationPublication DateTitle
US20210342041A1 (en)Method and apparatus for adding icon to interface of android system, and mobile terminal
RU2582854C2 (en)Method and device for fast access to device functions
US10551987B2 (en)Multiple screen mode in mobile terminal
EP3056982B1 (en)Terminal apparatus, display control method and recording medium
CN102819352B (en)The method and apparatus of control terminal
CN102819416B (en) A method and device for displaying component content
US9007314B2 (en)Method for touch processing and mobile terminal
US20100107067A1 (en)Input on touch based user interfaces
CN106484228B (en)Double screen switches display methods and mobile terminal
CN105224276A (en)A kind of multi-screen display method and electronic equipment
CN106126049A (en) Menu operation method and system
CN107506109A (en)A kind of method and mobile terminal for starting application program
CN102981711A (en)Method and system for moving application icons on touch screen
KR20140024721A (en)Method for changing display range and an electronic device thereof
CN106570372A (en)Starting method of application program and mobile terminal
KR20130097331A (en)Apparatus and method for selecting object in device with touch screen
CN106095303B (en)Application program operation method and device
CN105867830A (en)Fingerprint identification-based processing method and mobile terminal
CN106227453A (en) Method for controlling mobile terminal and mobile terminal
US20110012843A1 (en)Touch-controlled electronic apparatus and related control method
CN106201295B (en)Message copying method and device and intelligent terminal
CN106502545A (en)A kind of operational approach of slip control and mobile terminal
CN106814935B (en)Application icon display method of terminal and terminal
CN106293051B (en)Gesture-based interaction method and device and user equipment
JP7163755B2 (en) display input device

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant
TR01Transfer of patent right
TR01Transfer of patent right

Effective date of registration:20210419

Address after:No.27, west section of Xinggang Road, Yibin Lingang Economic and Technological Development Zone, Yibin City, Sichuan Province, 644000

Patentee after:Yibin bond China smart technology Co.,Ltd.

Address before:Zhongxin science and technology building, No. 31 Shenzhen Road, 518029 street gossip in Guangdong province Futian District Yuanling 9 layer (layer 1001 building since 10)

Patentee before:Zhou Qi


[8]ページ先頭

©2009-2025 Movatter.jp