Movatterモバイル変換


[0]ホーム

URL:


CN107179849B - Terminal, input control method thereof, and computer-readable storage medium - Google Patents

Terminal, input control method thereof, and computer-readable storage medium
Download PDF

Info

Publication number
CN107179849B
CN107179849BCN201710362473.XACN201710362473ACN107179849BCN 107179849 BCN107179849 BCN 107179849BCN 201710362473 ACN201710362473 ACN 201710362473ACN 107179849 BCN107179849 BCN 107179849B
Authority
CN
China
Prior art keywords
touch screen
screen unit
function key
event
preset time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710362473.XA
Other languages
Chinese (zh)
Other versions
CN107179849A (en
Inventor
任忠杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co LtdfiledCriticalNubia Technology Co Ltd
Priority to CN201710362473.XApriorityCriticalpatent/CN107179849B/en
Publication of CN107179849ApublicationCriticalpatent/CN107179849A/en
Application grantedgrantedCritical
Publication of CN107179849BpublicationCriticalpatent/CN107179849B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The invention discloses an input control method of a terminal, which comprises the following steps: responding to the triggering operation of the function key, and starting timing; judging whether a sliding gesture from the first side edge to a second side edge opposite to the first side edge is received on the touch screen unit within preset time; if so, reporting a sliding gesture event; if not, reporting the function key event. The input control method can solve the problem that the user mistakenly triggers the function key when the user executes the upward sliding touch control gesture on the bottom of the touch screen unit, and improves the user experience. The invention also provides a terminal and a computer readable storage medium having the input control method.

Description

Terminal, input control method thereof, and computer-readable storage medium
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to a terminal, an input control method of the terminal, and a computer-readable storage medium.
Background
With the wide application of touch control technology to various terminals, more and more touch gestures can be recognized and executed on the terminals. The terminal includes, but is not limited to, a computer, a portable terminal, a smart phone, a tablet computer, a personal Digital assistant pda (personal Digital assistant), and the like. Along with the realization of various touch gesture control functions on the terminal, a series of problems such as false triggering are brought.
A pull-up gesture sliding up the bottom of the touch screen of the terminal is generally used to start a list of background running programs, start a shortcut button bar, and the like. However, a typical terminal is further provided with one or more function keys based on touch sensing technology, such as one or more of a Home key, a menu key, a return key, and other physical keys, at the bottom of the touch screen. Due to the fact that the terminal has the function key at the bottom of the touch screen, when a user triggers the upglide gesture at the bottom of the touch screen, the user usually touches the function key by mistake, for example, the user touches the menu key by mistake during the upglide gesture, and at this time, the terminal executes a function of starting a menu bar instead of executing a function corresponding to the upglide gesture expected by the user.
The function keys of the common terminals on the market generally adopt an entity silk screen printing key based on an infrared touch technology, a capacitance touch technology or a resistance type touch technology, or adopt a virtual key mode arranged at the bottom of the touch screen, the function keys all adopt a touch sensing mode, and the operable distance between the function keys and the touch screen is small, so that the false triggering probability of a user when the user slides up on the bottom of the touch screen is increased, and the experience degree of the user is greatly reduced.
Disclosure of Invention
The invention mainly aims to provide a terminal, an input control method of the terminal and a computer readable storage medium, aiming at improving the problem of false triggering when a user performs a touch control gesture of sliding up the bottom of a touch screen.
In order to achieve the above object, an input control method provided by the present invention is used for input control of a terminal, where the terminal includes a touch screen unit and a function key disposed on a first side of the touch screen unit, and the method includes the following steps:
responding to the triggering operation of the function key, and starting timing;
judging whether a sliding gesture from the first side edge to a second side edge opposite to the first side edge is received on the touch screen unit within preset time;
when the touch screen unit receives the sliding gesture within the preset time, reporting a sliding gesture event;
and reporting a function key event when the sliding gesture is not received on the touch screen unit when the timing reaches the preset time.
Further, the preset time is 8-100 milliseconds.
Further, when the touch screen unit receives the sliding gesture within the preset time, the step of reporting the sliding gesture event includes: and reporting a sliding gesture event when the touch screen unit receives the sliding gesture within the preset time, and clearing a function key event corresponding to the triggering operation of the function key.
Further, the step of starting timing in response to the triggering operation of the function key includes:
responding to the triggering operation of the function key, and judging whether the time interval between the pressing event and the lifting event when the function key is triggered is smaller than a preset threshold value or not;
when the time interval between the pressing event and the lifting event is smaller than the preset threshold when the function key is triggered, starting timing;
and reporting the function key event when the time interval between the pressing event and the lifting event is greater than or equal to the preset threshold when the function key is triggered.
Further, the step of judging whether the touch screen unit receives the sliding gesture from the first side edge towards the second side edge opposite to the first side edge within the preset time includes:
judging whether the touch screen unit receives a trigger operation within a preset time;
when the touch screen unit does not receive the triggering operation within the preset time, determining that the sliding gesture is not received on the touch screen unit when the timing reaches the preset time;
when the touch screen unit receives a trigger operation within a preset time, judging whether a trigger area corresponding to the trigger operation received by the touch screen unit within the preset time is a first side edge of the touch screen;
when the triggering area corresponding to the triggering operation received on the touch screen unit within the preset time is the first side edge of the touch screen, the sliding gesture received on the touch screen unit within the preset time is determined.
Further, the method also comprises the following steps:
determining whether a sliding track corresponding to the sliding gesture received on the touch screen unit in the preset time meets a preset condition;
and when the sliding track corresponding to the sliding gesture does not meet the preset condition, withdrawing the reported sliding gesture event.
A terminal comprises a touch screen unit, a function key arranged on a first side edge of the touch screen unit, a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the steps of the input control method are realized when the processor executes the computer program.
Furthermore, the distance between the lifting event induction boundary of the function key and the induction report point boundary close to the first side edge on the touch screen unit is 0-5 mm.
Further, the function keys are arranged at the bottom of the terminal, and the sliding gesture is a pull-up gesture from a first side edge of the touch screen unit to a second side edge opposite to the first side edge.
A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program realizes the steps of the above-mentioned input control method when being executed by a processor.
The input control method and the terminal using the input control method can judge whether the triggering operation of the function key is the action of actively triggering the function key by the user to execute the corresponding operation or the false triggering action of the user when the user executes the touch control gesture of sliding upwards on the bottom of the touch screen unit when the user triggers the function key, improve the false triggering problem of the user when executing the touch control gesture of sliding upwards on the bottom of the touch screen, and improve the user experience. Meanwhile, in the input control method of the invention, when the triggering operation of the function key is responded, timing is started first, the function key event is not reported, and a program for intercepting and delaying the function key event is not required to be additionally arranged, so that the processing speed is optimized, the processing response time is accelerated, and the user experience is enhanced.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of a mobile terminal according to various embodiments of the present invention;
fig. 2 is a schematic block diagram of a terminal according to an embodiment of the present invention;
fig. 3 is a schematic front view of the terminal of fig. 2;
fig. 4 is a flowchart of a method of an input control method for the terminal of fig. 2 according to a first embodiment of the present invention;
fig. 5 is a schematic view of a user interface display corresponding to a pull-up gesture executed on a terminal according to an embodiment of the present invention.
Fig. 6 is a schematic view of a user interface display corresponding to a function execution key on a terminal according to an embodiment of the present invention.
Fig. 7 is a schematic view of a user interface display corresponding to a case where a function key is triggered by mistake when a pull-up gesture is performed on a terminal according to an embodiment of the present invention.
Fig. 8A is a schematic view of a user interface display corresponding to the terminal when the pull-up gesture is performed at intervals after the function key is executed on the terminal in an embodiment of the present invention.
Fig. 8B is a schematic view of a user interface display corresponding to the terminal when the pull-up gesture is performed at intervals after the function key is executed on the terminal in another embodiment of the present invention.
Fig. 9 is a flowchart of a method of an input control method for the terminal of fig. 2 according to a second embodiment of the present invention;
fig. 10 is a flowchart of a method of an input control method for the terminal of fig. 2 according to a third embodiment of the present invention;
fig. 11 is a flowchart of a method of an input control method for the terminal of fig. 2 according to a fourth embodiment of the present invention;
fig. 12 is a flowchart illustrating an input control method for the terminal shown in fig. 2 according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
The terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and a mobile or stationary terminal such as a Digital TV, a desktop computer, and the like.
The following description will be given by way of example of a mobile terminal, and it will be understood by those skilled in the art that the construction according to the embodiment of the present invention can be applied to a fixed type terminal, in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of amobile terminal 100 for implementing various embodiments of the present invention, themobile terminal 100 may include: adisplay unit 106, auser input unit 107, aninterface unit 108, amemory 109, aprocessor 110, and apower supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of themobile terminal 100, and that themobile terminal 100 may include more or fewer components than shown, or some components may be combined, or a different arrangement of components; for example, in the example shown in fig. 1, themobile terminal 100 further includes an RF (Radio Frequency)unit 101, aWiFi module 102, anaudio output unit 103, an a/V (audio/video)input unit 104, asensor 105, and the like.
The various components of themobile terminal 100 are described in detail below with reference to fig. 1:
thedisplay unit 106 is used to display information input by a user or information provided to the user. TheDisplay unit 106 may include aDisplay panel 1061, and theDisplay panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
Theuser input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, theuser input unit 107 may include atouch panel 1071 andother input devices 1072. Thetouch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near thetouch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. Thetouch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to theprocessor 110, and can receive and execute commands sent by theprocessor 110. In addition, thetouch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to thetouch panel 1071, theuser input unit 107 may includeother input devices 1072. In particular,other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, screen keys, home keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited thereto.
Further, thetouch panel 1071 may cover thedisplay panel 1061, and when thetouch panel 1071 detects a touch operation thereon or nearby, thetouch panel 1071 transmits the touch operation to theprocessor 110 to determine the type of the touch event, and then theprocessor 110 provides a corresponding visual output on thedisplay panel 1061 according to the type of the touch event. Although thetouch panel 1071 and thedisplay panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, thetouch panel 1071 and thedisplay panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
Theinterface unit 108 serves as an interface through which at least one external device is connected to themobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. Theinterface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within themobile terminal 100 or may be used to transmit data between themobile terminal 100 and external devices.
Thememory 109 may be used to store software programs as well as various data. Thememory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, thememory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
Theprocessor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in thememory 109 and calling data stored in thememory 109, thereby performing overall monitoring of the mobile terminal.Processor 110 may include one or more processing units; preferably, theprocessor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into theprocessor 110.
Themobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, thepower supply 111 may be logically connected to theprocessor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, themobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
Based on the above terminal hardware structure, various embodiments of the method of the present invention are provided.
Referring to fig. 2 and fig. 3, fig. 2 is a schematic block structure diagram of a terminal 10 according to an embodiment of the present invention, and fig. 3 is a schematic front structure diagram of the terminal 10 in fig. 2.
The terminal 10 comprises adisplay unit 11, atouch screen unit 12, astorage unit 13, aprocessing unit 14,function keys 15, atiming unit 16, apower supply 17 and the like. Those skilled in the art will appreciate that the terminal 10 shown in fig. 2 may also include more or fewer components than shown, or combine certain components, or a different arrangement of components. Thetouch screen unit 12 is overlaid on thedisplay unit 11, and is configured to detect a touch operation of a user on or near thetouch screen unit 12, and transmit the touch operation to theprocessing unit 14 to determine a type of the touch event, and then theprocessing unit 14 provides a corresponding visual output on thedisplay unit 11 according to the type of the touch event.
Thetouch screen unit 12 includes a first side 121 and a second side 122 disposed opposite to each other. Thefunction keys 15 are disposed on a first side 121 of thetouch screen unit 12.
Specifically, in the present embodiment, thefunction keys 15 are disposed at the bottom of thetouch screen unit 12. For example, when the terminal 10 is an intelligent terminal such as a mobile phone or a tablet computer, thefunction keys 15 include one or more of a Home key, a menu key, a return key, and other physical keys, and thefunction keys 15 are disposed at the bottom of thetouch screen unit 12. As shown in fig. 3, thefunction key 15 may be a solid silk-screen key based on an infrared touch technology, a capacitive touch technology or a resistive touch technology, and the user only needs to touch thefunction key 15, so that thefunction key 15 can respond to the touch action of the user to generate a responsive touch event, and then theprocessing unit 14 provides a corresponding visual output on thedisplay unit 11 according to the type of the touch event.
It is understood that in other embodiments, thefunction keys 15 are not limited to physical keys, as shown in fig. 3, thefunction keys 15 may also be virtual keys disposed at the bottom of thetouch screen unit 12, that is, thetouch screen unit 12 and thefunction keys 15 are different functional areas divided on the same touch panel.
Referring to fig. 4, a flowchart of aninput control method 200 for the terminal 10 in fig. 2 according to a first embodiment of the present invention is shown.
Wherein, theinput control method 200 comprises the following steps:
step S10, in response to the triggering operation of thefunction key 15, starts timing.
When the user triggers thefunction key 15 on the terminal 10, thetiming unit 16 is started to start timing in response to the triggering operation of thefunction key 15. Specifically, in this embodiment, when the user triggers thefunction key 15 on the terminal 10, theprocessing unit 14 starts thetiming unit 16 to start timing in response to the triggering operation of thefunction key 15, but does not immediately report the function key event.
In this case, the situation where the user triggers thefunction key 15 on the terminal 10 generally includes two types: one is that the user purposefully triggers thefunction key 15 to start the function corresponding to thefunction key 15; the other is that the user erroneously touches thefunction key 15, for example, the user originally wants to perform a pull-up gesture from the bottom of thetouch screen unit 12, but the operation is erroneously touched due to a limited operation space between thetouch screen unit 12 and thefunction key 15.
In step S20, it is determined whether a sliding gesture from the first side 121 toward the second side 122 opposite to the first side 121 is received on thetouch screen unit 12 within a preset time.
Within a preset time after the start of timing, theprocessing unit 14 determines whether a sliding gesture from the first side 121 toward the second side 122 opposite to the first side 121 is received on thetouch screen unit 12 according to a signal generated by thetouch screen unit 12 in response to a touch operation of a user on thetouch screen unit 12. Specifically, after thetiming unit 16 starts timing, within a preset time, if thetouch screen unit 12 detects a touch operation of a user on thetouch screen unit 12, the touch operation of the user on thetouch screen unit 12 is responded, and the touch operation is converted into an electrical signal and transmitted to theprocessing unit 14, and theprocessing unit 14 determines whether the touch operation is a sliding gesture from the first side 121 toward the second side 122 opposite to the first side 121, that is, determines whether the touch operation is a pull-up gesture; if yes, go to step S30; if not, the process proceeds to step S40.
And step S30, reporting a sliding gesture event when the sliding gesture is received on the touch screen unit within the preset time.
When theprocessing unit 14 determines that the touch operation is a slide gesture from the first side 121 toward the second side 122 opposite to the first side 121, that is, determines that the touch operation is a pull-up gesture, the slide gesture event is reported.
In an operating system of a current terminal, for example, in an intelligent terminal adopting an android system, when an input device detects an event, the detected event is reported to a distribution filter layer (inputflag.so) for processing, then the distribution filter layer distributes the reported event to a Framework service layer (Framework service.jar), and finally the Framework service layer distributes the reported event to an application program (APP) corresponding to a current interface for processing by the APP. It is understood that in other operating systems, when an input device detects an event, a corresponding input event is generated, the input device recognizes the detected input event or the processor recognizes the detected input event, and the corresponding input device or processor reports the recognized input event to a corresponding functional layer of the operating system.
It can be understood that, in this embodiment, the step of reporting the slide gesture event may be that thetouch screen unit 12 or theprocessing unit 14 reports the pull-up gesture event (slide gesture event) to a corresponding functional layer of an operating system. Specifically, for example, taking an intelligent terminal of an android system as an example, when theprocessing unit 14 determines that the touch operation is a sliding gesture heading from the first side 121 to the second side 122 opposite to the first side 121, that is, determines that the touch operation is a pull-up gesture, thetouch screen unit 12 or theprocessing unit 14 reports the sliding gesture event to a corresponding functional layer of an operating system, for example, a frame service layer; the function corresponding to the pull-up gesture is then executed by the corresponding application, and theprocessing unit 14 controls thedisplay unit 11 to display the corresponding visual output.
And step S40, reporting a function key event when the sliding gesture is not accepted on the touch screen unit when the timing reaches the preset time.
When the sliding gesture from the first side 121 to the second side 122 opposite to the first side 121 is not received on thetouch screen unit 12 by the preset time after the timing starts, reporting a function key event. In general, theprocessing unit 14 determines, according to a signal generated by thetouch screen unit 12 in response to a touch operation of a user on thetouch screen unit 12, that a slide gesture, which is from the first side 121 toward the second side 122 opposite to the first side 121 and is not received on thetouch screen unit 12 within a preset time, mainly includes two cases, one is that any touch operation is not received on thetouch screen unit 12 within the preset time, and it may be determined that a slide gesture, which is from the first side 121 toward the second side 122 opposite to the first side 121 and is not received on thetouch screen unit 12 within the preset time; the other is that although the trigger is received on thetouch screen unit 12 within the preset time, if the corresponding touch operation is not a slide gesture from the first side 121 toward the second side 122 opposite to the first side 121, theprocessing unit 14 determines that the slide gesture from the first side 121 toward the second side 122 opposite to the first side 121 is not received on thetouch screen unit 12 within the preset time.
Similarly, in this embodiment, the step of reporting the function key event may be that thefunction key 15 or theprocessing unit 14 reports the function key event triggered by thefunction key 15 to a corresponding functional layer of the operating system. Specifically, for example, taking the intelligent terminal of the android system as an example, when the sliding gesture from the first side 121 to the second side 122 opposite to the first side 121 is not received on thetouch screen unit 12 by the preset time after the start of timing, thefunction key 15 or theprocessing unit 14 reports the function key event to a corresponding functional layer of the operating system; then, the function corresponding to the function key event is executed by the corresponding application program, and theprocessing unit 14 controls thedisplay unit 11 to display the corresponding visual output, for example, return to the previous user interface, return to the home interface, open the menu user interface, and the like.
Theinput control method 200 and the terminal 10 using theinput control method 200 in the first embodiment of the present invention can implement the false touch self-check function when the user triggers thefunction key 15, and can determine whether the triggering operation of thefunction key 15 is an action of the user actively triggering thefunction key 15 to execute a corresponding operation, or a false triggering action when the user executes a touch control gesture of sliding up on the bottom of thetouch screen unit 12, thereby improving the false triggering problem when the user executes the touch control gesture of sliding up on the bottom of the touch screen, and improving the user experience. Meanwhile, in theinput control method 200 according to the first embodiment of the present invention, when the triggering operation of thefunction key 15 is responded, timing is started first, and the function key event is not reported, and a program for intercepting and delaying the function key event does not need to be additionally provided, so that the processing speed is optimized, the processing response time is shortened, and the user experience is enhanced.
The following will further describe the control process and principle of theinput control method 200 of the terminal 10 according to the first embodiment of the present invention with reference to fig. 5 to 8.
The terminal 10 is a portable intelligent terminal, thefunction keys 15 are real silk-screen keys, the user pulls the upper handle at the bottom of thetouch screen unit 12 to activate the shortcut button bar, and thefunction keys 15 triggered by the user aremenu keys 15A.
Referring to fig. 5, the operation corresponding to the pull-up gesture is to start a shortcut button bar, and a user interface after the shortcut button bar is started is as shown in fig. 5, in which auser interface 11A having a plurality of shortcut start icons (icons) is pulled out from the bottom of adisplay unit 11 for a user to select a corresponding shortcut start Icon to start a corresponding function; in this embodiment, the shortcut starticon user interface 11A may be laid out on the lower half display interface of thedisplay unit 11.
Referring to fig. 6, thefunction key 15 is amenu key 15A, and when the menu key 15A is triggered, a corresponding displayed user interface is as shown in fig. 6, and is auser interface 11B of a plurality of menu options longitudinally arranged and displayed on thedisplay unit 11, so that a user can click different menu options to enter a corresponding sub-menu or trigger a corresponding function. In this embodiment, theuser interface 11B of the plurality of menu options displayed in the vertical arrangement may be laid out on the whole display interface of thedisplay unit 11.
It should be understood that fig. 5 and fig. 6 only illustrate examples of user interface responses corresponding to two different triggering operations of the pull-up gesture and thefunction key 15, and do not specifically limit the present invention.
Referring to fig. 7, when the user performs the pull-up gesture and is improperly operated due to carelessness or a small operable distance between the function key and the touch screen, when the user triggers the menu key 15A (such as a touch position and a touch trajectory identified by a dotted arrow in fig. 7), the processing unit 14 starts the timing unit 16 to start timing in response to the triggering operation of the menu key 15A, but does not immediately report the function key event; after the timing unit 16 starts timing, since a touch trajectory corresponding to a pull-up gesture enters the sensing area of the touch screen unit 12 immediately after the user triggers the menu key 15A, the touch screen unit 12 detects a touch operation of the user on the touch screen unit 12 within a preset time, and transmits the touch operation to the processing unit 14, and the processing unit 14 determines that the touch operation is a slide gesture from the first side 121 toward the second side 122 opposite to the first side 121, that is, determines that the touch operation is a pull-up gesture; reporting the sliding gesture event to a corresponding functional layer of an operating system; then, the corresponding shortcut startup bar application executes the function corresponding to the pull-up gesture, and the processing unit 14 controls the display unit 11 to pull out and display the user interface 11A with a plurality of shortcut startup icons (Icon) on the lower half portion of the display interface of the display unit, so that the user can select the corresponding shortcut startup Icon to start the corresponding function.
Referring to fig. 8A, when a user triggers themenu key 15A (e.g., the touch position and the touch trajectory identified by the dashed circle in fig. 8A) in order to execute the function of the user interface of the menu options displayed on thedisplay unit 11 and longitudinally arranged corresponding to themenu key 15A, theprocessing unit 14 starts thetiming unit 16 to start timing in response to the triggering operation of themenu key 15A, but does not immediately report the function key event; after thetiming unit 16 starts timing, since the user does not perform any other triggering operation after triggering themenu key 15A, within a preset time, thetouch screen unit 12 does not detect any touch operation of the user on thetouch screen unit 12, and determines that a sliding gesture heading from the first side 121 to the second side 122 opposite to the first side 121 is not received on the touch screen unit when the timing reaches the preset time, and reports a function key event to a corresponding function layer of the operating system; then, the corresponding menu bar application executes the function corresponding to the pull-up gesture, and theprocessing unit 14 controls thedisplay unit 11 to arrange theuser interface 11B of the displayed menu options on the display interface in the longitudinal direction, so that the user can click different menu options to enter the corresponding sub-menu or trigger the corresponding function.
Referring to fig. 8B, when the user triggers the menu key 15A on the terminal 10 and then triggers the pull-up gesture (e.g. the touch position and the touch trajectory identified by the dashed line and the dashed line arrow in fig. 8B), but there is a significant time interval between the two triggering operations, for example, the time interval is greater than the preset time, the terminal 10 displays a schematic diagram of the corresponding user interface how to respond to the triggering operation of the user.
Theprocessing unit 14 responds to the triggering operation of thefunction key 15, starts thetiming unit 16 to start timing, but does not immediately report a function key event; after thetiming unit 16 starts timing, since the user does not trigger the pull-up gesture within the preset time after triggering themenu key 15A, if thetouch screen unit 12 does not detect any touch operation of the user on thetouch screen unit 12 within the preset time, it is determined that the touch screen unit does not receive the sliding gesture from the first side 121 toward the second side 122 opposite to the first side 121 when the timing reaches the preset time, and reports the function key event to the corresponding function layer of the operating system; then, the corresponding menu bar application executes the function corresponding to the pull-up gesture, and theprocessing unit 14 controls thedisplay unit 11 to longitudinally arrange the user interface of the displayed menu options on the display interface, so that the user can click different menu options to enter the corresponding sub-menu or trigger the corresponding function.
It can be understood that, when the user triggers the pull-up gesture after an obvious time interval, thetouch screen unit 12 normally responds to the touch action of the user to generate a responsive pull-up gesture event, and then reports the slide gesture event to a corresponding functional layer of the operating system; the corresponding shortcut startup bar application executes the function corresponding to the pull-up gesture, and theprocessing unit 14 controls thedisplay unit 11 to display a user interface with a plurality of shortcut startup icons (Icon) on the lower half portion of the display interface thereof, so that the user can select the corresponding shortcut startup Icon to start the corresponding function.
Referring to fig. 9, a flowchart of aninput control method 202 for the terminal 10 in fig. 2 according to a second embodiment of the present invention is shown. Wherein, theinput control method 202 comprises the following steps:
s10, responding to the triggering operation of the function key, and starting timing;
s20, judging whether a sliding gesture from the first side edge to a second side edge opposite to the first side edge is received on the touch screen unit within a preset time;
s32, when the touch screen unit receives the sliding gesture within the preset time, reporting a sliding gesture event, and clearing a function key event corresponding to the trigger operation of the function key;
and S40, reporting a function key event when the sliding gesture is not accepted on the touch screen unit when the timing reaches the preset time.
The steps S10, S20 and S40 are the same as those in the first embodiment, and are not repeated herein.
Specifically, in the step S32, when theprocessing unit 14 determines that the touch operation is a slide gesture from the first side 121 toward the second side 122 opposite to the first side 121, that is, determines that the touch operation is a pull-up gesture, the slide gesture event is reported, and the function key event corresponding to the trigger operation of thefunction key 15 is cleared. It can be understood that when the touch screen unit receives the pull-up gesture within the preset time, it may be determined that the triggering of thefunction key 15 by the user is a false touch operation, and it is not necessary to report a function key event generated by thefunction key 15 in response to the triggering of the user, and clear the function key event corresponding to the triggering operation of thefunction key 15.
Theinput control method 202 and the terminal 10 using theinput control method 202 in the second embodiment of the present invention can determine whether the triggering operation of thefunction key 15 is an action of the user actively triggering thefunction key 15 to execute a corresponding operation or an action of the user performing a touch control gesture of sliding up on the bottom of thetouch screen unit 12 when thefunction key 15 is triggered by mistake when the touch control gesture of sliding up on the bottom of the touch screen is executed, and can clear the function key event corresponding to the triggering operation of thefunction key 15 after thefunction key 15 is determined to be triggered by mistake to clear the system cache.
It can be understood that, in other embodiments, when thefunction key 15 responds to the trigger operation of the user, the corresponding function key event is not generated, and the function key event is not reported immediately; in step S40, when it is determined that the sliding gesture is not accepted on the touch screen unit when the counted time reaches the preset time, the function key event is generated, and the function key event is reported. In this embodiment, thedormitory function key 15 generates the function key event only when theprocessing unit 14 determines that the sliding gesture is not accepted on the touch screen unit when the timing reaches the preset time, and reports the function key event, so as to further shorten the response time of the system and enhance the user experience.
Referring to fig. 10, a flowchart of aninput control method 203 for the terminal 10 in fig. 2 according to a third embodiment of the present invention is shown. Wherein, step S20, step S30 and step S40 of theinput control method 203 in the third embodiment are the same as those in the first embodiment, and are not repeated herein; the difference is that thestep 10 comprises:
s131, responding to the triggering operation of the function key, and judging whether the time interval between the pressing event and the lifting event when the function key is triggered is smaller than a preset threshold value;
s132, when the time interval between the pressing event and the lifting event is smaller than the preset threshold when thefunction key 15 is triggered, starting timing;
s133, reporting the function key event when the time interval between the pressing event and the lifting event when thefunction key 15 is triggered is greater than or equal to the preset threshold.
When a user triggers afunction key 15 on a terminal 10, responding to the triggering operation of thefunction key 15, firstly judging the operation time length for triggering thefunction key 15 by the user, and if the operation time length is greater than a preset threshold value, representing that the intention of the user is to trigger the operation corresponding to thefunction key 15; if the operation duration is less than the preset threshold, it represents that the user is likely to touch thefunction key 15 during other operations or unintentionally. Wherein the preset threshold value can be 0.1-1 second; further, the preset threshold value can be 0.1-0.3 seconds.
Specifically, in response to the triggering operation of thefunction key 15, it is determined whether a time interval between a pressing (Keydown) event and a lifting (Keyup) event when thefunction key 15 is triggered is smaller than a preset threshold. The Keydown event when thefunction key 15 is triggered is an event that thefunction key 15 senses to be triggered for the first time; the Keyup event when thefunction key 15 is triggered is an event that thefunction key 15 senses that the finger of the user leaves the sensing area of thefunction key 15, such as lifting of the finger or leaving the sensing boundary of thefunction key 15 when the finger slides.
When it is determined that the time interval between the Keydown event and the Keyup event is smaller than the preset threshold when thefunction key 15 is triggered, starting timing, and entering step S20; and reporting the function key event when the time interval between the KeyDown event and the Keyup event is greater than or equal to the preset threshold when the function key is triggered.
Theinput control method 202 and the terminal 10 using theinput control method 202 in the third embodiment of the present invention can automatically recognize the operation of mistakenly triggering thefunction key 15 when the touch control gesture of sliding up the bottom of the touch screen is executed, and can directly determine whether the user mistakenly touches thefunction key 15 according to the duration of the user triggering thefunction key 15, thereby further increasing the determination accuracy of theinput control method 203 and improving the user experience.
Referring to fig. 11, a flowchart of aninput control method 204 for the terminal 10 in fig. 2 according to a fourth embodiment of the present invention is shown. Wherein, steps S10, S30 and S40 of theinput control method 204 in the fourth embodiment are the same as those in the first embodiment, and are not repeated herein; the difference is that thestep 20 comprises:
s241, judging whether the touch screen unit receives a trigger within a preset time;
s242, when the trigger is not received on the touch screen unit within the preset time, determining that the sliding gesture is not received on the touch screen unit when the timing reaches the preset time;
s243, when the touch screen unit receives the trigger within the preset time, judging whether a trigger area corresponding to the trigger received by the touch screen unit within the preset time is a first side edge of the touch screen;
and S244, when the trigger area corresponding to the trigger received on the touch screen unit in the preset time is the first side edge of the touch screen, determining that the sliding gesture is received on the touch screen unit in the preset time.
After thetiming unit 16 is started, when it is determined that the trigger is not received on thetouch screen unit 12 within the preset time, it is determined that the sliding gesture is not received on thetouch screen unit 12 when the preset time is reached.
When receiving the trigger on thetouch screen unit 12 within the preset time, determining whether a trigger area corresponding to the trigger received by thetouch screen unit 12 within the preset time is the first side 121 of thetouch screen unit 12, and determining that the sliding gesture is received on thetouch screen unit 12 within the preset time.
In this embodiment, theinput control method 204 only needs to determine that the trigger operation received on thetouch screen unit 12 within the preset time is an upward pulling gesture when the trigger area corresponding to the trigger received by thetouch screen unit 12 within the preset time is the first side 121 of thetouch screen unit 12, and then the step S30 is entered. The response speed of theinput control method 204 is further increased, and even after the user touches thefunction key 15 by mistake, the user directly starts the corresponding function of the pull-up gesture as long as the first side 121 of thetouch screen unit 12 is triggered within the preset time.
Further, referring to fig. 12 together, in an embodiment, the input control method for the terminal 10 in fig. 2 may further include the steps of:
s51, determining whether the sliding track corresponding to the sliding gesture received on the touch screen unit in the preset time meets the preset condition;
and S52, when the sliding track corresponding to the sliding gesture does not meet the preset condition, cancelling the reported sliding gesture event.
The preset condition may be that the length of the sliding track is greater than a preset value, or whether the direction of the sliding track is in line with a side from the first side 121 of thetouch screen unit 12 to thesecond side 12.
In this embodiment, in the process of executing the pull-up gesture, if the sliding track does not meet the preset condition, for example, the length of the sliding track does not meet the preset condition, the terminal 10 may withdraw the reported sliding gesture event halfway, withdraw the function corresponding to the pull-up gesture to be executed, or immediately stop the function corresponding to the pull-up gesture being executed. For example, when the operation corresponding to the pull-up gesture is to pull out the user interface with multiple shortcut start icons (Icon) on the bottom of thedisplay unit 11, if the sliding track of the pull-up gesture does not meet the preset condition, the user interface with multiple shortcut start icons (Icon) may be retracted to the bottom of thedisplay unit 11, and the operation calling the shortcut button bar is cancelled or confirmed as failed.
In the above embodiment, the preset time may be 8 to 100 milliseconds; the preset time can be further 10-15 milliseconds. In one embodiment, the predetermined time is 10 milliseconds.
Referring again to fig. 2, thememory unit 13 of the terminal 10 is further configured to store a computer program operable on theprocessing unit 14, and theprocessing unit 14 is configured to execute the computer program; wherein theprocessing unit 14 is configured to implement the steps of the input control method in any of the above embodiments when executing the computer program.
Specifically, theprocessing unit 14 is configured to implement the following steps when executing the computer program:
step S10, responding to the trigger operation of thefunction key 15, and starting timing;
in step S20, it is determined whether a sliding gesture from the first side 121 toward the second side 122 opposite to the first side 121 is received on thetouch screen unit 12 within a preset time.
And step S30, reporting a sliding gesture event when the sliding gesture is received on the touch screen unit within the preset time.
And step S40, reporting a function key event when the sliding gesture is not accepted on the touch screen unit when the timing reaches the preset time.
The terminal 10 in the present invention can implement the false touch self-checking function when the user triggers thefunction key 15, and can determine whether the triggering operation of thefunction key 15 is an action of the user actively triggering thefunction key 15 to execute a corresponding operation, or a false triggering action when the user executes a touch control gesture of sliding up on the bottom of thetouch screen unit 12, so as to improve the false triggering problem when the user executes the touch control gesture of sliding up on the bottom of the touch screen, and improve the user experience. Meanwhile, when responding to the triggering operation of thefunction key 15, the terminal 10 starts timing first, does not report the function key event, and does not need to additionally set a program for intercepting and delaying the function key event, thereby optimizing the processing speed, shortening the processing response time and enhancing the user experience.
Further, theprocessing unit 14 is further configured to implement step S32 when executing the computer program: and reporting a sliding gesture event when the touch screen unit receives the sliding gesture within the preset time, and clearing a function key event corresponding to the triggering operation of the function key.
Further, theprocessing unit 14 is further configured to implement the following steps when executing the computer program:
s131, responding to the triggering operation of the function key, and judging whether the time interval between the pressing event and the lifting event when the function key is triggered is smaller than a preset threshold value;
s132, when the time interval between the pressing event and the lifting event is smaller than the preset threshold value when the function key is triggered, starting timing;
and S133, reporting the function key event when the time interval between the pressing event and the lifting event is greater than or equal to the preset threshold when the function key is triggered.
Further, theprocessing unit 14 is further configured to implement the following steps when executing the computer program:
s241, judging whether the touch screen unit receives a trigger within a preset time;
s242, when the trigger is not received on the touch screen unit within the preset time, determining that the sliding gesture is not received on the touch screen unit when the timing reaches the preset time;
s243, when the touch screen unit receives the trigger within the preset time, judging whether a trigger area corresponding to the trigger received by the touch screen unit within the preset time is a first side edge of the touch screen;
and S244, when the trigger area corresponding to the trigger received on the touch screen unit in the preset time is the first side edge of the touch screen, determining that the sliding gesture is received on the touch screen unit in the preset time.
Further, theprocessing unit 14 is further configured to implement the following steps when executing the computer program:
s51, determining whether the sliding track corresponding to the sliding gesture received on the touch screen unit in the preset time meets the preset condition;
and S52, when the sliding track corresponding to the sliding gesture does not meet the preset condition, cancelling the reported sliding gesture event.
The preset time may be determined according to a distance between a lifting event sensing boundary of thefunction key 15 and a sensing touch point boundary on thetouch screen unit 12 close to the first side 121. The smaller the distance between the lifting event sensing boundary of thefunction key 15 and the sensing report point boundary on thetouch screen unit 12 close to the first side edge 121 is, the shorter the preset time may be, the shorter the delay of the terminal 10 to the sliding gesture (pull-up gesture) is, and the better the user experience is.
Specifically, in an embodiment, the distance between the lifting event sensing boundary of the key 15 and the sensing touch point boundary on thetouch screen unit 12 close to the first side edge 121 is 0 to 5 mm; further, the distance between the lifting event sensing boundary of the key 15 and the sensing touch point boundary on thetouch screen unit 12 close to the first side edge 121 is 0.5 to 2 mm.
In this embodiment, the preset time may be 8 to 100 milliseconds; the preset time can be further 10-15 seconds.
In an embodiment, the preset time is 10 ms, theprocessing unit 14 may execute the step of determining whether the triggering of thefunction key 15 is a false triggering within 10 ms, and 10 ms is not perceptible to the user, which has a good user experience.
The present invention also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the input control method described above.
In the description herein, references to the description of the term "one embodiment," "another embodiment," or "first through xth embodiments," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, method steps, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (9)

CN201710362473.XA2017-05-192017-05-19Terminal, input control method thereof, and computer-readable storage mediumActiveCN107179849B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201710362473.XACN107179849B (en)2017-05-192017-05-19Terminal, input control method thereof, and computer-readable storage medium

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201710362473.XACN107179849B (en)2017-05-192017-05-19Terminal, input control method thereof, and computer-readable storage medium

Publications (2)

Publication NumberPublication Date
CN107179849A CN107179849A (en)2017-09-19
CN107179849Btrue CN107179849B (en)2021-08-17

Family

ID=59832368

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201710362473.XAActiveCN107179849B (en)2017-05-192017-05-19Terminal, input control method thereof, and computer-readable storage medium

Country Status (1)

CountryLink
CN (1)CN107179849B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107967108A (en)*2017-11-222018-04-27出门问问信息科技有限公司The error touch control method and device of anti-gesture operation
CN108845739A (en)*2018-05-292018-11-20维沃移动通信有限公司A kind of control method and mobile terminal of navigation key
CN108958626A (en)*2018-06-292018-12-07奇酷互联网络科技(深圳)有限公司Gesture identification method, device, readable storage medium storing program for executing and mobile terminal
CN110989864A (en)*2019-11-282020-04-10惠州市德赛西威汽车电子股份有限公司Mistaken touch prevention method for vehicle-mounted panel

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105407211A (en)*2015-10-202016-03-16上海斐讯数据通信技术有限公司Base-on-touch-button system and base-on-touch-button method for gesture identification
CN106648232A (en)*2016-12-162017-05-10广东欧珀移动通信有限公司 Method, device and terminal for preventing false triggering of touch buttons

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
KR100823871B1 (en)*2007-10-112008-04-21주식회사 자티전자 Portable terminal for managing power saving using drag button and its operation method
US20110191675A1 (en)*2010-02-012011-08-04Nokia CorporationSliding input user interface
CN102929425B (en)*2012-09-242016-02-24惠州Tcl移动通信有限公司A kind of touch key control method and device
CN103076918B (en)*2012-12-282016-09-21深圳Tcl新技术有限公司Long-range control method based on touch terminal and system
CN104281295B (en)*2013-07-012019-07-23中兴通讯股份有限公司A kind of method, system and electronic equipment for preventing from sliding false triggering on touch screen
CN105302470B (en)*2015-11-272018-09-28上海斐讯数据通信技术有限公司Control method, device and the touch panel device of the electronic image of touch panel device
CN106406633A (en)*2016-12-162017-02-15广东欧珀移动通信有限公司 A method, device and mobile terminal for preventing false touches on the edge of a touch screen

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105407211A (en)*2015-10-202016-03-16上海斐讯数据通信技术有限公司Base-on-touch-button system and base-on-touch-button method for gesture identification
CN106648232A (en)*2016-12-162017-05-10广东欧珀移动通信有限公司 Method, device and terminal for preventing false triggering of touch buttons

Also Published As

Publication numberPublication date
CN107179849A (en)2017-09-19

Similar Documents

PublicationPublication DateTitle
CN106406904B (en) An information processing method, electronic equipment and information processing device
CN103135818B (en)Quick operation response method and electronic equipment
US20190258380A1 (en)Method for False Touch Prevention and Terminal
US20130222338A1 (en)Apparatus and method for processing a plurality of types of touch inputs
US20090153495A1 (en)Input method for use in an electronic device having a touch-sensitive screen
CN108431756B (en) Method and electronic device for responding to gestures acting on a touchscreen of an electronic device
CN107219988B (en) A kind of interface operation guidance method and mobile terminal
CN106775420A (en) Method, device and graphical user interface for switching applications
CN107179849B (en)Terminal, input control method thereof, and computer-readable storage medium
CN105824559A (en)Unintended activation recognizing and treating method and electronic equipment
WO2014084874A2 (en)Classifying the intent of user input
KR20160023298A (en)Electronic device and method for providing input interface thereof
US20180173338A1 (en)Method and terminalfor preventing unintentional triggering of a touch key and storage medium
CN102207783A (en)Electronic device capable of customizing touching action and method
US20130201136A1 (en)Portable electronic device and method of controlling a portable electronic device having a proximity-sensing user interface
US11941910B2 (en)User interface display method of terminal, and terminal
CN104335146B (en) Portable electronic device and method of controlling portable electronic device with proximity sensing user interface
WO2017161637A1 (en)Touch control method, touch control device, and terminal
CN107229408B (en)Terminal, input control method thereof, and computer-readable storage medium
KR101952342B1 (en)Off-center sensor target region
US20190272078A1 (en)Floating touch control sensing method, floating touch control sensing system, and non-transitory computer readable storage medium
US9971447B2 (en)Electronic apparatus and coordinates detection method
CN107390996B (en) Handling method and mobile terminal for false touch of power button
CN105892918A (en)Mobile terminal with touch screen and control method of mobile terminal
US20150153871A1 (en)Touch-sensitive device and method

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp