Detailed Description
Referring to the drawings, wherein like reference numbers refer to like elements, the principles of the present application are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
In the description that follows, specific embodiments of the present application will be described with reference to steps and symbols executed by one or more computers, unless otherwise indicated. Accordingly, these steps and operations will be referred to, several times, as being performed by a computer, the computer performing operations involving a processing unit of the computer in electronic signals representing data in a structured form. This operation transforms the data or maintains it at locations in the computer's memory system, which may be reconfigured or otherwise altered in a manner well known to those skilled in the art. The data maintains a data structure that is a physical location of the memory that has particular characteristics defined by the data format. However, while the principles of the application have been described in language specific to above, it is not intended to be limited to the specific form set forth herein, and it will be recognized by those of ordinary skill in the art that various of the steps and operations described below may be implemented in hardware.
The principles of the present application may be employed in numerous other general-purpose or special-purpose computing, communication environments or configurations. Examples of well known computing systems, environments, and configurations that may be suitable for use with the application include, but are not limited to, hand-held telephones, personal computers, servers, multiprocessor systems, microcomputer-based systems, mainframe-based computers, and distributed computing environments that include any of the above systems or devices.
The details will be described below separately.
The embodiment will be described in terms of a touch operation device, which may be specifically integrated in an electronic device, such as a mobile interconnection network device (e.g., a smart phone, a tablet computer), and the like.
Referring first to fig. 1, an electronic device includes acurved display screen 10, and thecurved display screen 10 may include amain display area 11 and acurved display area 12 connected to themain display area 11. Themain display area 11 has a touch function and a display function, that is, the electronic device can display information such as characters and pictures in themain display area 11, and a user can perform touch operation such as click operation and sliding operation in themain display area 11 to control the electronic device. The curvedsurface display area 12 has a touch function and a display function, and a user can perform a touch operation such as a click operation, a slide operation, and the like in the curvedsurface display area 12. In practical applications, the electronic device may include twocurved display areas 12, namely a left curved display area and a right curved display area.
In some embodiments, thecurved display area 12 may not have a display function, that is, thecurved display area 12 only has a touch function as an auxiliary touch area of themain display area 11.
In some embodiments, thecurved display area 12 may also have a display function, that is, thecurved display area 12 may be a side touch display area. That is, the electronic device may display information such as text, pictures, etc. through thecurved display area 12. At this time, thecurved display area 12 may be an auxiliary display area of themain display area 11.
Based on the electronic device, in some embodiments, please refer to fig. 2, and fig. 2 is a schematic flow chart of a touch operation method provided in the present embodiment, which includes the following steps:
step S101, detecting a touch operation triggered by a user in a curved surface display area.
The touch operation may be touch information of a user touching the touch display screen with a finger. The touch operation may be a click operation such as a single click, a double click, a long press, or a slide operation such as a slide-up, a slide-down, a slide-left, and a slide-right operation.
In an embodiment, the display screen of the electronic device may be a capacitive touch screen, the capacitive touch screen may be divided into a main touch screen and a curved touch screen, and a user generates an electrical signal by touching a touch area. The touch area is composed of a plurality of transverse and longitudinal electrode arrays, when a human body touches the touch screen, the human body and the touch screen generate capacitance (human body grounding), the capacitance is a direct conductor for high-frequency current, and the capacitance of the human body is superposed on the electrode capacitance of the touch screen. A processor of the touch screen sequentially detects the sizes of the capacitors of the transverse electrode array and the longitudinal electrode array, so that transverse and longitudinal coordinates are determined. Specifically, when a user's finger (or other part of the human body) touches the touch screen, the electric field of the human body draws a small current from the touch point through the finger (or other part of the human body). The current flows out of the electrodes on the four corners of the touch screen, and the current flowing through the four electrodes is in direct proportion to the distance from the finger to the four corners, so that information such as the position of a touch point can be obtained through accurate calculation of the proportion of the four currents.
And step S102, acquiring the touch position and the touch mode of the touch operation.
The touch position may be a left curved surface display area or a right curved surface display area. The touch mode can be a click operation mode or a sliding operation mode.
The touch position information may be represented by coordinate values, for example, a coordinate system of the touch display screen may be established, and when a user performs a click operation on the touch display screen, coordinate information (such as horizontal axis coordinate x, vertical axis coordinate y, and the like) of the click operation may be detected or recorded. The clicking operation may be a single click, or two clicks, for example, a double click, etc.
In an embodiment, if the touch mode of the user is a sliding operation mode, a touch track of the touch operation may also be obtained.
For example, when a user clicks touch or continuously touches (for example, a formed sliding gesture is drawn on the touch screen), the maximum value of the two adjacent capacitance change values can be obtained by comparison according to the capacitance of each partition on the touch screen, and the maximum value is in the area within the maximum sensitivity range, so that a series of touch point coordinates can be found out, and then the touch trajectory can be identified according to a geometric mathematical algorithm. It should be noted that, the present embodiment takes a capacitive touch screen as an example for description, but the present embodiment is not limited thereto.
That is, the step of acquiring the touch trajectory may specifically include: and determining the initial touch point of the touch operation as a valid touch point, and identifying all coordinate points in an area covered by a track from the initial touch point to the termination touch point.
Step S103, acquiring a touch instruction according to the touch position and the touch manner.
In the embodiment of the present application, different touch positions and touch manners are in a one-to-one correspondence relationship with the touch instruction, that is, one touch position and one touch manner correspond to one touch instruction, where the preset instruction may be preset, for example, when the touch position is below the curved surface display area and the touch manner is single-click, the corresponding touch instruction may be a function of returning to the main interface, and when the touch position is above the curved surface display area and the touch manner is double-click, the corresponding touch instruction may be a function of opening a multi-task interface, and the like. The touch instruction is used for instructing the electronic equipment to perform corresponding operation.
In an embodiment, the touch position may include a position value of the click operation on the curved surface display area, such as a coordinate value, and it may be determined whether the coordinate value falls within a position value range of a preset area (for example, the curved surface display area is divided into a first curved surface display area and a second curved surface display area).
And step S104, performing touch operation on the electronic equipment according to the touch instruction.
The electronic equipment can execute the touch instruction after acquiring the touch instruction, perform touch operation on the electronic equipment according to the touch instruction, and further display an interface corresponding to the touch instruction in a display screen of the electronic equipment, for example, if the touch instruction acquired by the electronic equipment is a bluetooth opening instruction, the electronic equipment is controlled to open bluetooth; for another example, if the acquired touch instruction is a do-not-disturb mode opening instruction, controlling the electronic device to open the do-not-disturb mode; for another example, if the acquired touch instruction is a task manager instruction, the electronic device is controlled to enter a task manager interface.
In the embodiment of the present invention, the electronic device may be any intelligent electronic device, for example: a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
Therefore, the touch operation triggered by the user in the curved surface display area can be detected, the touch position and the touch mode of the touch operation can be obtained, the touch instruction can be obtained according to the touch position and the touch mode, and the touch operation can be performed on the electronic device according to the touch instruction. According to the method and the device, the interaction function of the electronic equipment can be realized only through the gesture operation curved surface display area of the user, so that a virtual button is not required to be arranged in the screen of the equipment, in addition, the physical keys are prevented from being damaged due to repeated use because only the gesture is required to be interacted, and the efficiency of interaction control of the electronic equipment is improved.
According to the above description of the embodiment, the touch operation method of the present application will be further described below.
Referring to fig. 3, fig. 3 is a schematic flow chart of another touch operation method according to an embodiment of the present disclosure, in which a curved display area includes a first sub area, a second sub area, and a third sub area that are adjacent to each other in sequence, and the method includes the following steps:
step S201, detecting a touch operation triggered by a user in a curved surface display area.
The touch operation may be touch information of a user touching the touch display screen with a finger. The touch operation may be a click operation or a slide operation.
In an embodiment, the curved display area may include a first sub-area, a second sub-area, and a third sub-area that are adjacent to each other in sequence. For example, referring to fig. 4, thecurved display area 12 may include three sub-areas sequentially adjacent to each other, namely a first sub-area 120 (which may be referred to as a side top touch area), a second sub-area 121 (which may be referred to as a side middle touch area), and a third sub-area 122 (which may be referred to as a side bottom touch area).
In step S202, a touch position and a touch manner of the touch operation are acquired.
The touch position may be a left curved surface display area or a right curved surface display area. The touch mode can be a click operation mode or a sliding operation mode.
The touch position information may be represented by coordinate values, for example, a coordinate system of the touch display screen may be established, and when a user performs a click operation on the touch display screen, coordinate information (such as horizontal axis coordinate x, vertical axis coordinate y, and the like) of the click operation may be detected or recorded. The clicking operation may be a single click, or two clicks, for example, a double click, etc.
In step S203, the number of fingers used by the user to perform the touch operation is acquired.
In this embodiment, the touch operation of the user may be further divided into a single-finger touch and a double-finger touch, and in other embodiments, the touch operation may further include a three-finger touch, a four-finger touch, and the like. The touch control method and the touch control device have the advantages that more different touch control instructions are realized, and the operation which can be completed by a user through touch control is enriched. Therefore, when the touch operation triggered by the user in the curved surface display area is obtained, the number of fingers used in the touch operation can be further obtained.
And step S204, determining a target sub-area of the touch operation in the curved surface display area according to the touch position.
When the touch position is located in the curvedsurface display area 12, specifically, a target sub-area of the touch position in the curved surface display area needs to be further determined, a corresponding coordinate set may be determined according to the coordinates of the touch position, and then the target sub-area corresponding to the coordinate set is determined. For example, when the coordinates of the touch position are located in the coordinate set of thefirst sub-region 120, the target sub-region of the touch operation in the curved display region is determined as thefirst sub-region 120; when the coordinates of the touch position are in the coordinate set of thesecond sub-area 121, determining that the target sub-area of the touch operation in the curved surface display area is thesecond sub-area 121; when the coordinates of the touch position are located in the coordinate set of thethird sub-area 122, the target sub-area of the touch operation in the curved display area is determined to be thethird sub-area 122.
In step S205, candidate touch commands corresponding to the target sub-area are obtained.
When the touch position is located in thecurved display area 12, specifically, in thesecond sub-area 121, candidate touch instructions corresponding to thesecond sub-area 121, such as a screen capture instruction, a screen lock instruction, a return instruction, and the like, may be acquired.
In an embodiment, the three sub-areas may respectively correspond to different types of touch instructions, for example, the three sub-areas may be respectively divided into a screen processing type instruction (such as a screen locking instruction, an unlocking instruction, a brightness adjustment instruction, and a screen capture instruction), a function returning type instruction (such as a previous interface returning instruction and a main interface returning instruction), a task management type instruction (such as a task closing instruction and a task opening instruction), and the like.
Step S206, selecting touch instructions corresponding to the number of fingers and the touch manner from the candidate touch instructions.
In some embodiments, the touch instruction may be obtained based on the number of fingers and the touch position of the touch operation and the touch manner, for example, when the operation area is located in a first sub-area in the curved surface display area, the number of fingers used by the user to perform the touch operation is obtained; and acquiring a touch instruction according to the number of the fingers and the touch mode.
For example, a touch instruction set corresponding to the number of fingers may be obtained, and then, a touch instruction corresponding to the touch mode may be selected from the touch instruction set.
The number of the fingers can correspond to different types of touch instructions, for example, a return function type instruction corresponding to one finger, a screen processing type instruction corresponding to two fingers, an image processing type instruction corresponding to three fingers, and the like.
In an embodiment, the touch manner may include single click, double click, long press, and slide, and when the touch manner is slide, the slide direction may be further acquired, such as slide left, slide up, slide down, and the like. That is, selecting a touch instruction corresponding to the number of fingers and the touch mode from the candidate touch instructions includes:
when the touch mode is sliding, acquiring the sliding direction of the touch mode;
and selecting the number of fingers and the sliding direction from the candidate touch instructions to acquire the touch instructions.
In an embodiment, when the touch manner is sliding, the sliding distance may be further obtained, and then the corresponding touch instruction is obtained according to the number of fingers and the sliding distance. That is, selecting a touch instruction corresponding to the number of fingers and the touch mode from the candidate touch instructions includes:
when the touch mode is sliding, acquiring the sliding distance of the touch mode;
and selecting the number of fingers and the sliding distance from the candidate touch instructions to obtain the touch instructions.
Step S207, performing a touch operation on the electronic device according to the touch instruction.
The electronic equipment can execute the touch instruction after acquiring the touch instruction, perform touch operation on the electronic equipment according to the touch instruction, and further display an interface corresponding to the touch instruction in a display screen of the electronic equipment, for example, if the touch instruction acquired by the electronic equipment is a bluetooth opening instruction, the electronic equipment is controlled to open bluetooth; for another example, if the acquired touch instruction is a do-not-disturb mode opening instruction, controlling the electronic device to open the do-not-disturb mode; for another example, if the acquired touch instruction is a task manager instruction, the electronic device is controlled to enter a task manager interface.
As can be seen from the above, the embodiment of the application can detect the touch operation triggered by the user in the curved surface display area, acquire the touch position and the touch manner of the touch operation, acquire the number of fingers used by the user in the touch operation, determining a target subarea of the touch operation in the curved surface display area according to the touch position, acquiring candidate touch instructions corresponding to the target subarea, selecting touch instructions corresponding to the number of fingers and the touch mode from the candidate touch instructions, the electronic equipment is subjected to touch operation according to the touch instruction, the interaction function of the electronic equipment can be realized only by operating the curved surface display area through the gesture of the user, so that a virtual button is not required to be arranged in the screen of the equipment, in addition, interaction is only needed through gestures, damage to physical keys due to repeated use is avoided, and the efficiency of interactive control of the electronic equipment is improved.
In order to better implement the touch operation method provided by the embodiment of the present application, an embodiment of the present application further provides a device based on the touch operation method. The meaning of the noun is the same as that in the touch operation method, and specific implementation details can refer to the description in the method embodiment.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a touch operation device according to an embodiment of the present disclosure, where thetouch operation device 30 includes: adetection module 301, afirst acquisition module 302, asecond acquisition module 303, and anexecution module 304;
thedetection module 301 is configured to detect a touch operation triggered by a user in the curved surface display area;
the first obtainingmodule 302 is configured to obtain a touch position and a touch manner of the touch operation;
the second obtainingmodule 303 is configured to obtain a touch instruction according to the touch position and the touch manner;
theexecution module 304 is configured to perform a touch operation on the electronic device according to the touch instruction.
In an embodiment, the curved display area may include a first sub-area, a second sub-area, and a third sub-area that are adjacent to each other in sequence, as shown in fig. 6, the second obtainingmodule 303 may include: a determiningsubmodule 3031 and an obtainingsubmodule 3032;
the determiningsubmodule 3031 is configured to determine, according to the touch position, a target sub-area of the touch operation in the curved surface display area;
the obtainingsubmodule 3032 is configured to obtain a touch instruction according to the target sub-area and the touch manner.
In an embodiment, the obtaining sub-module 3032 may be specifically configured to obtain candidate touch instructions corresponding to the target sub-region, and select a touch instruction corresponding to the touch manner from the candidate touch instructions.
In an embodiment, as shown in fig. 7, thetouch operation device 30 may further include: a third obtainingmodule 305;
the third obtainingmodule 305 is configured to obtain the number of fingers used by the user during the touch operation before the second obtainingmodule 303 obtains the touch instruction according to the touch position and the touch manner;
the second obtainingmodule 303 is specifically configured to obtain a touch instruction according to the number of fingers, the touch position, and the touch manner.
As can be seen from the above, thetouch operation device 30 provided in the embodiment of the present application can detect a touch operation triggered by a user in a curved surface display area, acquire a touch position and a touch manner of the touch operation, acquire a touch instruction according to the touch position and the touch manner, and perform a touch operation on an electronic device according to the touch instruction. According to the method and the device, the interaction function of the electronic equipment can be realized only through the gesture operation curved surface display area of the user, so that a virtual button is not required to be arranged in the screen of the equipment, in addition, the physical keys are prevented from being damaged due to repeated use because only the gesture is required to be interacted, and the efficiency of interaction control of the electronic equipment is improved.
The application also provides an electronic device, which includes a memory, a processor and a computer program stored in the memory and capable of running on the processor, and is characterized in that the processor implements the touch operation method provided by the method embodiment when executing the program.
In another embodiment of the present application, an electronic device is also provided, and the electronic device may be a smart phone, a tablet computer, or the like. As shown in fig. 8, theelectronic device 400 includes aprocessor 401, amemory 402. Theprocessor 401 is electrically connected to thememory 402.
Theprocessor 401 is a control center of theelectronic device 400, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or loading an application program stored in thememory 402 and calling data stored in thememory 402, thereby integrally monitoring the electronic device.
In this embodiment, theprocessor 401 in theelectronic device 400 loads instructions corresponding to processes of one or more application programs into thememory 402 according to the following steps, and theprocessor 401 runs the application programs stored in thememory 402, thereby implementing various functions:
detecting touch operation triggered by a user in the curved surface display area;
acquiring a touch position and a touch mode of the touch operation;
acquiring a touch instruction according to the touch position and the touch mode;
and performing touch operation on the electronic equipment according to the touch instruction.
Referring to fig. 9, fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. Theelectronic device 500 may include Radio Frequency (RF)circuitry 501,memory 502 including one or more computer-readable storage media,input unit 503,display unit 504,sensor 504,audio circuitry 506, Wireless Fidelity (WiFi)module 507,processor 508 including one or more processing cores, andpower supply 509. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 9 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
Therf circuit 501 may be used for receiving and transmitting information, or receiving and transmitting signals during a call, and in particular, receives downlink information of a base station and then sends the received downlink information to one ormore processors 508 for processing; in addition, data relating to uplink is transmitted to the base station. In general,radio frequency circuit 501 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, theradio frequency circuit 501 may also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
Thememory 502 may be used to store applications and data.Memory 502 stores applications containing executable code. The application programs may constitute various functional modules. Theprocessor 508 executes various functional applications and data processing by executing application programs stored in thememory 502. Thememory 502 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the electronic device, and the like. Further, thememory 502 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, thememory 502 may also include a memory controller to provide theprocessor 508 and theinput unit 503 access to thememory 502.
Theinput unit 503 may be used to receive input numbers, character information, or user characteristic information (such as a fingerprint), and generate a keyboard, mouse, joystick, optical, or trackball signal input related to user setting and function control. In particular, in one particular embodiment, theinput unit 503 may include a touch-sensitive surface as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (e.g., operations by a user on or near the touch-sensitive surface using a finger, a stylus, or any other suitable object or attachment) thereon or nearby, and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to theprocessor 508, and can receive and execute commands sent by theprocessor 508.
Thedisplay unit 504 may be used to display information input by or provided to a user and various graphical user interfaces of the electronic device, which may be made up of graphics, text, icons, video, and any combination thereof. Thedisplay unit 504 may include a display panel. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is transmitted to theprocessor 508 to determine the type of touch event, and then theprocessor 508 provides a corresponding visual output on the display panel according to the type of touch event. Although in FIG. 9 the touch sensitive surface and the display panel are two separate components to implement input and output functions, in some embodiments the touch sensitive surface may be integrated with the display panel to implement input and output functions.
The electronic device may also include at least onesensor 505, such as light sensors, motion sensors, and other sensors. In particular, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that may turn off the display panel and/or the backlight when the electronic device is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which may be further configured to the electronic device, detailed descriptions thereof are omitted.
Theaudio circuit 506 may provide an audio interface between the user and the electronic device through a speaker, microphone. Theaudio circuit 506 can convert the received audio data into an electrical signal, transmit the electrical signal to a speaker, and convert the electrical signal into a sound signal to output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by theaudio circuit 506 and converted into audio data, which is then processed by the audiodata output processor 508 and then sent to another electronic device via therf circuit 501, or the audio data is output to thememory 502 for further processing. Theaudio circuit 506 may also include an earbud jack to provide communication of a peripheral headset with the electronic device.
Wireless fidelity (WiFi) belongs to short-distance wireless transmission technology, and electronic equipment can help users to send and receive e-mails, browse webpages, access streaming media and the like through awireless fidelity module 507, and provides wireless broadband internet access for users. Although fig. 9 shows thewireless fidelity module 507, it is understood that it does not belong to the essential constitution of the electronic device, and may be omitted entirely as needed within the scope not changing the essence of the invention.
Theprocessor 508 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing an application program stored in thememory 502 and calling data stored in thememory 502, thereby integrally monitoring the electronic device. Optionally,processor 508 may include one or more processing cores; preferably, theprocessor 508 may integrate an application processor, which primarily handles operating systems, user interfaces, application programs, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into theprocessor 508.
The electronic device also includes a power supply 509 (such as a battery) to power the various components. Preferably, the power source may be logically connected to theprocessor 508 through a power management system, so that the power management system may manage charging, discharging, and power consumption management functions. Thepower supply 509 may also include any component such as one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown in fig. 9, the electronic device may further include a camera, a bluetooth module, and the like, which are not described in detail herein.
In specific implementation, the above modules may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and specific implementation of the above modules may refer to the foregoing method embodiments, which are not described herein again.
It should be noted that, all or part of the steps in the various methods of the above embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, such as a memory of a terminal, and executed by at least one processor in the terminal, and during the execution, the flow of the embodiments, such as the touch operation method, may be included. Among others, the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
In the above, detailed descriptions are given to the touch operation method, the touch operation device, the storage medium, and the electronic device, and each functional module may be integrated in one processing chip, or each module may exist alone physically, or two or more modules are integrated in one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The principle and the implementation of the present application are explained herein by applying specific examples, and the above description of the embodiments is only used to help understand the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.