Disclosure of Invention
In view of the above, the present invention is directed to a method and an apparatus for processing an object in an application, and a touch screen terminal, which can provide a simple touch gesture for a user to complete an operation process of the object provided with an operation function in the application, such as a delete operation, a move operation, a copy operation, or a cut operation, and can better improve user experience.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the invention provides a method for processing an object in an application, which comprises the following steps:
receiving a first gesture operation and a second gesture operation, and respectively analyzing the first gesture operation and the second gesture operation into a first operation instruction and a second operation instruction; the first gesture operation and the second gesture operation are gesture operations for an object in an application;
one or more objects of the application are selected according to the first operation instruction, and the operation of the one or more selected objects is executed according to the second operation instruction.
In the above solution, the executing the operation of the selected one or more objects according to the second operation instruction includes: and executing the deleting operation, the moving operation, the copying operation or the cutting operation of the selected one or more objects according to the second operation instruction.
In the foregoing solution, before receiving the first gesture operation and the second gesture operation, the method further includes:
setting a corresponding relation between a first gesture and a first operation instruction; the corresponding relation is used for analyzing the first gesture operation into a first operation instruction;
and/or setting a corresponding relation between the second gesture and the second operation instruction; the corresponding relation is used for analyzing the second gesture operation into a second operation instruction.
In the foregoing solution, before receiving the first gesture operation and the second gesture operation, the method further includes: setting an operation attribute in an object of the application;
the operational attributes include at least one of the following attributes: delete operation attribute, move operation attribute, copy operation attribute, cut operation attribute.
In the above scheme, the setting operation attribute is that an interface for calling the application object is added to the application object, and the interface is used for operating the application object;
the object operating the application comprises one of: deleting the object of the application, moving the object of the application, copying the object of the application, and cutting the object of the application.
In the above solution, after the performing the operation of the selected one or more objects according to the second operation instruction, the method further includes: and outputting operation ending information.
In the above solution, the first gesture operation is a gesture operation for selecting one or more objects of the application;
the second gesture operation is a gesture operation that performs an operation of the selected one or more objects.
In the above scheme, the second gesture operation is a received graphical gesture operation.
In the above solution, the first gesture operation includes at least one of: a continuous swipe gesture, a single tap gesture;
the second gesture operation includes at least one of: fork gestures, arrow-type gestures, circular gestures.
The invention also provides a device for processing the object in the application, which comprises: the device comprises an input module and an operation management module; wherein,
the input module is used for receiving a first gesture operation and a second gesture operation, analyzing the first gesture operation and the second gesture operation into a first operation instruction and a second operation instruction respectively, and sending the first operation instruction and the second operation instruction to the operation management module; the first gesture operation and the second gesture operation are gesture operations for an object in an application;
the operation management module is used for selecting one or more objects of the application according to the first operation instruction and executing the operation of the selected one or more objects according to the second operation instruction.
In the foregoing solution, the operation management module is specifically configured to execute a delete operation, a move operation, a copy operation, or a cut operation of the selected one or more objects according to the second operation instruction.
In the above scheme, the apparatus further comprises: the setting module is used for setting the corresponding relation between the first gesture and the first operation instruction; the corresponding relation is used for analyzing the first gesture operation into a first operation instruction; and/or the corresponding relation between the second gesture and the second operation instruction; the corresponding relation is used for analyzing the second gesture operation into a second operation instruction.
In the above scheme, the setting module is further configured to set an operation attribute in the object of the application; the operational attributes include at least one of the following attributes: delete operation attribute, move operation attribute, copy operation attribute, cut operation attribute.
In the above scheme, the setting module sets the operation attribute as adding an interface for calling the application object in the application object, where the interface is used to operate the application object;
the object operating the application comprises one of: deleting the object of the application, moving the object of the application, copying the object of the application, and cutting the object of the application.
In the foregoing solution, the apparatus further includes an output module, configured to output operation end information after the operation of the selected one or more objects is performed according to the second operation instruction.
The invention also provides a touch screen terminal which comprises the processing device of the object in the application.
The method for processing the object in the application receives a first gesture operation and a second gesture operation, and analyzes the first gesture operation and the second gesture operation into a first operation instruction and a second operation instruction respectively; the first gesture operation and the second gesture operation are gesture operations for an object in an application; selecting one or more objects of the application according to the first operation instruction, and deleting the selected one or more objects according to the second operation instruction; therefore, simple touch gestures can be provided for a user to complete operation processing of an object provided with an operation function in an application, such as deletion operation, movement operation, copy operation, cutting operation and the like, so that the user does not need to stop experience for searching for a menu item of the operation function, and the experience effect of the user is greatly improved.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
Fig. 1 is a schematic flow chart of an object processing method in application of the present invention, as shown in fig. 1, including the following steps:
step 101: receiving a first gesture operation and a second gesture operation;
here, the first gesture operation and the second gesture operation are gesture operations for an object in an application;
further, the first gesture operation is a gesture operation of selecting one or more objects of the application;
the second gesture operation is a gesture operation for performing an operation of the selected one or more objects;
wherein the second gesture operation is a received graphical gesture operation; to distinguish a first gesture operation for selecting one or more objects of the application;
specifically, the first gesture operation includes at least one of: a continuous swipe gesture, a single tap gesture;
the second gesture operation includes at least one of: fork gestures, arrow-type gestures, circular gestures.
Further, before receiving the first gesture operation and the second gesture operation, the method further includes:
setting a corresponding relation between the first gesture and the second operation instruction; the corresponding relation is used for analyzing the first gesture operation into a first operation instruction;
setting a corresponding relation between the second gesture and the second operation instruction; the corresponding relation is used for analyzing the second gesture operation into a second operation instruction.
Here, the first gesture operation may be a single-click gesture operation of selecting objects of one or more applications, a gesture operation of delineating a region by a certain gesture, selecting objects of one or more applications of the region, or the like; for example, an object of an application touched by a continuously-sliding contact is selected as an object of the first operation instruction, or an object within a range defined by the continuously-sliding contact is selected as an object of the first operation instruction, or an object within a rectangular range formed by a diagonal line formed by a straight line formed by a leading contact and a trailing contact of the continuously-sliding contact is selected as an object of the first operation instruction, or an object within a rectangular range formed by a diameter formed by a straight line formed by a leading contact and a trailing contact of the continuously-sliding contact is selected as an object of the first operation instruction.
The second gesture operation is an operation gesture set by the user, and may be a fork (x) gesture operation, an arrow-type gesture operation, a circular gesture operation, or the like.
Further, before receiving the first gesture operation and the second gesture operation, the method further includes: setting an operation attribute in an object of the application;
the operational attributes include at least one of the following attributes: delete operation attribute, move operation attribute, copy operation attribute, cut operation attribute.
Further, the setting operation attribute is that an interface for calling the application object is added to the application object, and the interface is used for operating the application object;
the object operating the application comprises one of: deleting the object of the application, moving the object of the application, copying the object of the application, and cutting the object of the application;
specifically, the processing procedure of the object can be completed by calling the operation processing function corresponding to the object to be operated.
Step 102: analyzing the first gesture operation and the second gesture operation into a first operation instruction and a second operation instruction respectively;
here, since the correspondence relationship of the first gesture and the object of the application selected to be deleted has been set before the deletion process; and, the correspondence of the second gesture to the deletion of the selected object; the first gesture operation and the second gesture operation can be analyzed into a first operation instruction and a second operation instruction respectively according to the corresponding relation.
Step 103: one or more objects of the application are selected according to the first operation instruction, and the operation of the selected one or more objects is executed according to the second operation instruction;
further, the executing the one or more selected objects according to the second operation instruction includes: and executing the deleting operation, the moving operation, the copying operation or the cutting operation of the selected one or more objects according to the second operation instruction.
Further, after the operation of the selected one or more objects is performed according to the second operation instruction, the method further includes: and outputting operation ending information.
The following further illustrates the technical solution of the present invention by taking the deletion operation as an example.
Fig. 2 is a schematic flowchart of a processing method of an object in an application according to an embodiment of the present invention, as shown in fig. 2, including the following steps:
step 201: a user runs a certain application and enters a certain object browsing interface in the application;
step 202: the user wants to delete an object in the object browsing interface, and selects the object through a first gesture;
specifically, a region may be defined by the first gesture, and a certain object or a plurality of objects in the region may be selected.
Step 203: the user marks out a second gesture in any area of the touch screen;
specifically, the second gesture may be set before the task is deleted, any gesture may be used as the second gesture, and specifically, the gesture shown in fig. 3 may be used as the second gesture.
Step 204: the touch screen terminal acquires the first gesture operation and the second gesture operation, and the first gesture operation and the second gesture operation are respectively analyzed into a first operation instruction and a second operation instruction;
here, the first gesture command is to select one or more objects of the application; the second gesture instruction is to delete the selected one or more objects.
Step 205: the touch screen terminal acquires the selected object according to the first operation instruction and completes the deletion processing of the selected object according to the second operation instruction;
here, the touch screen terminal may complete the deletion processing of the current object by calling a deletion application interface of the object of the application;
further, before executing the deletion process of the current object, the method further includes: the method comprises the steps that a touch screen terminal inquires whether an object to be deleted is provided with a deletion attribute or not, namely whether the object is associated with a deletion application interface or not, and if the object is provided with the deletion attribute, deletion processing of the current object is completed by calling the deletion application interface of the object; otherwise, directly returning to the end of deletion.
Step 206: and outputting the deletion end information.
After the ending information is output, the touch screen terminal closes the current acquisition deleting task and enters a normal state so as to perform the next gesture acquisition working state.
Further, for other operations, for example: the moving operation, the copying operation, or the cutting operation, which is the same as the above-mentioned deleting operation, is performed by selecting one or more objects through the first gesture operation, and performing the moving operation, the copying operation, or the cutting operation of the selected one or more objects through the second gesture operation.
Fig. 4 is a schematic diagram of a structure of a processing apparatus for processing an object in application of the present invention, as shown in fig. 4, the apparatus includes: an input module 41 and an operation management module 42; wherein,
the input module 41 is configured to, after receiving a first gesture operation and a second gesture operation, analyze the first gesture operation and the second gesture operation into a first operation instruction and a second operation instruction respectively, and send the first operation instruction and the second operation instruction to the operation management module 42; the first gesture operation and the second gesture operation are gesture operations for an object in an application;
the operation management module 42 is configured to select one or more objects of the application according to the first operation instruction, and execute an operation of the selected one or more objects according to the second operation instruction.
Here, the input module 11 is configured to receive a gesture operation input through a touch screen, and the input module may receive the input gesture operation through a standard Application Programming Interface (API), that is, the input gesture operation is received in a generalized and normalized Interface mode based on a platform;
after receiving the second operation instruction sent by the input module 11, the operation management module 12 calls an application interface of an object operating the application in the selected object, and executes an operation of the corresponding one or more objects.
Further, the operation management module 12 is specifically configured to execute a delete operation, a move operation, a copy operation, or a cut operation of the selected one or more objects according to the second operation instruction.
Further, the apparatus further comprises: a setting module 43, configured to set a corresponding relationship between the first gesture and the first operation instruction; the corresponding relation is used for analyzing the first gesture operation into a first operation instruction; and/or the corresponding relation between the second gesture and the second operation instruction; the corresponding relation is used for analyzing the second gesture operation into a second operation instruction.
Further, the setting module 43 is further configured to set an operation attribute in the object of the application; the operational attributes include at least one of the following attributes: delete operation attribute, move operation attribute, copy operation attribute, cut operation attribute.
Further, the setting module 43 sets the operation attribute as that an application interface for calling the application object is added to the application object, and the interface is used for operating the application object;
the object operating the application comprises one of: deleting the object of the application, moving the object of the application, copying the object of the application, and cutting the object of the application.
Here, the setting module 43 is an application interface that is set by the touch screen terminal to provide an operation for executing an object when the object is created in the APP development stage, and when the object of the application is created, a process name capable of implementing an operation function is added to an attribute of the object having an operation requirement, where the operation includes at least one of: delete operation, move operation, copy operation, cut operation.
Further, the apparatus further includes an output module 44, configured to output operation end information after performing the operation on the selected one or more objects according to the second operation instruction.
Those skilled in the art will understand that the functions implemented by the processing modules in the processing device of the object in the application shown in fig. 4 can be understood by referring to the related description of the processing method of the object in the application. Those skilled in the art will understand that the functions of each module unit in the processing device of the object in the application shown in fig. 4 can be realized by a program running on a processor, and can also be realized by a specific logic circuit.
The invention also discloses a touch screen terminal which comprises a processing device of the object in the application shown in the figure 4.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present invention are included in the protection scope of the present invention.