Movatterモバイル変換


[0]ホーム

URL:


CN104156146A - Processing method and device for objects in application and touch screen terminal - Google Patents

Processing method and device for objects in application and touch screen terminal
Download PDF

Info

Publication number
CN104156146A
CN104156146ACN201310176096.2ACN201310176096ACN104156146ACN 104156146 ACN104156146 ACN 104156146ACN 201310176096 ACN201310176096 ACN 201310176096ACN 104156146 ACN104156146 ACN 104156146A
Authority
CN
China
Prior art keywords
gesture
application
instruction
objects
operation instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310176096.2A
Other languages
Chinese (zh)
Inventor
陈斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE CorpfiledCriticalZTE Corp
Priority to CN201310176096.2ApriorityCriticalpatent/CN104156146A/en
Priority to PCT/CN2013/079607prioritypatent/WO2013167045A2/en
Publication of CN104156146ApublicationCriticalpatent/CN104156146A/en
Pendinglegal-statusCriticalCurrent

Links

Classifications

Landscapes

Abstract

The invention discloses a processing method for objects in an application. The method comprises the steps of receiving a first gesture operation and a second gesture operation, and analyzing the first gesture operation and the second gesture operation into a first operating instruction and a second operating instruction respectively, wherein the first gesture operation and the second gesture operation are gesture operations for the objects in the application; according to the first operating instruction, selecting one or more objects of the application, and according to the second operation instruction, executing the operations of the selected one or more objects. The invention further discloses a processing device for the objects in the application and a touch screen terminal. By the adoption of the technical scheme, a simple touch gesture can be provided for a user to complete processing of operations, such as the deletion operation or the move operation or the copy operation or the cut operation, of the objects with the operating functions, and the user experience quality is greatly improved.

Description

Processing method and device for object in application and touch screen terminal
Technical Field
The invention relates to a network communication technology, in particular to a touch screen terminal, a method and a device for processing an object in application and the touch screen terminal.
Background
With the increasing abundance of intelligent mobile terminal services and Application types, more and more third party applications (apps) are used by users. The increasing diversification of apps also leads to style diversification; for the deletion operation of various objects with extremely high user use frequency in the App use process, the positioning of menu items is uncertain, even the menu items are nested too much or other various inconveniences occur, the convenience of App interaction of the user is directly influenced, and the operation is one of the factors which always troubles the user experience.
Disclosure of Invention
In view of the above, the present invention is directed to a method and an apparatus for processing an object in an application, and a touch screen terminal, which can provide a simple touch gesture for a user to complete an operation process of the object provided with an operation function in the application, such as a delete operation, a move operation, a copy operation, or a cut operation, and can better improve user experience.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the invention provides a method for processing an object in an application, which comprises the following steps:
receiving a first gesture operation and a second gesture operation, and respectively analyzing the first gesture operation and the second gesture operation into a first operation instruction and a second operation instruction; the first gesture operation and the second gesture operation are gesture operations for an object in an application;
one or more objects of the application are selected according to the first operation instruction, and the operation of the one or more selected objects is executed according to the second operation instruction.
In the above solution, the executing the operation of the selected one or more objects according to the second operation instruction includes: and executing the deleting operation, the moving operation, the copying operation or the cutting operation of the selected one or more objects according to the second operation instruction.
In the foregoing solution, before receiving the first gesture operation and the second gesture operation, the method further includes:
setting a corresponding relation between a first gesture and a first operation instruction; the corresponding relation is used for analyzing the first gesture operation into a first operation instruction;
and/or setting a corresponding relation between the second gesture and the second operation instruction; the corresponding relation is used for analyzing the second gesture operation into a second operation instruction.
In the foregoing solution, before receiving the first gesture operation and the second gesture operation, the method further includes: setting an operation attribute in an object of the application;
the operational attributes include at least one of the following attributes: delete operation attribute, move operation attribute, copy operation attribute, cut operation attribute.
In the above scheme, the setting operation attribute is that an interface for calling the application object is added to the application object, and the interface is used for operating the application object;
the object operating the application comprises one of: deleting the object of the application, moving the object of the application, copying the object of the application, and cutting the object of the application.
In the above solution, after the performing the operation of the selected one or more objects according to the second operation instruction, the method further includes: and outputting operation ending information.
In the above solution, the first gesture operation is a gesture operation for selecting one or more objects of the application;
the second gesture operation is a gesture operation that performs an operation of the selected one or more objects.
In the above scheme, the second gesture operation is a received graphical gesture operation.
In the above solution, the first gesture operation includes at least one of: a continuous swipe gesture, a single tap gesture;
the second gesture operation includes at least one of: fork gestures, arrow-type gestures, circular gestures.
The invention also provides a device for processing the object in the application, which comprises: the device comprises an input module and an operation management module; wherein,
the input module is used for receiving a first gesture operation and a second gesture operation, analyzing the first gesture operation and the second gesture operation into a first operation instruction and a second operation instruction respectively, and sending the first operation instruction and the second operation instruction to the operation management module; the first gesture operation and the second gesture operation are gesture operations for an object in an application;
the operation management module is used for selecting one or more objects of the application according to the first operation instruction and executing the operation of the selected one or more objects according to the second operation instruction.
In the foregoing solution, the operation management module is specifically configured to execute a delete operation, a move operation, a copy operation, or a cut operation of the selected one or more objects according to the second operation instruction.
In the above scheme, the apparatus further comprises: the setting module is used for setting the corresponding relation between the first gesture and the first operation instruction; the corresponding relation is used for analyzing the first gesture operation into a first operation instruction; and/or the corresponding relation between the second gesture and the second operation instruction; the corresponding relation is used for analyzing the second gesture operation into a second operation instruction.
In the above scheme, the setting module is further configured to set an operation attribute in the object of the application; the operational attributes include at least one of the following attributes: delete operation attribute, move operation attribute, copy operation attribute, cut operation attribute.
In the above scheme, the setting module sets the operation attribute as adding an interface for calling the application object in the application object, where the interface is used to operate the application object;
the object operating the application comprises one of: deleting the object of the application, moving the object of the application, copying the object of the application, and cutting the object of the application.
In the foregoing solution, the apparatus further includes an output module, configured to output operation end information after the operation of the selected one or more objects is performed according to the second operation instruction.
The invention also provides a touch screen terminal which comprises the processing device of the object in the application.
The method for processing the object in the application receives a first gesture operation and a second gesture operation, and analyzes the first gesture operation and the second gesture operation into a first operation instruction and a second operation instruction respectively; the first gesture operation and the second gesture operation are gesture operations for an object in an application; selecting one or more objects of the application according to the first operation instruction, and deleting the selected one or more objects according to the second operation instruction; therefore, simple touch gestures can be provided for a user to complete operation processing of an object provided with an operation function in an application, such as deletion operation, movement operation, copy operation, cutting operation and the like, so that the user does not need to stop experience for searching for a menu item of the operation function, and the experience effect of the user is greatly improved.
Drawings
FIG. 1 is a schematic flow chart of a method for processing objects in the application of the present invention;
fig. 2 is a schematic flowchart of a method for processing an object in an application according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a delete gesture provided in accordance with an embodiment of the present invention;
fig. 4 is a schematic diagram of a structure of a processing apparatus for processing an object in application of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
Fig. 1 is a schematic flow chart of an object processing method in application of the present invention, as shown in fig. 1, including the following steps:
step 101: receiving a first gesture operation and a second gesture operation;
here, the first gesture operation and the second gesture operation are gesture operations for an object in an application;
further, the first gesture operation is a gesture operation of selecting one or more objects of the application;
the second gesture operation is a gesture operation for performing an operation of the selected one or more objects;
wherein the second gesture operation is a received graphical gesture operation; to distinguish a first gesture operation for selecting one or more objects of the application;
specifically, the first gesture operation includes at least one of: a continuous swipe gesture, a single tap gesture;
the second gesture operation includes at least one of: fork gestures, arrow-type gestures, circular gestures.
Further, before receiving the first gesture operation and the second gesture operation, the method further includes:
setting a corresponding relation between the first gesture and the second operation instruction; the corresponding relation is used for analyzing the first gesture operation into a first operation instruction;
setting a corresponding relation between the second gesture and the second operation instruction; the corresponding relation is used for analyzing the second gesture operation into a second operation instruction.
Here, the first gesture operation may be a single-click gesture operation of selecting objects of one or more applications, a gesture operation of delineating a region by a certain gesture, selecting objects of one or more applications of the region, or the like; for example, an object of an application touched by a continuously-sliding contact is selected as an object of the first operation instruction, or an object within a range defined by the continuously-sliding contact is selected as an object of the first operation instruction, or an object within a rectangular range formed by a diagonal line formed by a straight line formed by a leading contact and a trailing contact of the continuously-sliding contact is selected as an object of the first operation instruction, or an object within a rectangular range formed by a diameter formed by a straight line formed by a leading contact and a trailing contact of the continuously-sliding contact is selected as an object of the first operation instruction.
The second gesture operation is an operation gesture set by the user, and may be a fork (x) gesture operation, an arrow-type gesture operation, a circular gesture operation, or the like.
Further, before receiving the first gesture operation and the second gesture operation, the method further includes: setting an operation attribute in an object of the application;
the operational attributes include at least one of the following attributes: delete operation attribute, move operation attribute, copy operation attribute, cut operation attribute.
Further, the setting operation attribute is that an interface for calling the application object is added to the application object, and the interface is used for operating the application object;
the object operating the application comprises one of: deleting the object of the application, moving the object of the application, copying the object of the application, and cutting the object of the application;
specifically, the processing procedure of the object can be completed by calling the operation processing function corresponding to the object to be operated.
Step 102: analyzing the first gesture operation and the second gesture operation into a first operation instruction and a second operation instruction respectively;
here, since the correspondence relationship of the first gesture and the object of the application selected to be deleted has been set before the deletion process; and, the correspondence of the second gesture to the deletion of the selected object; the first gesture operation and the second gesture operation can be analyzed into a first operation instruction and a second operation instruction respectively according to the corresponding relation.
Step 103: one or more objects of the application are selected according to the first operation instruction, and the operation of the selected one or more objects is executed according to the second operation instruction;
further, the executing the one or more selected objects according to the second operation instruction includes: and executing the deleting operation, the moving operation, the copying operation or the cutting operation of the selected one or more objects according to the second operation instruction.
Further, after the operation of the selected one or more objects is performed according to the second operation instruction, the method further includes: and outputting operation ending information.
The following further illustrates the technical solution of the present invention by taking the deletion operation as an example.
Fig. 2 is a schematic flowchart of a processing method of an object in an application according to an embodiment of the present invention, as shown in fig. 2, including the following steps:
step 201: a user runs a certain application and enters a certain object browsing interface in the application;
step 202: the user wants to delete an object in the object browsing interface, and selects the object through a first gesture;
specifically, a region may be defined by the first gesture, and a certain object or a plurality of objects in the region may be selected.
Step 203: the user marks out a second gesture in any area of the touch screen;
specifically, the second gesture may be set before the task is deleted, any gesture may be used as the second gesture, and specifically, the gesture shown in fig. 3 may be used as the second gesture.
Step 204: the touch screen terminal acquires the first gesture operation and the second gesture operation, and the first gesture operation and the second gesture operation are respectively analyzed into a first operation instruction and a second operation instruction;
here, the first gesture command is to select one or more objects of the application; the second gesture instruction is to delete the selected one or more objects.
Step 205: the touch screen terminal acquires the selected object according to the first operation instruction and completes the deletion processing of the selected object according to the second operation instruction;
here, the touch screen terminal may complete the deletion processing of the current object by calling a deletion application interface of the object of the application;
further, before executing the deletion process of the current object, the method further includes: the method comprises the steps that a touch screen terminal inquires whether an object to be deleted is provided with a deletion attribute or not, namely whether the object is associated with a deletion application interface or not, and if the object is provided with the deletion attribute, deletion processing of the current object is completed by calling the deletion application interface of the object; otherwise, directly returning to the end of deletion.
Step 206: and outputting the deletion end information.
After the ending information is output, the touch screen terminal closes the current acquisition deleting task and enters a normal state so as to perform the next gesture acquisition working state.
Further, for other operations, for example: the moving operation, the copying operation, or the cutting operation, which is the same as the above-mentioned deleting operation, is performed by selecting one or more objects through the first gesture operation, and performing the moving operation, the copying operation, or the cutting operation of the selected one or more objects through the second gesture operation.
Fig. 4 is a schematic diagram of a structure of a processing apparatus for processing an object in application of the present invention, as shown in fig. 4, the apparatus includes: an input module 41 and an operation management module 42; wherein,
the input module 41 is configured to, after receiving a first gesture operation and a second gesture operation, analyze the first gesture operation and the second gesture operation into a first operation instruction and a second operation instruction respectively, and send the first operation instruction and the second operation instruction to the operation management module 42; the first gesture operation and the second gesture operation are gesture operations for an object in an application;
the operation management module 42 is configured to select one or more objects of the application according to the first operation instruction, and execute an operation of the selected one or more objects according to the second operation instruction.
Here, the input module 11 is configured to receive a gesture operation input through a touch screen, and the input module may receive the input gesture operation through a standard Application Programming Interface (API), that is, the input gesture operation is received in a generalized and normalized Interface mode based on a platform;
after receiving the second operation instruction sent by the input module 11, the operation management module 12 calls an application interface of an object operating the application in the selected object, and executes an operation of the corresponding one or more objects.
Further, the operation management module 12 is specifically configured to execute a delete operation, a move operation, a copy operation, or a cut operation of the selected one or more objects according to the second operation instruction.
Further, the apparatus further comprises: a setting module 43, configured to set a corresponding relationship between the first gesture and the first operation instruction; the corresponding relation is used for analyzing the first gesture operation into a first operation instruction; and/or the corresponding relation between the second gesture and the second operation instruction; the corresponding relation is used for analyzing the second gesture operation into a second operation instruction.
Further, the setting module 43 is further configured to set an operation attribute in the object of the application; the operational attributes include at least one of the following attributes: delete operation attribute, move operation attribute, copy operation attribute, cut operation attribute.
Further, the setting module 43 sets the operation attribute as that an application interface for calling the application object is added to the application object, and the interface is used for operating the application object;
the object operating the application comprises one of: deleting the object of the application, moving the object of the application, copying the object of the application, and cutting the object of the application.
Here, the setting module 43 is an application interface that is set by the touch screen terminal to provide an operation for executing an object when the object is created in the APP development stage, and when the object of the application is created, a process name capable of implementing an operation function is added to an attribute of the object having an operation requirement, where the operation includes at least one of: delete operation, move operation, copy operation, cut operation.
Further, the apparatus further includes an output module 44, configured to output operation end information after performing the operation on the selected one or more objects according to the second operation instruction.
Those skilled in the art will understand that the functions implemented by the processing modules in the processing device of the object in the application shown in fig. 4 can be understood by referring to the related description of the processing method of the object in the application. Those skilled in the art will understand that the functions of each module unit in the processing device of the object in the application shown in fig. 4 can be realized by a program running on a processor, and can also be realized by a specific logic circuit.
The invention also discloses a touch screen terminal which comprises a processing device of the object in the application shown in the figure 4.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present invention are included in the protection scope of the present invention.

Claims (16)

CN201310176096.2A2013-05-132013-05-13Processing method and device for objects in application and touch screen terminalPendingCN104156146A (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
CN201310176096.2ACN104156146A (en)2013-05-132013-05-13Processing method and device for objects in application and touch screen terminal
PCT/CN2013/079607WO2013167045A2 (en)2013-05-132013-07-18Method and device for processing object in application, and touchscreen terminal

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201310176096.2ACN104156146A (en)2013-05-132013-05-13Processing method and device for objects in application and touch screen terminal

Publications (1)

Publication NumberPublication Date
CN104156146Atrue CN104156146A (en)2014-11-19

Family

ID=49551360

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201310176096.2APendingCN104156146A (en)2013-05-132013-05-13Processing method and device for objects in application and touch screen terminal

Country Status (2)

CountryLink
CN (1)CN104156146A (en)
WO (1)WO2013167045A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN106055232A (en)*2016-05-252016-10-26维沃移动通信有限公司Message processing method and mobile terminal
WO2021003730A1 (en)*2019-07-112021-01-14深圳市柔宇科技有限公司Touch device and input control method therefor

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
KR20100034218A (en)*2008-09-232010-04-01엘지전자 주식회사Apparatus and method for controlling icon display of touch screen
CN102750105A (en)*2012-06-292012-10-24宇龙计算机通信科技(深圳)有限公司Terminal and managing method of touch-control track
CN102760006A (en)*2012-03-262012-10-31联想(北京)有限公司Method and device for determining operands
CN102955668A (en)*2011-08-292013-03-06联想(北京)有限公司Method for selecting objects and electronic equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101593060B (en)*2009-07-062012-10-03友达光电股份有限公司 Touch operation method and operation method of electronic device
CN102262471A (en)*2010-05-312011-11-30广东国笔科技股份有限公司Touch intelligent induction system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
KR20100034218A (en)*2008-09-232010-04-01엘지전자 주식회사Apparatus and method for controlling icon display of touch screen
CN102955668A (en)*2011-08-292013-03-06联想(北京)有限公司Method for selecting objects and electronic equipment
CN102760006A (en)*2012-03-262012-10-31联想(北京)有限公司Method and device for determining operands
CN102750105A (en)*2012-06-292012-10-24宇龙计算机通信科技(深圳)有限公司Terminal and managing method of touch-control track

Cited By (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN106055232A (en)*2016-05-252016-10-26维沃移动通信有限公司Message processing method and mobile terminal
CN106055232B (en)*2016-05-252019-06-07维沃移动通信有限公司A kind of processing method and mobile terminal of message
WO2021003730A1 (en)*2019-07-112021-01-14深圳市柔宇科技有限公司Touch device and input control method therefor
CN113366423A (en)*2019-07-112021-09-07深圳市柔宇科技股份有限公司Touch control equipment and input control method thereof

Also Published As

Publication numberPublication date
WO2013167045A3 (en)2014-04-10
WO2013167045A2 (en)2013-11-14

Similar Documents

PublicationPublication DateTitle
US11301126B2 (en)Icon control method and terminal
CN102654814B (en)Method and device for calling functions in application as well as electronic equipment
CN103777947B (en)The management method at the main interface of a kind of mobile terminal and device
US20150106766A1 (en)Method for quickly operating file of smart phone and smart phone
CN107066188B (en)A kind of method and terminal sending screenshot picture
CN102968277A (en)Method and device for deleting or cutting file based on touch screen
US9423927B2 (en)Managing user interface elements using gestures
CN105320544A (en) Application program uninstall method and device
CN104158972A (en)Method for calling third-party application in conversation process and user terminal
KR20140097810A (en)Method for controlling layout and an electronic device thereof
WO2016188261A1 (en)Method and apparatus for switching multiple folders, and computer storage medium
CN106528156B (en)A kind of page data processing method and device
WO2016127577A1 (en)Event processing method and apparatus
CN104731439A (en)Gesture packaging and task executing method and device
CN102855064B (en)A kind of method of functional control help document of quick display application program
CN103279276B (en)The method of locating information and device
CN113485853A (en)Information interaction method and device and electronic equipment
CN104580704B (en)Method and device for viewing details of short messages
CN104156146A (en)Processing method and device for objects in application and touch screen terminal
CN103218161B (en)Mobile terminal operation method and system based on multi-point touch
CN106066874B (en) Object handling methods and terminals
CN109213980B (en)Method and device for editing presentation file and computer readable storage medium
CN104808895A (en)Icon arranging method
CN103473030A (en)Method and device for processing instructions and terminal equipment
CN106155473A (en)A kind of terminal applies processing method and device thereof

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
WD01Invention patent application deemed withdrawn after publication
WD01Invention patent application deemed withdrawn after publication

Application publication date:20141119


[8]ページ先頭

©2009-2025 Movatter.jp