Movatterモバイル変換


[0]ホーム

URL:


CN103324329B - A kind of method of toch control and device - Google Patents

A kind of method of toch control and device
Download PDF

Info

Publication number
CN103324329B
CN103324329BCN201210080723.8ACN201210080723ACN103324329BCN 103324329 BCN103324329 BCN 103324329BCN 201210080723 ACN201210080723 ACN 201210080723ACN 103324329 BCN103324329 BCN 103324329B
Authority
CN
China
Prior art keywords
area
control
control command
object displayed
gesture input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210080723.8A
Other languages
Chinese (zh)
Other versions
CN103324329A (en
Inventor
田艳军
高峰
程雪涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing LtdfiledCriticalLenovo Beijing Ltd
Priority to CN201210080723.8ApriorityCriticalpatent/CN103324329B/en
Publication of CN103324329ApublicationCriticalpatent/CN103324329A/en
Application grantedgrantedCritical
Publication of CN103324329BpublicationCriticalpatent/CN103324329B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Landscapes

Abstract

The present invention relates to touch control field, particularly relate to a kind of method of toch control and device, described method is applied to the electronic equipment with touch screen, and described method includes: touch screen is divided into first area and second area;Wherein, described first area at least has display function, and described second area at least has touch function;Detection user inputs in the gesture of described second area;The object executive control operation that described first area is shown is inputted according to described gesture.In the method that the embodiment of the present invention provides, owing to touch display screen being divided into first area and second area, first area is used for showing image, and user performs gesture operation at second area, thus without the sight line blocking user, user is hindered to watch the image in first area.On the other hand, realize corresponding touch control by detecting the gesture input of user, convenient and swift, improve the experience of user.

Description

Touch control method and device
Technical Field
The present invention relates to the field of touch control, and in particular, to a touch control method and apparatus.
Background
When using electronic devices such as mobile phones, cameras, PADs, etc., it is often necessary to perform certain controls with keys, such as zooming by key pressing, zooming in/out of files, pictures by direction keys, etc. Taking zoom as an example, when a camera or an electronic device with a camera module is used, a situation that a zoom operation is required on a shooting object is often encountered. Currently, on an electronic device having a camera module, it is common to preview an object to be photographed by touching a finder frame and to implement zooming by a key. When the user needs to perform zooming operation, the corresponding function key is pressed, and the zooming-in or zooming-out of the focal length can be realized. However, the key zooming mode generally has a limited zooming distance, and when a user presses a function key once to perform zooming for a certain time, the operation mode is rough, and when the user needs to jump zooming, the user needs to press the function key for many times to achieve an ideal effect, and the operation is not fast and convenient enough. In the prior art, a mode of displaying an indicator bar in a touch view-finding frame for zooming also exists, but when the mode is used for zooming, the finger occupies a larger position during the operation of a user, so that the preview sight of the user is often shielded, and the user experience is low.
Disclosure of Invention
In order to solve the technical problem, embodiments of the present invention provide a touch control method, which can quickly and conveniently implement touch control, does not block the sight of a user, and improves the user experience. The technical scheme is as follows:
in one aspect, an embodiment of the present invention provides a touch control method applied to an electronic device, where the electronic device has a touch screen, and the method includes:
dividing the touch screen into a first area and a second area; the first area at least has a display function, and the second area at least has a touch function;
detecting gesture input of a user in the second area;
and executing control operation on the object displayed in the first area according to the gesture input.
Preferably, the performing of the control operation on the object displayed in the first area according to the gesture input includes:
acquiring a control instruction corresponding to the gesture input according to a preset corresponding relation between the gesture input and the control instruction;
and executing control operation on the object displayed in the first area according to the control instruction.
Preferably, the method further comprises:
detecting the relative positions of the first area and the second area to obtain a detection result;
the performing of the control operation on the object displayed in the first area according to the gesture input is:
and executing control operation on the object displayed in the first area according to the detection result and the gesture input.
Preferably, the performing of the control operation on the object displayed in the first area according to the detection result and the gesture input is to:
when the detection result shows that the second area is located on the right side or below the first area, acquiring a first control command according to the acquired gesture input corresponding to the first direction, and executing control operation on an object displayed in the first area according to the first control command;
when the detection result shows that the second area is positioned on the left side or the upper side of the first area, acquiring a second control command according to the acquired gesture input corresponding to the second direction, and executing control operation on the object displayed in the first area according to the second control command;
wherein the first direction and the second direction are opposite directions, and the first control command is the same as the second control command.
Preferably, the performing of the control operation on the object displayed in the first area according to the detection result and the gesture input is to:
when the detection result shows that the second area is positioned on the right side or below the first area, acquiring a positive control command corresponding to the gesture input according to a preset corresponding relation between the input gesture and the control command; executing control operation on the object displayed in the first area according to the positive control command;
when the detection result shows that the second area is positioned on the left side or the upper side of the first area, acquiring a control command opposite to the control command corresponding to the gesture input according to a preset corresponding relation between the input gesture and the control command; and executing control operation on the object displayed in the first area according to the control command opposite to the control command corresponding to the gesture input.
Preferably, the performing of the control operation on the object displayed in the first area according to the gesture input is:
acquiring a sliding distance corresponding to the gesture input;
and acquiring a change coefficient according to the sliding distance, and executing control operation on the object displayed in the first area according to the acquired change coefficient.
Preferably, the performing of the control operation on the object displayed in the first area includes any one of:
performing a zoom operation on an object displayed in the first area;
performing an enlarging or reducing operation on the object displayed in the first area;
performing page turning operation on the object displayed in the first area;
and executing a rotation operation on the object displayed in the first area.
On the other hand, the embodiment of the invention also discloses a touch control device, which is applied to electronic equipment, wherein the electronic equipment is provided with a touch screen, and the device comprises:
the touch screen comprises a dividing unit, a processing unit and a display unit, wherein the dividing unit is used for dividing the touch screen into a first area and a second area; the first area at least has a display function, and the second area at least has a touch function;
the first detection unit is used for detecting gesture input of a user in the second area;
and the control unit is used for executing control operation on the object displayed in the first area according to the gesture input.
Preferably, the control unit is:
the first control unit is used for acquiring a control instruction corresponding to the gesture input according to the corresponding relation between the preset gesture input and the control instruction; and executing control operation on the object displayed in the first area according to the control instruction.
Preferably, the apparatus further comprises:
the second detection unit is used for detecting the relative positions of the first area and the second area and acquiring a detection result;
the control unit is as follows:
and the second control unit is used for executing control operation on the object displayed in the first area according to the detection result and the gesture input.
Preferably, the second control unit includes:
the first control subunit is used for acquiring a first control command according to the acquired gesture input corresponding to the first direction when the detection result shows that the second area is positioned on the right side or below the first area, and executing control operation on an object displayed in the first area according to the first control command;
the second control subunit is used for acquiring a second control command according to the acquired gesture input corresponding to the second direction when the detection result shows that the second area is positioned on the left side or the upper side of the first area, and executing control operation on the object displayed in the first area according to the second control command;
wherein the first direction and the second direction are opposite directions, and the first control command is the same as the second control command.
Preferably, the second control unit includes:
the third control subunit is used for acquiring a positive control command corresponding to the gesture input according to a preset corresponding relation between the input gesture and the control command when the detection result shows that the second area is positioned on the right side or below the first area; executing control operation on the object displayed in the first area according to the positive control command;
the fourth control subunit is configured to, when the detection result indicates that the second area is located on the left side or above the first area, obtain, according to a preset correspondence between an input gesture and a control command, a control command opposite to the control command corresponding to the gesture input; and executing control operation on the object displayed in the first area according to the control command opposite to the control command corresponding to the gesture input.
Preferably, the control unit is:
the third control unit is used for acquiring a sliding distance corresponding to the gesture input; and acquiring a change coefficient according to the sliding distance, and executing control operation on the object displayed in the first area according to the acquired change coefficient.
Preferably, the control unit comprises any one of the following:
a fourth control unit configured to perform a zoom operation on the object displayed in the first area;
a fifth control unit configured to perform an enlargement or reduction operation on the object displayed in the first area;
a sixth control unit, configured to perform a page turning operation on an object displayed in the first region;
a seventh control unit configured to perform a rotation operation on the object displayed in the first area.
The embodiment of the invention has the beneficial effects that: the embodiment of the invention provides a touch control method, which comprises the steps of dividing a touch display screen into a first area and a second area, and detecting gesture input of a user in the second area; and executing control operation on the object displayed in the first area according to the gesture input. The touch display screen is divided into the first area and the second area, the first area is used for displaying images, and the user performs gesture operation in the second area, so that the sight of the user is not shielded, and the user is prevented from watching the images in the first area. On the other hand, corresponding touch control is realized by detecting the gesture input of the user, the method is convenient and fast, and the user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flowchart illustrating a touch control method according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating a touch control method according to a second embodiment of the present invention;
FIG. 3 is a flowchart illustrating a touch control method according to a third embodiment of the present invention;
fig. 4 is a schematic diagram of a touch control device according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a touch control method, which can quickly and conveniently realize touch control, does not shield the sight of a user and improves the experience of the user.
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiment of the present invention will be clearly and completely described below with reference to the drawings in the embodiment of the present invention, and it is obvious that the described embodiment is only a part of the embodiment of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a flowchart of a touch control method according to a first embodiment of the present invention is shown.
The method provided by the embodiment of the invention is applied to the electronic equipment with the touch screen, and the electronic equipment can be a mobile phone, a camera, a PAD and the like.
S101, dividing the touch screen into a first area and a second area; the first area at least has a display function, and the second area at least has a touch function.
In the embodiment of the invention, the electronic equipment is provided with the touch screen, and the touch screen of the electronic equipment is divided into a first area and a second area. The first area at least has a display function, that is, the first area is an image display area for displaying and previewing information such as graphics, images and characters. For example, when taking a picture, the first region is used to display and preview an image in the finder frame. As another example, the first region may display a photograph, an electronic book, a video file, and the like. The second area at least has a touch function and can accept touch gesture input of a user. Generally, the second region is a non-image display region. Therefore, the touch screen is divided into an image display area and a non-image display area, so that a user can watch and preview images in the image display area, and touch operation is performed in the non-image display area, therefore, the sight of the user is not shielded during operation, and the touch screen is convenient and quick.
It will be appreciated by those skilled in the art that the touch screen may also be arranged such that the first region is a main image display region and the second region is a sub image region. The sub image display region may have a display function in addition to the touch function, but the displayed object is different from the object displayed in the first region. Specifically, the user can preview, view, etc. a through image, a photograph, an electronic book, a video, etc. in the main image display area; in the sub-image display area, operation icons, prompt information, and the like for controlling objects in the main image display area may be displayed.
And S102, detecting the gesture input of the user in the second area.
As mentioned above, the second area has at least a touch function and can accept gesture input of the user. And the user performs control operation on the object displayed in the first area through gesture input in the second area.
S103, executing control operation on the object displayed in the first area according to the gesture input.
And after the gesture input of the user is acquired, executing control operation on the object displayed in the first area according to the gesture input. Specifically, step S103 may be implemented by the following steps:
S103A, acquiring a control instruction corresponding to the gesture input according to the corresponding relation between the preset gesture input and the control instruction.
The corresponding relation between the gesture input and the control instruction can be preset, and different gesture inputs correspond to different control instructions. For example, when the gesture input of the user is detected to be sliding from top to bottom, a zoom-out operation on the object displayed in the first area is correspondingly performed; for another example, when the gesture input of the user is detected to be sliding from bottom to top, the corresponding operation is to perform a zoom-in operation on the object displayed in the first area. The corresponding relation between the gesture input and the control instruction can be preset by a system or can be customized by a user.
Further, in different application scenarios, the obtained corresponding relationship between the gesture input and the control instruction may be different. That is to say, the system may determine the application scene where the electronic device is located, and then further obtain the corresponding relationship between the gesture input and the control instruction, so as to perform the control operation on the object displayed in the first area. For example, when it is detected that the electronic device is in a photographing mode and then it is detected that the gesture input of the user is sliding from top to bottom, the obtained control instruction may be to perform a focus reduction operation; when it is detected that the electronic device is in an electronic book watching mode and the gesture input of the user is sliding from top to bottom, the obtained control instruction may be to perform a page turning operation. That is, when the electronic device is in different application scenarios, the control commands obtained by executing the same gesture input may be different.
And S103B, executing control operation on the object displayed in the first area according to the control instruction.
Specifically, step S103B may include any one of the following cases:
performing a zoom operation on an object displayed in the first area;
performing an enlarging or reducing operation on the object displayed in the first area;
performing page turning operation on the object displayed in the first area;
and executing a rotation operation on the object displayed in the first area.
The above is only a preferred embodiment of the present invention, and the present invention is not limited to the manner of executing the control operation, and other implementations obtained by those skilled in the art without inventive labor are within the scope of the present invention.
In the method provided by the first embodiment of the present invention, the touch screen of the electronic device is divided into the first area and the second area, the first area is used for displaying an image, and the second area is used for receiving a touch gesture input. And moreover, the control operation is realized by using touch input, and the method is convenient and quick.
Referring to fig. 2, a flowchart of a touch control method according to a second embodiment of the present invention is shown.
S201, dividing the touch screen into a first area and a second area.
In the embodiment of the invention, the electronic equipment is provided with the touch screen, and the touch screen of the electronic equipment is divided into a first area and a second area. The first area at least has a display function, that is, the first area is an image display area for displaying and previewing information such as graphics, images and characters. The second area at least has a touch function and can accept touch gesture input of a user.
S202, detecting gesture input of the user in the second area.
As mentioned above, the second area has at least a touch function and can accept gesture input of the user. And the user performs control operation on the object displayed in the first area through gesture input in the second area.
Specifically, the detecting of the gesture input of the user in the second area includes: the method comprises the steps of obtaining a plurality of touch points corresponding to gesture input of a user, and obtaining one or more of the direction, the sliding distance, the position coordinate information of a starting point (the first touch point corresponding to the gesture input), the position coordinate information of an ending point (the last first touch point corresponding to the gesture input) and the like corresponding to the gesture input according to the touch points.
S203, detecting the relative position of the first area and the second area, and acquiring a detection result.
The relative positions of the first region and the second region are specifically: the second region is located to the right, left, above, below, etc. of the first region. In the second embodiment of the invention, the use habit and the screen direction of the user can be judged by detecting the relative position relationship between the first area and the second area, so that the corresponding control instruction can be adaptively adjusted. For example, when the user uses the electronic device in the photographing mode, the right-handed user is often used to place the electronic device laterally and the second area is located at the right side of the first area, so that the user can conveniently perform a touch operation in the second area using the right hand. And a left-handed user is usually used to place the electronic device horizontally and locate the second area at the left side of the first area, so that the user can conveniently perform a touch operation in the second area by using the left hand. In order to enable a user to obtain better experience, so that the same touch gesture operation, for example, sliding from left to right, of the user can achieve the same operation effect even when different use habits and screen directions are different, at this time, the relative position relationship between the first area and the second area needs to be detected, and generally, the control operation can be executed according to the detection result.
It will be understood by those skilled in the art that the order of step S202 and step S203 may be reversed or performed simultaneously, and the present invention is not limited thereto.
And S204, executing control operation on the object displayed in the first area according to the detection result and the gesture input.
In order to enable the user to achieve the same operation effect in different screen directions (the relative positions of the first area and the second area are different) when performing the same gesture input (the same from the perspective of the user, for example, sliding from left to right, sliding from top to bottom, etc.), the second embodiment of the present invention can adaptively adjust the control operation by detecting the relative position relationship of the first area and the second area.
Specifically, when the relative position relationship between the first area and the second area is detected to be different, different corresponding rules of the gesture input and the control instruction may be set to adjust the control instruction. In the following, a specific example is described, and it can be understood by those skilled in the art that different rules for corresponding gesture inputs and control commands can be set according to actual needs. For example, when it is detected that the second area is located on the right side of the first area (which may also be referred to as "right-hand mode"), a sliding operation from left to right (from the perspective of the user) by the user may be set, that is, a gesture sliding from the direction of the first area to the direction of the second area inputs a corresponding control instruction as a focus increasing operation; when the second area is detected to be located on the left side of the first area (which may also be referred to as a "left-hand mode"), the user performs a sliding operation from left to right (from the perspective of the user), that is, the gesture input corresponding to the sliding operation from the second area to the first area is a focus increasing operation. Of course, the embodiment of the present invention is not limited to the above example, and when the electronic device is placed horizontally, the control command corresponding to the sliding of the user from top to bottom may also be set as the focal length reduction operation, and the control command corresponding to the sliding of the user from bottom to top is the focal length increase operation; likewise, the opposite arrangement is also possible. Other implementations obtained by persons skilled in the art without inventive effort are within the scope of the present invention.
Optionally, another possible adaptive adjustment method is as follows: and when the relative position relation between the first area and the second area is detected to accord with a set condition, acquiring a control instruction opposite to the control instruction according to the corresponding relation between the preset gesture input and the control instruction. That is, the control instructions executed may be opposite when the screen directions are different.
The two implementations described above are explained in detail below.
Specifically, step S204 can be specifically implemented by the following steps:
when the detection result shows that the second area is located on the right side or below the first area, acquiring a first control command according to the acquired gesture input corresponding to the first direction, and executing control operation on an object displayed in the first area according to the first control command; when the detection result shows that the second area is positioned on the left side or the upper side of the first area, acquiring a second control command according to the acquired gesture input corresponding to the second direction, and executing control operation on the object displayed in the first area according to the second control command; wherein the first direction and the second direction are opposite directions, and the first control command is the same as the second control command.
The examples are given by way of illustration and are not to be construed as limiting the invention. We can preset the following correspondence: (1) when the second area is positioned on the right side or below the first area, corresponding to the focal length amplifying operation when the direction of detecting the gesture input slides from the direction of the first area to the direction of the second area; and when the direction of detecting the gesture input slides from the second area direction to the first area direction, zooming out operation is performed correspondingly. (2) When the second area is positioned on the left side or the upper side of the first area, corresponding to the focal length reduction operation when the direction of detecting the gesture input slides from the direction of the first area to the direction of the second area; and when the direction of detecting the gesture input slides from the direction of the second area to the direction of the first area, zooming in operation is performed correspondingly.
Taking a photo as an example, when a user having a right-handed habit uses the electronic device in the photo mode, it is common practice to place the electronic device laterally and the second region is located at the right side of the first region, and at this time, the user performs a sliding operation from left to right (for the user), which is referred to as a first gesture input. At this time, the system detects that the first direction corresponding to the first gesture input slides from the first area direction to the second area direction, at this time, according to the preset corresponding relation between the gesture input and the control instruction, the obtained control instruction is to execute a focal length amplification operation, and at this time, a zoom operation, specifically, a focal length amplification operation, is executed on the image displayed in the first area. A left-handed user is usually used to place the electronic device horizontally and the second area is located at the left side of the first area, and when the user also performs a sliding operation from left to right (for the user), it is called a second gesture input. And when the system detects that the second direction corresponding to the second gesture input slides from the direction of the second area to the direction of the first area, the acquired control instruction is to execute the focal length amplification operation according to the preset corresponding relation between the gesture input and the control instruction, and at this time, the focal length amplification operation is executed on the image displayed in the first area. As can be seen from the above example, the user performs the gesture input under different operation habits (corresponding to different relative positions of the first area and the second area), although the first direction and the second direction are opposite, the first control command and the second control command are the same, and the same control effect is achieved. Therefore, the method provided by the embodiment of the invention can self-adaptively adjust the corresponding control instruction by detecting the relative position relationship between the first area and the second area, so that the same gesture input in the angle of the user can achieve the same control effect.
Another possible implementation manner of step S204 is:
when the detection result shows that the second area is positioned on the right side or below the first area, acquiring a positive control command corresponding to the gesture input according to a preset corresponding relation between the input gesture and the control command; executing control operation on the object displayed in the first area according to the positive control command; when the detection result shows that the second area is positioned on the left side or the upper side of the first area, acquiring a control command opposite to the control command corresponding to the gesture input according to a preset corresponding relation between the input gesture and the control command; and executing control operation on the object displayed in the first area according to the control command opposite to the control command corresponding to the gesture input.
The corresponding relation between the gesture input and the control instruction is preset, and when the second area is detected to be positioned on the right side or the lower side of the first area, the control instruction which is the same as the control instruction is executed, namely, a positive control instruction; and when the second area is detected to be positioned at the left side or the upper side of the first area, executing a control instruction opposite to the control instruction, namely an inverse control instruction. Therefore, users with different use habits can execute the same gesture input from the perspective of the users, and the same control effect can be achieved.
For example, we can preset the following correspondence relationship: (1) detecting that the gesture input direction corresponds to a focal length amplification operation when the gesture input direction slides from the first area direction to the second area direction; (2) and when the direction of detecting the gesture input slides from the second area direction to the first area direction, zooming out operation is performed correspondingly.
Taking a photo as an example, when a user having a right-handed habit uses the electronic device in the photo mode, it is common practice to place the electronic device laterally and the second region is located at the right side of the first region, and at this time, the user performs a sliding operation from left to right (for the user), which is referred to as a first gesture input. And when the system detects that the first direction corresponding to the first gesture input slides from the first area direction to the second area direction, the system executes the focal length amplification operation when the acquired control instruction is a positive control instruction according to the corresponding relation between the preset gesture input and the control instruction, and executes the focal length amplification operation on the object displayed in the first area. A left-handed user is usually used to place the electronic device horizontally and the second area is located at the left side of the first area, and when the user also performs a sliding operation from left to right (for the user), it is called a second gesture input. At this time, the system detects that the second direction corresponding to the second gesture input slides from the second area direction to the first area direction, at this time, according to the preset corresponding relation between the gesture input and the control instruction, the control instruction corresponding to the gesture input is to execute the focal length reduction operation, at this time, the control instruction opposite to the control instruction, namely the focal length enlargement operation, is obtained, and the focal length enlargement operation is executed on the display object in the first area. As can be seen from the above example, the method provided by the embodiment of the present invention can adaptively adjust the corresponding control instruction by detecting the relative position relationship between the first region and the second region, so that the same gesture input in the angle of the user can achieve the same control effect.
In the method provided by the second embodiment of the present invention, the same gesture input can be performed for users with different usage habits, and the control command can be adaptively adjusted, so that the same control effect can be obtained. Specifically, the control operation is performed on the display object in the first area by detecting the relative position relationship between the first area and the second area, inputting the corresponding relationship with the control instruction through different gestures, or executing different control instructions. In the second embodiment of the invention, the self-adaptive adjustment control operation of the electronic equipment in the left-hand mode or the right-hand mode can be detected according to different use habits of the user, so that the user can obtain better experience.
In the embodiment of the invention, when the electronic device is in different application scenes, the control operation performed on the object displayed in the first area according to the acquired gesture input may be different. For example, when the electronic device is in a photographing mode, a zoom operation may be performed on the object displayed in the first region; when the electronic equipment is in an electronic book browsing mode, page turning operation can be performed on the object displayed in the first area; when the electronic equipment is in a picture browsing mode, operations such as zooming in, zooming out, rotating and the like are performed on the object displayed in the first area; when the electronic equipment is in a video watching mode, the objects displayed on the first area can be fast-forward and fast-backward operations.
The method provided by the present invention will be described in detail below with respect to performing a zoom operation on an object displayed in the first area when the electronic device is in the photographing mode.
Referring to fig. 3, a flowchart of a touch control method according to a third embodiment of the present invention is shown.
S301, the touch screen is divided into a first area and a second area.
In the embodiment of the invention, the electronic equipment is provided with the touch screen, and the touch screen of the electronic equipment is divided into a first area and a second area. The first area at least has a display function, that is, the first area is an image display area for displaying and previewing information such as graphics, images and characters. The second area at least has a touch function and can accept touch gesture input of a user.
S302, detecting the relative position of the first area and the second area, and acquiring a detection result.
As with the second embodiment, when the second region is located to the right or below the first region, we may refer to it as a "right-hand mode"; when the second region is located to the left or above the first region, we may refer to it as a "left-hand mode".
S303, detecting the gesture input of the user in the second area, and acquiring a plurality of touch points corresponding to the gesture input.
Position information of each touch point is acquired, and the position information can be coordinates of the touch point.
S304, acquiring a first direction corresponding to the gesture input according to the position information of the touch points.
S305, acquiring a first control instruction according to the first direction and the detection result.
Specifically, the system may preset a corresponding relationship between the gesture input and the control command. For example, when the user is in the "right-hand mode", it may be set that when the direction in which the gesture input is detected is sliding from the first area direction toward the second area direction, the corresponding focus zoom-in operation is performed; and when the direction of detecting the gesture input slides from the second area direction to the first area direction, zooming out operation is performed correspondingly. (2) When the second area is positioned on the left side or the upper side of the first area, corresponding to the focal length reduction operation when the direction of detecting the gesture input slides from the direction of the first area to the direction of the second area; and when the direction of detecting the gesture input slides from the direction of the second area to the direction of the first area, zooming in operation is performed correspondingly.
And acquiring a control instruction corresponding to the detected relative position relationship between the first area and the second area and the direction corresponding to the gesture input.
S306, obtaining the sliding distance of the gesture input according to the positions of the touch points, and obtaining a change coefficient according to the sliding distance.
And acquiring the sliding distance of the gesture input according to the position information of the first touch point and the position information of the last touch point. In this embodiment of the invention, the coefficient of variation is an increment of the distance of the slide. I.e. from position L1Slide to LXThe increment of the distance change Δ L.
And S307, according to the change coefficient, carrying out zooming operation on the object in the first area according to a first control instruction.
Assuming that the length of the first area is L, the range with the maximum corresponding focal length is Fa, and the current focal length of the electronic device is F1From an arbitrary position L1Slide to LXAcquired changed focal length FXComprises the following steps:
FX=F1+(LX-L1)*Fa/L(1)
the transformed focal length may be obtained according to equation 1.
Alternatively, the zoom operation may be performed by calculating Δ F, wherein,
ΔF=ΔL*Fa/L(2)
in this way, the electronic device can be controlled to perform the zoom operation by increasing or decreasing Δ F steps according to the first control instruction.
In the method provided by the third embodiment of the present invention, the stepless zooming can be quickly realized according to the gesture input of the user, which is convenient and fast, and the user performs the operation in the second area without blocking the display image of the first area, so that better experience can be obtained.
In the fourth embodiment of the present invention, when the user is in the state of browsing the electronic book, the change coefficient may be obtained according to the direction and distance corresponding to the gesture input of the user, and the page turning operation is executed. For example, sliding from top to bottom performs a page backward operation, and sliding from bottom to top performs a page forward operation.
In the fifth embodiment of the present invention, when a user is in a picture browsing state and browses an electronic book, the user may further perform an operation of zooming in and zooming out according to a direction, a distance, and an obtained variation coefficient corresponding to the user gesture input. For example, sliding from top to bottom is a zoom-out operation, and sliding from bottom to top is a zoom-in operation. Specifically, the distance corresponding to the gesture input may be obtained to obtain the variation coefficient, and the scaling of zooming in and zooming out may be determined.
In another embodiment of the present invention, a rotation operation, a video fast forward operation, a reverse operation, and the like may be performed on the display object in the first region through a gesture input in the second region.
It is understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and other embodiments obtained by those skilled in the art without inventive step are within the scope of the present invention.
Fig. 4 is a schematic view of a touch control device according to an embodiment of the present invention.
A touch control device is applied to an electronic device, the electronic device is provided with a touch screen, and the device comprises:
a dividing unit 401, configured to divide the touch screen into a first area and a second area; the first area at least has a display function, and the second area at least has a touch function.
A first detecting unit 402, configured to detect a gesture input of the user in the second area.
A control unit 403, configured to perform a control operation on the object displayed in the first area according to the gesture input.
Preferably, the control unit is a first control unit, and is configured to obtain a control instruction corresponding to the gesture input according to a preset correspondence between the gesture input and the control instruction; and executing control operation on the object displayed in the first area according to the control instruction.
Preferably, the apparatus further comprises:
the second detection unit is used for detecting the relative positions of the first area and the second area and acquiring a detection result;
at this time, the control unit is a second control unit, and is configured to perform a control operation on the object displayed in the first area according to the detection result and the gesture input.
Preferably, the second control unit includes:
and the first control subunit is used for acquiring a first control command according to the acquired gesture input corresponding to the first direction when the detection result shows that the second area is positioned on the right side or below the first area, and executing control operation on the object displayed in the first area according to the first control command.
And the second control subunit is used for acquiring a second control command according to the acquired gesture input corresponding to the second direction when the detection result shows that the second area is positioned on the left side or the upper side of the first area, and executing control operation on the object displayed in the first area according to the second control command.
Wherein the first direction and the second direction are opposite directions, and the first control command is the same as the second control command.
Preferably, the second control unit includes:
the third control subunit is used for acquiring a positive control command corresponding to the gesture input according to a preset corresponding relation between the input gesture and the control command when the detection result shows that the second area is positioned on the right side or below the first area; and executing control operation on the object displayed in the first area according to the positive control command.
The fourth control subunit is configured to, when the detection result indicates that the second area is located on the left side or above the first area, obtain, according to a preset correspondence between an input gesture and a control command, a control command opposite to the control command corresponding to the gesture input; and executing control operation on the object displayed in the first area according to the control command opposite to the control command corresponding to the gesture input.
Preferably, the control unit is:
the third control unit is used for acquiring a sliding distance corresponding to the gesture input; and acquiring a change coefficient according to the sliding distance, and executing control operation on the object displayed in the first area according to the acquired change coefficient.
Preferably, the control unit comprises any one of the following:
a fourth control unit configured to perform a zoom operation on the object displayed in the first area.
A fifth control unit configured to perform an enlargement or reduction operation on the object displayed in the first area.
And the sixth control unit is used for executing page turning operation on the object displayed in the first area.
A seventh control unit configured to perform a rotation operation on the object displayed in the first area.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. The term "comprising", without further limitation, means that the element so defined is not excluded from the group consisting of additional identical elements in the process, method, article, or apparatus that comprises the element.
The invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The foregoing is directed to embodiments of the present invention, and it is understood that various modifications and improvements can be made by those skilled in the art without departing from the spirit of the invention.

Claims (6)

CN201210080723.8A2012-03-232012-03-23A kind of method of toch control and deviceActiveCN103324329B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201210080723.8ACN103324329B (en)2012-03-232012-03-23A kind of method of toch control and device

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201210080723.8ACN103324329B (en)2012-03-232012-03-23A kind of method of toch control and device

Publications (2)

Publication NumberPublication Date
CN103324329A CN103324329A (en)2013-09-25
CN103324329Btrue CN103324329B (en)2016-07-06

Family

ID=49193126

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201210080723.8AActiveCN103324329B (en)2012-03-232012-03-23A kind of method of toch control and device

Country Status (1)

CountryLink
CN (1)CN103324329B (en)

Families Citing this family (35)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN104866081A (en)*2014-02-252015-08-26中兴通讯股份有限公司Terminal operation method and device as well as terminal
CN104915132A (en)*2014-03-122015-09-16联想(北京)有限公司Information processing method and equipment
CN104407753A (en)*2014-10-312015-03-11东莞宇龙通信科技有限公司 A touch screen interface operation method and terminal equipment
CN105677193A (en)*2014-11-182016-06-15夏普株式会社Object operation method and electronic equipment
EP3286915B1 (en)2015-04-232021-12-08Apple Inc.Digital viewfinder user interface for multiple cameras
CN104793749B (en)*2015-04-302018-11-30小米科技有限责任公司Intelligent glasses and its control method, device
CN104932817B (en)*2015-05-272018-10-02努比亚技术有限公司The method and apparatus of terminal side frame induction interaction
KR102079054B1 (en)*2015-05-292020-02-19후아웨이 테크놀러지 컴퍼니 리미티드 Method and method for adjusting the shooting focal length of the mobile terminal using the touch pad
CN106325721A (en)*2015-06-182017-01-11联想移动通信软件(武汉)有限公司Operation method and device for sensing areas on terminal touch screen and terminal
KR20170014356A (en)*2015-07-292017-02-08엘지전자 주식회사Mobile terminal and method of controlling the same
CN105515952B (en)*2015-12-172019-03-08小米科技有限责任公司Method of sending message in multimedia and device
US10009536B2 (en)2016-06-122018-06-26Apple Inc.Applying a simulated optical effect based on data received from multiple camera sensors
US10967879B2 (en)*2016-10-032021-04-06Mitsubishi Electric CorporationAutonomous driving control parameter changing device and autonomous driving control parameter changing method
CN106502561A (en)*2016-10-122017-03-15乐视控股(北京)有限公司Terminal operation method and device
DK180859B1 (en)2017-06-042022-05-23Apple Inc USER INTERFACE CAMERA EFFECTS
CN107728923B (en)*2017-10-202020-11-03维沃移动通信有限公司 An operation processing method and mobile terminal
US11112964B2 (en)2018-02-092021-09-07Apple Inc.Media capture lock affordance for graphical user interface
CN108494958B (en)*2018-03-152021-01-08维沃移动通信有限公司Image processing method and flexible screen terminal
US11722764B2 (en)2018-05-072023-08-08Apple Inc.Creative camera
US10375313B1 (en)2018-05-072019-08-06Apple Inc.Creative camera
DK201870623A1 (en)2018-09-112020-04-15Apple Inc.User interfaces for simulated depth effects
US11128792B2 (en)2018-09-282021-09-21Apple Inc.Capturing and displaying images with multiple focal planes
US11321857B2 (en)2018-09-282022-05-03Apple Inc.Displaying and editing images with depth information
US10645294B1 (en)2019-05-062020-05-05Apple Inc.User interfaces for capturing and managing visual media
US11770601B2 (en)2019-05-062023-09-26Apple Inc.User interfaces for capturing and managing visual media
US11706521B2 (en)2019-05-062023-07-18Apple Inc.User interfaces for capturing and managing visual media
US11054973B1 (en)2020-06-012021-07-06Apple Inc.User interfaces for managing media
US11212449B1 (en)2020-09-252021-12-28Apple Inc.User interfaces for media capture and management
CN112445148A (en)*2020-12-082021-03-05厦门立林科技有限公司User-defined key control method with display screen and device thereof
CN113259615B (en)*2021-04-012024-05-28海南视联通信技术有限公司Focal length adjusting method and device, terminal equipment and storage medium
US11778339B2 (en)2021-04-302023-10-03Apple Inc.User interfaces for altering visual media
US11539876B2 (en)2021-04-302022-12-27Apple Inc.User interfaces for altering visual media
US12112024B2 (en)2021-06-012024-10-08Apple Inc.User interfaces for managing media styles
CN113672329A (en)*2021-08-192021-11-19联想(北京)有限公司 Control method and control device
US20240373121A1 (en)2023-05-052024-11-07Apple Inc.User interfaces for controlling media capture settings

Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101432677A (en)*2005-03-042009-05-13苹果公司Electronic device with display and surrounding touch sensitive bezel for user interface and control
CN101911006A (en)*2007-10-242010-12-08造型逻辑有限公司Electronic document reader
CN102279699A (en)*2010-06-082011-12-14索尼公司Information processing apparatus, information processing method, and program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2010262557A (en)*2009-05-112010-11-18Sony CorpInformation processing apparatus and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101432677A (en)*2005-03-042009-05-13苹果公司Electronic device with display and surrounding touch sensitive bezel for user interface and control
CN101911006A (en)*2007-10-242010-12-08造型逻辑有限公司Electronic document reader
CN102279699A (en)*2010-06-082011-12-14索尼公司Information processing apparatus, information processing method, and program

Also Published As

Publication numberPublication date
CN103324329A (en)2013-09-25

Similar Documents

PublicationPublication DateTitle
CN103324329B (en)A kind of method of toch control and device
JP7385052B2 (en) Photography methods, equipment, electronic equipment and storage media
US10191636B2 (en)Gesture mapping for image filter input parameters
US8812996B1 (en)Methods and apparatus for processing application windows
US8675113B2 (en)User interface for a digital camera
US9438789B2 (en)Display control apparatus and display control method
US9170728B2 (en)Electronic device and page zooming method thereof
US20130152024A1 (en)Electronic device and page zooming method thereof
US20140165013A1 (en)Electronic device and page zooming method thereof
EP2722748A2 (en)Method and apparatus for displaying data in terminal
US20130239050A1 (en)Display control device, display control method, and computer-readable recording medium
US20130208163A1 (en)Camera shutter key display apparatus and method
US9509733B2 (en)Program, communication apparatus and control method
EP2530577A2 (en)Display apparatus and method
US20140043255A1 (en)Electronic device and image zooming method thereof
WO2020000971A1 (en)Method and apparatus for switching global special effects, terminal device, and storage medium
CN113923350A (en)Video shooting method and device, electronic equipment and readable storage medium
CN112333395B (en)Focusing control method and device and electronic equipment
US9277117B2 (en)Electronic apparatus including a touch panel and control method thereof for reducing erroneous operation
GB2547541A (en)Display control apparatus and control method thereof
JP6198459B2 (en) Display control device, display control device control method, program, and storage medium
CA2807866C (en)User interface for a digital camera
KR101083158B1 (en)Contents conversion method for mobile terminal with touch screen
JP2021060790A (en)Electronic apparatus and control method thereof
CN104866163A (en)Image display method and device and electronic equipment

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
C14Grant of patent or utility model
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp