Disclosure of Invention
In order to solve the technical problem, embodiments of the present invention provide a touch control method, which can quickly and conveniently implement touch control, does not block the sight of a user, and improves the user experience. The technical scheme is as follows:
in one aspect, an embodiment of the present invention provides a touch control method applied to an electronic device, where the electronic device has a touch screen, and the method includes:
dividing the touch screen into a first area and a second area; the first area at least has a display function, and the second area at least has a touch function;
detecting gesture input of a user in the second area;
and executing control operation on the object displayed in the first area according to the gesture input.
Preferably, the performing of the control operation on the object displayed in the first area according to the gesture input includes:
acquiring a control instruction corresponding to the gesture input according to a preset corresponding relation between the gesture input and the control instruction;
and executing control operation on the object displayed in the first area according to the control instruction.
Preferably, the method further comprises:
detecting the relative positions of the first area and the second area to obtain a detection result;
the performing of the control operation on the object displayed in the first area according to the gesture input is:
and executing control operation on the object displayed in the first area according to the detection result and the gesture input.
Preferably, the performing of the control operation on the object displayed in the first area according to the detection result and the gesture input is to:
when the detection result shows that the second area is located on the right side or below the first area, acquiring a first control command according to the acquired gesture input corresponding to the first direction, and executing control operation on an object displayed in the first area according to the first control command;
when the detection result shows that the second area is positioned on the left side or the upper side of the first area, acquiring a second control command according to the acquired gesture input corresponding to the second direction, and executing control operation on the object displayed in the first area according to the second control command;
wherein the first direction and the second direction are opposite directions, and the first control command is the same as the second control command.
Preferably, the performing of the control operation on the object displayed in the first area according to the detection result and the gesture input is to:
when the detection result shows that the second area is positioned on the right side or below the first area, acquiring a positive control command corresponding to the gesture input according to a preset corresponding relation between the input gesture and the control command; executing control operation on the object displayed in the first area according to the positive control command;
when the detection result shows that the second area is positioned on the left side or the upper side of the first area, acquiring a control command opposite to the control command corresponding to the gesture input according to a preset corresponding relation between the input gesture and the control command; and executing control operation on the object displayed in the first area according to the control command opposite to the control command corresponding to the gesture input.
Preferably, the performing of the control operation on the object displayed in the first area according to the gesture input is:
acquiring a sliding distance corresponding to the gesture input;
and acquiring a change coefficient according to the sliding distance, and executing control operation on the object displayed in the first area according to the acquired change coefficient.
Preferably, the performing of the control operation on the object displayed in the first area includes any one of:
performing a zoom operation on an object displayed in the first area;
performing an enlarging or reducing operation on the object displayed in the first area;
performing page turning operation on the object displayed in the first area;
and executing a rotation operation on the object displayed in the first area.
On the other hand, the embodiment of the invention also discloses a touch control device, which is applied to electronic equipment, wherein the electronic equipment is provided with a touch screen, and the device comprises:
the touch screen comprises a dividing unit, a processing unit and a display unit, wherein the dividing unit is used for dividing the touch screen into a first area and a second area; the first area at least has a display function, and the second area at least has a touch function;
the first detection unit is used for detecting gesture input of a user in the second area;
and the control unit is used for executing control operation on the object displayed in the first area according to the gesture input.
Preferably, the control unit is:
the first control unit is used for acquiring a control instruction corresponding to the gesture input according to the corresponding relation between the preset gesture input and the control instruction; and executing control operation on the object displayed in the first area according to the control instruction.
Preferably, the apparatus further comprises:
the second detection unit is used for detecting the relative positions of the first area and the second area and acquiring a detection result;
the control unit is as follows:
and the second control unit is used for executing control operation on the object displayed in the first area according to the detection result and the gesture input.
Preferably, the second control unit includes:
the first control subunit is used for acquiring a first control command according to the acquired gesture input corresponding to the first direction when the detection result shows that the second area is positioned on the right side or below the first area, and executing control operation on an object displayed in the first area according to the first control command;
the second control subunit is used for acquiring a second control command according to the acquired gesture input corresponding to the second direction when the detection result shows that the second area is positioned on the left side or the upper side of the first area, and executing control operation on the object displayed in the first area according to the second control command;
wherein the first direction and the second direction are opposite directions, and the first control command is the same as the second control command.
Preferably, the second control unit includes:
the third control subunit is used for acquiring a positive control command corresponding to the gesture input according to a preset corresponding relation between the input gesture and the control command when the detection result shows that the second area is positioned on the right side or below the first area; executing control operation on the object displayed in the first area according to the positive control command;
the fourth control subunit is configured to, when the detection result indicates that the second area is located on the left side or above the first area, obtain, according to a preset correspondence between an input gesture and a control command, a control command opposite to the control command corresponding to the gesture input; and executing control operation on the object displayed in the first area according to the control command opposite to the control command corresponding to the gesture input.
Preferably, the control unit is:
the third control unit is used for acquiring a sliding distance corresponding to the gesture input; and acquiring a change coefficient according to the sliding distance, and executing control operation on the object displayed in the first area according to the acquired change coefficient.
Preferably, the control unit comprises any one of the following:
a fourth control unit configured to perform a zoom operation on the object displayed in the first area;
a fifth control unit configured to perform an enlargement or reduction operation on the object displayed in the first area;
a sixth control unit, configured to perform a page turning operation on an object displayed in the first region;
a seventh control unit configured to perform a rotation operation on the object displayed in the first area.
The embodiment of the invention has the beneficial effects that: the embodiment of the invention provides a touch control method, which comprises the steps of dividing a touch display screen into a first area and a second area, and detecting gesture input of a user in the second area; and executing control operation on the object displayed in the first area according to the gesture input. The touch display screen is divided into the first area and the second area, the first area is used for displaying images, and the user performs gesture operation in the second area, so that the sight of the user is not shielded, and the user is prevented from watching the images in the first area. On the other hand, corresponding touch control is realized by detecting the gesture input of the user, the method is convenient and fast, and the user experience is improved.
Detailed Description
The embodiment of the invention provides a touch control method, which can quickly and conveniently realize touch control, does not shield the sight of a user and improves the experience of the user.
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiment of the present invention will be clearly and completely described below with reference to the drawings in the embodiment of the present invention, and it is obvious that the described embodiment is only a part of the embodiment of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a flowchart of a touch control method according to a first embodiment of the present invention is shown.
The method provided by the embodiment of the invention is applied to the electronic equipment with the touch screen, and the electronic equipment can be a mobile phone, a camera, a PAD and the like.
S101, dividing the touch screen into a first area and a second area; the first area at least has a display function, and the second area at least has a touch function.
In the embodiment of the invention, the electronic equipment is provided with the touch screen, and the touch screen of the electronic equipment is divided into a first area and a second area. The first area at least has a display function, that is, the first area is an image display area for displaying and previewing information such as graphics, images and characters. For example, when taking a picture, the first region is used to display and preview an image in the finder frame. As another example, the first region may display a photograph, an electronic book, a video file, and the like. The second area at least has a touch function and can accept touch gesture input of a user. Generally, the second region is a non-image display region. Therefore, the touch screen is divided into an image display area and a non-image display area, so that a user can watch and preview images in the image display area, and touch operation is performed in the non-image display area, therefore, the sight of the user is not shielded during operation, and the touch screen is convenient and quick.
It will be appreciated by those skilled in the art that the touch screen may also be arranged such that the first region is a main image display region and the second region is a sub image region. The sub image display region may have a display function in addition to the touch function, but the displayed object is different from the object displayed in the first region. Specifically, the user can preview, view, etc. a through image, a photograph, an electronic book, a video, etc. in the main image display area; in the sub-image display area, operation icons, prompt information, and the like for controlling objects in the main image display area may be displayed.
And S102, detecting the gesture input of the user in the second area.
As mentioned above, the second area has at least a touch function and can accept gesture input of the user. And the user performs control operation on the object displayed in the first area through gesture input in the second area.
S103, executing control operation on the object displayed in the first area according to the gesture input.
And after the gesture input of the user is acquired, executing control operation on the object displayed in the first area according to the gesture input. Specifically, step S103 may be implemented by the following steps:
S103A, acquiring a control instruction corresponding to the gesture input according to the corresponding relation between the preset gesture input and the control instruction.
The corresponding relation between the gesture input and the control instruction can be preset, and different gesture inputs correspond to different control instructions. For example, when the gesture input of the user is detected to be sliding from top to bottom, a zoom-out operation on the object displayed in the first area is correspondingly performed; for another example, when the gesture input of the user is detected to be sliding from bottom to top, the corresponding operation is to perform a zoom-in operation on the object displayed in the first area. The corresponding relation between the gesture input and the control instruction can be preset by a system or can be customized by a user.
Further, in different application scenarios, the obtained corresponding relationship between the gesture input and the control instruction may be different. That is to say, the system may determine the application scene where the electronic device is located, and then further obtain the corresponding relationship between the gesture input and the control instruction, so as to perform the control operation on the object displayed in the first area. For example, when it is detected that the electronic device is in a photographing mode and then it is detected that the gesture input of the user is sliding from top to bottom, the obtained control instruction may be to perform a focus reduction operation; when it is detected that the electronic device is in an electronic book watching mode and the gesture input of the user is sliding from top to bottom, the obtained control instruction may be to perform a page turning operation. That is, when the electronic device is in different application scenarios, the control commands obtained by executing the same gesture input may be different.
And S103B, executing control operation on the object displayed in the first area according to the control instruction.
Specifically, step S103B may include any one of the following cases:
performing a zoom operation on an object displayed in the first area;
performing an enlarging or reducing operation on the object displayed in the first area;
performing page turning operation on the object displayed in the first area;
and executing a rotation operation on the object displayed in the first area.
The above is only a preferred embodiment of the present invention, and the present invention is not limited to the manner of executing the control operation, and other implementations obtained by those skilled in the art without inventive labor are within the scope of the present invention.
In the method provided by the first embodiment of the present invention, the touch screen of the electronic device is divided into the first area and the second area, the first area is used for displaying an image, and the second area is used for receiving a touch gesture input. And moreover, the control operation is realized by using touch input, and the method is convenient and quick.
Referring to fig. 2, a flowchart of a touch control method according to a second embodiment of the present invention is shown.
S201, dividing the touch screen into a first area and a second area.
In the embodiment of the invention, the electronic equipment is provided with the touch screen, and the touch screen of the electronic equipment is divided into a first area and a second area. The first area at least has a display function, that is, the first area is an image display area for displaying and previewing information such as graphics, images and characters. The second area at least has a touch function and can accept touch gesture input of a user.
S202, detecting gesture input of the user in the second area.
As mentioned above, the second area has at least a touch function and can accept gesture input of the user. And the user performs control operation on the object displayed in the first area through gesture input in the second area.
Specifically, the detecting of the gesture input of the user in the second area includes: the method comprises the steps of obtaining a plurality of touch points corresponding to gesture input of a user, and obtaining one or more of the direction, the sliding distance, the position coordinate information of a starting point (the first touch point corresponding to the gesture input), the position coordinate information of an ending point (the last first touch point corresponding to the gesture input) and the like corresponding to the gesture input according to the touch points.
S203, detecting the relative position of the first area and the second area, and acquiring a detection result.
The relative positions of the first region and the second region are specifically: the second region is located to the right, left, above, below, etc. of the first region. In the second embodiment of the invention, the use habit and the screen direction of the user can be judged by detecting the relative position relationship between the first area and the second area, so that the corresponding control instruction can be adaptively adjusted. For example, when the user uses the electronic device in the photographing mode, the right-handed user is often used to place the electronic device laterally and the second area is located at the right side of the first area, so that the user can conveniently perform a touch operation in the second area using the right hand. And a left-handed user is usually used to place the electronic device horizontally and locate the second area at the left side of the first area, so that the user can conveniently perform a touch operation in the second area by using the left hand. In order to enable a user to obtain better experience, so that the same touch gesture operation, for example, sliding from left to right, of the user can achieve the same operation effect even when different use habits and screen directions are different, at this time, the relative position relationship between the first area and the second area needs to be detected, and generally, the control operation can be executed according to the detection result.
It will be understood by those skilled in the art that the order of step S202 and step S203 may be reversed or performed simultaneously, and the present invention is not limited thereto.
And S204, executing control operation on the object displayed in the first area according to the detection result and the gesture input.
In order to enable the user to achieve the same operation effect in different screen directions (the relative positions of the first area and the second area are different) when performing the same gesture input (the same from the perspective of the user, for example, sliding from left to right, sliding from top to bottom, etc.), the second embodiment of the present invention can adaptively adjust the control operation by detecting the relative position relationship of the first area and the second area.
Specifically, when the relative position relationship between the first area and the second area is detected to be different, different corresponding rules of the gesture input and the control instruction may be set to adjust the control instruction. In the following, a specific example is described, and it can be understood by those skilled in the art that different rules for corresponding gesture inputs and control commands can be set according to actual needs. For example, when it is detected that the second area is located on the right side of the first area (which may also be referred to as "right-hand mode"), a sliding operation from left to right (from the perspective of the user) by the user may be set, that is, a gesture sliding from the direction of the first area to the direction of the second area inputs a corresponding control instruction as a focus increasing operation; when the second area is detected to be located on the left side of the first area (which may also be referred to as a "left-hand mode"), the user performs a sliding operation from left to right (from the perspective of the user), that is, the gesture input corresponding to the sliding operation from the second area to the first area is a focus increasing operation. Of course, the embodiment of the present invention is not limited to the above example, and when the electronic device is placed horizontally, the control command corresponding to the sliding of the user from top to bottom may also be set as the focal length reduction operation, and the control command corresponding to the sliding of the user from bottom to top is the focal length increase operation; likewise, the opposite arrangement is also possible. Other implementations obtained by persons skilled in the art without inventive effort are within the scope of the present invention.
Optionally, another possible adaptive adjustment method is as follows: and when the relative position relation between the first area and the second area is detected to accord with a set condition, acquiring a control instruction opposite to the control instruction according to the corresponding relation between the preset gesture input and the control instruction. That is, the control instructions executed may be opposite when the screen directions are different.
The two implementations described above are explained in detail below.
Specifically, step S204 can be specifically implemented by the following steps:
when the detection result shows that the second area is located on the right side or below the first area, acquiring a first control command according to the acquired gesture input corresponding to the first direction, and executing control operation on an object displayed in the first area according to the first control command; when the detection result shows that the second area is positioned on the left side or the upper side of the first area, acquiring a second control command according to the acquired gesture input corresponding to the second direction, and executing control operation on the object displayed in the first area according to the second control command; wherein the first direction and the second direction are opposite directions, and the first control command is the same as the second control command.
The examples are given by way of illustration and are not to be construed as limiting the invention. We can preset the following correspondence: (1) when the second area is positioned on the right side or below the first area, corresponding to the focal length amplifying operation when the direction of detecting the gesture input slides from the direction of the first area to the direction of the second area; and when the direction of detecting the gesture input slides from the second area direction to the first area direction, zooming out operation is performed correspondingly. (2) When the second area is positioned on the left side or the upper side of the first area, corresponding to the focal length reduction operation when the direction of detecting the gesture input slides from the direction of the first area to the direction of the second area; and when the direction of detecting the gesture input slides from the direction of the second area to the direction of the first area, zooming in operation is performed correspondingly.
Taking a photo as an example, when a user having a right-handed habit uses the electronic device in the photo mode, it is common practice to place the electronic device laterally and the second region is located at the right side of the first region, and at this time, the user performs a sliding operation from left to right (for the user), which is referred to as a first gesture input. At this time, the system detects that the first direction corresponding to the first gesture input slides from the first area direction to the second area direction, at this time, according to the preset corresponding relation between the gesture input and the control instruction, the obtained control instruction is to execute a focal length amplification operation, and at this time, a zoom operation, specifically, a focal length amplification operation, is executed on the image displayed in the first area. A left-handed user is usually used to place the electronic device horizontally and the second area is located at the left side of the first area, and when the user also performs a sliding operation from left to right (for the user), it is called a second gesture input. And when the system detects that the second direction corresponding to the second gesture input slides from the direction of the second area to the direction of the first area, the acquired control instruction is to execute the focal length amplification operation according to the preset corresponding relation between the gesture input and the control instruction, and at this time, the focal length amplification operation is executed on the image displayed in the first area. As can be seen from the above example, the user performs the gesture input under different operation habits (corresponding to different relative positions of the first area and the second area), although the first direction and the second direction are opposite, the first control command and the second control command are the same, and the same control effect is achieved. Therefore, the method provided by the embodiment of the invention can self-adaptively adjust the corresponding control instruction by detecting the relative position relationship between the first area and the second area, so that the same gesture input in the angle of the user can achieve the same control effect.
Another possible implementation manner of step S204 is:
when the detection result shows that the second area is positioned on the right side or below the first area, acquiring a positive control command corresponding to the gesture input according to a preset corresponding relation between the input gesture and the control command; executing control operation on the object displayed in the first area according to the positive control command; when the detection result shows that the second area is positioned on the left side or the upper side of the first area, acquiring a control command opposite to the control command corresponding to the gesture input according to a preset corresponding relation between the input gesture and the control command; and executing control operation on the object displayed in the first area according to the control command opposite to the control command corresponding to the gesture input.
The corresponding relation between the gesture input and the control instruction is preset, and when the second area is detected to be positioned on the right side or the lower side of the first area, the control instruction which is the same as the control instruction is executed, namely, a positive control instruction; and when the second area is detected to be positioned at the left side or the upper side of the first area, executing a control instruction opposite to the control instruction, namely an inverse control instruction. Therefore, users with different use habits can execute the same gesture input from the perspective of the users, and the same control effect can be achieved.
For example, we can preset the following correspondence relationship: (1) detecting that the gesture input direction corresponds to a focal length amplification operation when the gesture input direction slides from the first area direction to the second area direction; (2) and when the direction of detecting the gesture input slides from the second area direction to the first area direction, zooming out operation is performed correspondingly.
Taking a photo as an example, when a user having a right-handed habit uses the electronic device in the photo mode, it is common practice to place the electronic device laterally and the second region is located at the right side of the first region, and at this time, the user performs a sliding operation from left to right (for the user), which is referred to as a first gesture input. And when the system detects that the first direction corresponding to the first gesture input slides from the first area direction to the second area direction, the system executes the focal length amplification operation when the acquired control instruction is a positive control instruction according to the corresponding relation between the preset gesture input and the control instruction, and executes the focal length amplification operation on the object displayed in the first area. A left-handed user is usually used to place the electronic device horizontally and the second area is located at the left side of the first area, and when the user also performs a sliding operation from left to right (for the user), it is called a second gesture input. At this time, the system detects that the second direction corresponding to the second gesture input slides from the second area direction to the first area direction, at this time, according to the preset corresponding relation between the gesture input and the control instruction, the control instruction corresponding to the gesture input is to execute the focal length reduction operation, at this time, the control instruction opposite to the control instruction, namely the focal length enlargement operation, is obtained, and the focal length enlargement operation is executed on the display object in the first area. As can be seen from the above example, the method provided by the embodiment of the present invention can adaptively adjust the corresponding control instruction by detecting the relative position relationship between the first region and the second region, so that the same gesture input in the angle of the user can achieve the same control effect.
In the method provided by the second embodiment of the present invention, the same gesture input can be performed for users with different usage habits, and the control command can be adaptively adjusted, so that the same control effect can be obtained. Specifically, the control operation is performed on the display object in the first area by detecting the relative position relationship between the first area and the second area, inputting the corresponding relationship with the control instruction through different gestures, or executing different control instructions. In the second embodiment of the invention, the self-adaptive adjustment control operation of the electronic equipment in the left-hand mode or the right-hand mode can be detected according to different use habits of the user, so that the user can obtain better experience.
In the embodiment of the invention, when the electronic device is in different application scenes, the control operation performed on the object displayed in the first area according to the acquired gesture input may be different. For example, when the electronic device is in a photographing mode, a zoom operation may be performed on the object displayed in the first region; when the electronic equipment is in an electronic book browsing mode, page turning operation can be performed on the object displayed in the first area; when the electronic equipment is in a picture browsing mode, operations such as zooming in, zooming out, rotating and the like are performed on the object displayed in the first area; when the electronic equipment is in a video watching mode, the objects displayed on the first area can be fast-forward and fast-backward operations.
The method provided by the present invention will be described in detail below with respect to performing a zoom operation on an object displayed in the first area when the electronic device is in the photographing mode.
Referring to fig. 3, a flowchart of a touch control method according to a third embodiment of the present invention is shown.
S301, the touch screen is divided into a first area and a second area.
In the embodiment of the invention, the electronic equipment is provided with the touch screen, and the touch screen of the electronic equipment is divided into a first area and a second area. The first area at least has a display function, that is, the first area is an image display area for displaying and previewing information such as graphics, images and characters. The second area at least has a touch function and can accept touch gesture input of a user.
S302, detecting the relative position of the first area and the second area, and acquiring a detection result.
As with the second embodiment, when the second region is located to the right or below the first region, we may refer to it as a "right-hand mode"; when the second region is located to the left or above the first region, we may refer to it as a "left-hand mode".
S303, detecting the gesture input of the user in the second area, and acquiring a plurality of touch points corresponding to the gesture input.
Position information of each touch point is acquired, and the position information can be coordinates of the touch point.
S304, acquiring a first direction corresponding to the gesture input according to the position information of the touch points.
S305, acquiring a first control instruction according to the first direction and the detection result.
Specifically, the system may preset a corresponding relationship between the gesture input and the control command. For example, when the user is in the "right-hand mode", it may be set that when the direction in which the gesture input is detected is sliding from the first area direction toward the second area direction, the corresponding focus zoom-in operation is performed; and when the direction of detecting the gesture input slides from the second area direction to the first area direction, zooming out operation is performed correspondingly. (2) When the second area is positioned on the left side or the upper side of the first area, corresponding to the focal length reduction operation when the direction of detecting the gesture input slides from the direction of the first area to the direction of the second area; and when the direction of detecting the gesture input slides from the direction of the second area to the direction of the first area, zooming in operation is performed correspondingly.
And acquiring a control instruction corresponding to the detected relative position relationship between the first area and the second area and the direction corresponding to the gesture input.
S306, obtaining the sliding distance of the gesture input according to the positions of the touch points, and obtaining a change coefficient according to the sliding distance.
And acquiring the sliding distance of the gesture input according to the position information of the first touch point and the position information of the last touch point. In this embodiment of the invention, the coefficient of variation is an increment of the distance of the slide. I.e. from position L1Slide to LXThe increment of the distance change Δ L.
And S307, according to the change coefficient, carrying out zooming operation on the object in the first area according to a first control instruction.
Assuming that the length of the first area is L, the range with the maximum corresponding focal length is Fa, and the current focal length of the electronic device is F1From an arbitrary position L1Slide to LXAcquired changed focal length FXComprises the following steps:
FX=F1+(LX-L1)*Fa/L(1)
the transformed focal length may be obtained according to equation 1.
Alternatively, the zoom operation may be performed by calculating Δ F, wherein,
ΔF=ΔL*Fa/L(2)
in this way, the electronic device can be controlled to perform the zoom operation by increasing or decreasing Δ F steps according to the first control instruction.
In the method provided by the third embodiment of the present invention, the stepless zooming can be quickly realized according to the gesture input of the user, which is convenient and fast, and the user performs the operation in the second area without blocking the display image of the first area, so that better experience can be obtained.
In the fourth embodiment of the present invention, when the user is in the state of browsing the electronic book, the change coefficient may be obtained according to the direction and distance corresponding to the gesture input of the user, and the page turning operation is executed. For example, sliding from top to bottom performs a page backward operation, and sliding from bottom to top performs a page forward operation.
In the fifth embodiment of the present invention, when a user is in a picture browsing state and browses an electronic book, the user may further perform an operation of zooming in and zooming out according to a direction, a distance, and an obtained variation coefficient corresponding to the user gesture input. For example, sliding from top to bottom is a zoom-out operation, and sliding from bottom to top is a zoom-in operation. Specifically, the distance corresponding to the gesture input may be obtained to obtain the variation coefficient, and the scaling of zooming in and zooming out may be determined.
In another embodiment of the present invention, a rotation operation, a video fast forward operation, a reverse operation, and the like may be performed on the display object in the first region through a gesture input in the second region.
It is understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and other embodiments obtained by those skilled in the art without inventive step are within the scope of the present invention.
Fig. 4 is a schematic view of a touch control device according to an embodiment of the present invention.
A touch control device is applied to an electronic device, the electronic device is provided with a touch screen, and the device comprises:
a dividing unit 401, configured to divide the touch screen into a first area and a second area; the first area at least has a display function, and the second area at least has a touch function.
A first detecting unit 402, configured to detect a gesture input of the user in the second area.
A control unit 403, configured to perform a control operation on the object displayed in the first area according to the gesture input.
Preferably, the control unit is a first control unit, and is configured to obtain a control instruction corresponding to the gesture input according to a preset correspondence between the gesture input and the control instruction; and executing control operation on the object displayed in the first area according to the control instruction.
Preferably, the apparatus further comprises:
the second detection unit is used for detecting the relative positions of the first area and the second area and acquiring a detection result;
at this time, the control unit is a second control unit, and is configured to perform a control operation on the object displayed in the first area according to the detection result and the gesture input.
Preferably, the second control unit includes:
and the first control subunit is used for acquiring a first control command according to the acquired gesture input corresponding to the first direction when the detection result shows that the second area is positioned on the right side or below the first area, and executing control operation on the object displayed in the first area according to the first control command.
And the second control subunit is used for acquiring a second control command according to the acquired gesture input corresponding to the second direction when the detection result shows that the second area is positioned on the left side or the upper side of the first area, and executing control operation on the object displayed in the first area according to the second control command.
Wherein the first direction and the second direction are opposite directions, and the first control command is the same as the second control command.
Preferably, the second control unit includes:
the third control subunit is used for acquiring a positive control command corresponding to the gesture input according to a preset corresponding relation between the input gesture and the control command when the detection result shows that the second area is positioned on the right side or below the first area; and executing control operation on the object displayed in the first area according to the positive control command.
The fourth control subunit is configured to, when the detection result indicates that the second area is located on the left side or above the first area, obtain, according to a preset correspondence between an input gesture and a control command, a control command opposite to the control command corresponding to the gesture input; and executing control operation on the object displayed in the first area according to the control command opposite to the control command corresponding to the gesture input.
Preferably, the control unit is:
the third control unit is used for acquiring a sliding distance corresponding to the gesture input; and acquiring a change coefficient according to the sliding distance, and executing control operation on the object displayed in the first area according to the acquired change coefficient.
Preferably, the control unit comprises any one of the following:
a fourth control unit configured to perform a zoom operation on the object displayed in the first area.
A fifth control unit configured to perform an enlargement or reduction operation on the object displayed in the first area.
And the sixth control unit is used for executing page turning operation on the object displayed in the first area.
A seventh control unit configured to perform a rotation operation on the object displayed in the first area.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. The term "comprising", without further limitation, means that the element so defined is not excluded from the group consisting of additional identical elements in the process, method, article, or apparatus that comprises the element.
The invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The foregoing is directed to embodiments of the present invention, and it is understood that various modifications and improvements can be made by those skilled in the art without departing from the spirit of the invention.