Movatterモバイル変換


[0]ホーム

URL:


WO2009155981A1 - Gesture on touch sensitive arrangement - Google Patents

Gesture on touch sensitive arrangement
Download PDF

Info

Publication number
WO2009155981A1
WO2009155981A1PCT/EP2008/058164EP2008058164WWO2009155981A1WO 2009155981 A1WO2009155981 A1WO 2009155981A1EP 2008058164 WEP2008058164 WEP 2008058164WWO 2009155981 A1WO2009155981 A1WO 2009155981A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
touch
parameter
predicted
contiguous
Prior art date
Application number
PCT/EP2008/058164
Other languages
French (fr)
Inventor
Erik Sparre
Original Assignee
Uiq Technology Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uiq Technology AbfiledCriticalUiq Technology Ab
Priority to PCT/EP2008/058164priorityCriticalpatent/WO2009155981A1/en
Publication of WO2009155981A1publicationCriticalpatent/WO2009155981A1/en

Links

Classifications

Definitions

Landscapes

Abstract

The present invention is directed to a method for recognizing at least one drag gesture detected by a touch sensitive arrangement 20, 22, 24 of a portable device 10. The method comprises the steps of: recording and obtaining at least one first parameter indicative of a first touch gesture made on the touch sensitive arrangement; obtaining at least one predicted parameter indicative of a predicted next gesture based on said first parameter; recording and obtaining at least one second parameter indicative of a second touch gesture made on the touch sensitive arrangement; converting the first touch gesture and the second touch gesture to a contiguous gesture when the difference between said predicted parameter and said second parameter is below a predetermined threshold.

Description

GESTURE ON TOUCH SENSITIVE ARRANGEMENT
TECHNICAL FIELD
The present invention relates generally to a method and an arrangement for detection of a gesture on a touch sensitive arrangement, e.g. a touchscreen. Embodiments of the present invention relates to a method and arrangement in a portable device.
DESCRIPTION OF RELATED ART
As is well known to those skilled in the art a button, a track ball, a thumbwheel and/or a computer mouse or similar are commonly provided as an interface between a device and a user of the device, i.e. provided as the user interface or the so-called Man-Machine Interface (MMI). It is also well known that touch sensitive arrangements such as touchscreens or similar are frequently preferred as a user interface in small devices, e.g. in portable communication arrangements such as cell phones or similar and in other portable arrangements such as personal digital assistants (PDA) or similar. This is i.a. due to the fact that touch sensitive arrangements do usually not involve the intricate assembly and/or operational space etc that is required for implementing electromechanical user interfaces as those mentioned above.
Common electromechanical user interfaces - e.g. a computer mouse or similar - can be used for moving a cursor on a screen or similar, e.g. such as a Liquid Crystal Display (LCD) or similar. In addition, such user interfaces can be operated to indicate an object presented on the screen, e.g. by a click or a double click on the user interface.
Electromechanical user interfaces can also be operated to drag an object presented on the screen, e.g. by a click on the user interface to indicate the object (e.g. a click on a computer mouse) and then dragging the object by moving the user interface (e.g. moving the computer mouse). Clicking, double clicking and dragging an object on a screen as briefly described above are well known facts to those skilled in the art and they need no further description.
It is also well known that a click, double click and/or a drag gesture can be performed on touch sensitive arrangements. A single click or a double click may e.g. be preformed by a single-tap or a double tap respectively on the touch sensitive arrangement, e.g. by means of a finger or a stylus or similar. Similarly, a drag may be performed by a tap of a finger or a stylus or similar on the touch sensitive arrangement to indicate the object, whereupon the indicated object may be dragged by sliding the finger or stylus or similar over the surface of the touch sensitive arrangement. Clicking, double clicking and dragging an object on a touch sensitive arrangement as briefly described above are well known to those skilled in the art and they need no further description.
Nevertheless, some characteristics of a single tap, a double tap and a drag gesture on a touch sensitive arrangement will be schematically elaborated below with reference to Fig. 1 a, 1 b, 1 c and 1 c'.
Figure 1 a illustrates the timing of a typical single tap gesture. As the finger or stylus or similar is detected by the touch sensitive arrangement the pressure increases and remains at or above a certain maximum value for certain time being less than a reference amount of time Δtap. As the finger or stylus is subsequently lifted from the touch sensitive arrangement the pressure decreases and remains at or below a certain minimum value for certain time being more than a reference amount of time Δrel. This may be detected by a touch sensitive arrangement as a single tap corresponding to a click on a computer mouse or similar.
Figure 1 b illustrates the timing of a typical double tap gesture. As the finger or stylus or similar is detected by the touch sensitive arrangement the pressure increases and remains at or above a certain maximum value for certain time being less than a reference amount of time Δtap. As the finger or stylus is subsequently lifted from the touch sensitive arrangement the pressure decreases and remains at or below a certain minimum value for certain time being less than a reference amount of time Δrel. As the finger or stylus or similar is once again detected by the touch sensitive arrangement the pressure increases and remains at or above a certain maximum value for certain time being less than a reference amount of time Δtap. As the finger or stylus is subsequently lifted from the touch sensitive arrangement the pressure decreases and remains at or below a certain minimum value for certain time being more than a reference amount of time Δrel. This may be detected by a touch sensitive arrangement as a double tap corresponding to a double click on a computer mouse or similar. Figure 1 c illustrates an exemplifying timing of a typical drag gesture. As the finger or stylus or similar is detected by the touch sensitive arrangement the pressure increases to or above a certain maximum value, where it may remain for certain time being less than a first reference amount of time Δd1 as schematically indicated in Fig. 1 c. As the drag gesture continues it is natural to lift the stylus or finger from the touch sensitive arrangement to reduce friction etc. The pressure will then decrease to or below a certain minimum value for certain time Δd2. A typical drag gesture as described briefly above is too often recognized by the touch sensitive arrangement as a single tap, described above with reference to Fig. 1 a. This is particularly so if Δtap ≥ Δd1 and Δd2 ≥ Δrel.
Hence, the problem is to recognize a drag gesture made on the touch sensitive arrangement even if the pressure of the touch varies during the gesture. In other words, a gesture causing two or more detected movements separated by missing detections - as illustrated by the three lines in Fig. 1d separated by two sections of missing detections - should be recognized as a continuous gesture and constructed as a single gesture as illustrated by the continuous line in Fig. 1d.
In view of the above it would be advantageous to have a simple and efficient method and arrangement for detecting a tap and drag gesture such that the drag gesture is distinguished from other gestures such as e.g. a single tap and/or a double tap.
SUMMARY
The present invention is directed to solving the problem of providing a simple and efficient method and device for detecting a drag gesture such that the drag gesture is distinguished from other gestures such as e.g. a single tap and/or a double tap.
At least one of the problems identified above is solved according to a first aspect of the invention providing a method for recognizing at least one drag gesture detected by a touch sensitive arrangement of a portable device, which method in the portable device comprises the steps of: recording and obtaining at least one first parameter indicative of a first touch gesture made on the touch sensitive arrangement; obtaining at least one predicted parameter indicative of a predicted next gesture based on said first parameter; recording and obtaining at least one second parameter indicative of a second touch gesture made on the touch sensitive arrangement; converting the first touch gesture and the second touch gesture to a contiguous gesture when the difference between said predicted parameter and said second parameter is below a predetermined threshold.
A second embodiment of the invention is directed to a method comprising the features of the first aspect wherein the first parameter comprises at least one first gesture position, the predicted parameter comprises at least one predicted gesture position, and the second parameter comprises at least one second gesture position, and; the first touch gesture and the second touch gesture are converted to a contiguous gesture when the difference between said predicted gesture position and said second gesture position is less than a predetermined value.
A third embodiment of the invention is directed to a method comprising the features of the first aspect or the features of the second embodiment wherein the first parameter comprises a first gesture velocity, the predicted parameter comprises a predicted gesture velocity, and the second parameter comprises a second gesture velocity; and the first touch gesture and the second touch gesture are converted to a contiguous gesture when the difference between said predicted gesture velocity and said second gesture velocity is less than a predetermined value.
A fourth embodiment of the invention is directed to a method comprising the features of the third embodiment wherein the first touch gesture and the second touch gesture are converted to a contiguous gesture when the predicted gesture velocity and said second gesture velocity are substantially equal.
A fifth embodiment of the invention is directed to a method comprising the features of any one of the first aspect the second, third or fourth embodiment wherein the first parameter comprises a first gesture time stamp and the second parameter comprises a second gesture time stamp; and the first touch gesture and the second touch gesture are converted to a contiguous gesture when the difference between said first gesture time stamp and said second gesture time stamp is less than a predetermined value.
A sixth embodiment of the invention is directed to a method comprising the features of any one of the second, third, fourth or fifth embodiment, which method comprises the steps of: converting the first touch gesture and the second touch gesture to a contiguous gesture by using a filtered version of at least one second parameter indicative of the second touch gesture.
A seventh embodiment of the invention is directed to a method comprising the features of the sixth embodiment, which method comprises the steps of: converting the first touch gesture and the second touch gesture to a contiguous gesture by using an Alpha-Beta filter for filtering said at least one second parameter indicative of the second touch gesture.
An eighth embodiment of the invention is directed to a portable device that comprises a touch sensing arrangement and that is configured to perform the method according to any one of the first aspect or the second, third, fourth, fifth, sixth or seventh embodiment.
A ninth embodiment of the invention is directed to a computer program product stored on a computer usable medium, comprising readable program means for causing a portable device to execute, when said program means is loaded in the portable device comprising a touch sensing arrangement configured to recognize at least one drag gesture detected by a touch sensitive arrangement, the steps of: recording and obtaining at least one first parameter indicative of a first touch gesture made on the touch sensitive arrangement; obtaining at least one predicted parameter indicative of a predicted next gesture based on said first parameter; recording and obtaining at least one second parameter indicative of a second touch gesture made on the touch sensitive arrangement; converting the first touch gesture and the second touch gesture to a contiguous gesture when the difference between said predicted parameter and said second parameter is below a predetermined threshold.
Further advantages of the present invention and embodiments thereof will appear from the following detailed description of the invention.
It should be emphasized that the term "comprises/comprising" when used in this specification is taken to specify the presence of stated features, integers, steps or components, but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof. BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will now be described in more detail in relation to the enclosed drawings, in which:
Fig. 1 a is a schematic illustration of the finger/stylus pressure and the timing at a single tap gesture,
Fig. 1 b is a schematic illustration of the finger/stylus pressure and the timing of a double tap gesture,
FFiigg.. 11 cc is a schematic illustration of the finger/stylus pressure and the timing at a drag gesture,
Fig. 1d is a schematic illustration of a drag gesture detected as a series of separated detection that should be constructed as a single gesture.
Fig. 2 is a schematic illustration of a portable device in the form of a cell phone 10.
FFiigg.. 33 is a schematic illustration of the relevant parts of the cell phone 10 in Fig. 1 ,
Fig. 4 is a flowchart of an exemplifying operation of an embodiment of the invention,
Fig. 5 is a CD ROM 56 on which program code for executing the method according to the invention is provided.
DETAILED DESCRIPTION OF EMBODIMENTS
Features of Embodiments
The present invention relates to portable devices comprising a touch sensitive arrangement. In particular, the invention relates to portable communication devices comprising a touchscreen or similar touch sensitive arrangement. However, the invention is by no means limited to communication devices or touchscreens. Rather, it can be applied to any suitable portable device comprising a suitable touch sensitive arrangement.
Figure 2 shows an exemplifying portable communication device according to a preferred embodiment of the invention. Preferably, the device is a mobile cell phone 10. However, as indicated above, a cell phone is just one example of a portable device in which the invention can be implemented. The invention can for instance be implemented in a PDA (Personal Digital Assistant), a palm top computer, a lap top computer or a smartphone or any other suitable portable device. The cell phone 10 in Fig. 2 comprises a keypad 12, a loudspeaker 13 and a microphone 14. The keypad 12 is used for entering information such as selection of functions and responding to prompts. The keypad 12 may be of any suitable kind, including but not limited to keypads with suitable push-buttons or similar and/or a combination of different suitable button arrangements. The key pad 12 may even be an integral part of a touch sensitive arrangement comprised by the phone 10 being described below. The loudspeaker 13 is used for presenting sounds to a user and the microphone 14 is used for sensing the voice from the user or similar. In addition, the cell phone 10 includes an antenna, which is used for communication with other users via a network. The antenna is in-built in the cell phone 10 and hence not shown in Fig 2.
Moreover, the cell phone 10 in Fig. 2 comprises a touch sensitive arrangement comprising an exemplifying touchscreen 20. The touchscreen 20 comprises a touch function unit arranged to operatively receive and/or sense touches made by a user on the surface of the touchscreen 20. It is also preferred that the touchscreen 20 comprises a display function unit arranged to operatively present such items as functions, prompts, still and/or moving images etc to a user. A touch function unit and a display function unit are almost mandatory features of typical touchscreens and they are also well known to those skilled in the art. Exemplifying touchscreens in this category can e.g. be found in modern cell phones such as the M600i, W950i, P990i and others from Sony Ericsson. Hence, the touch function unit and display function unit of a touchscreen are well known and they need no detailed description.
Figure 3 shows parts of the interior of the cell phone 10 being relevant but not necessarily mandatory for the present invention. As previously explained, it is preferred that the cell phone 10 comprises a keypad 12, a speaker 13, a microphone 14 and a touchscreen 20.
In particular, it is preferred that the touchscreen 20 comprises a touch function unit 22 for receiving and detecting touches from a user of the cell phone 10, and a display function unit 24 (e.g. comprising a display such as an LCD or similar) for presenting functions, prompts, still images and/or moving images etc as mentioned above.
In addition, the cell phone 10 is preferably provided with a memory arrangement 16 for storing such items as e.g. system files and data files etc. The memory arrangement 16 may be any suitable memory or combination of memories that are commonly used in known portable devices such as e.g. cell phones or similar. In addition, the cell phone 10 comprises an antenna 17 connected to a radio circuit 18 for enabling wireless communication with a cellular network.
Furthermore, the cell phone 10 is provided with a control unit 40 for controlling and supervising the operation of the cell phone 10. The control unit 40 may be implemented by means of hardware and/or software, and it may comprise one or several hardware units and/or software modules, e.g. one or several separate processor arrangements provided with or having access to the appropriate software and hardware required for the functions to be performed by the cell phone 10, as is well known by those skilled in the art. As can be seen in Fig. 3, it is preferred that the control unit 40 is connected to or at least arranged to operatively communicate with the keypad 12, the speaker 13, the microphone 14, the touchscreen 20, the radio unit 18 and the memory 16. This provides the control unit 40 with the ability to control and communicate with these units to e.g. exchange information and instructions with the units.
In particular, the control unit 40 is provided with a drag gesture control 42, which is of special interest in connection with the present invention. Being a part of the control unit 40 implies that the drag gesture control 42 can be implemented by means of hardware and/or software and it can comprise one or several hardware units and/or software modules, e.g. one or several separate processor units provided with or having access to the software and hardware appropriate for the functions required. The drag gesture control 42 is arranged to operatively control the touchscreen arrangement 20 so as to comprise and/or communicate with the touch function unit of the touchscreen arrangement 20 for sensing touches received and detected by the touch function unit 22 of the touchscreen 20. In particular, the drag gesture control 42 is arranged so as to operatively detect a drag gesture (e.g. a tap and drag gesture) received and detected by the touch function unit 22 such that the drag gesture is distinguished from other gestures such as e.g. a single tap and/or a double tap.
Before we proceed it should be added that the touch function unit 22 of the touchscreen 20 may comprise any of: a resistive, a capacitive, a surface-wave-acoustic (SAW) or an infrared (IR) technique or some other suitable touch sensing technique as is well known to those skilled in the art. In the exemplifying resistive, capacitive, SAW or IR techniques or similar the pressure exerted by a finger or a stylus on the touch sensing surface of the touch function unit 22 can be represented in a graph as discussed above with reference to Fig. 1 a-1d. For example, a finger or a stylus or similar applied with an increasing pressure on a resistive touch sensing arrangement will typically cause the detected signal to increase gradually, whereas a decreased pressure will typically cause the detected signal to decrease. The same is valid mutatis mutandis in case of a finger applied on a capacitive, SAW or IR touch sensing arrangement. A higher pressure causes a larger area of the finger to be applied on the touch sensing arrangement and the other way around for a lower pressure, which can be detected by the touch sensing arrangement. However, a stylus applied on a capacitive, SAW or IR touch sensing arrangement with a varying pressure may not cause the detected pressure to vary, since the area of the stylus applied on the touch sensing arrangement remains essentially the same. Rather, a constant pressure may be detected as long the stylus remains applied on a capacitive, SAW or IR touch sensing arrangement, even if the stylus is applied with a varying pressure.
Function of Embodiments
The attention is now directed to an exemplifying method to be performed by the cell phone 10 described above and particularly by the drag gesture control 42 in the cell phone 10. The exemplifying method detects a tap and drag gesture such that the drag gesture is distinguished from other gestures, e.g. a single tap and/or a double tap. Below a finger gesture or similar is assumed. However, the same applies mutatis mutandis for a stylus gesture or similar.
First, an algorithm is needed to make a prediction of the most likely finger path, based on the most recent press-drag-release event stream. Second, a criterion for accepting or rejecting a new press-drag-release event stream must be established. Third, a method for "connecting" the first event stream with the second event stream is needed.
It is preferred that the method is performed by the drag gesture control 42 being arranged so as to logically implement the method between the touch screen driver software etc controlling the touch function unit 22 and the operating system level human interface input device handler etc controlling the display function unit 24 and the features displayed thereon. Here it is assumed that the touch function unit 22 provides X and Y coordinates to indicate the position of a finger or similar during a tap and drag on the touchscreen arrangement 20 plus a time stamp with each event, e.g. each new sampled position. It is also preferred that the touch function unit 22 provides Z values in a similar manner to indicate the level of pressure against the touch screen. Time, X, Y, and Z positions / values can all be used by the method. The touch function unit 22 may e.g. output more than 20, 40, 60, 80, 100, 120, 140, 160, 180, 200, 250, 300, 350 or more than 400 coordinates per second. Hence, we may obtain at least one of X and Y coordinate and possibly a Z value plus a sample time stamp with each such sampling event.
It is preferred that the method uses an Alpha-Beta filter or en Alpha-Beta-Gamma filter. However, a person skilled in the art having the benefit of this disclosure may contemplate other filters with inferior performance and/or being more complicated and thus more costly to develop.
Using an Alpha-Beta-Gamma filter and assuming that we know the speed and acceleration of the finger or similar since last x/y coordinate sample, then we can predict the next coordinate (if we know the time difference ΔT between two events).
We use the well-known equations for motion:
Position (prediction)
X predicted = * + (Δ7>* +l '2(AT) ^ (1 ) y predicted = J + (&T)vy + 1 / 2(AT)2 ay (2)
Z predicted = Z + (Δ7>^ +l '2(ATf ^Z (3)
Velocity (prediction) vx predicted =vx + (ΔT)θX (4) vy predicted = Y? + (ΔT)ay (5)
ZX predicted = VC + (AT)az (6) Acceleration (prediction) ax predicted =ax (7) ay predicted = W (8) az predicted = &Z (9)
As can be seen from expressions 7-9 it is assumed that the acceleration is constant between two samples, which is particularly true when the sampling rate is high.
Now, it is preferred to use the latest raw x, y coordinates and possibly also the z coordinate measured by the touch function unit 22, even though these measures may be heavily influenced by noise. So we mix these measures in with the prediction to a fixed degree, which is determined by the Alpha (α), Beta (β)and Gamma factors (Y), i.e.:
Change = Measured - Predicted (10) (i.e. a change is the offset of the measure from the prediction)
Expression (10) written for position, speed and acceleration:
"new position" = "position prediction" + α "position change" (1 1 )
"new speed" = "speed prediction" + β "speed change" (12) "new acceleration" = prediction + Y "acceleration change" (13)
So the necessary calculations are:
Change (offset from prediction) Ax = xmeasured - (x + (AT)vx + 1/ 2(AT)2 CiX) (14)
( I.e. Iλx — X measured~ x predicted )
^y = ymeasured~ (y + (AT)vy + 1/ 2(AT)2 ay) (15)
Δz = Zmeasured~ (z + (AT)VZ + \l 2(AT)2 Ciz) (16)
ΔVX = ΔJC / ΔΓ (17) Avy = Ay l AT (18)
AvZ = Az I AT (19)
Aax = Avx/ AT (20) Aay = Avy / AT (21 )
Aaz = Avy / AT (22)
x/y/z positions updated (filtered), using expressions 1-3, 1 1 and 14-16 xnew = x + (AT)vx + 1 / 2(AT) ax + ccAx = xmeasured - Ax + ocAx =^>
*new = ^measured + Δ*(l" «) (23) ynew = y measured + ^y(I - a) (24)
Z-new= ^-measured + ^z(I~ 0O (25)
x/y/z speeds updated (filtered), using expressions 4-6, 12 and 17-19 vxnew = vx + (AT)ax + /?Δvx (26)
^ynew = vy + (ΔT)ay + βAvy (27) vznew = vz + (AT)az + βAvz (28)
x/y/z accelerations updated (filtered), using expressions 7-9, 13 and 20-22 axnew = ax + γAax (29) ay new =ay + y^ay (30) aznew = az + yAaz (31 )
The attention is now directed to exemplifying steps of a method according an embodiment if the invention for detecting a drag gesture based on the predictions as described above. In this example it is assumed that the method is performed by the cell phone 10 as described above and particularly by the drag gesture control 42 in the cell phone 10. The steps of the exemplifying method will be described with reference to the exemplifying flowchart in Fig. 7. The steps of the method are preferably implemented by means of the feedback-control 42 as schematically illustrated in Fig. 2.
In a first step S1 of an exemplifying method according to an embodiment of the present invention an initialization of the cell phone 10 and particularly the drag gesture control 42 is performed. The initialisation may e.g. include such actions as activating the touch function unit 22, e.g. activating a resistive, capacitive, surface-wave-acoustic (SAW) or infrared touch sensing arrangement or some other suitable touch sensing technique comprised by the touch function unit 22. The initialisation may also include such actions as initialising the display function unit 24 and preferably the other necessary functions and/or units of the cell phone 10.
In a second step S2 of the exemplifying method it is preferred that a first touch gesture G1 is recorded at a first press event by the touch function unit 22 and made available to the drag gesture control 42. It is assumed that the recording of the first touch gesture G1 comprises a number of consecutive samples of X, Y, and possibly Z positions / values and a number of time differences ΔT each marking the time difference between two consecutive and adjacent samples.
Here, it may be clarified that in a tap and drag gesture such as that illustrated in Fig. 1 c there will normally be movements that can be detected and sampled during the whole sequence including but not limited to Δd1 and Δd2 as schematically indicated in Fig. 1 c.
However, there may be an issue with the first sample in the consecutive number of samples constituting the first gesture G1. For the first sample we prefer to use the first raw X, Y, and possibly Z positions / values as they are, or alternatively only as starting values, to predict and filter the next X, Y, and possibly Z positions / values (i.e. throw away the first sample). Speed and acceleration may have to be assumed to be zero for the first sample.
In a third step S3 of the exemplifying method it is preferred that a predicted gesture Gp is obtained based on the samples in the first gesture G1 and expressions 1-9 given above. The prediction is based on the last X, Y, and possibly Z positions / values and the time difference ΔT between two events (e.g. between two samples). Here, ΔT may be given by the sampling rate / sampling interval.
In a fourth step S4 of the exemplifying method it is preferred that a second touch gesture G2 is recorded at a second press event by the touch function unit 22 and made available to the drag gesture control 42. It is assumed that the recording of the second touch gesture G2 comprises a number of consecutive samples of X, Y, and possibly Z positions / values and a number of time differences ΔT each marking the time difference between two consecutive and adjacent samples. In a fifth step S5 of the exemplifying method it is preferably determined whether the first gesture G1 should be considered as a part of the same gesture as the second gesture G2, i.e. it is determined if the two gestures G1 , G2 forms a single gesture.
Here, it is preferred that the prediction Gp - obtained in the third step S3 by expressions 1-9 operating on samples forming the first gesture G1 as described above - is compare to the first sample of X, Y, and possibly Z positions / values of the second gesture G2. If the first sample of X, Y and possibly Z of the second gesture G2 is measured very close to the prediction, and preferably at the same time as the time ΔTG between the first gesture G1 and the first sample of the second gesture G2 is short, then we can assume that G1 and G2 is part of the same gesture. For example, G1 and G2 may be considered a parts of the same gesture when the difference (Δx, Δy) between said predicted gesture position xp, yp and said second gesture position x2, y2 is less than 0, 1 millimetres, 0,5 millimetres or less than 1 millimetre. Similarly, the short time ATG between the first gesture G1 and the first sample of the second gesture G2 may e.g. be less than 0,1 seconds, 0,5 seconds or less than 1 second
It is even more preferred that that G1 and G2 is assumed to be part of the same gesture if Δvx and Δvy as defined by expressions 17 and 18 respectively are small, preferably at the same time as the time ΔTG lapsed between the events is short (i.e. ΔTG is small as indicated above). For example, G1 and G2 may be considered a parts of the same gesture when the difference (Δvx, Δvy) between said predicted gesture velocity vp and said second gesture velocity v2 is less than 0,1 millimetres per second, 0,5 millimetres per second or less than 1 millimetre per second.
We use Δvx and Δvy defined in expressions 17, 18 rather than Δx and Δy defined in expressions 14, 15, since the acceptable difference should preferably depend on the duration of a "dropout", i.e. determined by the duration between the last sample of G1 and the first sample of G2 or a similar or corresponding time difference such as e.g. ATG . Hence, it is more relevant to look at discrepancies in speed between the prediction and the actual measure.
If it can be assumed that the gestures G1 , G2 are not a part of the same gesture it is preferred that the method returns to the second step S2 described above. However, if it can be assumed that the gestures G1 , G2 form parts of the same gesture it is preferred that the method proceeds to the next sixth step S6.
In a sixth step S6 of the exemplifying method it is preferably determined whether the gestures G1 , G2 that is assumed to form parts of the same gesture should actually be converted into a single contiguous gesture. In other words, under the assumption that the two gestures G1 , G2 are parts of a single gesture it is determined whether the gestures G1 , G2 should actually form a contiguous gesture.
It is preferred that the first gesture G1 and the second gesture G2 are converted into a single contiguous gesture if:
Δvx < kvx (32)
Avy < kvy (33) ATG < kt (34)
wherein kvx , kvy and kt are constants.
Note, that while it will be reasonable to hope for a good vx/vy prediction during a dropout, the same is not true for the optional Z-values. If there indeed was a dropout, Z should have decreased up until the dropout and then increased again after the dropout. We must look for low Z values (low pressure) before and after the dropout. In addition, Z-speed and Z-acceleration may be considered. Acceleration should be moderate and speed should be negative before the dropout (providing that more pressure means higher Z value).
If Z-values are available, we also need these conditions to be true just before the dropout:
z < kz (35) vz < 0 (36) abs{vz) < kvz (37) abs(az) < kaz (38)
wherein kz , kvz and kaz are constants. If it can be assumed that the gestures G1 , G2 should not form a contiguous gesture it is preferred that the method returns to the second step S2 described above. However, if it can be assumed that the gestures G1 , G2 should form a contiguous gesture it is preferred that the method proceeds to the next seventh step S7.
In a seventh step S7 of the exemplifying method it is preferred that the first gesture G1 and the second gesture G2 are converted to a single contiguous gesture. This is preferably accomplished by means of the expressions 23-31 above. In other words, the first sample of X, Y, and possibly Z positions / values of the second gesture G2 is preferably replaced by new filtered positions / values preferably calculated as indicated by the expressions 23-31 above, i.e. calculated depending on the first sample of X, Y, and possibly Z positions / values in the second gesture G adjusted with a weight of the a difference between the first sample of X, Y, and possibly Z positions / values in the second gesture G2 and the prediction of these positions / values.
In general, as previously explained, it is preferred that the drag control unit 46 is adapted to perform the exemplifying method as described above by being provided with one or more processors having corresponding memory containing the appropriate software in the form of a program code or similar. However, program code or similar can also be provided on a data carrier such as a CD ROM disc 56 as depicted in Fig. 5 or an insertable memory stick, which code or similar will perform the invention when loaded into a computer or into a phone having suitable processing capabilities. The program code can also be downloaded remotely from a server either outside or inside the cellular network or be downloaded via a computer like a PC to which the phone is temporarily connected.
The present invention has now been described with reference to exemplifying embodiments. However, the invention is not limited to the embodiments described herein. On the contrary, the full extent of the invention is only determined by the scope of the appended claims.

Claims

1. A method for recognizing at least one drag gesture detected by a touch sensitive arrangement (20, 22, 24) of a portable device (10), which method in the portable device (10) comprises the steps of: recording and obtaining at least one first parameter indicative of a first touch gesture (G1 ) made on the touch sensitive arrangement (20), obtaining at least one predicted parameter indicative of a predicted next gesture (Gp) based on said first parameter, - recording and obtaining at least one second parameter indicative of a second touch gesture (G2) made on the touch sensitive arrangement (20), converting the first touch gesture (G1 ) and the second touch gesture (G2) to a contiguous gesture when the difference between said predicted parameter and said second parameter is below a predetermined threshold.
2. The method according to claim 1 , wherein the first parameter comprises at least one first gesture position (x1 , y1 ), the predicted parameter comprises at least one predicted gesture position (xp, yp), and the second parameter comprises at least one second gesture position (x2, y2), and the first touch gesture (G1 ) and the second touch gesture (G2) are converted to a contiguous gesture when the difference (Δx, Δy) between said predicted gesture position (xp, yp) and said second gesture position (x2, y2) is less than a predetermined value.
3. The method according to any one of claim 1 or 2, wherein the first parameter comprises a first gesture velocity (v1x, v1y), the predicted parameter comprises a predicted gesture velocity (vpx, vpy), and the second parameter comprises a second gesture velocity (v2x, v2y), and - the first touch gesture (G1 ) and the second touch gesture (G2) are converted to a contiguous gesture when the difference (Δvx, Δvy) between said predicted gesture velocity (vpx, vpy) and said second gesture velocity (v2x, v2y) is less than a predetermined value.
4. The method according to claim 3, wherein the first touch gesture (G1 ) and the second touch gesture (G2) are converted to a contiguous gesture when the predicted gesture velocity (vpx, vpy) and said second gesture velocity (v2x, vpy) are substantially equal.
5. The method according to any one of claim 1 , 2, 3 or 4, wherein the first parameter comprises a first gesture time stamp (t1 ) and the second parameter comprises a second gesture time stamp (t2), and the first touch gesture (G1 ) and the second touch gesture (G2) are converted to a contiguous gesture when the difference ( ΔTG ) between said first gesture time stamp (t1 ) and said second gesture time stamp (t2) is less than a predetermined value.
6. The method according to any one of claim 2, 3, 4 or 5, which method comprises the steps of: converting the first touch gesture (G1 ) and the second touch gesture (G2) to a contiguous gesture by using a filtered version of at least one second parameter indicative of the second touch gesture (G2).
7. The method according to claim 6, which method comprises the steps of: converting the first touch gesture (G1 ) and the second touch gesture (G2) to a contiguous gesture by using an Alpha-Beta filter for filtering said at least one second parameter indicative of the second touch gesture (G2).
8. A portable device (10) comprising a touch sensing arrangement (20, 22, 24) and being configured to perform the method according to any one of claim 1-9.
9. A computer program product stored on a computer usable medium (56), comprising readable program means for causing a portable device (10) to execute, when said program means is loaded in the portable device (10) comprising a touch sensing arrangement (20, 22, 24) configured to recognize at least one drag gesture detected by a touch sensitive arrangement (20, 22, 24); the steps of: recording and obtaining at least one first parameter indicative of a first touch gesture (G1 ) made on the touch sensitive arrangement (20), obtaining at least one predicted parameter indicative of a predicted next gesture (Gp) based on said first parameter, recording and obtaining at least one second parameter indicative of a second touch gesture (G2) made on the touch sensitive arrangement (20), converting the first touch gesture (G1 ) and the second touch gesture (G2) to a contiguous gesture when the difference between said predicted parameter and said second parameter is below a predetermined threshold.
PCT/EP2008/0581642008-06-262008-06-26Gesture on touch sensitive arrangementWO2009155981A1 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
PCT/EP2008/058164WO2009155981A1 (en)2008-06-262008-06-26Gesture on touch sensitive arrangement

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
PCT/EP2008/058164WO2009155981A1 (en)2008-06-262008-06-26Gesture on touch sensitive arrangement

Publications (1)

Publication NumberPublication Date
WO2009155981A1true WO2009155981A1 (en)2009-12-30

Family

ID=40382000

Family Applications (1)

Application NumberTitlePriority DateFiling Date
PCT/EP2008/058164WO2009155981A1 (en)2008-06-262008-06-26Gesture on touch sensitive arrangement

Country Status (1)

CountryLink
WO (1)WO2009155981A1 (en)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2012001208A1 (en)*2010-06-282012-01-05Nokia CorporationHaptic surface compression
EP2575014A2 (en)*2011-09-302013-04-03Samsung Electronics Co., LtdMethod and apparatus for handling touch input in a mobile terminal
US9602729B2 (en)2015-06-072017-03-21Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US9612741B2 (en)2012-05-092017-04-04Apple Inc.Device, method, and graphical user interface for displaying additional information in response to a user contact
US9619076B2 (en)2012-05-092017-04-11Apple Inc.Device, method, and graphical user interface for transitioning between display states in response to a gesture
US9632664B2 (en)2015-03-082017-04-25Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9639184B2 (en)2015-03-192017-05-02Apple Inc.Touch input cursor manipulation
US9645732B2 (en)2015-03-082017-05-09Apple Inc.Devices, methods, and graphical user interfaces for displaying and using menus
US9674426B2 (en)2015-06-072017-06-06Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US9753639B2 (en)2012-05-092017-09-05Apple Inc.Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US9778771B2 (en)2012-12-292017-10-03Apple Inc.Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9785305B2 (en)2015-03-192017-10-10Apple Inc.Touch input cursor manipulation
US9830048B2 (en)2015-06-072017-11-28Apple Inc.Devices and methods for processing touch inputs with instructions in a web page
US9880735B2 (en)2015-08-102018-01-30Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9886184B2 (en)2012-05-092018-02-06Apple Inc.Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US9891811B2 (en)2015-06-072018-02-13Apple Inc.Devices and methods for navigating between user interfaces
US9959025B2 (en)2012-12-292018-05-01Apple Inc.Device, method, and graphical user interface for navigating user interface hierarchies
US9990107B2 (en)2015-03-082018-06-05Apple Inc.Devices, methods, and graphical user interfaces for displaying and using menus
US9990121B2 (en)2012-05-092018-06-05Apple Inc.Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US9996231B2 (en)2012-05-092018-06-12Apple Inc.Device, method, and graphical user interface for manipulating framed graphical objects
US10037138B2 (en)2012-12-292018-07-31Apple Inc.Device, method, and graphical user interface for switching between user interfaces
US10042542B2 (en)2012-05-092018-08-07Apple Inc.Device, method, and graphical user interface for moving and dropping a user interface object
US10048757B2 (en)2015-03-082018-08-14Apple Inc.Devices and methods for controlling media presentation
US10067653B2 (en)2015-04-012018-09-04Apple Inc.Devices and methods for processing touch inputs based on their intensities
US10073615B2 (en)2012-05-092018-09-11Apple Inc.Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10078442B2 (en)2012-12-292018-09-18Apple Inc.Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10095396B2 (en)2015-03-082018-10-09Apple Inc.Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10095391B2 (en)2012-05-092018-10-09Apple Inc.Device, method, and graphical user interface for selecting user interface objects
US10126930B2 (en)2012-05-092018-11-13Apple Inc.Device, method, and graphical user interface for scrolling nested regions
US10162452B2 (en)2015-08-102018-12-25Apple Inc.Devices and methods for processing touch inputs based on their intensities
US10175757B2 (en)2012-05-092019-01-08Apple Inc.Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10175864B2 (en)2012-05-092019-01-08Apple Inc.Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US10200598B2 (en)2015-06-072019-02-05Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US10235035B2 (en)2015-08-102019-03-19Apple Inc.Devices, methods, and graphical user interfaces for content navigation and manipulation
US10248308B2 (en)2015-08-102019-04-02Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10275087B1 (en)2011-08-052019-04-30P4tents1, LLCDevices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10346030B2 (en)2015-06-072019-07-09Apple Inc.Devices and methods for navigating between user interfaces
US10416800B2 (en)2015-08-102019-09-17Apple Inc.Devices, methods, and graphical user interfaces for adjusting user interface objects
US10437333B2 (en)2012-12-292019-10-08Apple Inc.Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US10496260B2 (en)2012-05-092019-12-03Apple Inc.Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US10620781B2 (en)2012-12-292020-04-14Apple Inc.Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5880411A (en)*1992-06-081999-03-09Synaptics, IncorporatedObject position detector with edge motion feature and gesture recognition
EP0870223B1 (en)*1994-10-072005-08-24Synaptics IncorporatedMethod for compensating for unintended lateral motion in a tap gesture made on a touch-sensor pad

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5880411A (en)*1992-06-081999-03-09Synaptics, IncorporatedObject position detector with edge motion feature and gesture recognition
EP0870223B1 (en)*1994-10-072005-08-24Synaptics IncorporatedMethod for compensating for unintended lateral motion in a tap gesture made on a touch-sensor pad

Cited By (125)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102971689B (en)*2010-06-282015-10-07诺基亚公司Haptic surface compression
CN102971689A (en)*2010-06-282013-03-13诺基亚公司Haptic surface compression
WO2012001208A1 (en)*2010-06-282012-01-05Nokia CorporationHaptic surface compression
US10345961B1 (en)2011-08-052019-07-09P4tents1, LLCDevices and methods for navigating between user interfaces
US10275087B1 (en)2011-08-052019-04-30P4tents1, LLCDevices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10338736B1 (en)2011-08-052019-07-02P4tents1, LLCDevices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10365758B1 (en)2011-08-052019-07-30P4tents1, LLCDevices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10386960B1 (en)2011-08-052019-08-20P4tents1, LLCDevices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10540039B1 (en)2011-08-052020-01-21P4tents1, LLCDevices and methods for navigating between user interface
US10649571B1 (en)2011-08-052020-05-12P4tents1, LLCDevices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10656752B1 (en)2011-08-052020-05-19P4tents1, LLCGesture-equipped touch screen system, method, and computer program product
US10664097B1 (en)2011-08-052020-05-26P4tents1, LLCDevices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
EP2575014A2 (en)*2011-09-302013-04-03Samsung Electronics Co., LtdMethod and apparatus for handling touch input in a mobile terminal
US10782871B2 (en)2012-05-092020-09-22Apple Inc.Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10042542B2 (en)2012-05-092018-08-07Apple Inc.Device, method, and graphical user interface for moving and dropping a user interface object
US11023116B2 (en)2012-05-092021-06-01Apple Inc.Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US9823839B2 (en)2012-05-092017-11-21Apple Inc.Device, method, and graphical user interface for displaying additional information in response to a user contact
US11010027B2 (en)2012-05-092021-05-18Apple Inc.Device, method, and graphical user interface for manipulating framed graphical objects
US10996788B2 (en)2012-05-092021-05-04Apple Inc.Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10969945B2 (en)2012-05-092021-04-06Apple Inc.Device, method, and graphical user interface for selecting user interface objects
US10942570B2 (en)2012-05-092021-03-09Apple Inc.Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US9886184B2 (en)2012-05-092018-02-06Apple Inc.Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10908808B2 (en)2012-05-092021-02-02Apple Inc.Device, method, and graphical user interface for displaying additional information in response to a user contact
US10884591B2 (en)2012-05-092021-01-05Apple Inc.Device, method, and graphical user interface for selecting object within a group of objects
US10191627B2 (en)2012-05-092019-01-29Apple Inc.Device, method, and graphical user interface for manipulating framed graphical objects
US10775994B2 (en)2012-05-092020-09-15Apple Inc.Device, method, and graphical user interface for moving and dropping a user interface object
US9971499B2 (en)2012-05-092018-05-15Apple Inc.Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US10775999B2 (en)2012-05-092020-09-15Apple Inc.Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US9990121B2 (en)2012-05-092018-06-05Apple Inc.Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US9996231B2 (en)2012-05-092018-06-12Apple Inc.Device, method, and graphical user interface for manipulating framed graphical objects
US9753639B2 (en)2012-05-092017-09-05Apple Inc.Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US11221675B2 (en)2012-05-092022-01-11Apple Inc.Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US11068153B2 (en)2012-05-092021-07-20Apple Inc.Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US11314407B2 (en)2012-05-092022-04-26Apple Inc.Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US11354033B2 (en)2012-05-092022-06-07Apple Inc.Device, method, and graphical user interface for managing icons in a user interface region
US10592041B2 (en)2012-05-092020-03-17Apple Inc.Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10073615B2 (en)2012-05-092018-09-11Apple Inc.Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US11947724B2 (en)2012-05-092024-04-02Apple Inc.Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US10496260B2 (en)2012-05-092019-12-03Apple Inc.Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US10095391B2 (en)2012-05-092018-10-09Apple Inc.Device, method, and graphical user interface for selecting user interface objects
US10481690B2 (en)2012-05-092019-11-19Apple Inc.Device, method, and graphical user interface for providing tactile feedback for media adjustment operations performed in a user interface
US10114546B2 (en)2012-05-092018-10-30Apple Inc.Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10126930B2 (en)2012-05-092018-11-13Apple Inc.Device, method, and graphical user interface for scrolling nested regions
US12045451B2 (en)2012-05-092024-07-23Apple Inc.Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US12067229B2 (en)2012-05-092024-08-20Apple Inc.Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10168826B2 (en)2012-05-092019-01-01Apple Inc.Device, method, and graphical user interface for transitioning between display states in response to a gesture
US9619076B2 (en)2012-05-092017-04-11Apple Inc.Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10175757B2 (en)2012-05-092019-01-08Apple Inc.Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10175864B2 (en)2012-05-092019-01-08Apple Inc.Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US9612741B2 (en)2012-05-092017-04-04Apple Inc.Device, method, and graphical user interface for displaying additional information in response to a user contact
US12340075B2 (en)2012-05-092025-06-24Apple Inc.Device, method, and graphical user interface for selecting user interface objects
US9996233B2 (en)2012-12-292018-06-12Apple Inc.Device, method, and graphical user interface for navigating user interface hierarchies
US10437333B2 (en)2012-12-292019-10-08Apple Inc.Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US12135871B2 (en)2012-12-292024-11-05Apple Inc.Device, method, and graphical user interface for switching between user interfaces
US12050761B2 (en)2012-12-292024-07-30Apple Inc.Device, method, and graphical user interface for transitioning from low power mode
US9778771B2 (en)2012-12-292017-10-03Apple Inc.Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9857897B2 (en)2012-12-292018-01-02Apple Inc.Device and method for assigning respective portions of an aggregate intensity to a plurality of contacts
US10915243B2 (en)2012-12-292021-02-09Apple Inc.Device, method, and graphical user interface for adjusting content selection
US9959025B2 (en)2012-12-292018-05-01Apple Inc.Device, method, and graphical user interface for navigating user interface hierarchies
US9965074B2 (en)2012-12-292018-05-08Apple Inc.Device, method, and graphical user interface for transitioning between touch input to display output relationships
US10185491B2 (en)2012-12-292019-01-22Apple Inc.Device, method, and graphical user interface for determining whether to scroll or enlarge content
US10037138B2 (en)2012-12-292018-07-31Apple Inc.Device, method, and graphical user interface for switching between user interfaces
US10620781B2 (en)2012-12-292020-04-14Apple Inc.Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
US10078442B2 (en)2012-12-292018-09-18Apple Inc.Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10101887B2 (en)2012-12-292018-10-16Apple Inc.Device, method, and graphical user interface for navigating user interface hierarchies
US10175879B2 (en)2012-12-292019-01-08Apple Inc.Device, method, and graphical user interface for zooming a user interface while performing a drag operation
US10402073B2 (en)2015-03-082019-09-03Apple Inc.Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10338772B2 (en)2015-03-082019-07-02Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10860177B2 (en)2015-03-082020-12-08Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US12436662B2 (en)2015-03-082025-10-07Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9632664B2 (en)2015-03-082017-04-25Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10268342B2 (en)2015-03-082019-04-23Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11977726B2 (en)2015-03-082024-05-07Apple Inc.Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9645709B2 (en)2015-03-082017-05-09Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10095396B2 (en)2015-03-082018-10-09Apple Inc.Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10180772B2 (en)2015-03-082019-01-15Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10268341B2 (en)2015-03-082019-04-23Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9990107B2 (en)2015-03-082018-06-05Apple Inc.Devices, methods, and graphical user interfaces for displaying and using menus
US10613634B2 (en)2015-03-082020-04-07Apple Inc.Devices and methods for controlling media presentation
US11112957B2 (en)2015-03-082021-09-07Apple Inc.Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10067645B2 (en)2015-03-082018-09-04Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10048757B2 (en)2015-03-082018-08-14Apple Inc.Devices and methods for controlling media presentation
US10387029B2 (en)2015-03-082019-08-20Apple Inc.Devices, methods, and graphical user interfaces for displaying and using menus
US9645732B2 (en)2015-03-082017-05-09Apple Inc.Devices, methods, and graphical user interfaces for displaying and using menus
US11550471B2 (en)2015-03-192023-01-10Apple Inc.Touch input cursor manipulation
US9785305B2 (en)2015-03-192017-10-10Apple Inc.Touch input cursor manipulation
US10599331B2 (en)2015-03-192020-03-24Apple Inc.Touch input cursor manipulation
US9639184B2 (en)2015-03-192017-05-02Apple Inc.Touch input cursor manipulation
US10222980B2 (en)2015-03-192019-03-05Apple Inc.Touch input cursor manipulation
US11054990B2 (en)2015-03-192021-07-06Apple Inc.Touch input cursor manipulation
US10067653B2 (en)2015-04-012018-09-04Apple Inc.Devices and methods for processing touch inputs based on their intensities
US10152208B2 (en)2015-04-012018-12-11Apple Inc.Devices and methods for processing touch inputs based on their intensities
US9674426B2 (en)2015-06-072017-06-06Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US10346030B2 (en)2015-06-072019-07-09Apple Inc.Devices and methods for navigating between user interfaces
US10200598B2 (en)2015-06-072019-02-05Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US12346550B2 (en)2015-06-072025-07-01Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US9602729B2 (en)2015-06-072017-03-21Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US9860451B2 (en)2015-06-072018-01-02Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US10455146B2 (en)2015-06-072019-10-22Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US9830048B2 (en)2015-06-072017-11-28Apple Inc.Devices and methods for processing touch inputs with instructions in a web page
US9916080B2 (en)2015-06-072018-03-13Apple Inc.Devices and methods for navigating between user interfaces
US10303354B2 (en)2015-06-072019-05-28Apple Inc.Devices and methods for navigating between user interfaces
US10841484B2 (en)2015-06-072020-11-17Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US11835985B2 (en)2015-06-072023-12-05Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US11681429B2 (en)2015-06-072023-06-20Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US9706127B2 (en)2015-06-072017-07-11Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US11231831B2 (en)2015-06-072022-01-25Apple Inc.Devices and methods for content preview based on touch input intensity
US11240424B2 (en)2015-06-072022-02-01Apple Inc.Devices and methods for capturing and interacting with enhanced digital images
US10705718B2 (en)2015-06-072020-07-07Apple Inc.Devices and methods for navigating between user interfaces
US9891811B2 (en)2015-06-072018-02-13Apple Inc.Devices and methods for navigating between user interfaces
US11327648B2 (en)2015-08-102022-05-10Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10698598B2 (en)2015-08-102020-06-30Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11182017B2 (en)2015-08-102021-11-23Apple Inc.Devices and methods for processing touch inputs based on their intensities
US11740785B2 (en)2015-08-102023-08-29Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10754542B2 (en)2015-08-102020-08-25Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10884608B2 (en)2015-08-102021-01-05Apple Inc.Devices, methods, and graphical user interfaces for content navigation and manipulation
US10235035B2 (en)2015-08-102019-03-19Apple Inc.Devices, methods, and graphical user interfaces for content navigation and manipulation
US10162452B2 (en)2015-08-102018-12-25Apple Inc.Devices and methods for processing touch inputs based on their intensities
US10209884B2 (en)2015-08-102019-02-19Apple Inc.Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback
US10416800B2 (en)2015-08-102019-09-17Apple Inc.Devices, methods, and graphical user interfaces for adjusting user interface objects
US10203868B2 (en)2015-08-102019-02-12Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10963158B2 (en)2015-08-102021-03-30Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9880735B2 (en)2015-08-102018-01-30Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US12386501B2 (en)2015-08-102025-08-12Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10248308B2 (en)2015-08-102019-04-02Apple Inc.Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures

Similar Documents

PublicationPublication DateTitle
WO2009155981A1 (en)Gesture on touch sensitive arrangement
CN103262008B (en)Intelligent wireless mouse
US20090066659A1 (en)Computer system with touch screen and separate display screen
US20100259499A1 (en)Method and device for recognizing a dual point user input on a touch based user input device
US20160162064A1 (en)Method for actuating a tactile interface layer
US20110187652A1 (en)Bump suppression
CN110069178B (en)Interface control method and terminal equipment
EP3612920A1 (en)Electronic device response to force-sensitive interface
WO2006020305A2 (en)Gestures for touch sensitive input devices
KR20150101213A (en)Electronic device, wearable device and method for the input of the electronic device
WO2018194718A1 (en)Force-sensitive user input interface for an electronic device
US8810529B2 (en)Electronic device and method of controlling same
CN110703972B (en)File control method and electronic equipment
CN108984096A (en) Touch operation method, device, storage medium and electronic device
WO2021196346A1 (en)Capacitive touch device and gesture recognition method thereof, chip and storage medium
US10733280B2 (en)Control of a mobile device based on fingerprint identification
KR20080105724A (en) Communication terminal having a touch panel and a touch coordinate value calculation method thereof
CN103809894B (en)A kind of recognition methods of gesture and electronic equipment
CN111176512B (en)Icon processing method and electronic equipment
CN108427534B (en)Method and device for controlling screen to return to desktop
CN113407066B (en)Touch controller of handheld device and control method thereof
CN108459818A (en)The method and apparatus for controlling unlocking screen
EP3528103B1 (en)Screen locking method, terminal and screen locking device
KR100859882B1 (en) Method and device for recognizing dual point user input on touch based user input device
CN111124204B (en) Application control method and electronic device

Legal Events

DateCodeTitleDescription
121Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number:08761387

Country of ref document:EP

Kind code of ref document:A1

NENPNon-entry into the national phase

Ref country code:DE


[8]ページ先頭

©2009-2025 Movatter.jp