Movatterモバイル変換


[0]ホーム

URL:


This specification was published by theTouch Events Community Group. It is not a W3C Standard nor is it on the W3C Standards Track.

Please note that under theW3C Community Final Specification Agreement (FSA) other conditions apply.

This version of the specification includes fixes and improvements toTouch Events Level 1, and incorporates the features previously published asTouch Event Extensions.

There is currently no intention to carry on further work on the Touch Events specification. This document has been maintained up to this point to reflect additions and changes made in user agents since the release of the originalTouch Events Level 1 specification. The Community Group considers Touch Events a legacy API – authors are strongly encouraged to adoptPointer Events instead.

The Touch Events specification defines a set of low-level events that represent one or more points of contact with a touch-sensitive surface, and changes of those points with respect to the surface and any DOM elements displayed upon it (e.g. for touch screens) or associated with it (e.g. for drawing tablets without displays). It also addresses pen-tablet devices, such as drawing tablets, with consideration toward stylus capabilities.

Introduction

User Agents that run on terminals which provide touch input to use web applications typically use interpreted mouse events to allow users to access interactive web applications. However, these interpreted events, being normalized data based on the physical touch input, tend to have limitations on delivering the intended user experience. Additionally, it is not possible to handle concurrent input regardless of device capability, due to constraints of mouse events: both system level limitations and legacy compatibility.

Meanwhile, native applications are capable of handling both cases with the provided system APIs.

The Touch Events specification provides a solution to this problem by specifying interfaces to allow web applications to directly handle touch events, and multiple touch points for capable devices.

This specification defines conformance criteria that apply to a single product: the user agent that implements the interfaces that it contains.

WindowProxy is defined in [[!HTML5]].

WebIDL Conformance

The IDL blocks in this specification are conforming IDL fragments as defined by the WebIDL specification [[!WEBIDL]].

A conforming user agent must also be aconforming JavaScript implementation of this IDL fragments in this specification, with the following exception:

Note: Both ways of reflecting IDL attributes allow for simply getting and setting the property on the platform object to work. For example, given aTouch objectaTouch, evaluatingaTouch.target would return theEventTarget for theTouch object. If the user agent implements IDL attributes as accessor properties, then the property access invokes the getter which returns theEventTarget. If the user agent implements IDL attributes as data properties on the platform object with the same behavior as would be found with the accessor properties, then the object would appear to have an own property namedtarget whose value is anEventTarget object, and the property access would return this value.

Touch Interface

This interface describes an individualtouch point for a touch event.Touch objects are immutable; after one is created, its attributes must not change.

enum TouchType {    "direct",    "stylus"};dictionary TouchInit {    required long        identifier;    required EventTarget target;             double      clientX = 0;             double      clientY = 0;             double      screenX = 0;             double      screenY = 0;             double      pageX = 0;             double      pageY = 0;             float       radiusX = 0;             float       radiusY = 0;             float       rotationAngle = 0;             float       force = 0;             double      altitudeAngle = 0;             double      azimuthAngle = 0;             TouchType   touchType = "direct";};[Exposed=Window]interface Touch {    constructor(TouchInit touchInitDict);    readonly        attribute long        identifier;    readonly        attribute EventTarget target;    readonly        attribute double      screenX;    readonly        attribute double      screenY;    readonly        attribute double      clientX;    readonly        attribute double      clientY;    readonly        attribute double      pageX;    readonly        attribute double      pageY;    readonly        attribute float       radiusX;    readonly        attribute float       radiusY;    readonly        attribute float       rotationAngle;    readonly        attribute float       force;    readonly        attribute float       altitudeAngle;    readonly        attribute float       azimuthAngle;    readonly        attribute TouchType   touchType;};
identifier

An identification number for eachtouch point.

When a touch point becomes active, it must be assigned anidentifier that is distinct from any otheractive touch point. While the touch point remains active, all events that refer to it must assign it the sameidentifier.

target

TheEventTarget on which thetouch point started when it was first placed on the surface, even if thetouch point has since moved outside the interactive area of that element.

Some implementations alter the target element to correct for the imprecision of coarse input. Therefore, the target element may not necessarily be the element directly at the coordinates of the event. The methods used to target/disambiguate coarse input are out of scope for this specification.

screenX

The horizontal coordinate of point relative to the screen in pixels

screenY

The vertical coordinate of point relative to the screen in pixels

clientX

The horizontal coordinate of point relative to the viewport in pixels, excluding any scroll offset

clientY

The vertical coordinate of point relative to the viewport in pixels, excluding any scroll offset

pageX

The horizontal coordinate of point relative to the viewport in pixels, including any scroll offset

pageY

The vertical coordinate of point relative to the viewport in pixels, including any scroll offset

radiusX

The radius of the ellipse which most closely circumscribes the touching area (e.g. finger, stylus) along the axis indicated by rotationAngle, in CSS pixels (as defined by [[!CSS-VALUES]]) of the same scale as screenX;0 if no value is known. The value must not be negative.

radiusY

The radius of the ellipse which most closely circumscribes the touching area (e.g. finger, stylus) along the axis perpendicular to that indicated by rotationAngle, in CSS pixels (as defined by [[!CSS-VALUES]]) of the same scale as screenY;0 if no value is known. The value must not be negative.

rotationAngle

The angle (in degrees) that the ellipse described by radiusX and radiusY is rotated clockwise about its center;0 if no value is known. The value must be greater than or equal to0 and less than90.

If the ellipse described by radiusX and radiusY is circular, then rotationAngle has no effect. The user agent may use0 as the value in this case, or it may use any other value in the allowed range. (For example, the user agent may use the rotationAngle value from the previous touch event, to avoid sudden changes.)

force

A relative value of pressure applied, in the range0 to1, where0 is no pressure, and1 is the highest level of pressure the touch device is capable of sensing;0 if no value is known. In environments where force is known, the absolute pressure represented by the force attribute, and the sensitivity in levels of pressure, may vary.

altitudeAngle

The altitude (in radians) of the transducer (e.g. pen/stylus), in the range [0,π/2] — where 0 is parallel to the surface (X-Y plane), and π/2 is perpendicular to the surface. For hardware and platforms that do not report tilt or angle, the value MUST be 0.

The default value defined here foraltitudeAngle is 0. This differs from thePointer Events - Level 3 [[POINTEREVENTS]] specification's definition for thealtitudeAngle property, which has a default value of π/2, which positions the transducer as being perpendicular to the surface.
altitudeAngle explanation diagram
ExamplealtitudeAngle of π/4 (45 degrees from the X-Y plane).
azimuthAngle

The azimuth angle (in radians) of the transducer (e.g. pen/stylus), in the range [0, 2π] — where 0 represents a transducer whose cap is pointing in the direction of increasing X values (point to "3 o'clock" if looking straight down) on the X-Y plane, and the values progressively increase when going clockwise (π/2 at "6 o'clock", π at "9 o'clock", 3π/2 at "12 o'clock"). When the transducer is perfectly perpendicular to the surface (altitudeAngle of π/2), the value MUST be 0. For hardware and platforms that do not report tilt or angle, the value MUST be 0.

azimuthAngle explanation diagram
ExampleazimuthAngle of π/6 ("4 o'clock").
touchType

The type of device used to trigger the touch.

TouchType

An enumeration representing the different types of possible touch input.

direct

A direct touch from a finger on the screen.

stylus

A touch from a stylus or pen device.

TouchList Interface

This interface defines a list of individual points of contact for a touch event.TouchList objects are immutable; after one is created, its contents must not change.

ATouchList object'ssupported property indices ([[!WEBIDL]]) are the numbers in the range 0 to one less than the length of the list.

[Exposed=Window]interface TouchList {    readonly        attribute unsigned long length;    getter Touch? item (unsigned long index);};
length

Returns the number ofTouch objects in the list

item

Returns theTouch at the specified index in the list or null if the index is not less than the length of the list.

TouchEvent Interface

This interface defines thetouchstart,touchend,touchmove, andtouchcancel event types.TouchEvent objects are immutable; after one is created and initialized, its attributes must not change.TouchEvent inherits from theUIEvent interface defined in [[!DOM-LEVEL-3-EVENTS]].

TheTouchEventInit dictionary is used by theTouchEvent interface's constructor to provide a mechanism by which to construct untrusted (synthetic) touch events. It inherits from theEventModifierInit dictionary defined in [[!DOM-LEVEL-3-EVENTS]]. The steps for constructing an event are defined in [[!DOM4]]. See theexample for sample code demonstrating how to fire an untrusted touch event.

dictionary TouchEventInit : EventModifierInit {            sequence<Touch> touches = [];            sequence<Touch> targetTouches = [];            sequence<Touch> changedTouches = [];};[Exposed=Window]interface TouchEvent : UIEvent {    constructor(DOMString type, optional TouchEventInit eventInitDict = {});    readonly        attribute TouchList touches;    readonly        attribute TouchList targetTouches;    readonly        attribute TouchList changedTouches;    readonly        attribute boolean   altKey;    readonly        attribute boolean   metaKey;    readonly        attribute boolean   ctrlKey;    readonly        attribute boolean   shiftKey;    getter boolean getModifierState (DOMString keyArg);};
touches

A list ofTouch objects for every point of contact currently touching the surface.

targetTouches

A list ofTouch objects for every point of contact that is touching the surfaceand started on the element that is the target of the current event.

changedTouches

A list ofTouch objects for every point of contact which contributed to the event.

For thetouchstart event this must be a list of the touch points that just became active with the current event. For thetouchmove event this must be a list of the touch points that have moved since the last event. For thetouchend andtouchcancel events this must be a list of the touch points that have just been removed from the surface, with the last known coordinates of the touch points before they were removed.

altKey

true if the alt (Alternate) key modifier is activated; otherwisefalse

metaKey

true if the meta (Meta) key modifier is activated; otherwisefalse. On some platforms this attribute may map to a differently-named key modifier.

ctrlKey

true if the ctrl (Control) key modifier is activated; otherwisefalse

shiftKey

true if the shift (Shift) key modifier is activated; otherwisefalse

getModifierState(keyArg)

Queries the state of a modifier using a key value. Returnstrue if it is a modifier key and the modifier is activated,false otherwise.

TouchEvent Implementer's Note

User agents should ensure that allTouch objects available from a givenTouchEvent are all associated to the same document that theTouchEvent was dispatched to. To implement this, user agents should maintain a notion of the currenttouch-active document. On first touch, this is set to the target document where the touch was created. When all active touch points are released, thetouch-active document is cleared. AllTouchEvents are dispatched to the currenttouch-active document, and eachTouch object it contains refers only to DOM elements (and co-ordinates) in that document. If a touch starts entirely outside the currentlytouch-active document, then it is ignored entirely.

Usage Examples

The examples below demonstrate the relations between the differentTouchList members defined in aTouchEvent.

touches andtargetTouches of aTouchEvent

This example demonstrates the utility and relations between thetouches andtargetTouches members defined in theTouchEvent interface. The following code will generate different output based on the number of touch points on the touchable element and the document:

<div>This element is touchable.</div><script>document.getElementById('touchable').addEventListener('touchstart', function(ev) {    if (ev.touches.item(0) == ev.targetTouches.item(0))    {        /**         * If the first touch on the surface is also targeting the         * "touchable" element, the code below should execute.         * Since targetTouches is a subset of touches which covers the         * entire surface, TouchEvent.touches >= TouchEvents.targetTouches         * is always true.         */        document.write('Hello Touch Events!');    }    if (ev.touches.length == ev.targetTouches.length)    {        /**         * If all of the active touch points are on the "touchable"         * element, the length properties should be the same.         */        document.write('All points are on target element')    }    if (ev.touches.length > 1)    {        /**         * On a single touch input device, there can only be one point         * of contact on the surface, so the following code can only         * execute when the terminal supports multiple touches.         */        document.write('Hello Multiple Touch!');    }}, false);</script>

changedTouches of aTouchEvent

This example demonstrates the utility ofchangedTouches and it's relation with the otherTouchList members of theTouchEvent interface. The code is a example which triggers whenever a touch point is removed from the defined touchable element:

<div>This element is touchable.</div><script>document.getElementById('touchable').addEventListener('touchend', function(ev) {    /**     * Example output when three touch points are on the surface,     * two of them being on the "touchable" element and one point     * in the "touchable" element is lifted from the surface:     *     * Touch points removed: 1     * Touch points left on element: 1     * Touch points left on document: 2     */    document.write('Touch points removed: ' + ev.changedTouches.length);    document.write('Touch points left on element: ' + ev.targetTouches.length);    document.write('Touch points left on document: ' + ev.touches.length);}, false);</script>

Firing a syntheticTouchEvent from script

This example demonstrates how to create and fire aTouchEvent from script.

if (Touch.length < 1 || TouchEvent.length < 1)  throw "TouchEvent constructors not supported";var touch = new Touch({    identifier: 42,    target: document.body,    clientX: 200,    clientY: 200,    screenX: 300,    screenY: 300,    pageX: 200,    pageY: 200,    radiusX: 5,    radiusY: 5});var touchEvent = new TouchEvent("touchstart", {    cancelable: true,    bubbles: true,    composed: true,    touches: [touch],    targetTouches: [touch],    changedTouches: [touch]});document.body.dispatchEvent(touchEvent);

List ofTouchEvent types

The following table provides a summary of theTouchEvent event types defined in this specification. All events should accomplish the bubbling phase. All events should be composed [[WHATWG-DOM]] events.

Event TypeSync / AsyncBubbling phaseComposedTrusted proximal event target typesInterfaceCancelableDefault Action
touchstartSyncYesYesDocument,ElementTouchEventVariesundefined
touchendSyncYesYesDocument,ElementTouchEventVariesVaries: user agents may dispatchmouse and click events
touchmoveSyncYesYesDocument,ElementTouchEventVariesundefined
touchcancelSyncYesYesDocument,ElementTouchEventNonone

Cancelability of touch events

Canceling a touch event can prevent or otherwise interrupt scrolling (which could be happening in parallel with script execution). For maximum scroll performance, a user agent may not wait for each touch event associated with the scroll to be processed to see if it will be canceled. In such cases the user agent should generate touch events whosecancelable property isfalse, indicating thatpreventDefault cannot be used to prevent or interrupt scrolling. Otherwisecancelable will betrue.

In particular, a user agent should generate only uncancelable touch events when itobserves that there are no non-passive listeners for the event.

Thetouchstart event

A user agent must dispatch this event type to indicate when the user places atouch point on the touch surface.

The target of this event must be anElement. If the touch point is within a frame, the event should be dispatched to an element in the child browsing context of that frame.

If this event iscanceled, it should prevent any default actions caused by any touch events associated with the sameactive touch point, including mouse events or scrolling.

Thetouchend event

A user agent must dispatch this event type to indicate when the user removes atouch point from the touch surface, also including cases where the touch point physically leaves the touch surface, such as being dragged off of the screen.

The target of this event must be the sameElement on which thetouch point started when it was first placed on the surface, even if thetouch point has since moved outside the interactive area of the target element.

Thetouch point or points that were removed must be included in thechangedTouches attribute of theTouchEvent, and must not be included in thetouches andtargetTouches attributes.

If this event iscanceled, any sequence of touch events that includes this event must not beinterpreted as a click.

Thetouchmove event

A user agent must dispatch this event type to indicate when the user moves atouch point along the touch surface.

The target of this event must be the sameElement on which thetouch point started when it was first placed on the surface, even if thetouch point has since moved outside the interactive area of the target element.

Note that the rate at which the user agent sendstouchmove events is implementation-defined, and may depend on hardware capabilities and other implementation details.

A user agent should suppress the default action caused by anytouchmove event until at least onetouchmove event associated with the sameactive touch point is notcanceled. Whether the default action is suppressed fortouchmove events after at least onetouchmove event associated with the sameactive touch point is notcanceled is implementation dependent.

Thetouchcancel event

A user agent must dispatch this event type to indicate when a touch point has been disrupted in an implementation-specific manner, such as a synchronous event or action originating from the UA canceling the touch, or the touch point leaving the document window into a non-document area which is capable of handling user interactions (e.g. the UA's native user interface, or an area of the document which is managed by a plug-in). A user agent may also dispatch this event type when the user places moretouch points on the touch surface than the device or implementation is configured to store, in which case the earliestTouch object in theTouchList should be removed.

The target of this event must be the sameElement on which thetouch point started when it was first placed on the surface, even if thetouch point has since moved outside the interactive area of the target element.

Thetouch point or points that were removed must be included in thechangedTouches attribute of theTouchEvent, and must not be included in thetouches andtargetTouches attributes.

Retargeting

The following section describesretargeting steps, defined in [[!WHATWG-DOM]].

Touch object has an associatedunadjustedTarget (null orEventTarget). Unless stated otherwise it is null.

TouchEvent'sretargeting steps, given atouchEvent, must run these steps:

  1. For eachTouchtouch intouchEvent'stouches,targetTouches, andchangedTouches members:

    1. Settouch'sunadjustedTarget totouch'starget iftouch'sunadjustedTarget is null.
    2. Settouch'starget to the result of invokingretargetingtouch'sunadjustedTarget againsttouchEvent'starget.

Conditionally exposing legacy touch event APIs

User agents have an associated booleanexpose legacy touch event APIs whose value isimplementation-defined.

Existing web content often use the existence of these APIs as a signal that the user agent is a touch-enabled "mobile" device, and therefore exposing these APIs on non-mobile devices, even if they are touch-enabled, could lead to a suboptimal user experience for such web content.

Extensions to theGlobalEventHandlers mixin

The following section describes extensions to the existingGlobalEventHandlers mixin, defined in [[!HTML5]], to facilitate the event handler registration. For user agents whereexpose legacy touch event APIs is false, this mixin must not be implemented.

partial interface mixin GlobalEventHandlers {                    attribute EventHandler ontouchstart;                    attribute EventHandler ontouchend;                    attribute EventHandler ontouchmove;                    attribute EventHandler ontouchcancel;};
ontouchstart

The event handler IDL attribute (see [[!HTML5]]) for thetouchstart event type.

ontouchend

The event handler IDL attribute (see [[!HTML5]]) for thetouchend event type.

ontouchmove

The event handler IDL attribute (see [[!HTML5]]) for thetouchmove event type.

ontouchcancel

The event handler IDL attribute (see [[!HTML5]]) for thetouchcancel event type.

Interaction with Mouse Events andclick

The user agent may dispatch both touch events and (for compatibility with web content not designed for touch) mouse events [[!DOM-LEVEL-2-EVENTS]] in response to the same user input. If the user agent dispatches both touch events and mouse events in response to a single user action, then thetouchstart event type must be dispatched before any mouse event types for that action. Iftouchstart,touchmove, ortouchend arecanceled, the user agent should not dispatch any mouse event that would be a consequential result of the prevented touch event.

If a Web application can process touch events, it cancancel the events, and no corresponding mouse events would need to be dispatched by the user agent. If the Web application is not specifically written for touch input devices, it will react to the subsequent mouse events instead.

User agents will typically dispatch mouse and click events only for single-finger activation gestures (like tap and long press). Gestures involving movement of the touch point or multi-touch interactions – with two or moreactive touch points – will usually only generate touch events.

If the user agent interprets a sequence of touch events as a tap gesture, then it should dispatchmousemove,mousedown,mouseup, andclick events (in that order) at the location of thetouchend event for the corresponding touch input. If the contents of the document have changed during processing of the touch events, then the user agent may dispatch the mouse events to a different target than the touch events.

The default actions and ordering of any further touch and mouse events are implementation-defined, except as specified elsewhere.

The activation of an element (e.g., in some implementations, a tap) would typically produce the following event sequence (though this may vary slightly, depending on specific user agent behavior):

  1. touchstart
  2. Zero or moretouchmove events, depending on movement of the finger
  3. touchend
  4. mousemove(for compatibility with legacy mouse-specific code)
  5. mousedown
  6. mouseup
  7. click

If, however, either thetouchstart,touchmove ortouchend event has beencanceled during this interaction, no mouse or click events will be fired, and the resulting sequence of events would simply be:

  1. touchstart
  2. Zero or moretouchmove events, depending on movement of the finger
  3. touchend

Even if a user agent supports Touch Events, this does not necessarily mean that a touchscreen is the only input mechanism available to users. Particularly in the case of touch-enabled laptops, or traditional "touch only" devices (such as phones and tablets) with paired external input devices, users may use the touchscreen in conjunction with a trackpad, mouse or keyboard. For this reason, developers should avoid binding event listeners with "either touch or mouse/keyboard" conditional code, as this results in sites/application that become touch-exclusive, preventing users from being able to use any other input mechanism.

// conditional "touch OR mouse/keyboard" event binding// DON'T DO THIS, as it makes interactions touch-exclusive// on devices that have both touch and mouse/keyboardif ('ontouchstart' in window) {  // set up event listeners for touch  target.addEventListener('touchend', ...);  ...} else {  // set up event listeners for mouse/keyboard  target.addEventListener('click', ...);  ...}

Instead, developers should handle different forms of input concurrently.

// concurrent "touch AND mouse/keyboard" event binding// set up event listeners for touchtarget.addEventListener('touchend', function(e) {  // prevent compatibility mouse events and click  e.preventDefault();  ...});...// set up event listeners for mouse/keyboardtarget.addEventListener('click', ...);...

To avoid processing the same interaction twice for touch (once for the touch event, and once for the compatibility mouse events), developers should make sure tocancel the touch event, suppressing the generation of any further mouse or click events. Alternatively, see theInputDeviceCapabilities API for a way to detect mouse events that were generated as a result of touch events.

Glossary

active touch point
Atouch point which is currently on the screen and is being tracked by the user agent. The touch point becomes active when the user agent first dispatches atouchstart event indicating its appearance. It ceases to be active after the user agent dispatches atouchend ortouchcancel event indicating that the touch point is removed from the surface or no longer tracked.
touch point
The coordinate point at which a pointer (e.g finger or stylus) intersects the target surface of an interface. This may apply to a finger touching a touch-screen, or an digital pen writing on a piece of paper.
canceled event
An event whose default action was prevented by means ofpreventDefault(), returningfalse in an event handler, or other means as defined by [[!DOM-LEVEL-3-EVENTS]] and [[!HTML5]].

Acknowledgements

Many thanks to the WebKit engineers for developing the model used as a basis for this spec, Neil Roberts (SitePen) for his summary of WebKit touch events, Peter-Paul Koch (PPK) for his write-ups and suggestions, Robin Berjon for developing theReSpec.js spec authoring tool, and the WebEvents WG for their many contributions.

Many others have made additional comments as the spec developed, which have led to steady improvements. Among them are Matthew Schinckel, Andrew Grieve, Cathy Chan, Boris Zbarsky, Patrick H. Lauke, and Simon Pieters. If we inadvertently omitted your name, please let me know.

The group acknowledges the following contributors to this specification's test suite: Matt Brubeck, Olli Pettay, Art Barstow, Cathy Chan and Rick Byers.

Changes Since Last Publication

This is a summary of the major changes made since the10 October 2013 Recommendation was published.Full commit history is also available.


[8]ページ先頭

©2009-2025 Movatter.jp