Copyright © 2020-2021W3C® (MIT,ERCIM,Keio,Beihang). W3Cliability,trademark andpermissive document license rules apply.
This document outlines various accessibility related user needs, requirements and scenarios for real-time communication (RTC). These user needs should drive accessibility requirements in various related specifications and the overall architecture that enables it. It first introduces a definition ofRTC as used throughout the document and outlines howRTC accessibility can support the needs of people with disabilities. It defines the term user needs as used throughout the document and then goes on to list a range of these user needs and their related requirements. Following that some quality related scenarios are outlined and finally a data table that maps the user needs contained in this document to related use case requirements found in other technical specifications.
This document is most explicitly not a collection of baseline requirements. It is also important to note that some of the requirements may be implemented at a system or platform level, and some may be authoring requirements.
This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of currentW3C publications and the latest revision of this technical report can be found in theW3C technical reports index at https://www.w3.org/TR/.
This is aW3CWorking Group Note produced by theAccessible Platform Architectures (APA) Working Group with support from theResearch Questions Task Force. This document provides a set of user needs and requirements collected by the task force and refined from public comments on theSecond Public Working Draft 7 December 2020. These requirements represent a consensus within the Working Group and form a solid basis from which developers of real-time communication (RTC) applications should consider user accessibility requirements.
As a Working Group Note this content is stable, and the Working Group does not plan to make further changes. Should the need arise, however, the document could be updated. Comments received on this document will help the Working Group to decide if updates are needed, or will be taken into account should a republication be planned.
To comment,file an issue in theW3C apa GitHub repository. If this is not feasible, send email topublic-apa@w3.org (comment archive). In-progress updates to the document may be viewed in thepublicly visible editors' draft.
This document was published by theAccessible Platform Architectures Working Group as a Working Group Note.
GitHub Issues are preferred for discussion of this specification.
Publication as a Working Group Note does not imply endorsement by theW3C Membership.
This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.
This document was produced by a group operating under the1 August 2017W3C Patent Policy.
This document is governed by the15 September 2020W3C Process Document.
Real-time communication (RTC) is an evolution beyond the traditional data exchange model of client to server resulting in real-time peer to peer audio, video and data exchange directly between supported user agents. This allows instantaneous applications for video, text and audio calls, text chat, file exchange, screen sharing and gaming, all without the need for browser plug-ins. While real-time communication (RTC) applications are enabled in the main by specifications likeWebRTC,WebRTC is not the sole specification with responsibility to enable accessible real-time communication applications. The use cases and requirements are broad - for example as outlined in theIETF RFC 7478 'Web Real-Time Communication Use Cases and Requirements' document. [ietf-rtc] [webrtc]
RTC accessibility is enabled by a combination of technologies and specifications such as those from the Media Working Group, Web and Networks Interest Group, Second Screen, and Web Audio Working group as well asAGWG andARIA. The Accessible Platform Architectures Working Group (APA) hopes this document will inform how these groups meet various responsibilities for enabling accessibleRTC, as well updating related use cases in various groups. For examples, view the current work onWebRTC Next Version Use Cases First Public Working Draft. [webrtc-use-cases]
This document outlines various accessibility related user needs forRTC accessibility. The term 'user needs' in this document relates to what people with various disabilities need to successfully useRTC applications. These needs may relate to having particular supports in an application, being able to complete tasks or access other functions. These user needs should drive accessibility requirements forRTC accessibility and its related architecture.
User needs are presented here with their related requirements; some in a range of scenarios (which can be thought of as similar to user stories).
The following outlines a range of user needs and requirements. The user needs have also been compared to existing use cases for real-time text (RTT) such as theIETF 'Framework for Real-Time Text over IP Using the Session Initiation Protocol (SIP)' RFC 5194 and the European Procurement Standard EN 301 549. [rtt-sip] [EN301-549]
Not all atomic items necessarily are pinned next to other atomic elements but some may be dependent, related or updated synchronously. For example, if there are multiple atomic data points destined for an 80 character braille display that has been sectioned to display 4 atomic items in up to 19 spaces each (leaving at least one blank cell for spacing).
Here the term atomic relates to small pieces of data. For the purposes of accessibility conformance testing, the definitions and use of the terms 'atomic' and 'atomic rules' may also be useful. [applicability-atomic] [rule-types]
Successful design of operations required for acting on incoming calls, getting informed about who the caller is and connecting relay services should not require complicated sequences of user actions.
Moving beyond mono in this context is also important, as the stereo spread allows audio descriptions to be sound staged. Applications should also inherit customization settings from the user's operating system.
This user need may also indicate necessary support for 'Total conversation' services as defined byITU inWebRTC applications. These are combinations of voice, video, andRTT in the same real-time session. [total-conversation]
To successfully connect video or text relay services should not require a complicated sequence of user actions.
This relates to cognitive accessibility requirements. For related work atW3C see the 'Personalization Semantics Content Module 1.0' and 'Media Queries Level 5'. [personalization] [media-queries]
There are potential real-time communication application issues that may only apply in immersive environments or augmented reality contexts.
For example, if anRTC application is also an XR application then relevant XR accessibility requirements should be addressed as well. [xaur]
Scenario: A deaf user watching a signed broadcast needs a high-quality frame rate to maintain legibility and clarity in order to understand what is being signed.
EN 301 549 Section 6, recommendsWebRTC applications should support a frame rate of at least 20 frames per second (FPS). More details can be found at Accessible Procurement standard for ICT products and services EN 301 549 (PDF) andITU-T Series H Supplement 1 "Sign language and lip-reading real-time conversation using low bit-rate video communication".
Scenario: A hard of hearing user needs better stereo sound to have a quality experience in work calls or meetings with friends or family. Transmission aspects, such as decibel range for audio needs to be of high-quality. For calls, industry allows higher audio resolution but still mostly in mono only.
EN 301 549 Section 6, recommends forWebRTC enabled conferencing and communication the application shall be able to encode and decode communication with a frequency range with an upper limit of at least 7KHz. More details can be found at Accessible Procurement standard for ICT products and services EN 301 549 (PDF)
Scenario: A hard of hearing user needs better stereo sound so they can have a quality experience in watching HD video or having a HD meeting with friends or family. Similarly for video quality, transmission aspects such as frames per second needs to be of high-quality.
A hard of hearing user often combines their perception of speech from audio with their perception of lip movement and other visual clues to create an overall understanding of speech. For the visual parts, the requirements on video are the same as expressed in '5.1 Deaf users: Video resolution and frame rates' about perception of sign language because lip movements are also part of sign language, equally rapid and as detailed as the other parts of sign language.
EN 301 549 Section 6, recommends forWebRTC enabled conferencing and communication the application shall be able to encode and decode communication with a frequency range with an upper limit of at least 7KHz. More details can be found at Accessible Procurement standard for ICT products and services EN 301 549 (PDF)
The following is a list of new user needs and requirements since the publication of theprevious working draft:
The following is a list of updated requirements to existing user needs:
The following are other changes in this document:
This user need may also indicate necessary support for 'Total conversation' services as defined byITU inWebRTC applications. These are combinations of voice, video, andRTT in the same real-time session. [total-conversation]
This document has been updated based on document feedback, discussion and Research Questions Task Force consensus.
This work is supported by theEC-funded WAI-Guide Project.