In the field of human–computer interaction, taxonomies are used to classify and describe interaction (i.e. input and output) modalities, methods, technologies and devices. However, so far, most of these taxonomies and classification schemes consider only a subset of modalities and related methods, often reducing them to vision, audition and touch. Additionally, they are usually either technology- or task- instead of human-centered and thus vulnerable to rapid outdating with technological advancement and the advent of novel sensor and actor technologies. To tackle both problems, we propose a novel taxonomy that has been designed around the human and the human capabilities to sense output from and provide input to computer systems. We argue that although knowledge about the human sensory system might be up to changes as well, this process is considerably slower compared to technological advancement. Further, we reduce the taxonomy to what humans can actively and consciously sense or produce which is why novel findings related to human perception might not immediately compromise its validity. This article motivates the need for a novel taxonomy in the light of how computers and humans are able to perceive each other. It discusses existing taxonomies and introduces the new one that is intended to be (i) centered around the human and (ii) as holistic and timeless as possible. Further, the new taxonomy was evaluated with six human–computer interaction experts with regards to its practical use for researchers and different application scenarios.
We present and discuss a new taxonomy for input/output modalities and devices that is based on humans’ capabilities to provide input to and perceive output from computer systems.
The taxonomy is rooted in the physiological and cognitive characteristics of the human body.
The taxonomy can be used by researchers but also practitioners
As a human-centered and (nearly) holistic overview of interaction modalities and categories of devices
For the classification of interaction modalities
For the description of interaction devices
The taxonomy has been qualitatively evaluated with six HCI experts.
Access to content on Oxford Academic is often provided through institutional subscriptions and purchases. If you are a member of an institution with an active account, you may be able to access content in one of the following ways:
Typically, access is provided across an institutional network to a range of IP addresses. This authentication occurs automatically, and it is not possible to sign out of an IP authenticated account.
Choose this option to get remote access when outside your institution. Shibboleth/Open Athens technology is used to provide single sign-on between your institution’s website and Oxford Academic.
If your institution is not listed or you cannot sign in to your institution’s website, please contact your librarian or administrator.
Enter your library card number to sign in. If you cannot sign in, please contact your librarian.
Society member access to a journal is achieved in one of the following ways:
Many societies offer single sign-on between the society website and Oxford Academic. If you see ‘Sign in through society site’ in the sign in pane within a journal:
If you do not have a society account or have forgotten your username or password, please contact your society.
Some societies use Oxford Academic personal accounts to provide access to their members. See below.
A personal account can be used to get email alerts, save searches, purchase content, and activate subscriptions.
Some societies use Oxford Academic personal accounts to provide access to their members.
Click the account icon in the top right to:
Oxford Academic is home to a wide variety of products. The institutional subscription may not cover the content that you are trying to access. If you believe you should have access to that content, please contact your librarian.
For librarians and administrators, your personal account also provides access to institutional account management. Here you will find options to view and activate subscriptions, manage institutional settings and access options, access usage statistics, and more.
To purchase short-term access, please sign in to your personal account above.
Don't already have a personal account?Register
Month: | Total Views: |
---|---|
February 2019 | 1 |
March 2019 | 46 |
April 2019 | 10 |
May 2019 | 9 |
June 2019 | 16 |
July 2019 | 16 |
August 2019 | 4 |
September 2019 | 10 |
October 2019 | 15 |
November 2019 | 7 |
December 2019 | 18 |
January 2020 | 6 |
February 2020 | 12 |
March 2020 | 18 |
April 2020 | 6 |
May 2020 | 3 |
June 2020 | 19 |
July 2020 | 1 |
August 2020 | 15 |
September 2020 | 10 |
October 2020 | 8 |
November 2020 | 2 |
December 2020 | 4 |
January 2021 | 2 |
February 2021 | 2 |
March 2021 | 4 |
April 2021 | 11 |
May 2021 | 26 |
June 2021 | 7 |
July 2021 | 14 |
August 2021 | 4 |
September 2021 | 8 |
October 2021 | 13 |
November 2021 | 5 |
January 2022 | 4 |
February 2022 | 5 |
March 2022 | 15 |
April 2022 | 14 |
May 2022 | 19 |
June 2022 | 12 |
July 2022 | 26 |
August 2022 | 32 |
September 2022 | 30 |
October 2022 | 21 |
November 2022 | 6 |
December 2022 | 8 |
January 2023 | 17 |
February 2023 | 33 |
March 2023 | 14 |
April 2023 | 4 |
May 2023 | 19 |
June 2023 | 5 |
July 2023 | 7 |
August 2023 | 9 |
September 2023 | 14 |
October 2023 | 30 |
November 2023 | 28 |
December 2023 | 24 |
January 2024 | 49 |
February 2024 | 36 |
March 2024 | 9 |
April 2024 | 30 |
May 2024 | 35 |
June 2024 | 17 |
July 2024 | 20 |
August 2024 | 15 |
September 2024 | 16 |
October 2024 | 65 |
November 2024 | 44 |
December 2024 | 13 |
January 2025 | 53 |
February 2025 | 16 |
March 2025 | 17 |
April 2025 | 22 |
Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide
This PDF is available to Subscribers Only
View Article Abstract & Purchase OptionsFor full access to this pdf, sign in to an existing account, or purchase an annual subscription.