Human error is an action that has been done but that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits".[1] Human error has been cited as a primary cause and contributing factor in disasters and accidents in industries as diverse asnuclear power (e.g., theThree Mile Island accident),aviation (e.g.,United Airlines Flight 173),space exploration (e.g., theSpace Shuttle Challenger disaster andSpace Shuttle Columbia disaster), andmedicine. Prevention of human error is generally seen as a major contributor toreliability andsafety of (complex) systems. Human error is one of the many contributing causes ofrisk events.

Human error refers to something having been done that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits".[1] In short, it is a deviation from intention, expectation or desirability.[1] Logically, human actions can fail to achieve their goal in two different ways: the actions can go as planned, but the plan can be inadequate (leading to mistakes); or, the plan can be satisfactory, but the performance can be deficient (leading toslips andlapses).[2][3] However, a mere failure is not an error if there had been no plan to accomplish something in particular.[1]

Human error and performance are two sides of the same coin: "human error" mechanisms are the same as "human performance" mechanisms; performance later categorized as 'error' is done so in hindsight:[3][4] therefore actions later termed "human error" are actually part of the ordinary spectrum of human behaviour. The study ofabsent-mindedness in everyday life provides ample documentation and categorization of such aspects of behavior. Having a sense of awareness is needed to understand when dealing with a potential danger, thus being able to correct it.[5] While human error is firmly entrenched in the classical approaches to accident investigation and risk assessment, it has no role in newer approaches such asresilience engineering.[6]
There are many ways to categorize human error:[7][8]
Thecognitive study of human error is a very active research field, including work related to limits ofmemory andattention and also todecision making strategies such as theavailability heuristic and othercognitive biases. Such heuristics and biases are strategies that are useful and often correct, but can lead to systematic patterns of error.
Misunderstandings as a topic in human communication have been studied inconversation analysis, such as the examination of violations of thecooperative principle and Gricean maxims.
Organizational studies of error or dysfunction have included studies ofsafety culture. One technique for analyzing complex systems failure that incorporates organizational analysis ismanagement oversight risk tree analysis.[14][15][16]

Some researchers have argued that the dichotomy of human actions as "correct" or "incorrect" is a harmfuloversimplification of acomplex phenomenon.[17][18] A focus on the variability of human performance and how human operators (and organizations) can manage that variability, may be a more fruitful approach. Newer approaches, such as resilience engineering mentioned above, highlight the positive roles that humans can play in complex systems. In resilience engineering, successes (things that go right) and failures (things that go wrong) are seen as having the same basis, namely human performance variability. A specific account of that is theefficiency–thoroughness trade-off principle,[19] which can be found on all levels of human activity, in individuals as well as in groups.