Use error


The term []use error has recently been introduced to replace the commonly used terms human error and user error. The new term, which has already been adopted by international standards organizations for medical devices, suggests that accidents should be attributed to the circumstances, rather than to the human beings who happened to be there.

The need for the terminological change

The term "use error" was first used in May 1995 in an MD+DI guest editorial, by . Traditionally, human errors are considered as a special aspect of human factors. Accordingly, they are attributed to the human operator, or user.
When taking this approach, we assume that the system design is perfect, and the only source for the use errors is the human operator. For example, the U.S. Department of Defense HFACS
classifies use errors attributed to the human operator, disregarding improper design and configuration setting, which often result in missing alarms, or in inappropriate alerting.
The need for changing the term was due to a common malpractice of the stakeholders in cases of accidents. Instead of investing in fixing the error-prone design, management attributed the error to the users.
The need for the change has been pointed out by the accident investigators:
A mishap is typically considered as either a use error or a force majeure:
In 1998, Cook, Woods and Miller presented the concept of hindsight bias, exemplified by celebrated accidents in medicine, by a workgroup on patient safety
The workgroup pointed at the tendency to attribute accidents in health care to isolated human failures.
They provide references to early research about the effect of knowledge of the outcome, which was unavailable beforehand, on later judgement about the processes that led up to that outcome. They explain that in looking back, we tend to oversimplify the situation that the actual practitioners faces. They conclude focusing on the hindsight knowledge prevents our understanding of the richer story, the circumstances of the human error.
According to this position, the term Use Error is formally defined in several international standards, such as IEC 62366, ISO 14155 and ISO 14971, to describe
ISO standards about medical devices and procedures provide examples of use errors, which are attributed to human factors, include slips, lapses and mistakes. Practically, this means that they are attributed to the user, implying the user's accountability.
The U.S. Food and Drug Administration glossary of medical devices provides the following explanation about this term:
With this interpretation by ISO and the FDA, the term ‘use error’ is actually synonymous with ‘user error’.
Another approach, which distinguishes ‘use errors’ from ‘user errors', is taken by IEC 62366. Annex A includes an explanation justifying the new term:
This explanation complies with “The New View”, which Sidney Dekker suggested as an alternative to “The Old View”. This interpretation favors investigations intended to understand the situation, rather than blaming the operators.
In a 2011 report draft on health IT usability, the U.S. National Institute of Standards and Technology defines "use error" in healthcare IT this way: “Use error is a term used very specifically to refer to user interface designs that will engender users to make errors of commission or omission. It is true that users do make errors, but many errors are due not to user error per se but due to designs that are flawed, e.g., poorly written messaging, misuse of color-coding conventions, omission of information, etc.".

Example of user error

An example of an accident due to a user error is the ecological disaster of 1967 caused by the Torrey Canyon supertanker. The accident was due to a combination of several exceptional events, the result of which was that the supertanker was heading directly to the rocks. At that point, the captain failed to change the course because the steering control lever was inadvertently set to the Control position, which disconnected the rudder from the wheel at the helm.

Examples of user failure to handle system failure

Examples of the second type are the Three Mile Island accident described above, the NYC blackout following a storm and the chemical plant disaster in Bhopal, India.

Classifying use errors

The URM Model characterizes use errors in terms of the user's failure to manage a system deficiency. Six categories of use errors are described in a URM document:
  1. Expected faults with risky results;
  2. Expected faults with unexpected results;
  3. Expected user errors in identifying risky situations ;
  4. User Errors in handling expected faults;
  5. Expected errors in function selection;
  6. Unexpected faults, due to operating in exceptional states.

    Critics

Erik Hollnagel argues that going from and 'old' view to a 'new' view is not enough. One should go all the way to a 'no' view. This means that the notion of error, whether user error or use error might be destructive rather than constructive.
Instead, he proposes to focus on the performance variability of everyday actions, on the basis that this performance variability is both useful and necessary. In most cases the result is that things go right, in a few cases that things go wrong. But the reason is the same.
Hollnagel expanded on this in his writings about the efficiency–thoroughness trade-off principle
of Resilience Engineering,
and the Resilient Health Care Net.