‘The human element is the most flexible, adaptable and valuable part of the … system, but it is also the most vulnerable to influences which can adversely affect its performance’.
(ICAO circular 216-AN/131, 1989)
Error is an ever-present and natural part of life and it is generally accepted that we will all make errors daily. Each of us already have an intuitive understanding of the term human error, but it is useful to define it in order to better understand its nature and to develop mechanisms for effective error management.
Human error is a generic term that involves all those instances where a planned activity fails to achieve its intended outcome (1). For example, forgetting to set your park brake in your car or misapplying your vehicle brakes in wet and slippery road conditions.
Regardless of the definition used, human error is best characterised by its outcome. That is, the undesirable consequences that are produced or the potential for undesirable consequences.
In any organisation, human error is a regular occurrence. The good news is that most of the time, these errors are self-corrected, and they have little consequence. Other interesting facts about error include:
- Average rate of error is constant at 1-3 per hour whatever the expertise (except for beginners).
- Errors tend to decrease in more demanding situations (due to cognitive control) but the recovery rate collapses (due to insufficient mental capacity).
- Errors change with the nature of expertise. Routine based errors (slips & lapses) increase with expertise, knowledge base (mistakes) decrease.
- Motor skill errors (learning) are more frequent for beginners but decrease with experience.
- Experts are characterised by their ability to recover from error (anticipation).
Figure 1. Contrasting perspectives of human error
|Old View||New View|
|Human error is a cause of trouble||Human error is a symptom of deeper system trouble|
|Need to find and fix people’s mistakes and inaccurate assessments||Need to understand that mistakes and actions by people made sense to them at the time – errors have context|
|Complex systems are basically safe||Complex systems are basically unsafe|
|People are erratic and unreliable and undermine system safety||Complex systems are tradeoffs between the competing goals of safety vs. efficiency|
|Make a system safer by restricting the human contribution||Safe systems are maintained through practice at all levels within an organisation|
Error rate, detection, and performance
‘To err is human; to blame it on the other guy is even more human.’
Bob Goddard, rocket scientist
The figure below shows the typical error rate through ‘standard performance’ at 1-3 errors per hour. The error rate increases at both ends of the performance spectrum; namely when we are most relaxed and on auto pilot mode (inattentive) and when we are operating with maximum conscious effort (high workload). The implication for safety professionals when either examining human behaviour in the aftermath of an incident, or proactively designing a new task; is to understand the level of workload faced by directly involves parties. Was it too high, too low and would others have coped any different?
It is a common misconception that bad outcomes result from errors, carelessness or negligence. In fact, the consequences of human errors depend on the circumstances or context under which the error occurs, as illustrated below.
Figure 2. Error rate, detection, and performance graph
A relatively small error can trigger a very serious accident, if other parts of the system fail. Many of the errors we make every day have no serious consequences, either because other elements of the system detected and corrected them (such as aural or visual warning systems), or they did not coincide with the presence of an active hazard (driving on a wet slippery surface).
Social and cultural norms underlying human error
Our views about error are heavily influenced by social and cultural norms. Often errors are seen as ‘bad’ and media reports tend to sensationalise the coverage of accidents reinforcing the need to find the culprit. Some of the more pervasive social views about error include:
- The right and wrong principle. Professional standards of conduct are based on moral principles and values, which guides our decision-making about what is right and wrong and that which is accepted by an individual or a social group. For example, a driver charged for a motor vehicle accident, in which they were found to be sending a text message on their phone while driving, must be in the wrong. But how many times have you sent a quick text message while driving yourself, with the best of intentions?
- Near miss. These are often viewed as unimportant because nothing bad happened; but they represent missed opportunities to learn.
- Looking good / avoiding looking bad. People do not like to admit failure for fear of embarrassment or being harshly judged.
- Taking responsibility means admission of failure. We are conditioned from early childhood to not fail as it is a sign of weakness.
- Fundamental attribution error. When examining an event, we are conditioned to over emphasise dispositional (personality) explanations for behaviour, rather than fully comprehend the situational/location/environmental characteristics people had to face at the time.
Figure 3. Causes in context
Error Producing Conditions
Error-producing conditions (EPCs) can increase the probability of error when a specific task is performed in the presence of these conditions. Some EPC’s are more powerful than others. The figure below shows estimates of how different conditions increase the risk of error. The error producing conditions are ranked in the order of their known effects and the numbers in parentheses indicate the risk factor (that is, the amount by which the nominal error rates should be multiplied under the worst conditions). For example, while poor instructions or procedures may triple the rate of error, lack of familiarity with a task is estimated to result in a 17-fold increase in the risk of error.
Figure 4. Error producing conditions
A number of conclusions from this list of EPC’s can be made:
- Three of the best researched factors; namely, sleep disturbance, hostile environment, and boredom; carry the least amount of penalties.
- Those EPC’s at the top of the list are those that lie squarely within the organisational sphere of influence. That is, managers and administrators rarely, if ever, have the opportunity to jeopardise a system’s safety directly. Their influence is more indirect; top level decisions create the conditions that promote unsafe acts.
- Departures from routine and changes in the circumstances in which actions are normally performed constitute a major influence in the occurrence of slips and lapses.
Want to know more?
For more in depth information about human factors solutions or practical human factors training for your workplace, contact Leading Edge Safety Systems. We are a group of highly qualified and experienced human factors experts, with second to none experience in a range of industries with a proven track record of providing practical solutions to addressing key safety, risk and human factors challenges in the workplace.
Reason, J. (1990). Human error. New York: Cambridge University Press.