Error

Human error is not a cause!

Error, mistake, faux pas, gaffe, blunder, lapse, slip, goof, oops, blooper; how many phrases do we have to express the idea that things don’t always happen as we expect or as we would prefer?  At the 2009 CEO Conference of the Institute of Nuclear Power Operations (INPO), one CEO stated that the most important change in the commercial nuclear industry in the past decade was the recognition that people do not intentionally commit errors.  INPO’s training reference guide that introduced the commercial nuclear power industry Human Performance Improvement (HPI) initiative stated that HPI represented “a new way of thinking.”  So the question is, how might we think differently about this concept of error that seems an inevitable aspect of the human condition?

The “fact” that some 80% of accidents are “caused” by human error appears in much of the safety literature.   Formal accident investigation attributions of error as cause have been used for justification of blame and punishment, ostensibly to “prevent” recurrence of similar accidents.  Yet after decades of labeling human error as cause, what do we really know scientifically about this fundamental human concept?

Much of the scientific work on accident causation can be traced to the aftermath of the Three Mile Island accident.  Woods and Cook1 explain the situation as:  “At that time, the folk model of accident causation was firmly in place among researchers and error seemed a plausible target for work on safety. It was only after a long period of empirical research on human performance and accidents that it became apparent that answering the question of what is error was neither the first step nor a useful step, but only a dead end.”

As James Reason explains in his book Human Error, error means different things to different people, and depends on context.  In Latin the meaning of error is “to wander”. In baseball an error is the act, in the judgment of the official scorer, of a fielder misplaying a ball in a manner that allows a batter or base runner to reach one or more additional bases, when such an advance should have been prevented given ordinary effort by the fielder.  In computer operation, an error is when an unexpected condition occurs.

The utility of error as causation is further complicated since error cannot be isolated as a particular psychological or behavioral phenomenon.  Addressing efforts by cognitive psychologists to identify error types, Reason states that “Far from being rooted in irrational or maladaptive tendencies, these … error forms have their origin in fundamentally useful psychological processes.”  He continues quoting Ernest Mach (1905) “knowledge and error flow from the same mental sources, only success can tell one from the other”.

So it seems that what may be called error is distinguishable only retrospectively in the presence of an undesirable outcome.  Absent such an outcome, error is not observable.  So if error is not observable sans outcome, is there any utility to this concept which is so rooted in the cultural views of causality yet so lacking in scientific validity?

Returning to Woods and Cook, “Error is not a fixed category of scientific analysis. It is not an objective, stable state of the world. Instead, it arises from the interaction between the world and the people who create, run, and benefit (or suffer) from human systems for human purposes-a relationship between hazards in the world and our knowledge, our perceptions, and even our dread of the potential paths toward and forms of failure.” …”To use “error” as a synonym for harm gives the appearance of progress where there is none“

If the concept of error has no particular value in analysis of failure, and indeed, that such use may be counterproductive, perhaps its value lies elsewhere.  Viewing error as a fuzzy concept, rather than an absolute one, provides a basis for proceeding.  William James’ philosophy of Pragmatism relates meaning to a concept’s purpose.  Operationalization is the process of defining a fuzzy concept so as to make the concept measurable in form of variables consisting of specific observations.  W. Edwards Deming explains that “An operational definition is a procedure agreed upon for translation of a concept into measurement of some kind.”

How might we understand error in a purposeful sense that promotes the human condition.  Consider, as an example, physical pain.  Pain may be understood as a negative consequence; something to be avoided or even feared.  Alternatively, pain may be understood as one of the body’s key defense mechanisms, the purpose of which is to alert us of a threat to the body’s safety or survival.  Similarly we may shift the meaning of error as harm, to error as warning of harm.  Thus error becomes a signal to prompt protective actions.

Reason offers three related “working” definitions of error, each predicated on a retrospective judgments of not achieving the desired outcome from pursuing a predetermined course of action. He then suggests that error be understood in terms of intentions, actions and consequences.  He also suggests that error be extended from purely an individual phenomenon to include organizational phenomena. So if we understand error as signal operating with intentions, actions and consequences, we can view this formulation equivalent to Deming’s description of the Shewhart’s Cycle of ‘Plan, Do, Study, Act.’  In this way, error become signals that enable individuals and organizations to monitor the relationship of doing with the plan in relationship to anticipated outcomes and then adjusting the plan and actions based on the feedback provided by error.

Error is life providing feedback on our interactions with the environment.  By shifting our paradigm of error from one of fear and blame to one of system feedback, we find that error is nature’s way of helping us proceed incrementally toward our goals while coping with an uncertain universe.

Advertisements

8 Responses to Error

  1. wroege says:

    To err is human and even the smartest humans err…

    In my own experience I find recognizing errors even when there are no consequences is an important way of considering new habits or barriers that will prevent future consequences for similar errors.

    I think it also critical we as a group always keep in mind that error is unintentional and should not be punished.

  2. GSearfoss says:

    When I was a Shift Supervisor at the Susquehanna Steam Electric Station, a two unit boiling water reactor, we recognized that errors were going to be made, even in an industry that strived towards zero errors and zero defects. It was extremely important that we learned from our mistakes and were candid about our problems and weaknesses. Everyone was open to evaluation and everyone was encouraged to identify potential problems. Everyone’s professional opinion was respected and no one was discouraged from raising concerns. Our approach was simple – we can learn from anyone.

    • wecarnes says:

      So Skip, what do you think we need to learn to create the type of culture you had at Susquehanna that encouraged this learning approach? As a shift supervisor, how did you communicate and reinforce that this openess was desirable?

      Thanks for joining in Skip.

      Earl

  3. Earl said:

    “So it seems that what may be called error is distinguishable only retrospectively in the presence of an undesirable outcome. Absent such an outcome, error is not observable.”

    This seems to fly in the face of decades of aviation, submarine, and nuclear power training during which instructors can see and correct errors in real time.

    Take care,

    Bill Corcoran
    Mission: Saving lives, pain, assets, and careers through thoughtful inquiry.
    Motto: If you want safety, peace, or justice, then work for competency, integrity, and transparency.

    W. R. Corcoran, Ph.D., P.E.
    Nuclear Safety Review Concepts Corporation

    • wecarnes says:

      For any who might not know Bill, he is one of the thought leaders in causal analysis in high hazard industries and I have high regard for his work.

      So two replies Bill. First, what was it they were correcting? How are you defining error? Do people correct error or behavior?

      Second, you picked out one statement for reply; so what about the larger argument that error is not cause.

      Finally, this is exactly the type of comment I hope to get; it stimulates thought and discussion. I really thank you Bill for joining in and hope you will continue to contribute.

      All the best,

      Earl

  4. wroege says:

    I think the points laid out by Bill and Earl are a key point as we go forward. Language is very important and the term “error” has many definitions depending on the community. I have heard Earl discuss this many times, some with me, and I understand the very fine point that he makes. In many ways understanding and accepting that point helps one move forward in the HPI concept framework. Sometimes I wish we had a new word that we could use to describe the concept as I do wonder whether there is such a thing as error if there is not consequence.

    The key concept to me is that error is inadvertent and therefore should not be used for blame. A blame culture is a cancer that eats away at otherwise healthy organizations.

  5. Wroege,

    Thanks for your input.

    I would agree that much of what is called error is “inadvertent.”

    All of the errors I have seen in practice were the result of a set of factors that resulted in the exact nature, magnitude, location, and timing of the error.

    These factors included what set-up the person for the error, what triggered it, what made it as bad as it was, and what kept it from being worse.

    Once the direct factors and the underlying factors that resulted in them are known it is possible to consider corrective measures intelligently.

    Error should be used as an entree into peeling the onion to find out the basic fundamental underlying vulnerabilities of the organization. There should be no sacred cows, not even the regulators or the victims.

    Take care,

    Bill Corcoran
    Mission: Saving lives, pain, assets, and careers through thoughtful inquiry.
    Motto: If you want safety, peace, or justice, then work for competency, integrity, and transparency.

    W. R. Corcoran, Ph.D., P.E.
    Nuclear Safety Review Concepts Corporation

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: