Before continuing with discussions of assessment in highly reliability organizations I want to share some thoughts prompted by meeting recently with one of the leading HRO researchers.
Dr. Kathleen Sutcliffe of the U of Michigan spoke last week at the Dept. of Interior University Senior Executive Series. She is the co-author with Karl Weick of Managing the Unexpected: Resilient Performance in an Age of Uncertainty, and its 2001 predecessor first edition. Much of her recent work focuses on health care; see for example the excellent paper she co-authored along with Weick and Tim Vogus on safety culture in health care “Doing No Harm” http://owen.vanderbilt.edu/vanderbilt/data/research/2267full.pdf
Dr. Sutcliffe’s presentation highlighted a few points about the uniqueness of HRO’s:
- They are distinguished not by absolute error rates but by what they do and how they prepare to handle the unexpected
- The only way to become reliable in complex organizations is through resilience; yes try to anticipate and prevent – but uniquely to develop refined skills to catch and connect the small occurrences
- A lot of what you “see” in HRO’s is what people talk about; mistakes, alternatives, what could go wrong
- Most organizations focus on getting it right; HRO’s hate getting it wrong
On this last point, Dr. Sutcliffe mentioned the Positive Asymmetry research by Karen Cerulo of Rutgers. The general thrust of Cerulo’s work is that most people (and organizations) operate based on “blind optimism – a tunnel-vision directed to best case scenarios and an accompanying disregard for worst case scenarios.” Now how much of this is physiological predisposition and how much socialized is a current “debate” among cognitive neuroscientists and sociologists. (See three part discussion, begun Dec. 2009 of the concepts is available at: http://www.ibiblio.org/culture/?q=node/47
We will need to pay attention to what these research efforts may say about high reliability organizations. But fundamentally for now I think the HRO research and this new research should make us particularly sensitive to recent trends in safety culture. There is a new awareness that while surveys may give some insights about perspectives on culture, the real question is what to do to actively create and maintain cultures in which safety and production are seamless; cultures in which value conflicts do not exist, rather the behaviors that result in high productivity are the same behaviors that create safe operations.
The new trend in high hazard organizations is that resilient cultures can be created and must be managed. The potential downside of this recognition is that culture management is often equated with skills and mindsets from MBA or engineering management perspectives – heavy on the management of things, dollars and productivity. While the overall directions are positive, we can become so immersed in the mechanics that we lose sight of the fundamental lessons – we have to continually work on building and maintaining individual and organizational mind sets to avoid “complacency” of success and continually be mindful of the second law of thermodynamics as it applies to complex systems – things are always deteriorating from order to disorder. That’s just natural. So what keeps HROs reliable is resilience – a unique focus on watching for signs of disorder and being prepared to respond when such signs are detected.
Now all this came to mind when in searching for articles by Cerulo I stumbled upon a blog posting on Fukushima: http://sociologicalimagination.org/archives/3929 The blog’s author Casey Brienza (PhD candidate in Sociology at the University of Cambridge) poses this question: “So why, in a place that ought to understand best the double-edged blade of atomic energy, didn’t (the Japanese) do a better job at expecting the worst? Excerpting from the blog, Brienza offers the following:
“In Never Saw It Coming: Cultural Challenges to Envisioning the Worst (Chicago, 2006), Rutgers sociologist Karen A. Cerulo theorizes that human cognition and cultural practice are mutually-reinforcing and can prevent societies from fully evaluating worst possible outcomes. She calls this phenomenon ‘positive asymmetry’ and concludes that the United States with its brand of optimism has a surfeit of it. Indeed, her argument seems peculiarly wedded to a point in recent American history, the first decade of the twenty-first century, when traumatic events such as 9/11 and the NASA Columbia shuttle crash seemed to threaten some of the most potent symbols of what the United States believes about itself as a nation. Does her theory apply equally to the Japanese? …
I do not presume to fully understand the source of Japan’s troubles. However, I would argue that we must not reduce this crisis to a freak act of nature on the one hand or the fault of particular individuals and/or agencies on the other. Like the tree falling in the forest that nobody hears, the earthquake that nobody feels matters little in our everyday lives. Thus is a ‘natural disaster’ like this one actually a socially- and culturally-mediated event, caused by a confluence of geological processes and societal realities. I hope that in the years to follow—after the region has rebuilt and recovered—that we look carefully at the interaction between cognition and culture as we formulate a nuanced sociological account of what actually happened…and how to prevent this tragedy from ever happening again.”
The next blog posting will resume the discussion on assessment for high reliability. Meanwhile, thanks to Kathleen Sutcliffe and the others credited in this blog for enriching our conversations.