For this the first posting of 2012, here’s wishing you a safe, healthy and joyous year. In case you haven’t noticed, I haven’t posted in a while. That’s because I’ve been fully occupied with safety culture assessments. But you’ve still been reading which seems to indicate that it’s time for some more reflections.If you are interested in our safety culture assessments see our recently issued report of the “Independent Oversight Assessment of the Nuclear Safety Culture and Management of Nuclear Safety Concerns at the Hanford Site Waste Treatment and Immobilization Plant – January 2012” and the Supplement to the report posted on the DOE Health, Safety and Security web page at: www.hss.doe.gov.
For readers not in the DOE community, you may find the report of most interest for the methodology our team used. The methodology was developed through extensive research on Management and Organizational factors, originally sponsored by the U.S. Nuclear Regulatory Commission. And for you particularly interested in high reliability theory, the NRC research program led by Dr. Sonja Haber supported some of the early field work of the U.C. Berkeley high reliability team. So just in case there is confusion that high reliability and safety culture are somehow different; allow me to dispel that notion.
Some of you are aware that I have a mania for valid methodology. While I do respect personal opinion (and those of you who know me understand that I hold strong opinions of my own) when it comes to organizational safety assessments there is no substitute for rigorous method grounded in sound research. My activities over the past few months with safety culture assessments in high consequence projects have further solidified this methodological predisposition.
So it is with this perspective in mind that I reflected on a new report on “Health & Safety Information Gap’ in the Oil & Gas industry: http://www.rgu.ac.uk/news/rgu-research-for-aveva-identifies-health-and-safety-information-gap-
Some of the conclusions:
Developing a ‘culture of personal responsibility’ and ‘human behaviour’ were identified as the top two challenges to improving safety.
“When asked what their priorities were for improving safety, many felt that management commitment to safety needs to be improved, and that there needs to be greater accountability and individual responsibility for safety. Particular priorities included:
- Changing employee safety behaviours (66%)
- Changing the safety culture of the organisation (61%)
- Improving employee awareness of safety (53%)
- Recording and auditing improvements (51%)
- Meeting regulatory compliance (50%)
- Improving information systems (45%)
- Demonstrating employee competency (35%)
- Addressing new/changed regulations (31%)”
Just looking at the identified challenges and summary results, what do you think is their model of safety? WYLFIWYF – what you look for is what you find. Our models create our reality; unspoken, unrecognized assumptions about life and everything in it control how we think and act. So how many Chernobyl’s, Texas City Refinery’s or Deepwater Horizons have to happen before we in a technology centric world are forced to understand that the focus on ‘human behavior’ and ‘personal responsibility’ is a failed model that is virtually useless for complex technologies operated by complex organizations?
There are better models, there are sound methodologies based on extensive research on severe accidents and reliability seeking organizations. Late last year researchers at VTT published two documents that give very clear pictures of the better models and better methodologies, as well as the fallacies (or biases) of simplistic linear causality thinking.
“Human behavior needs to be understood in the context of people attempting (together) to make sense of themselves and their environment, and act based on perpetually incomplete information while relying on social conventions, affordances provided by the environment and the available cognitive heuristics. In addition, a move toward a positive view of the human contribution to safety is needed. Systemic safety management requires an increased understanding of various normal organizational phenomena … coupled with a systemic safety culture that encourages and endorses a holistic view of the workings and challenges of the socio-technical system in question.*1
“The purpose of an organisational evaluation is not usually to explain what has happened but to judge whether an organisation is capable of managing risks and creating sufficient safety in its activities. The focus of an organisational safety evaluation is on the future – to assess the organisation’s potential for safe performance.*2
The scholarship of the VTT group and their communicative styles are of the highest quality, and the knowledge they share builds upon years of sound research by no means novel or esoteric. The knowledge is available, the methods developed, tested and reliable. What is wanting is only the willingness to study, understand and apply. That way, and that way only, lays the path toward the promise of safe, sustainable technological deployment.
As a good friend, a nuclear engineer, phrased it so eloquently, “once you understand this stuff, you can’t look at any organizational dysfunction without seeing it through the eyes of safety culture (or just culture).”
As always, comments welcome.
*1 Teemu Reiman, Carl Rollenhagen, “Human and organizational biases affecting the management of safety”, Reliability Engineering and System Safety.vol. 96(2011):10, pp.1263-1274, Date 2011
- *2 Pia Oedewald, Elina Pietikäinen, Teemu Reiman, “A Guidebook for Evaluating Organizations in the Nuclear Industry – an example of safety culture evaluation”, VTT, Technical Research Centre of Finland, June 201, Report number: 2011:20 ISSN: 2000-0456