Last week I was at the Probabilistic Safety Analysis and Management conference. We had 400 attendees from some over 20 countries. Notably this was the first time in 20 years that there was a full tract on Safety Culture with about 40 papers presented. The topics included development of new theoretical models, case studies of safety culture analysis and interventions, and research underway in aviation, medicine, and nuclear power. The Conference key-note was delivered by Dr. George Apostolakis, formerly of MIT and a new Commissioner of the Nuclear Regulatory Commission; he chose to speak on safety culture. Over the next few weeks I’ll be summarizing a sampling of the papers presented.
Meanwhile, over the next week or so I invite you to comment on the Deepwater disaster. True, the full facts are not in and will likely be quite a while in coming. But as I was flying around the country I re-read “Streetlights and Shadows” by Gary Klein – his latest on decision making. In it he discusses fallacies about how people make decisions and elaborates on his extensive research on how experts make decisions based on pattern matching. He does an excellent job of relating his decision research to high reliability. From what we know about high reliability and organizational failure, I think we have a sense of some of the fundamental organizational and culture issues surrounding the catastrophe. The President’s Deepwater Commission will consider what should be done to prevent such events in the future. So how would you advise the Commission on what they should consider? What should be done by government and industry to transform petroleum drilling and production into a highly reliable operation? Is it possible?
I look forward to reading your suggestions.