Models, methodologies and meanderings

For this the first posting of 2012, here’s wishing you a safe, healthy and joyous year.  In case you haven’t noticed, I haven’t posted in a while.  That’s because I’ve been fully occupied with safety culture assessments. But you’ve still been reading which seems to indicate that it’s time for some more reflections.If you are interested in our safety culture assessments see our recently issued report of the “Independent Oversight Assessment of the Nuclear Safety Culture and Management of Nuclear Safety Concerns at the Hanford Site Waste Treatment and Immobilization Plant – January 2012” and the Supplement to the report posted on the DOE Health, Safety and Security web page at: www.hss.doe.gov

For readers not in the DOE community, you may find the report of most interest for the methodology our team used.  The methodology was developed through extensive research on Management and Organizational factors, originally sponsored by the U.S. Nuclear Regulatory Commission.  And for you particularly interested in high reliability theory, the NRC research program led by Dr. Sonja Haber supported some of the early field work of the U.C. Berkeley high reliability team.  So just in case there is confusion that high reliability and safety culture are somehow different; allow me to dispel that notion.

Some of you are aware that I have a mania for valid methodology.  While I do respect personal opinion (and those of you who know me understand that I hold strong opinions of my own) when it comes to organizational safety assessments there is no substitute for rigorous method grounded in sound research.  My activities over the past few months with safety culture assessments in high consequence projects have further solidified this methodological predisposition.

So it is with this perspective in mind that I reflected on a new report on “Health & Safety Information Gap’ in the Oil & Gas industry: http://www.rgu.ac.uk/news/rgu-research-for-aveva-identifies-health-and-safety-information-gap-

Some of the conclusions:

Developing a ‘culture of personal responsibility’ and ‘human behaviour’ were identified as the top two challenges to improving safety.

 “When asked what their priorities were for improving safety, many felt that management commitment to safety needs to be improved, and that there needs to be greater accountability and individual responsibility for safety.  Particular priorities included:

  • Changing employee safety behaviours (66%)
  • Changing the safety culture of the organisation (61%)
  • Improving employee awareness of safety (53%)
  • Recording and auditing improvements (51%)
  • Meeting regulatory compliance (50%)
  • Improving information systems (45%)
  • Demonstrating employee competency (35%)
  • Addressing new/changed regulations (31%)”

Just looking at the identified challenges and summary results, what do you think is their model of safety?  WYLFIWYF – what you look for is what you find. Our models create our reality; unspoken, unrecognized assumptions about life and everything in it control how we think and act.  So how many Chernobyl’s, Texas City Refinery’s or Deepwater Horizons have to happen before we in a technology centric world are forced to understand that the focus on ‘human behavior’ and ‘personal responsibility’ is a failed model that is virtually useless for complex technologies operated by complex organizations?  

There are better models, there are sound methodologies based on extensive research on severe accidents and reliability seeking organizations.  Late last year researchers at VTT published two documents that give very clear pictures of the better models and better methodologies, as well as the fallacies (or biases) of simplistic linear causality thinking.

“Human behavior needs to be understood in the context of people attempting (together) to make sense of themselves and their environment, and act based on perpetually incomplete information while relying on social conventions, affordances provided by the environment and the available cognitive heuristics. In addition, a move toward a positive view of the human contribution to safety is needed. Systemic safety management requires an increased understanding of various normal organizational phenomena … coupled with a systemic safety culture that encourages and endorses a holistic view of the workings and challenges of the socio-technical system in question.*1

 “The purpose of an organisational evaluation is not usually to explain what has happened but to judge whether an organisation is capable of managing risks and creating sufficient safety in its activities. The focus of an organisational safety evaluation is on the future – to assess the organisation’s potential for safe performance.*2

The scholarship of the VTT group and their communicative styles are of the highest quality, and the knowledge they share builds upon years of sound research by no means novel or esoteric.   The knowledge is available, the methods developed, tested and reliable.  What is wanting is only the willingness to study, understand and apply.  That way, and that way only, lays the path toward the promise of safe, sustainable technological deployment. 

 As a good friend, a nuclear engineer, phrased it so eloquently, “once you understand this stuff, you can’t look at any organizational dysfunction without seeing it through the eyes of safety culture (or just culture).”

 As always, comments welcome.

 *1 Teemu Reiman, Carl Rollenhagen, “Human and organizational biases affecting the management of safety”, Reliability Engineering and System Safety.vol. 96(2011):10, pp.1263-1274, Date 2011

http://www.vtt.fi/inf/julkaisut/muut/2011/Reiman_Rollenhagen.pdf

  • *2  Pia Oedewald, Elina Pietikäinen, Teemu Reiman, “A Guidebook for Evaluating Organizations in the Nuclear Industry – an example of safety culture evaluation”, VTT, Technical Research Centre of Finland, June 201, Report number: 2011:20 ISSN: 2000-0456

 http://www.stralsakerhetsmyndigheten.se/Global/Publikationer/Rapport/Sakerhet-vid-karnkraftverken/2011/SSM-Rapport-2011-20.pdf

About these ads

5 Responses to Models, methodologies and meanderings

  1. Earl,

    Thanks.

    I wonder if you would care to comment on the following as a way to move forward?
    ____________________________________

    Peer-to-peer Accountability Coaching
    Q. What is it?
    A. Peer-to-peer Accountability Coaching is the process of one team member inviting another team member to be accountable for living up to the orgganization’s Excellence Model.
    Q. What does it look like?
    A. The “coach” has an extremely clear and detailed mind picture of what the Excellence Model demands in terms of conditions, behaviors, actions, and inactions.
    The coach communicates respectfully, clearly and credibly.
    The coach leads by example.
    The coach is mindfully engaged in situational awareness at the facility.
    The coach is constantly comparing the current situation to what the situation would look like if the Excellence Model had been achieved.
    The coach steps up to the challenge of stating to the relevant team member what is going on that does not match the Excellence Model.
    The person receiving the coaching repeats it back, takes the appropriate immediate action, and asks the coach for an evaluation of the immediate action. When appropriate the person receiving the coaching writes a Condition Report.
    Q. Please give me an example.
    A. Several team members, including a job supervisor are getting ready to move a component with lifting and rigging equipment. During the Pre-job Brief the supervisor mentions that the work instruction says that this is a “Critical Lift”, but it actually only a “Heavy Lift”, and indicates that it will be done as a “Heavy Lift.”
    The Peer-to-peer Accountability Coach reminds the Supervisor that 1) the work should not proceed in nonconformance with the work instruction, 2) the right thing to do is to get the work instruction changed, and 3) that a CR needs to be written on the defective work instruction, the defective reviews that allowed it into the Pre-job Brief stage, and the avoidable rework associated with stopping the job and getting the errors corrected.

  2. wecarnes says:

    Bill
    Thank you for the reply and the model.

    Albert Bandura’s Social Learning Theory posits that people learn from one another, via observation, imitation, and modeling. The theory has often been called a bridge between behaviorist and cognitive learning theories because it encompasses attention, memory, and motivation.

    If we are very fortunate in our careers we luck into positive mentors and coaches. Yet this remains the exception, not the norm. How can an organization possibly expect anyone to perform well unless they are taught and coached? We do this for little league sports – but only the very best organizations do this for their people. Yet when something happens it’s the workers, not the management, who gets the blame. One way to understand the culture is simply to observe how managers interact with the staff, and how peers interact with each other. So thanks Bill for providing us with a positive example.

  3. Bill Mullins says:

    Earl,
    As you know I have been engaged with or followed the saga at the Hanford River Protection Program for more than 15 years – it certainly qualifies as among the most Complex, High-Consequence Circumstances in the world of environmental protection and probably among engineered pursuits of any type.

    At the present moment all the parties to the issues of WTP Safety Culture, and to the implications those issue could have as indicators of challenges across the DOE Nuclear Security and Safety enterprise, are awash in assessments of the circumstances of the RPP. The just issued report of HSS to which you refer is one more set of “expert opinions” about causal factors, extent of conditions, and evidence of remedial progress – but toward what end?

    It is evident that you and your HSS colleagues made a very concerted effort to introduce a research-credible methodological approach to this most recent assessment. You’ve indicated above your professional support for such defensible bases as essential to providing DOE decision-makers with sound assessment conclusions and informed recommendations.
    And so I’m wondering how I might conclude that your methods have been as successful as you would hope?

    I read the Executive Summary listing of shortcomings in the areas of Safety Culture and Management of Safety Concerns (12 Significant Conditions Adverse to Quality in my count) and I’m crestfallen – 16 months since the disruptive challenges raised in the DNFSB public hearings in October 2010 and the sense of the composite of yet another knowledgeable review is that “They Just Don’t Get It.”

    I’m left to wonder: How bad does it have to be before a seasoned team such as yours develops some pervasive sense of credulity for the proposition that the Concept of the RP Program is fatally flawed?

    What prevents the observation that there is lacking, in the basic governance architecture of the RPP, suitable respect for the substantial unconventional uncertainty present throughout this entire program – that the “Fast-Track Design-Build” construct was a defective Acquisition Strategy for the WTP? Or that an exceeding complex program serving two masters is beyond imprudent?

    What will it take for the consistent evidence, of review after review, to reach the light of day: that unrealistic commitments to scheduled milestones decades out are not working to provide accountability to the Tri-Party Cleanup Agreement?

    Who will be the first to say that by all appearances, that naïve Acquisition Strategy is the root cause of all the reluctance to raise issues, or to address them in a timely fashion, or to conduct realistic technology development programs until prodded by the DNFSB, or to put a life cycle integrated, System of Systems development organization in place, and to…?

    My conclusion for some time has been that all concerned – from DNFSB to local stakeholders – are trapped in a Vicious Cycle in which symptoms of unrealistic and imprudent Mission Management (e.g. the Fast-Track Design-Build or the forthcoming One System concepts) are being interpreted by default as evidence of a much more nebulous entity – weakness measured against the “Nuclear Safety Culture” meme. With all the hand-wringing over the symptoms of “defective safety culture,” the underlying progression of the presenting disease – under-capable Acquisition Strategy – goes unaddressed and largely unexamined.

    In a project (i.e. the WTP) whose only mission his High Hazard Reduction, it is inevitable that every aspect of design will involve significant questions of both production and protection. But it seems a mistake to assume that report of concerns (cf. Thomasitus) about the effectiveness of Pulse-Jet Mixing isn’t first and foremost a physical chemistry process engineering challenge – that the ability to assure adequate protection should be deferred for the most part pending confirmed evidence that needed handling of non-Newtonian fluids can be achieved at scale.

    To offer just one example of how putting Nuclear Safety Culture in the foreground of attention is problematic I suggest that a review be made of reports from the many different review and surveys conducted in the RPP over the past two years. Attention should be focused on the variation in key terminology and definitions applied in regard to “safety culture.” Such a review will clearly indicate that there is no unanimity among those offering “very important views” about what common subject is being examined.

    With some distance from the direct conduct of these reviews it is not surprising that such inconsistency of assumptions, definitions, analytical frameworks and such is to be found. In reality there is no agreed upon Nuclear Safety Culture Body of Knowledge. Just with IAEA alone there are dozens of guidance publications and no critical literature from which to question these “authoritative” documents – ones that are largely crowd-sourced by nuclear insiders with next to no domain knowledge in Sociology, Anthropology and contemporary Cognitive Psychology.

    This situation conspicuously and adversely impacts the artifact of NRC’s published Nuclear Safety Culture Policy; in comparison to NEI, INPO, IAEA guidance and a raft of other sources such as the VTT ones you cite NRC’s “expectations” are as superficial as they are “authoritative.” With some reviewers citing NRC stances on Safety Culture in the RPP setting, the movement away from DOE’s own, substantially more performance-based ISM experience is very disturbing to me.

    All this and much more leads me back to what beliefs I take that you and I share:
    • Do the Mission Safely – as implemented in ISM DEAR and Doctrine remains the central measure of DOE commitment to balanced resource application between the protection and production aspects of Acceptable Performance
    • The High Reliability Body of Knowledge, as it has evolved from observational research and the field work of those like Reason, Dekker, Weick, Sutcliffe, Hollnagel and many others, points to the importance of prudent wariness and uniform mindfulness to the Mission Done Safely.
    • Effective Program Management practice is that in which salient concerns of any type gain the attention they warrant as part of a wholesome Concept of the Enterprise – not an approach riven by professional narrowness, stovepiped histories, and “not playing nice together.”
    • The value of principle-based experience in large Discover and Develop programs such as Naval Reactors, or NASA (and even the Stockpile Stewardship Program) in which the reality of consistent attention to total performance is recognized as the key to reducing unwanted outcomes – of all types – to the practical minimum.

    I wish that I was seeing these well-established precepts and practices dominating the evaluation and analysis of “what’s amiss” at the RPP but I’m not.

    Those dozen SCAQs in the HSS Executive Summary should be setting off all the alarms on the 7th floor of the Forrestal Building. It is time to reboot the entire project before it turns out that the abandoned WPSS nuclear power plant down the road gets a sister in the WTP. I believe the S-1/S-2 memo of December 5, 2011 is a sound statement of corporate intentionality and is the place to start a recovery but it must be returned to the ISM evolutionary track and spared submergence in the Cargo Cult Sociology of “positive nuclear safety culture traits.”

    And the foregoing is before I get to the ongoing absurdity of using DOE STD 3009 as the basis for Safety in Design of the WTP! Hanford Tank Waste is a physical chemistry challenge of the first magnitude – aside from direct radiation from large waste volumes – the real hazards of the WTP are the chemicals. I don’t see that getting the attention that eventually it must.

    Best wishes,
    Bill Mullins, Principal, Better Choices Consulting

  4. Bob Roulston says:

    Mr. Mullins:
    By your statements: “To offer just one example of how putting Nuclear Safety Culture in the foreground of attention is problematic I suggest that a review be made of reports from the many different review and surveys conducted in the RPP over the past two years. Attention should be focused on the variation in key terminology and definitions applied in regard to “safety culture.” Such a review will clearly indicate that there is no unanimity among those offering “very important views” about what common subject is being examined.” are you getting back to the concept that the people studying an issue MUST come to some agreement on their terminology – i.e. resolve “Humpty Dumbty’s challenge of “the words I use mean exactly what I pay them to mean and nothing more” (paraphrase).
    In our complex world, if I refer to a statement by the NRC, you might think I was talking about the Nuclear Regulatory Commission, whereas I (an environmental professional) am refering to the National Response Center. Until we set the boundaries on study and terminology and agreed to it, we are talking but not communicating.

    • Bill Mullins says:

      Bob,

      I appreciate that question and the reference to Humpty Dumbty’s take on operational lexicons. Applying the HRO precepts in a Complex, High-Consequence Circumstance enterprise what I look for first is the provisions for dealing with the unanticipated.

      It is when a significant Vulnerability (i.e. unanticipated unwelcome outcome) emerges from the flow of the Program or Project that Humpty Dumbty’s usage kicks in. If the Kings Horsemen haven’t been trained and rehearsed at Egg Reconstruction (i.e. paid forward in readiness terms) Humpty recovery becomes a very long shot.

      In a sense if we reverse engineer what kind of readiness it takes for the Humpty definition algorithm to work effectively, we would end up with all the characteristic “no fatal surprises” attributes of an HRO. To the extent that practice makes for muscle memory without elaborate conversation reliability increases.

      So, does it follow that I need a standard glossary to communicate my findings about what I observe the enterprise resilience provisions to be? Probably not, but depending on the audience, I will likely need to do some tailoring of my descriptions to create the level of understanding I’m targeting with my Conversation. (BTW – when I capitalize like this I intend the meaning of a good English dictionary will get you through to my intent.)

      We know that in a broadcast report like the HSS review of the RPP, assumptions are made about target audiences and then terms of usage are employed to reach those audiences as effectively (i.e. thoroughly and efficiently) as possible.

      This becomes pretty practical to achieve with a standard scheduled audit (and this seems applicable all the way up to the GAO level). Such things are well-standardized – e.g. conventions exist regarding “observation” and “finding” so any dispute can move on to the action-provoking significance of applying one term or another in a particular case.

      But when the latest HSS review is the sixth or seventh documented assessment of what are some very high level concepts and principles this question of lexicon gets pretty messy in my view. So messy in fact that without some operational (as opposed to prescriptive) norms for judging significance, I find each new report adding more chaos looking forward defining appropriate responses than it provides risk insights with its specific details.

      I was looking at the IAEA standard Glossary the other day – as I recall it runs on for 137 pages. I was looking specifically at the description for the terms “safety,” “quality,” “risk,” and “culture” in comparison to the diverse usages of those terms in the various documents related to the state of the River Protection Program.

      The IAEA Glossary is laid out in traditional alphabetical order as if it were a general purpose dictionary – except that for the most part, each term has just one (presumably normative) description. In that mode, it obliterates the overall IAEA standards architecture – is that a quality alternative to a text hyper-linked database of the standards themselves? It seems a little obsolescent to me.

      You observed: “Until we set the boundaries on study and terminology and agreed to it, we are talking but not communicating.”

      Here’s my take home message about that – once we get to the steep side of the exponential curve of oversight demand as a function of observable complexity (e.g. where the RPP is) context and judgment regarding meaning demand a more negotiated approach than any glossary can prescribe.

      We attempt to circumvent this practical communication constraint all the time – to the peril of useful understanding!

      In the instance of nuclear energy applications – I find that all the parochial definitions about “nuclear safety,” – ones that circumscribe the meaning of “risk,”quality,” and “culture” – are doing more harm than good – despite the considerable and well-meaning efforts of the various standards setters in our field. That needs to change.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 97 other followers

%d bloggers like this: