Tag Archives: Narrative

The Future of Safety

Today I had the privilege and pleasure of speaking at the BCCGA AGM.  A copy of the slides presented can be downloaded here. In my conclusion I posed 4 questions for the BCCGA and its member organizations to consider.

1. What paradigm(s) should our safety vision be based upon?

The evolution of safety thinking can be viewed through 4 Ages.

The recurring theme is about how Humans were treated as new technologies were implemented into business practices. It’s logical that the changes in safety thinking mirror the evolution of Business Practices. The Ages of Technology, Human Factors, and Safety Management are rooted in an Engineering paradigm.

It’s Systems Thinking with distinct parts: People, Process, Technology. Treat them separately and then put them together to deliver a Strategy.  Reductionism works well when the system is stable, consistent, and relatively fixed by constraints imposed by humans (e.g., regulations, policies, standards, rules). However, in addition to ORDER, there are 2 other systems: COMPLEX and CHAOTIC in the real world. These two are constantly changing so a reductionistic approach is not appropriate. One must work holistically with an Ecological paradigm.

This diagram from the Cynefin Centre shows the relative sizing of the 3 systems. Complexity by far is the largest and continues to grow.  All organizations are complex adaptive system. A worthy safety vision must include the Age of Cognitive Complexity and view Safety as an emergent property of a complex adaptive system. The different thinking means rules don’t create safety but create the conditions that enable safety to emerge. Now we can understand why piling on more and more rules can lead to cognitive overload in workers and enable danger, not safety, to emerge.

2. How should we treat workers – as problems to be managed or solutions to be harnessed?

The Age of Technology and Age of Human Factors treated workers as problems – as cogs in a machine and as hazards to be controlled. The Age of Safety Management view recognizes that rules cannot cover every situation. Variability isn’t a threat but a necessity. We need to trust that humans always try to do what they think is right in the situation. The Age of Cognitive Complexity appreciates that humans think differently than logical information-processing machines in an Engineering paradigm. Humans are not rational thinkers; decisions are based on emotional reactions & heuristic shortcuts. As storytellers, people can articulate thick data that a typical report is unable to provide.  As solution providers, workers can call upon tacit knowledge – difficult to transfer to another person by means of writing it down or verbalizing it. Workers who feel like cogs or hazards tend to stay within themselves for fear of punishment. Knowledge is volunteered; never conscripted.

3. What safety heuristics can we share?

While Best Practices manuals are beneficial,  heuristics are on a  bigger stage when dealing with decisions. Humans make 95% of their decisions using heuristics. Heuristics are mental shortcuts to help people make quick, satisfactory but not perfect decisions.

They are the rules of thumb that Masters pass on to their Apprentices. Organizations ought to have a means to collect Safety-II success stories and use pattern recognition tools. Heuristics that emerge can be distributed to Masters for accuracy scrutiny.

4. How can we get more safety stories like these, fewer stories like those?

This question pertains to a new way of shaping a safety vision through the use of narratives (stories, pictures, voice recordings, drawings, sketches, etc.)

Narratives are converted into data points to generate a 2D contour map or fitness landscape
Each dot is a story and seen together they form patterns. The map shows the general direction we want to head – top right corner (High compliance with rules & High level of getting the job done). Clearly we want more safety stories in the Green area.  We also want fewer in the Red and Brown areas. Here’s the rub: If we try to go directly for the top right corner, we won’t get there.  This is ATTITUDE mapping at a level way deeper than observable BEHAVIOUR. Instead we head for an Adjacent Possible.
We get people to tell more stories here, fewer there  by changing a human constraint. It might be loosening a controlling constraint like a rule or practice. It could also be introducing an enabling constraint like a new tool or process.
We gather more stories and monitor how the clusters are changing in real-time. The evolving landscape maps a new Present state – a new starting point. We then change another constraint. Since we can’t predict outcomes both positive and unintended negative consequences might emerge. We accelerate the positives and dampen the negatives. In essence we co-evolve our way to the top right corner of the map. This is how we shape our Safety Culture.

7 Implications of Complexity for Safety

One of my favourite articles is The Complexity of Failure written by Sidney Dekker, Paul Cilliers, and Jan-Hendrik Hofmeyr.  In this posting I’d like to shed more light on the contributions of Paul Cillliers.

Professor Cilliers was a pioneering thinker on complexity working across both the humanities and the sciences. In 1998 he published Complexity and Postmodernism: Understanding Complex Systems which offered implications of complexity theory for our understanding of biological and social systems. Sadly he suddenly passed away in 2011 at the much too early age of 55 due to a massive brain hemorrhage.

My spark for writing comes from a blog recently penned by a complexity colleague Sonja Bilgnaut.  I am following her spade work by exploring  the implications of complexity for safety. Cilliers’ original text is in italics.

  1. Since the nature of a complex organization is determined by the interaction between its members, relationships are fundamental. This does not mean that everybody must be nice to each other; on the contrary. For example, for self-organization to take place, some form of competition is a requirement (Cilliers, 1998: 94-5). The point is merely that things happen during interaction, not in isolation.
  • Because humans are natural storytellers, stories are a widely used  interaction between fellow workers, supervisors, management, and executives. We need to pay attention to  stories told about daily experiences since they provide a strong signal of the present safety culture.
  • We should devote less time trying to change people and their behaviour and more time building relationships.  Despite what psychometric profiling offers, humans are too emotional and unpredictable to accurately figure out. In my case, I am not a trained psychologist so my dabbling trying to change how people tick might be dangerous, on the edge of practising pseudoscience.  I prefer to stay with the natural sciences (viz., physics, biology), the understanding of phenomena in Nature which have evolved over thousands of years.
  • If two workers are in conflict, don’t demand that they both smarten up. Instead, change the nature of relationship so that their interactions are different or even extinguished. Simple examples are changing the task or moving one to another crew.
  • Interactions go beyond people. Non-human agents include machines, ideas (rules, policies, regs) and events (meeting, incident). A worker following a safety rule can create a condition to enable safety to emerge. Too many safety rules can overwhelm and frustrate a worker enabling danger to emerge.

2. Complex organizations are open systems. This means that a great deal of energy and information flows through them, and that a stable state is not desirable.

  • A company’s safety management system (SMS) is a closed system.  In the idealistic SMS world,  stability, certainty, and predictability are the norms. If a deviation occurs, it needs to be controlled and managed. Within the fixed boundaries, we apply reductionistic thinking and place information into a number of safety categories, typically ranging from 4 to 10. An organizational metaphor is sorting solid LEGO bricks under different labels.
    In an open system, it’s different. Think of boundary-less fog and irreducible mayonnaise. If you outsource to a contractor or partner with an external supplier, how open is your SMS? Will you insist on their compliance or draw borders between firms? Do their SMS safety categories blend with yours?
  • All organisations are complex adaptive systems. Adaptation means not lagging behind and plunging into chaotic fire-fighting. It means looking ahead and not only trying to avoid things going wrong, but also trying to ensure that they go right. In the field, workers when confronted by unexpected varying conditions will adjust/adapt their performance to enable success (and safety) to emerge.
  • When field adjustments occasionally fail, it results in a new learning to be shared as a story. This is also why a stable state is not desirable. In a stable state, very little learning is necessary. You just repeat doing what you know.

3. Being open more importantly also means that the boundaries of the organization are not clearly defined. Statements of “mission” and “vision” are often attempts to define the borders, and may work to the detriment of the organization if taken too literally. A vital organization interacts with the environment and other organizations. This may (or may not) lead to big changes in the way the organization understands itself. In short, no organization can be understood independently of its context.

  • Mission and Vision statements are helpful in setting direction. A vector, North Arrow, if you like. They become detrimental if communicated as some idealistic future end state the organization must achieve.
  • Being open is different than “thinking out of the box” because there really is no box to start with. It’s a contextual connection of relationships with other organizations. It’s also a foggy because some organizations are hidden. You can impact organizations that you don’t even know  about and conversely, their unbeknownst actions can constrain you.
    The smart play is to be mindful by staying focused on the Present and monitor desirable and undesirable outcomes as they emerge.

4. Along with the context, the history of an organization co-determines its nature. Two similar-looking organizations with different histories are not the same. Such histories do not consist of the recounting of a number of specific, significant events. The history of an organization is contained in all the individual little interactions that take place all the time, distributed throughout the system.

  • Don’t think about creating a new safety mission or vision by starting with a blank page, a clean sheet, a greenfield.  The organization has history that cannot be erased. The Past should be honoured, not forgotten.
  • Conduct an ongoing challenge of best practices and Life-saving rules. Remember the historical reasons why these were first installed. Then question if these reasons remain valid.
  • Be aware of the part History plays when rolling out a safety initiative across an organization.
    • If it’s something that everyone genuinely agrees to and wants, then just clone & replicate. Aggregation is the corollary of reductionism and it is the common approach to both scaling and integration. Liken it to putting things into org boxes and then fitting them together like a jigsaw. The whole is equal to the sum of its parts.
    • But what if the initiative is controversial? Concerns are voiced, pushback is felt, resistance is real. Then we’re facing complexity where the properties of the safety system as a whole is not the sum of the parts but are unique to the system as a whole.
      If we want to scale capabilities we can’t just add them together. We need to pay attention to history and understand reactions like “It won’t work here”, “We tried that before”, “Oh no! Not again!”
      The change method is not to clone & replicate.  Start by honouring local context. Then decompose into stories to make sense of the culture. Discover what attracts people to do what they do. Recombine to create a mutually coherent solution.

5. Unpredictable and novel characteristics may emerge from an organization. These may or may not be desirable, but they are not by definition an indication of malfunctioning. For example, a totally unexpected loss of interest in a well-established product may emerge. Management may not understand what caused it, but it should not be surprising that such things are possible. Novel features can, on the other hand, be extremely beneficial. They should not be suppressed because they were not anticipated.

  • In the world of safety, failures are unpredictable and undesirable. They emerge when a hidden tipping point is reached.
    As part of an Emergency Preparedness plan, recovery crews with well-defined roles are designated. Their job is to fix the system as quickly as possible and safely restore it to its previous stable state.
  • Serendipity is an unintended but highly desirable consequence. This implies an organization should have an Opportunity crew ready to activate. Their job is to explore the safety opportunity, discover new patterns which may lead to a new solution, and exploit their benefits.
    At a tactical level, the new solution may be a better way of achieving the Mission and Vision. In the same direction but a different path or route.
    At a strategic level, the huge implication is that new opportunity may lead to a better future state than the existing carefully crafted, well-intentioned one. Decision-makers are faced with a dilemma: do we stay the course or will we adapt and change our vector?
  • Avoid introducing novel safety initiatives as big events kicked off with a major announcement. These tend to breed cynicism especially if the company history includes past blemished efforts. Novelty means you honestly don’t know what the outcomes will be since it will be a new experience to those you know (identified stakeholders) and those you don’t know in the foggy network.
    Launch as a small experiment.
    If desirable consequences are observed, accelerate the impact by widening the scope.
    If unintended negative consequences emerge, quickly dampen the impact or even shut it down.
    As noted in (2), constructively de-stabilize the system in order to learn.

6. Because of the nonlinearity of the interactions, small causes can have large effects. The reverse is, of course, also true. The point is that the magnitude of the outcome is not only determined by the size of the cause, but also by the context and by the history of the system. This is another way of saying that we should be prepared for the unexpected. It also implies that we have to be very careful. Something we may think to be insignificant (a casual remark, a joke, a tone of voice) may change everything. Conversely, the grand five-year plan, the result of huge effort, may retrospectively turn out to be meaningless. This is not an argument against proper planning; we have to plan. The point is just that we cannot predict the outcome of a certain cause with absolute clarity.

  • The Butterfly effect is a phenomenon of a complex adaptive system. I’m sure many blog writers like myself are hoping that our safetydifferently cause will go viral, “cross the chasm”, and be adopted by the majority. Sonja in her blog refers to a small rudder that determines the direction of even the largest ship. Perhaps that’s what we are: trimtabs!
  • On the negative side, think of a time when an elected official or CEO made a casual remark about a safety disaster only to have it go viral and backfire. In 2010 Deep Horizon disaster then CEO Tony Hayward called the amount of oil and dispersant “relatively tiny” in comparison with the “very big ocean”.  Hayward’s involvement has left him a highly controversial public figure.
  • Question: Could a long-term safety plan to progress through the linear stages of a Safety Culture Maturity model be a candidate as a meaningless five-year plan?
    If a company conducts an employee early retirement or buy-out program, does it regress and fall down a stage or two?
    If a company deploys external contractors with high turnover, does it ever get off the bottom rung?
    Instead of a linear progression model, stay in the Present and listen to the stories internal and external workers are telling. With the safety Vision in mind, ask what can we do to hear more stories like these, fewer stories like those.
    As the stories change, so will the safety culture.  Proper planning is launching small experiments to shape the culture.

7. Complex organizations cannot thrive when there is too much central control. This certainly does not imply that there should be no control, but rather that control should be distributed throughout the system. One should not go overboard with the notions of self-organization and distributed control. This can be an excuse not to accept the responsibility for decisions when firm decisions are demanded by the context. A good example here is the fact that managers are often keen to “distribute” the responsibility when there are unpopular decisions to be made—like retrenchments—but keen to centralize decisions when they are popular.

  • I’ve noticed safety professionals are frequent candidates for organization pendulum swings. One day you’re in Corporate Safety. Then an accident occurs and in the ensuing investigation a recommendation is made to move you into the field to be closer to the action. Later a new Director of Safety is appointed and she chooses to centralize Safety.
    Pendulum swings are what Robert Fritz calls Corporate Tides, the natural ebb and flow of org structure evolution.
  • Central v distributed control changes are more about governance/audit rather than workflow purposes. No matter what control mechanism is in vogue, it should enable stigmergic behaviour, the natural forming of network clusters to share knowledge, processes, and practices.
  • In a complex adaptive system, each worker is an autonomous decision-maker, a solution not a problem. Decisions made are based on information at hand (aka tacit knowledge) and if not available, knowing who, where, how to access it. Every worker has a knowledge cluster in the network. A safety professional positioned in the field can mean quicker access but more importantly, stronger in-person interactions. This doesn’t discount a person in Head Office who has a trusting relationship from being a “go to” guy. Today’s video conferencing tools can place the Corp Safety person virtually on site in a matter of minutes.
Thanks, Sonja. Thanks, Paul.
Note: If you have any comments, I would appreciate if you would post them at safetydifferently.com.

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSaveSaveSave

SaveSaveSaveSave

SaveSave

Safety Culture, the Movie

SWForce
It’s the holiday season. One terrific way to celebrate as a family is to see a movie together. Our pick? Star Wars: the Force Awakens.
Well, that was an easy decision. The next one is harder though…what sort of experience do we want? Will it be UltraAVX, D-Box, IMAX, 3D, VIP, Dolby ATMOS or Surround sound, or standard digital? While sorting through the movie options, for some reason I began thinking about safety culture and had an epiphany. Safety culture is a movie, not a photo.

A photo would be a Star Wars poster, a single image that a designer has artistically constructed. It’s not the full story, just a teaser aimed to influence people to buy a ticket. We understand this and don’t expect to comprehend the entire picture from one poster. A safety culture survey or audit should be treated in the same fashion. All we see is a photo, a snapshot capturing a moment in time. Similar to the poster artist, a survey designer also has a preconceived idea; influence is in the form of questions. The range of questions extends from researched “best practices” to personal whim.

I believe this is a major limitation of the survey/audit/poster. It could totally miss what people actually experience as a movie. A movie introduces visual motion and audible sound to excite our human senses and release emotions and feelings. We can watch behaviours as well as their positive and negative consequences being delivered. With flow we are able to sense operating point movement and drift into the zone of complacency and eventual failure. A safety culture has sound that a photo cannot reveal. In a movie you can hear loud communication, quiet conversations, or lack of (e.g., cone of silence).

If we were to create “Safety Culture, the Movie”, what would we need to consider? I’ve compiled a short list. What would you add?

  • Human actor engagement
    • Actors on screen – lead characters, supporting players, cameo appearances, cast in crowds; front-line workers, supervisors, safety professionals, public at large
    • Actors behind the screen – investors, producer, director, music arranger, theatre owners, craft guilds; Board, execs, project managers, suppliers, unions
    • Actors in front of the screen – paying audience, theatre staff, movie critics; customers, safety associations, regulatory inspectors
  • Story line
    • Safety culture is one big story
    • Safety culture movie is neverending
    • Within the one big story are several side stories, episodes, subplots
  • Relationships between characters and roles played
    • Heroes, villains, maidens in distress, comic relief, clones
    • Contact is continuous and relationships can shift over time (compare to a snapshot audit focusing on one scene at a particular time slot)
    • What seems in the beginning to be independent interactions are often interconnected (“Luke…I am your father”) and may lead to a dilemma or paradox later
  • Theme
    • Overt message of the safety culture movie – Good triumphs Evil? Might makes Right? Focus on what goes wrong? Honesty? Respect?
    • Hidden messages – Resistance is futile? Pay attention to outliers? Do what I say, not what I do? It’s all about the optics
    • Significance of myths, legends, rituals in the safety culture – the Dark side, Jedi order, zero harm workplace, behaviour-based safety
  • Critique
    • What can we learn if our movie is scored on a Flixter (Rotten tomatoes) scale out of 100?
      • What does a score of 30, 65, 95 tell us about performance? Success?
      • We can learn from each critic and fan comment standing on its own rather than dealing with a mathematical average
    • Feedback will influence and shape the ongoing movie
      • Too dark, not enough SFX, too many safety rules, not enough communication
    • Artifacts
      • A poster provides a few while a movie contains numerous for the discerning eye
      • Besides the artifacts displayed on screen, many are revealed in the narratives actors share with each other off screen during lunch hours, breaks, commutes
      • What might we further learn by closely examining artifacts? For instance, what’s the meaning behind…
        • Leia’s new hairdo (a new safety compliance policy)?
        • The short, funny looking alien standing next to R2D2 and C3PO (a safety watcher?)
        • Why Han Solo can’t be separated from his blaster (just following a PPE rule)?
        • Rey using some kind of staff weapon, perhaps similar to the staffs used by General Grievous’ body guards in Episode III (is it SOP certified)?
        • The star destroyers eliminating the control towers which caused them so many problems in the Battle of Endor (implementation of an accident investigation recommendation)?
        • Improvement in the X-Wing fighters (a safety by design change after an action review with surviving pilots?)

If you’d like to share your thoughts and comments, I suggest entering them at the safety differently.com website.

Season’s greetings, everyone! And may the Force be with you.

Do you lead from the Past or lead to the Future?

Ahhah

Recently Loren Murray, Head of Safety for Pacific Brands in Australia penned a thought provoking blog on the default future, a concept from the book  ‘The Three Laws of Performance’. I came across the book a few years ago and digested it from a leader effectiveness standpoint. Loren does a nice job applying it to a safety perspective.

“During my career I noticed that safety professionals (and this included myself) have a familiar box of tricks. We complete risk assessments, enshrine what we learn into a procedure or SOP, train on it, set rules and consequences, ‘consult’ via toolboxes or committees and then observe or audit.

When something untoward happens we stop, reflect and somehow end up with our hands back in the same box of tricks writing more procedures, delivering more training (mostly on what people already know), complete more audits and ensure the rules are better enforced….harder, meaner, faster. The default future described in The Three Laws of Performance looked a lot like what I just described!

What is the default future? We like to think our future is untold, that whatever we envision for our future can happen….However for most of us and the organisations we work for, this isn’t the case. To illustrate. You get bitten by a dog when you are a child. You decide dogs are unsafe. You become an adult, have kids and they want a dog. Because of your experiences in the past it is unlikely you will get a dog for your kids. The future isn’t new or untold it’s more of the past. Or in a phrase, the past becomes our future. This is the ‘default future’.

Take a moment to consider this. It’s pretty powerful stuff with implications personally and organisationally. What you decide in the past will ultimately become your future.

How does this affect how we practice safety? Consider our trusty box of tricks. I spent years learning the irrefutable logic of things like the safety triangle and iceberg theory. How many times have I heard about DuPont’s safety journey? Or the powerful imagery of zero harm. The undeniable importance of ‘strong and visible’ leadership (whatever that means) which breeds catch phrases like safety is ‘priority number one’.

These views are the ‘agreement reality’ of my profession. These agreements have been in place for decades. I learnt them at school, they were confirmed by my mentors, and given credibility by our regulators and schooling system. Some of the most important companies in Australia espouse it, our academics teach it, students devote years to learning it, workers expect it…. Our collective safety PAST is really powerful.”

 
Loren’s blog caused me to reflect on the 3 laws and how they might be applied in a complexity-based safety approach. Let’s see how they can help us learn so that we don’t keep on repeating the past.
First Law of Performance
“How people perform correlates to how situations occur to them.”
It’s pretty clear that the paradigms which dominate current safety thinking view people as error prone or create problems working in idealistic technological systems, structures, and processes. Perplexed managers get into a “fix-it” mode by recalling what worked in the past and assume that is the solution going forward. It’s being mindful of perception blindness and opening both eyes.
Second Law of Performance
“How a situation occurs arises in language.”

As evidence-based safety analysts, we need to hear the language and capture the conversations. One way is the Narrative approach where data is collected in the form of stories. We may even go beyond words and collect pictures, voice recordings, water cooler snippets, grapevine rumours, etc. When we see everything as a collective, we can discover themes and patterns emerging. These findings could be the keys that lead to an “invented” future.

Third Law of Performance
“Future-based language transforms how situations occur to people.”

Here are some possible yet practical shifts you can start with right now:

  • Let’s talk less about inspecting to catch people doing the wrong things and talk more about Safety-II; i.e., focusing on doing what’s right.
  • Let’s talk less about work-as-imagined deviations and more about work-as-done adjustments; i.e., less blaming and more appreciating and learning how people adjust performance when faced with varying, unexpected conditions.
  • Let’s talk less about past accident statistics and injury reporting systems and talk more about sensing networks that trigger anticipatory awareness of non-predictable negative events.
  • Let’s talk less about some idealistic Future state vision we hope to achieve linearly in a few years and talk more about staying in the Present, doing more proactive listening, and responding to the patterns that emerge in the Now.
  • And one more…let’s talk less about being reductionists (breaking down a social-technical system into its parts) and talk more about being holistic and understanding how parts (human, machines, ideas, etc.) relate, interact, and adapt together in a complex work environment.

The “invented” future conceivably may be one that is unknowable and unimaginable today but will emerge with future-based conversations.

What are you doing as a leader today? Leading workers to the default future or leading them to an invented Future?

Click here to read Loren’s entire blog posting.

When thinking of Safety, think of coffee aroma

CoffeeSafety has always been a hard sell to management and to front-line workers because, as Karl Weick put forward, Safety is a dynamic non-event. Non-events are taken for granted. When people see nothing, they presume that nothing is happening and that nothing will continue to happen if they continue to act as before.

I’m now looking at Safety from a complexity science perspective as something that emerges when system agents interact. An example is aroma emerging when hot water interacts with dry coffee grinds. Emergence is a real world phenomenon that System Thinking does not address.

Safety-I and Safety-II do not create safety but provide the conditions for Safety to dynamically emerge. But as a non-event, it’s invisible and people see nothing. Just as safety can emerge, so can danger as an invisible non-event. What we see is failure (e.g., accident, injury, fatality) when the tipping point is reached. We can also reach a tipping point when we do much of a good thing. Safety rules are valuable but if a worker is overwhelmed by too many, danger in terms of confusion, distraction can emerge.

I see great promise in advancing the Safety-II paradigm to understand what are the right things people should be doing under varying conditions to enable safety to emerge.

For further insights into Safety-II, I suggest reading Steven Shorrock’s posting What Safety-II isn’t on Safetydifferently.com. Below are my additional comments under each point made by Steven with a tie to complexity science. Thanks, Steven.

Safety-II isn’t about looking only at success or the positive
Looking at the whole distribution and all possible outcomes means recognizing there is a linear Gaussian and a non-linear Pareto world. The latter is where Black Swans and natural disasters unexpectedly emerge.

Safety-II isn’t a fad
Not all Safety-I foundations are based on science. As Fred Manuelle has proven, Heinrich’s Law is a myth. John Burnham’s book Accident Prone offers a historical rise and fall of the accident proneness concept. We could call them fads but it’s difficult to since they have been blindly accepted for so long.

This year marks the 30th anniversary of the Santa Fe Institute where Complexity science was born. At the May 2012 Resilience Lab I attended, Erik Hollnagel and Richard Cook introduced the RMLA elements of Resilience engineering: Respond, Monitor, Learn, Anticipate. They fit with Cognitive-Edge’s complexity view of Resilience: Fast recovery (R), Rapid exploitation (M,L), Early detection (A). This alignment had led to one way to operationalize Safety-II.

Safety-II isn’t ‘just theory’
As a pragmatist, I tend to not use the word “theory” in my conversations. Praxis is more important to me instead of spewing theoretical ideas. When dealing with complexity, the traditional Scientific Method doesn’t work. It’s not deductive nor inductive reasoning but abductive. This is the logic of hunches based on past experiences  and making sense of the real world.

Safety-II isn’t the end of Safety-I
The focus of Safety-I is on robust rules, processes, systems, equipment, materials, etc. to prevent a failure from occurring. Nothing wrong with that. Safety-II asks what can we do to recover when failure does occur plus what can we do to anticipate when failure might happen.

Resilience can be more than just bouncing back. Why return to the same place only to be hit again? Early exploitation means finding a better place to bounce to. We call it “swarming” or Serendipity if an opportunity unexpectedly arises.

Safety-II isn’t about ‘best practice’
“Best” practice does exist but only in the Obvious domain of the Cynefin Framework. It’s the domain of intuition and the Thinking Fast in Daniel Kahneman’s book Thinking Fast and Slow. What’s the caveat with best practices? There’s no feedback loop. So people just carry on as they did before.  Some best practices become good habits. On the other hand, danger can emerge from the baddies and one will drift into failure.

Safety-II and Resilience is about catching yourself before drifting into failure. Being alert to detect weak signals (e.g., surprising behaviours, strange noises, unsettling rumours) and having physical systems and people networks in place to trigger anticipatory awareness.

Safety-II isn’t what ‘we already do’
“Oh, yes, we already do that!” is typically expressed by an expert. It might be a company’s line manager or a safety professional. There’s minimal value challenging the response.  You could execute an “expert entrainment breaking” strategy. The preferred alternative? Follow what John Kay describes in his book Obliquity: Why Our Goals are Best Achieved Indirectly.

Don’t even start by saying “Safety-II”. Begin by gathering stories and making sense of how things get done and why things are done a particular way. Note the stories about doing things the right way. Chances are pretty high most stories will be around Safety-I. There’s your data, your evidence that either validates or disproves “we already do”. Tough for an expert to refute.

Safety-II isn’t ‘them and us’
It’s not them/us, nor either/or, but both/and.  Safety-I+Safety-II. It’s Robustness + Resilience together.  We want to analyze all of the data available, when things go wrong and when things go right.

The evolution of safety can be characterized by a series of overlapping life cycle paradigms. The first paradigm was Scientific Management followed by the rise of Systems Thinking in the 1980s. Today Cognition & Complexity are at the forefront. By honouring the Past, we learn in the Present. We keep the best things from the previous paradigms and let go of the proven myths and fallacies.

Safety-II isn’t just about safety
Drinking a cup of coffee should be a total experience, not just tasting of the liquid. It includes smelling the aroma, seeing the Barista’s carefully crafted cream design, hearing the first slurp (okay, I confess.) Safety should also be a total experience.

Safety can emerge from efficient as well as effective conditions.  Experienced workers know that a well-oiled, smoothly running machine is low risk and safe. However, they constantly monitor by watching gauges, listening for strange noises, and so on. These are efficient conditions – known minimums, maximums, and optimums that enable safety to emerge. We do things right.

When conditions involve unknowns, unknowables, and unimaginables, the shift is to effectiveness. We do the right things. But what are these right things?

It’s about being in the emerging Present and not worrying about some distant idealistic Future. It’s about engaging the entire workforce (i.e., wisdom of crowds) so no hard selling or buying-in is necessary.  It’s about introducing catalysts to reveal new work patterns.  It’s about conducting small “safe-to-fail” experiments to  shift the safety culture. It’s about the quick implementation of safety solutions that people want now.

Signing off and heading to Starbucks.

Safety-I + Safety-II

At a July 03 hosted conference Dave Snowden and Erik Hollnagel shared their thoughts about safety. Dave’s retrospects of their meeting are captured in his blog posting. Over the next few blogs I’ll be adding my reflections as a co-developer of Cognitive-Edge’s Creating and Leading a Resilient Safety Culture course.

Erik introduced Safety-II to the audience, a concept based on an understanding of what work actually is, rather than what it is imagined to be. It involves placing more focus on the everyday events when things go right rather than on errors, incidents, accidents when things go wrong. Today’s dominating safety paradigm is based on the “Theory of Error”. While Safety-I thinking has advanced safety tremendously, its effectiveness is waning and is now on the downside of the S-curve. Erik’s message is that we need to escape and move to a different view based on the “Theory of Action”.

Erik isn’t alone. Sidney Dekker’s latest presentation on the history of safety reinforces how little safety thinking has changed and how we are plateauing. Current programs such as Hearts & Minds continue to assume people have physical, mental, and moral shortcomings as was done way back in the early 1900s.

Dave spoke about Resilience and why the is critical as its the outliers where you find threat and opportunity. In our CE safety course, we refer to the Safety-I events that help prevent things from going wrong as Robustness. This isn’t an Either/Or situation but a Both/And. You need both Robustness + Resilience.

As a young electrical utility engineer, the creator of work-as-imagined, I really wanted feedback but struggled obtaining it. It wasn’t until I developed a rapport with the workers was I able to close the feedback loop to make me a better designer. Looking back I realize how fortunate I was since the crews were in proximity and exchanges were eye-to-eye.

During these debriefs I probably learned more from the “work-as-done” stories. I was told changes were necessary due to something that I had initially missed or overlooked. But more often it was due to an unforeseen situation in the field such as a sudden shift in weather or unexpected interference from other workers at the job site. Crews would make multiple small adjustments to accommodate varying conditions without fuss, bother, and okay, the occasional swear word.

I didn’t know it then but I know now: these were adjustments one learns to anticipate in a complex adaptive system. It was also experiencing Safety-II and Resilience in action in the form of narratives (aka stories).

A pathetic safety ritual endlessly recycled

Dave Johnson is Associate Publisher and Chief Editor of ISHN, a monthly trade publication targeting key safety, health and industrial hygiene buying influencers at manufacturing facilities of all sizes.  In his July 09 blog (reprinted below), he laments how the C-suite continues to take a reactive rather than proactive approach to safety. Here’s a reposting of my comments.

Let’s help the CEOs change the pathetic ritual

Dave: Your last paragraph says it all. We need to change the ritual. The question is not why or what, but how. One way is to threaten CEOs with huge personal fines or jail time. For instance, in New Zealand a new Health and Safety at Work Act is anticipated to be passed in 2014. The new law will frame duties around a “person conducting a business or undertaking” or “PCBU”. The Bill as currently drafted does not neatly define “PCBU” but the concept would appear to cover employers, principals, directors, even suppliers; that is, people at the top. A tiered penalty regime under the new Act could see a maximum penalty of $3 million for a body corporate and $600,000 and/or 5 years’ imprisonment for an individual. Thrown into jail due to unsafe behaviour by a contractor’s new employee whom you’ve never met would certainly get your attention.

But we know the pattern: Initially CEOs will order more compliance training, inspections, more safety rules. Checkers will be checking checkers. After a few months of no injuries, everyone will relax and as Sidney Dekker cautioned, complacency will set in and the organization will drift to failure. Another way is to provide CEOs with early detection tools with real-time capability. Too often we read comments in an accident report like “I felt something ominous was about to happen” or “I told them but nobody seemed to listen.”

CEOs need to be one of the first, not the last, to hear about a potential hazard identified but not being addressed. We now have the technology to allow an organization to collect stories from the front line and immediately convert them to data points which can be visually displayed. Let’s give CEOs and higher-ups the ability to walk the talk. In addition, we apply a complexity-based approach where traditional RCA investigative methods are limited. Specifically, we need to go “below the water line” when dealing with safety culture issues to understand the why rituals persist. 

Gary Wong
July 16, 2014

G.M.’s CEO is the latest executive to see the light

By Dave Johnson July 9, 2014

Wednesday, June 11, 2014, at the bottom right-hand corner of the section “Business Day” in The New York Times, is a boxed photograph of General Motors’ chief executive Mary T. Barra. The headline: “G.M. Chief Pledges A Commitment to Safety.”

Nothing against Ms. Barra. I’m sure she is sincere and determined in making her pledge. But I just shook my head when I saw this little “sidebar” box and the headline. Once again, we are treated to a CEO committing to safety after disaster strikes, innocent people are killed (so far G.M. has tied 13 deaths and 54 accidents to the defective ignition switch), and a corporation’s reputation is dragged through the media mud. The caption of Ms. Barra’s pic says it all: “…Mary T. Barra told shareholders that the company was making major changes after an investigation of its recall of defective small cars.”

Why do the commitments, the pledges and the changes come down from on high almost invariably after the fact?

You can talk all you want about the need to be proactive about safety, and safety experts have done just that for 20 or 30 or more years. Where has it gotten us, or more precisely, what impact has it had on the corporate world?

Talk all you want
Talk all you want about senior leaders of corporations needing to take an active leadership role in safety. Again, safety experts have lectured and written articles and books about safety leadership for decades. Sorry, but I can’t conjure the picture of most execs reading safety periodical articles and books. I know top organization leaders have stressful jobs with all sorts of pressures and competing demands. But I have a hard time picturing a CEO carving out reading time for a safety book in the evening. Indeed a few exist; former Alcoa CEO Paul O’Neill is the shining example. But they are the exceptions that prove the rule. The National Safety Council’s Campbell Institute of world class safety organizations and CEOs who “get it” are the exceptions, too, I’d assert.

And what is the rule? As a rule, proven again and again ad nauseam, top leaders of large corporations only really get into safety when they’re forced into a reactive mode. For the sake of share price and investor confidence, they speak out to clean up a reputational mess brought about by a widely publicized safety tragedy. Two space shuttles explode. Refineries blow up. Mines cave in. The incident doesn’t have to involve multiple fatalities and damning press coverage. I’ve talked with and listen to more than one plant manager or senior organization leader forced to make that terrible phone call to the family of a worker killed on the job, and who attended the funeral. The same declaration is stressed time and again: “Never again. Never again am I going to be put in the position of going through that emotional trauma. Business school never prepared me for that.”

“In her speech to shareholders, Ms. Barra apologized again to accident victims and their families, and vowed to improve the company’s commitment to safety,” reported The New York Times. “Nothing is more important than the safety of our customers,” she said. “Absolutely nothing.”

Oh really? What about the safety of G.M.’s workers? Oh yes, it’s customers who drive sales and profits, not line workers. This is cold business reality. Who did G.M.’s CEO want to get her safety message across to? She spoke at G.M.’s annual shareholder meeting in Detroit. Shareholders’ confidence needed shoring up. So you have the tough talk, the very infrequent public talk, about safety.

Preaching to the choir
I’ve just returned from the American Society of Safety Engineers annual professional development conference in Orlando. There was a raft of talks on safety leadership, what senior leaders can and should do to get actively involved in safety. There were presentations on the competitive edge safety can give companies. If an operation is run safely, there are fewer absences, better morale, good teamwork, workers watching out for each other, cohesiveness, strong productivity and quality and brand reputations. The classic counter-argument to the business case was also made: safety is an ethical and moral imperative, pure and simple.

But who’s listening to this sound advice and so-called thought leadership? As NIOSH Director Dr. John Howard pointed out in his talk, the ASSE audience, as with any safety conference audience, consists of the true believers who need no convincing. How many MBAs are in the audience?

Too often the moral high ground is swamped by the short-term, quarter-by-quarter financials that CEOs live or die by. Chalk it up to human nature, perhaps. Superior safety performance, as BST’s CEO Colin Duncan said at ASSE, results in nil outcomes. Nothing happens. CEOs are not educated to give thought and energy to outcomes that amount to nothing. So safety is invisible on corner office radar screens until a shock outcome does surface. Then come the regrets, the “if only I had known,” the internal investigation, the blunt, critical findings, the mea culpas, the “never again,” the pledge, the commitment, the vow, the tough talk.

There’s that saying, “Those who do not learn from history are bound to repeat it.” Sadly, and to me infuriatingly, a long history of safety tragedies has not proven to be much of a learning experience for many corporate leaders. “Ah, that won’t happen to us. Our (injury) numbers are far above average.” Still, you won’t have to wait long for the next safety apology to come out of mahogany row. It’s a pathetic ritual endlessly recycled.