Archive | Organizational Change RSS feed for this section

Lessons from an Oil Rig for Change and Leadership

11 Apr

A recent opinion article (full article here) on highlights the need for safety culture to address human needs. Oilrigs are incredibly high risk environments where decision making, experience, expertise, procedures and leadership are essential to staying safe and completing tasks effectively. However, the article identifies a trap people and organisations can easily fall into when confronted with high risk environments-the sensation that greater levels of procedures, micro management, and bureaucracy reduce mistakes, errors, and accidents. This trap contains the implicit assumption that narrowing the capacity for people to exercise judgement, improves performance. However, this approach has negative consequences for safety, leadership and managing change whilst building up potentially much bigger problems for the future (see Taleb, 2013, for a good overview of this concept).

The article attends to how overuse of procedures and technology dulls the attentiveness of human operators. The author contrasts two types of oilrig work to make the point-

“Rig roughnecks and roustabouts repeat the same procedure over again for 12 hours straight without mistake, partly because then type of work has enough mix of eye, hand, foot and body movement to keep their mind occupied”

Compared to

“…systems operators no longer have a chance to patrol the plant; they must sit still in the same spot, staring into computer screens for hours at a time.”

The first description provides an opportunity for frontline workers to develop sense making of complex, non -routine situations, situational awareness and role expertise. These type of roles create tacit skills, as frontline workers develop fine-tuned heuristic methods of problem solving.

The second description represents distance between the human worker and their environment. This situation creates the opposite conditions for developing expertise. The human worker is cut off from the quality environmental feedback stemming from decisions. This separates the worker from the fine discriminations available from being directly on the frontline, leaving them with nothing to do but follow abstract procedure.

When an organisation introduces systems which potentially wash out expertise, it also dulls the organisation to small deviations in routines which could lead to large consequences; along with the methods to tackle these deviations in their infancy. It’s similar to beautifully redecorating a building whilst removing all fire detectors and fire extinguishers. Things look a lot better to the casual observer, but are in fact significantly more dangerous. The improvements are superficial and aesthetic.

The introduction of systems which separate workers from the opportunity to develop expertise, also reduces improvisation and innovation, and this has significant consequences for leadership. All procedures and routines will hit their limit. When procedures and routines hit their limit, they often require intuitive leadership- people who use experience and expertise to take control of a situation, and have the confidence and mandate to improvise when necessary. When procedures take precedence, problems are migrated up the hierarchy. This leaves frontline workers waiting for instructions from someone separated from a developing situation. This robs frontline workers of the opportunity to develop and practice leadership.

The article continues

“Leaders know that people want a sense of control of their work environment. Workers don’t want top down edicts telling them how they MUST conduct their work, to be announced by a memo stuck on a notice board. Work procedures must be constantly questioned, reviewed and modified”.

If leaders are looking to effectively manage change and build an adaptive (and resilient) organisation, then the experiences and expertise of frontline workers need to play a central role. Change works most effectively when it supports and enhances what people do well, not wash it out with top down, abstract systems and structures.

To achieve this, frontline workers need to operate in conditions which allows them to develop meaningful expertise (see above), and then this expertise needs to be collected, analysed and used to direct change, and constantly adapt, review and modify procedures. This places frontline experience at the heart of change. And this type of change provides people with clear meaning and significance, it’s their skills and experience which are driving the direction.

These arguments do not call for the abandonment of procedures, they have enormous value. Nor do these arguments call for constant disruptive change in which nothing can get done effectively. Instead they call for natural human strengths to be supported and enhanced by placing frontline workers in contact with the consequences of their decisions, allowing them to develop expertise, and the authority to lead and improvise.

If expertise is collected and analysed in a way which doesn’t increase workload (beyond the minimal), then it can be used to modify, support and enhance procedures, systems and structures. And this leads to greater levels of safety, innovation and motivation. This way, the organisation is shaped and changed by experience, not abstract theory masquerading as rigorous micro management.


Taleb, N. N. (2012) Antifragile: Things That Gain from Disorder. New York: Random House.

Change and Resilience

15 Mar

How could an organisation improve its resilience? It’s safety record? Its ability to manage uncertainty and identify risks? The temptation is to head into a change programme, adopting an external model based on success elsewhere. Or by creating a plan largely defined by senior management expectations and top down in nature. Although these approaches can work, they also carry risks.

Continue reading

Case Study-When Experts Assume too Much

11 Feb

Another one of our projects written up as a case study. The project illustrated how success, coupled with routine and familiarity, can sometimes increase a teams vulnerability to sudden change….

When a team performs well, communication frequently acts as the glue which binds the performance. Members of a top performing team understand how each other make sense of situations, what words and phrases mean within a context, and pass on instructions and intent clearly. When a team member tells a colleague that the risks in a certain procedure are “significant”, they both understand what degree of risk “significant” means in relation to the procedure. There is no mismatched assumptions and second guessing. But problems occur when a regular team member leaves or is sick, and is replaced with someone new to the team. Communication can no longer be taken for granted, but it frequently does in these situations. Assuming someone new will understand the intent behind communication in the same way as someone who has been a team member for two years can be costly at best and catastrophic at worst. Inside of two days we able to provide a solution which ensured that new and temporary staff working in the NHS were able to pick up the intent of a well-established team immediately.

Continue reading

Transforming Expertise

28 Jan

Mickinsey and Company recently published an interesting article entitled “Transforming Expert Organizations”. The article identified an interesting “expertise paradox”. Expertise builds up within an organization, is highly effective and transformational, but then becomes increasingly more difficult for outsiders to understand and access. The authors of the article (Bollard et al, 2016) refer to this as an “expertise silo” and describe the concept below

Continue reading

Frozen Decisions

7 Jan

When we read a report, analyse data, examine a survey, how long is it valid as a basis for decision making? Five months? Five weeks? Of course, there isn’t a right answer. Every data source requires a degree of vigilance as time and activity chip away at its relevance. I’ll share an example below where a report I wrote for the purposes of decision making was invalid the moment it was printed.

Continue reading

Doing More With Less

28 Oct

Whenever you read a government strategy and economic outlook for companies across the West, there is a consistent underlying theme-we need to do more with less. This is particularly the case with health and social care within the UK, where competing priorities and “doing more with less” are leading to increased pressures in service delivery and the search for new ways of working. Any opportunities which may emerge from examining and testing how to do more with less will first be met with inevitable resistance, as the natural human process of loss aversion takes hold. Then there is the flip of loss aversion, over optimism, as a strategic idea is used to explain away contradictions, obstacles and difficulties; this can lead to view that doing more with less is going to be far easier than the reality will allow.  I’m going to take a brief look at a potential method of tackling loss aversion and over optimism whilst aiming to boost the improvisation required to “do more with less”.

Kahneman and Tversky (1992) identified a human bias toward preventing and feeling a loss as opposed to securing gains. This means that when a person has something removed from them (or the threat of removal), they will feel the loss more and fight harder to avoid the loss, than the person would over any gain. Therefore, the mere notion of “doing more with less” invokes a strong psychological reaction. The optimism bias, is associated with the more abstract task of strategic planning (see Kahneman and Tversky, 1979 for original research). The designers of plans have a potential bias to overestimate their ability (and the ability of their available resources) to complete a task, whilst underestimating the difficulties, both anticipated and unanticipated, in derailing a task.

As a result, human beings are demonstrably poor at forecasting (Tetlock and Gardner, 2015) consistently and significantly over weighting or under weighting the impact of future events. This can trap people into a cycle of emotional (as opposed to intuitive) decision making, predicting “less” will be relatively easy to cope with or “less” will be impossibly difficult. The reality may exist in a large space in between, and not remain in that space for very long.

Kahneman and Klein (2009) observe that forecasting improves if the future outcome is already known. In the absence of knowing the future the next best solution is to imagine one. However, the imagined future must take the form of catastrophe. For example, imagine you have been asked to do more with less, and it is now 6 months into the future. The request has resulted in a complete catastrophe. Now take 5 minutes to write down the history of that catastrophe. Then ask what could you do? What links could be made? What software, online resources could be levered? Which partners or colleagues could help? And what would you need to let go of? This exercise enables leverage points and cross links to be located. It also separates what is realistically less versus impossibly less.

The above can be carried out in teams and groups, and findings should be shared to highlight potential innovation and high risks. The exercise enables people to spot previously hidden risks and previously hidden opportunities, it also shares a lot of learning and ideas. This leaves a question- why focus on catastrophe, a worse-case scenario? Dealing with worst case scenarios are far from the average experience, as a result people draw from their most testing previous experiences and the reservoirs of their knowledge. The result should highlight both experience and creativity.


Kahneman, D. Tversky, A. (1992). “Advances in prospect theory: Cumulative representation of uncertainty”. Journal of Risk and Uncertainty 5 (4): 297–323.

Tetlock, P. E. & Gardner, D. (2015). Superforecasting: The Art and Science of Prediction. New York: Crown

Kahneman, D. Klein, G. (2009). Conditions for intuitive expertise: A failure to disagree. American Psychologist, 80, 237–251

Kahneman, D, Tversky, A. (1979). “Intuitive prediction: biases and corrective procedures”. TIMS Studies in Management Science 12: 313–327

Working On and Off the Edge

9 Jul

“You don’t turn a crew onto a score and not expect any heat”. These are the words, more or less, of the Grand Theft Auto anti-hero, Trevor Philips, as he shares his wisdom on decision making, risk and uncertainty. Trevor, an experienced bank robber, high jacker, drug dealer (amongst others) is reflecting on the consequences of making dramatic decisions which carry huge amounts of risk and lead to a very uncertain future. In Trevor’s world, the upside of decisions is closely intertwined with the downside-if you want the rewards, you personally have to endure the risks; you have to be prepared to take the heat. Being exposed to heat (in the Trevor context, the inevitable FBI investigation) is a drastic lesson in self-awareness; if it’s you having to deal personally with the consequences of your own decisions, then you’re going to be very focused on the quality of your plans and your ability to cope, if and when the heat arrives. This sharp and immediate connection between decision and action is “working on the edge”.

By contrast, in organisational life, decisions can be implemented with significant distance between those who make decisions and those who are carrying the decisions out, and this includes the consequences. In other words, not everyone who makes the decisions gets to feel the heat. This lack of connection between decision and consequence can have a number of negative effects. The ability to adapt becomes blunted, plans become too abstract, communication slows, and certain types of culture can edit feedback and erase bad news stories. It’s a little like trying to explain what a landscape looks like to someone on the end of a cell phone. The person listening to the description gets a general image of the landscape, but they are essentially matching the description to the nearest available analogue from memory, they are essentially thinking “it sounds like landscape B which I saw last year”. Without actually seeing the landscape, they are robbed of the subtle and nuanced detail. This detail might be vital feedback, but if the person describing the landscape misses some of this subtle detail out, for whatever reason, then feedback is compromised. The describer could be simply describing the landscape they feel the listener wants to hear, they could be focused only on the detail which has important meaning to them, they could be editing the scene in a variety of different ways and for a variety of different reasons. When there is distance between the planner and the implementer, the same limits are in place as those experienced by our landscape describer and listener. This is “working off the edge”. Here is a quote from a senior manager who was on the receiving end of a change management strategy which was designed off the edge

“It will never work (the strategy) because it has no bearing on how things actually work in this department. It’s been designed by the executive like your pushing around shapes and thinking-this could go here, and then we can put this one in there. Our director knows this, but executive went ahead and said this is how things are going to be, and too many important people have too much of a stake in looking competent to the board, so we’ve been told we’re stuck with it”.

Examining people and populations who work on the edge can reveal both interesting and useful lessons for anyone who manages uncertainty and change. The people who work on the edge have stories about their experiences which reveal the risks of a closed mind and the benefits of adaption. Because these people have personally experienced the downside of their decisions, they are able to provide lessons which can be used to bring decision making closer to the edge. So who works on the edge, in addition to Trevor Phillips? Start-up founders work on the edge, so do extreme athletes, branches of the military and police, areas of medicine and health, and explorers. Below are two articles written by the explorer Cathy O’Dowd, which have been featured on the Echo Breaker blog. In these articles are examples of how luck can be mistaken for skill, how examining worst case scenarios can be used to generate adaptive strategies and significantly, how making some decisions must involve an acknowledgement and acceptance that the ultimate price could be paid.

It’s not necessary to work on the edge to create an adaptive mind-set. It’s possible to improve our ability to adapt by moving our decisions closer to the edge. We can achieve this by ensuring there is closer distance between planners and implementers, that we fully understand the environment which our decisions effect, but most of all, we should imagine that we are personally affected by the worst consequences of each decision we make; we all need to ask if we could deal with the heat if and when it comes.


Some risks can’t managed away, if you’re going to make decisions in environments which carry dread risk, you should be prepared for the consequences

The benefits of an adaptive strategy in high risk and fast changing environments




The Value of Inarticulate Knowledge

7 Jul

Two people are in a garage. In front of both people is a drum kit; one garage, two people (let’s call them person A and person B), and two drum kits. Person A picks up the drum sticks and produces one of the most awesome displays of drumming person B has ever seen, including a trick that person B believes could only be performed by a super human. This dialogue follows

Person B- how did you do that trick?

Person A- the one with both hands?

Person B- Yeah, that one

Person A- I don’t know really, just practice I guess, I made it up

The above exchange is typical when someone tries to find out how someone else performed something which exceeded their expectations. What person A performed was beyond routine, and beyond good. It broke the limits of person B’s technical knowledge, they (B) simply didn’t recognise the skills required to perform “that tick”. Person B was so amazed by what they saw because they didn’t recognise it, it broke their norms and expectations; it was a surprise event. Immediately, person B attempted to adapt by trying to make sense of “that trick” by asking person A the question- how did you do it? However, person B was unable to adapt because person A simply did not know how to describe it, it was a tacit skill they had acquired through practice.

There is a general dilemma contained above; when human beings are confronted with events which are beyond routine, experience, and their current limits, they attempt to adapt their skill\knowledge base to incorporate this “new event”; human beings need to know how to make sense of new events, how to frame them, understand them and improve as a result of them. However, frequently, this understanding is bound to be limited, as so much of what we see is tacit-inarticulate knowledge. And this leads us into another set of questions.

Why do we trust some people more than others in a professional environment? What seems to separate the best performers from those who seem to struggle and how can this gap be closed? These questions are asked quite frequently in organisations, professions and skill based activities. The question’s aren’t necessarily asked or framed in this way, but appraisals, seminars, training and many other regular activities all seek to take the lessons from the “best” and give them to others to use and improve. In some cases it’s obvious what separates the good from the outstanding, but in most cases the interaction between an individual’s reasoning and their environment create tacit skills which are incredibly difficult to locate, collect and articulate in any meaningful way. This produces a fundamental of the human condition- we frequently know more than we can say. For researchers in my field, and similar fields, we aim to close the gap between inarticulate and articulate knowledge.

Every piece of research should have some benefit, so what benefit is there in drawing out lessons from people who have adapted to non-routine situations or redefined limits? I’ll start with an example that is probably familiar, almost routine. Imagine a CEO about to carry out a pretty simple piece of change management, take a highly successful team which operates in building A and move the team to building B; seems straight forward. A relatively simple change like this can actually have large consequences if the CEO doesn’t have some basic knowledge of how the team interact with their environment. Is this team successful because they have open plan offices and so frequently discuss their projects, allowing everyone to pick up hints, tips and absorb tacit skills? Or do they each have their own offices but communal lunch and common room facilities where similar learning can take place? If the CEO only understands that the team are successful as opposed to why they are successful, and that includes some basic knowledge of how the environment supports this success, a simple act like an office move could have a profound effect if the new work space has some innocent, but vital, differences which influence communication. Sometimes office moves are designed solely by architects and removed executive teams, and this is why tacit skills can be innocently washed out within a couple of weeks.

Another example is being able to turn emotional responses, such as trust, into usable information and ultimately knowledge. I was asked by a construction site manager to increase the efficiency and effectiveness of their performance. The site manager “virtually” managed specialist teams who operated across six separate sites at any one time. So, the site manager had six teams who were assigned to construction sites, anywhere across the UK, to carry out specialist projects. When the projects were completed, the teams would move on to another project located at another site. For the site manager, this was the problem- three of teams would work independently, use their initiative, problem solve and move smoothly between projects. By contrast, the other three teams would consistently run into problems, contact the site manager repeatedly about how to fix these problems, seek to consult on every decision and were causing the site manager “a lot of stress”. An aggravating problem with the latter three teams was that even when things were going well, the site manager couldn’t help but over manage them with constant phone calls and checks, further concreting a culture of dependency.

When the site manager was explaining the above situation, they talked about which teams they trusted and which teams they didn’t. This is a familiar human method for making sense of other people; carving people into “trust” and “don’t trust” groups. The challenge is to identify what beliefs and behaviours people are recognising which generate these emotional responses. So, what was the site manager recognising which led them to trust someone? Before each site job, a strategy for getting the job done was drawn up by themselves (the site manager) and the team who would be carrying the project out. This strategy was a sketch, and was meant to be a guideline for carrying out the job. It was at this point the two sets of teams separated in their approach.

When the teams which the site manager trusted arrived on site they would weigh up the situation and visualize how the strategy could run into trouble based on what they were seeing. The “trusted teams” would then adapt the plan based on perceived problems, develop contingency plans, and plan extra\alternative resources up front. This kept surprises to a minimum, and even when they did occur, they were far easier to recover from. The “untrusted teams”, alternatively, would start work immediately by implementing the sketched out strategy. This led to fast and frequent problems, and the total inability to recover from unexpected events, for example, the job would have to close down whilst they waited for alternative resources. The suggestion? Get the “untrusted teams” to adopt a simple, three stage, pre-planning routine when they arrived on site for each new project. This immediately got the teams focused on adaption, rather than frequently asking “what do I do next?” Identifying how people make sense of events, teams and other people through the casual use of everyday words like “trust”, can provide simple lessons which accelerate performance; so ask, what are people recognising which makes them trust someone else’s performance?

The eminent psychologist Gerd Gigerenzer (2008) provides a fascinating example, which illustrates how adaption can sometimes be achieved by letting go of information, as opposed to gathering more. Gigerenzer wished to test the role of recognition in decision making. The recognition heuristic is based on research which suggests that when human beings are faced with two or more competing choices, situations etc. they will select the choice which they recognize the most. Like many heuristics, its potential origin lies with how our ancestors used to make sense of uncertainty- trust friends over strangers etc.

Gigerenzer and his colleague entered a stock picking contest in 2000 organized by the magazine, Capital. The editor chose 50 internet equities and over a period of six weeks, each contestant could buy, sell or hold any of these equities to produce a profit. To pick their portfolio, Gigerenzer et al, asked 50 Berlin men and women with no professional knowledge of equities which stock names they recognised out the 50 equities. Choosing the 10 stocks most frequently recognised, the Gigerenzer portfolio was held for the six week tournament; no changes were made. The competition benchmark was the performance of the editor, at the end of the six weeks the Gigerenzer portfolio was up by 2.5% whilst the benchmark editor portfolio was down 18.5% (see Gigerenzer, 2007 for a full account). What broader conclusions can be drawn?

The novice group selected their stock picks on recognition. They had limited knowledge of stocks so just picked the names which the 100 Berliner’s recognised. The experts used very complex models and analysis. The problem occurs for experts when so much data from previous performance (which the experts had access to) is used to forecast future performance. No one is capable of predicting the future, so it is impossible tell which data from the past will be relevant to future performance, this causes errors to be factored into analysis. In other words, the experts simply knew too much, and name recognition demonstrated a far more reliable method for anticipating future performance. This finding has been repeated across multiple domains, and Gigerenzer himself invested a significant amount of money in a “novice portfolio” making substantial gains.

The lesson is that complexity can frequently derail intuition, and can blunt adaption to fast changing events. The point is that the Gigerenzer portfolio was selected intuitively, using a more productive method than complex data collection and analysis. This isn’t always the case of course, but it’s important for an organisation to know what highly effective short cuts exist which can cut down endless, time consuming and unproductive data collection. The point equally applies to procedures and processes. Procedures and processes should ensure that fundamentals are being attended to (airplane check lists, basic patient information etc.) but should not become so overly complex, bureaucratic and burdensome that fast, effective intuition, such as the recognition heuristic, becomes impossible. This is was something myself and colleagues discovered when investigating how experts produced excellent patient care for people who had dementia. The experts would use heuristics, experienced based rules of thumb, to provide simple and productive solutions. Too much bureaucracy would not only make these short cuts hard to share but nearly impossible to develop.

In this article I’ve taken a brief look at why it’s worth examining adaptive thinkers. A lot of what we do is tacit, known, but not articulated. Without taking simple steps to uncover tacit knowledge we can take it for granted, until the people we rely on leave, or we can wash it out with seemingly innocent changes, or we can make it impossible to surface with too much complexity, or we can mistake an adaptive mind for an open mind by constantly searching for more data and the gold at the end of the rainbow. Back to the scenario at the start, It’s not just about marvelling at the drumming skills of person A, it’s about giving person B a few tools which they can apply to learn those skills. Below are articles from Echo Breaker which has discussed the role and collection of tacit knowledge.

A quick look at expertise in action

What tacit sense making goes on when we read and respond to emails? This article takes a look at the consequences of making sense of emails

A look at some of the conclusions Echo Breaker research drew toward the end of 2013

Why exploring decision making has value not just for fulfilling research agendas but also for what it tells us about how we interact with, and make sense of our environment

Taking the emotion of trust and turning that into a resource which has use for management, planning and innovation


Gigerenzer, G (2008) Gut Feelings: Shorts Cuts to Better Decision Making. Penguin

Stories, Feedback and Adapting to Change

3 Jul

This article has a simple aim- it attempts to demonstrate the link between the stories we tell ourselves, the role of feedback when we use these stories to make sense of situations, and how it’s not so much the initial story which counts, it’s more about how adaptable that story is.

When your mind is closed or fixed it becomes incredibly difficult to generate alternative stories about what could happen next, or in some extreme cases, any story at all, which manifests itself in stress, panic, anxiety and in some cases depression. When you are faced with a situation which means something to you, is challenging, new, novel or beyond anything encountered before, generating a plausible story creates action. How adaptive that story is, often separates success from failure, expert from novice.

The ability to adapt a story, to absorb new information, is crucial to creating an adaptive mind set. When Klein et al (1989) researched firefighter decision making they discovered that when facing a blaze, firefighters, instead of comparing a couple of viable options immediately generated a plausible story of how to tackle the fire. This story was generated from the firefighter’s extensive reservoir of experiences, allowing them to pattern a previous story to the current situation. The firefighters experienced this as intuition “knowing just what to do”, and found it difficult to articulate. Once the plausible story had been generated they would simulate the actions in their mind, looking for barriers and solutions, subtle cues and patterns which may turn out to present big risks. After the mental simulation they would have an adapted story, customised for the situation right now. As the firefighters tackled the blaze the process would be constantly repeated, the story was tacitly organic; they were prepared to let go of initial assumptions.

Karl Weick and Kathleen Sutcliffe (see 2007) found similar results when they studied smoke jumpers* during their work on sense making. The vast amounts of experience firefighters and smoke jumpers have, allow them to generate plausible stories which in turn generate the necessary fast actions, but do so in a way which is adaptable. I and colleagues discovered similar during a study of dementia specialists and how they meet patient and family needs-excellent patient care was frequently intuitive as opposed to just the following of procedures. In other words, the frame of mind is not fixed, it adapts to the environment.

Let’s contrast the above with the CEO I talked with from a previous article. To recap, this was the CEO of large UK organisation (total anonymity was agreed, so no more information than that) who had been forced into making sudden and drastic changes due to the restructuring of government funding. The CEO didn’t have the experience of managing rapid change, and by their own admission was not doing a good job. However, what was really causing this executive to reflect was the fact that the funding changes had been sign posted long ago. Instead of being proactive, and taking the opportunity to make some more subtle tweaks, they had simply explained it all away, imagining that it would soon blow over and the always lethal “things would soon get back to normal”. The CEO had been hoisted by their sense of stability; years of seemingly stable operating had left both the mind-set, and the business model, completely unprepared for changes in the environment.

There are many obvious differences between CEOs, smoke jumpers, firefighters, and dementia specialists. However, I’d like to focus on one difference in particular- to be successful in firefighting for example, you HAVE to adapt to the environment, and you have to adapt quickly. Each story which is generated is done so with an adaptive frame of mind, each action provides feedback and this feedback adapts the story further. The stakes are high in firefighting, cues and patterns are regular, and actions can be traced to outcomes; you get to know the environment. The quality of experience, and speed of feedback, produces intuitive decision makers in environments similar to firefighting who can quickly generate plausible stories and then keep on adapting these stories.

Contrast the above with our CEO, where cues and patterns did not HAVE to be acted on immediately, and there was plenty of time to explain changes away. Due to the distance between actions and feedback in our CEO example, it is easy to mistake cause and effect and let false assumptions take hold, such as “nothing has happened yet, so things will probably get back to normal soon”. However, the stakes are still high for a CEO; people can lose their jobs and economies can potentially collapse when an entire network turns toxic. The key difference between the firefighter and our CEO is the distance between action and feedback. Without such a loop in place, it’s far more difficult to extract lessons about the relationship between decisions and there effect on the environment.

Without the benefit of immediate feedback, which increases the quality, experience and relevance of what works and what doesn’t, life can become a bit of a guessing game. A period of success and stability can be taken as a signal that things are going in the right direction, there is no urgency, unlike firefighting, to focus on any subtle cues or patterns which may signal potential risks; adaption is not high on the agenda. As an alternative we could turn again to the Rudolph study (2009) and examine what her adaptive decision makers did.

An adaptive decision maker generates a plausible story which makes sense of the current situation, an expectation of how that situation could unfold and an action script of what to do next (Klein, 2009). Our adaptive decision maker then tests this plausible story, examining rigorously how they could be wrong, and searching for weak signals which may have large consequences. If an element of the initial story is contradicted in this search, then it gets adapted, not explained away; they create feedback loops.

In the conversation with our CEO I suggested there was nothing inherently wrong with their story “things will get back to normal”, it was a worldview which had been justified for the past 20 years they had been in post, and it was plausible. The problem was it was a story which no longer fitted the environment; it’s time had passed. The challenge was to assess the reasons why that story no longer fitted. In that gap between the story and the environment would be the feedback which could be used to adapt.

Experiment and tinker with stories, don’t simply dig in and defend them.

The articles below the cover the subject of the adaptive mind-set

What beliefs are generating your plausible story? This article takes a look at some of the basic psychology we use to justify actions, and argues that even though technology has developed beyond comprehension over the past few hundred years, perhaps our methods of making sense of information haven’t.

A look at how people and organisations make sense of, and respond to, risks

High Reliability Organisations (HROs) are organisations which are incredibly adaptive when it comes to recovering from shocks, surprises and unexpected events. I take a brief look at the work of Karl Weick and Kathleen Sutcliffe through the lens of the TV show, Prison Break.

There are few minds more adaptive than Breaking Bad’s Walter White. In this article I take a look at how the writers of the show created the mind of Walter and how you could apply similar methods to improve your own adaptive frame of mind.

A look at the 2015 UK election and another look at the outstanding work of Karl Weick and Kathleen Sutcliffe

There are strong links between the breaking of stability and the increase of innovation, but it goes against our nature. Innovation requires disruption, yet we like, and strive for, stability.

The role of trust in decision making and how it affects adaption. This is a summary of the field work I carried out in two very different environments- a clinical setting and a construction site.

Another article on trust-what does it mean in a work context when we say we trust somebody to get things done? In this article I take a look at some of my field work and draw a link between trusting someone and their ability to adapt and improvise independently.


Rudolph, J. (2003) Into the Big Muddy and Out Again. Error Persistence and Crisis Management in the Operating Room. Dissertation, Boston College

Summary of the above can be found in

Klein, G (2009) Streetlights and Shadows. Bradford Books (see Klein also for plausible stories)

Weick, K. Sutcliffe, K. (2007) Managing the Unexpected. Jossey-Bass (also for plausible stories)

*If you are wondering what a smoker jumper is, click below







Going into and Coming out of a Closed Mind

23 Jun

In the previous article I discussed two panels from the graphic novel, Watchmen. In one of these panels we see two lovers, happy, carefree, potentially at the peak of their attraction, and in the second panel we see the outcome, a painful break up. The first panel sets the expectation, the second panel violates that expectation. The more naïve the initial forecast, the greater the damage when that forecast fails to manifest. In 1922, John Dewey wrote that life is interruptions and recoveries. It follows that if our forecasts and hopes for the future will, inevitably, be interrupted, then we can improve our recovery from these interruptions if we improve our ability to forecast. In the language of the last article this means avoiding and recovering from a fixed frame of mind.

The two panels from Watchmen provide a clue how to improve forecasting, how to temper our expectations of the future. When we achieve this, it improves not only “recovery from interruptions”, but our ability to improvise, adapt, innovate and develop tacit skills. Within the first panel (the two optimistic lovers) we have emotionally charged expectations and hopes for the future. Similar emotions appear at the start of a new project, company, strategy; the outlook is positive and anything which might challenge this brightly lit horizon is quickly explained away. Unless of course we had access to the second panel, the tragic end, a million miles and more from the expectations of the first panel, and we could see the outcome. This would present us with two bookends, one which anticipated an outcome we desired and the real outcome, a catastrophe. With these two bookends in place we would then need to use our imagination to fill in the history-what events could have brought such high hopes crashing down?

Challenges of all types we had not previously considered might appear as we are forced to look for items, potential events, flaws we had not considered during our anticipation stages. What is happening is we are being forced to look for what we haven’t seen; and what we haven’t seen or anticipated is what hurts us and our plans the most. When we are forced to imagine what led to a potential catastrophe, we are then being forced to plan for resilience, prepare for recovery and begin improvisation and innovation as a means of managing the unexpected.

What I have described above is prospective hindsight, or the pre-mortem exercise (Klein, 2004). Without the ability to see the actual future, we can instead draw ourselves that second panel from the Watchmen. We can create a scenario where the strategy we invested so much in, and feel so positive about, has been a catastrophe. This provides us our two bookends, one is our forecast, and the other is our forecast lying in ruins. All we need to do is fill in the history, focus on what we are not seeing. This exercise draws us out of potentially fixed frames of mind, makes us consider what we are prepared for and what we are not, opportunities to innovate, and what we need to let go of.

The following articles discuss how we get caught up in fixed frames of mind by being positive; this is very counter intuitive, asking someone to tone down enthusiasm and think about what could go wrong isn’t always well received in some group and organisational cultures. However, the articles also cover the other way too; people who believe they won’t be able to cope, when persuaded to think about how they might survive if the worst case scenario came true, are generally surprised by the resources available to them, both physical and mental. This never creases to amaze me when I do the pre-mortem or similar exercises with organisational groups- a general feeling of helplessness and gloom is quickly replaced by the location of exciting opportunities and tactics which had not been previously considered. This is the topic for a future collection, why movement, attempting something new, is so essential to personal and organisational success, but this is what you’ll find at the links below


This first article discusses Wilson and Gilbert’s (2003) work on affective forecasting. It explains how current emotional states influence our expectations of the future. Wilson et al also provide mechanisms for improving forecasting, which we might now say fit under the “second panel” genre.

A slant on the Scottish referendum, and the use of worst case scenarios

The relationship between insight and innovation when events don’t go according to plan

Linking emotion to uncertainty and then to recovery. This was a collaboration with my friend and colleague, Professor Marc Jones, where we used lab results carried out by Marc and his team to look at wider points regarding the management of uncertainty.

An organisational response to unexpected events can lead to a closed frame of mind in decision makers which produces- greater procedures and processes. In other words, all effort is focused on avoiding future mistakes. Avoiding mistakes is no bad thing, but placing procedures and processes at the front of priorities means the frame of reference is doomed to eventually be too narrow. When a non-routine event is encountered, in a culture which has a fixed frame of mind, the default setting is “follow procedures and processes”. This might avoid blame now, but be hiding a bigger problem for later (see the earlier article where Rudolph, 2003, studied clinical decision making). In other words, if an event doesn’t make sense there are two broad options. Firstly, investigate the limits of the procedures and processes in dealing with this non routine occurrence and then update the procedures and processes; in other words, treat the event as different. Or secondly, because the occurrence is “new” either consider the event as unimportant, a fluke, and then apply the procedures and processes. This second approach basically absorbs the novelty and explains it away, not advised, but sometimes a variety of pressures make it inevitable.

Letting go of a belief we are invested in can be incredibly difficult. This article discusses the benefits of going through that difficulty of letting go


Klein (2003) The Power of Intuition. Currency books.

Dewey, J. (1922\2002). Human Nature and Conduct. Mineola, NY: Dover

Moore, A. (1986) Watchmen. DC Comics. Titan.

Rudolph, J. (2003) Into the Big Muddy and Out Again. Error Persistence and Crisis Management in the Operating Room. Dissertation, Boston College

Summary of the above can be found in

Klein, G (2009) Streetlights and Shadows. Bradford Books.