Tag Archives: uncertainty

Apprenticeships and Zen

14 Jun

I recently watched a program on Japanese culture which touched on the subject of Zen. A Buddhist priest explained that meditation is a vital form of practice, it develops knowledge through self awareness, and then applying this knowledge to mindful activities such as cooking and gardening produces insight, and ultimately, wisdom.

The combination of knowledge and application to develop insight and wisdom is integral to the ancient system of thought, Zen. So, it is no surprise that there are strong similarities between how wisdom is acquired in Zen and how it is developed through an ancient, and highly successful, form of education, apprenticeships. I would argue that what makes apprenticeships so successful, is the same foundations of Zen, the combining of formal knowledge with practice on a daily basis.

The apprenticeship method, like Zen, trains the mind to interact with its environment, assuming less, noticing more and adapting accordingly. Both Zen and apprenticeships aim to harmonise the mind with the environment.

Relying solely on knowledge can have the opposite effect, resulting in the mind attempting to control the environment through the application of abstract theories and procedures. This reduces attention to environmental changes, and over emphasizes perceived control.

Taking the mind off the environment and relying on pure knowledge is a major source of organisational errors (see Taleb, 2012, for good examples). Developing methods where there is frequent feedback between the effect of knowledge on the environment ala Zen and apprenticeship models, is an effective way of avoiding these errors, acquiring wisdom and increasing creativity.

Reading

Taleb, N. N. (2012) Antifragile: Things That Gain from Disorder. New York: Random House

 

Developing A Challenge Mindset

19 May

Last week I had the pleasure of presenting at the Elevate conference held at the Excel Arena, London, along with my colleague Professor Marc Jones and Dr Hannah Macleod, a Gold Medal winner from the Rio Olympics with GB women’s hockey. The subject of our talk was based on developing a challenge mindset, and I’ll summarize some of the key points below from my perspective.

Success is an interaction of skills and the environment (with some luck thrown in). Success can make us overly focus on our skills whilst paying little attention to the environment. This results in a belief that skills are operating independently of the environment, and\or have control over the environment.

Accepting that success is an interaction of skills with environment, then changes to the environment, no matter how small, can begin to affect the outcome of skills. If the effects are not initially significant, then they can be explained away, reinforcing the over focus on skills and continuing the lack of attention to the environment. This can take group culture from a positive place to a closed and defensive place, resisting change and alternative perspectives.

To avoid this situation, befriend negativity. This means that even when performance is going well and the environment is stable, imagine what could go wrong, no matter how big or small, and practice how to deal with these situations. In other words, befriend your worse fears. Doing so reduces aversion when faced with unexpected events, maintains an open mind, and places a mindful focus on the relationship between skills and the environment.

Hannah provided excellent examples of how GB women’s hockey were constantly generating “what if scenarios” to plan for unexpected and negative events. The results for Hannah and her team speak for themselves.

 

 

 

Why Focusing On Catastrophe Is So Effective

3 Oct

Gary Klein’s pre-mortem technique has a long and effective history in improving forecasting, plans and decisions (Kahneman, 2011, Klein, 2007). The technique is incredibly simple, as the below example illustrates-

You and your team are about to agree a decision. Before you do so, imagine the decision has turned out to be a complete catastrophe. Everyone, working on their own, takes 5 minutes to write down the history of this catastrophe. Each individual history is then shared with the team.

I recently wrote about an interview featured on McKinsey Classic with Gary Klein and Nobel Laurette, Daniel Kahneman. The two psychologists discussed the role of intuition in executive decision making. Naturally, the pre-mortem technique came up as a highly effective method of improving decisions.

The logic behind why the technique works so well has been covered several times in articles on this blog, and covered extensively across research and corporate literature. However, Klein’s simple explanation of what lies behind the technique’s success in the McKinsey interview is incredibly insightful, and worth sharing

“The logic is that instead of showing people that you are smart because you can come up with a good plan, you show you’re smart by thinking of insightful reasons why this project might go south. If you make it part of your corporate culture, then you create an interesting competition: “I want to come up with some possible problem that other people haven’t even thought of.” The whole dynamic changes from trying to avoid anything that might disrupt harmony to trying to surface potential problems”

Reading

Kahneman, D (2011) Thinking Fast and Slow. Penguin

Klein, G. (2007) The Power of Intuition: How to Use Your Gut Feelings to Make Better Decisions at Work. Currency

http://www.mckinsey.com/business-functions/strategy-and-corporate-finance/our-insights/strategic-decisions-when-can-you-trust-your-gut?cid=other-eml-cls-mkq-mck-oth-1609

 

 

Our Future Selves and Decision Making

7 Jul

Below is a link to a TED Talk by the Harvard psychologist, Dan Gilbert. The talk is entitled, The Psychology of your Future Self, and illustrates how we, as human beings, have the capacity to get our expectations of the future so badly wrong. Gilbert addresses some key reasons why anticipations of future states can be so adrift, and within this article I’m going to reference these reasons to highlight how experience and imagination can significantly improve our ability to forecast, acquire expertise and make better decisions. But first, a small detour to ancient Greece.

Continue reading

“The Power of Negative Thinking”

23 Feb

Positive thinking only gets you so far. It’s negative thinking which really defines success. This is the argument put forward in an interview between Canadian Astronaut, Chris Hadfield, and The Red Bulletin (Red Bull magazine). Hadfield explains the point

“Self-help gurus are always advising us to think positively and envisage success, but it’s about as helpful as thinking about cupcakes. Just thinking about them isn’t going to help. It’s more important to think what could go wrong with a mission. Visualize failings, not success. That’s what’s essential to survival as an astronaut. I was an astronaut for 21 years, but I only spent six months in space. The rest of the time, I was looking into every detail that might have gone wrong during a mission. Once you’ve understood all the potential risks and you’re forewarned against them, fear no longer plays a part in your thought process”

In my research, and the research I draw upon, this argument runs like a red thread through accounts of decision making, planning and adaption. For example, Crandall et al (2006) argue that experts have a far greater knowledge of “what could go wrong” with decisions, plans and strategies than less experienced and accomplished staff across a variety of professional fields.

Weick at al (2007) in their analysis of resilient organisations, which includes NASA, identify that resilient organisations have an obsession with the question “what could go wrong?” In other words, they are prepared for failure and far more likely to learn from it.

In Jim Paul’s (written with Moynihan, 1994) account of lessons learned in losing large sums of money on the trading floor, the authors cite “avoiding losses” as the most significant strategy for success. By focusing on failure, on what NOT to do, the chances of success significantly increase because at the very least, a trader will stay in the game longer.

The “power of negative thinking” counter intuitively increases confidence as people, teams and organisations are far more prepared, and positive, about their ability to absorb failure and adapt. I’ve researched and seen the above manifest in fields as diverse as clinical decision making and construction site management (examples are here)

Weick (2009) refers to the ability of an organisation to adapt through adverse circumstances as having the “requisite variety”. Requisite variety is the sum of an organisation that has systematically learned from failure, analysed and then shared the lessons. A learning organisation, focused on “negative thinking” creates a reservoir of responses, both formal and tacit, which can be applied to complex, surprising and uncertain events. Chris Hadfield, in the quote below, sums the concept up perfectly

“I never experienced any fear when I got into a spacecraft— not because I was brave, but because I’d practiced solving every problem, thousands of times. Being well prepared makes all the difference. It minimizes any fear and gives you confidence”.

Reading

Paul, J. Moynihan, B. (1994) What I Learned Losing a Million Dollars. Columbia Business School Publishing

Crandall, B. Klein, G. Hoffman, R. (2006) Working Minds: A Practitioners Guide to Cognitive Task Analysis. The MIT Press

Weick, K. Sutcliffe, K. (2007) Managing the Unexpected: Resilient Performance in an Age of Uncertainty. Jossey-Bass.

Weick, K. (2009) Making Sense of the Organization, Volume 2: The Impermanent Organization. John Wiley & Sons

The Chris Hadfield interview with Red Bull, the Red Bulletin, at the link below

https://www.redbulletin.com/us/us/lifestyle/astronaut-chris-hadfield-explains-the-power-of-negative-thinking

 

Chaos, Construction sites, and Bayes

21 Jan

The last article was a case study about construction site decision making. To summarise briefly, a project manager was in charge of six construction sites, each ran by a site manager. Three of the site managers were “trusted” by the project manager and had excellent performance records. The other three site managers were “not trusted”, took up large amounts of senior management time and required significant performance improvement. We conducted research to uncover the difference between “trusted” and “untrusted”, and then turn this information into a resource which could be used to improve performance.

Continue reading

Working On and Off the Edge

9 Jul

“You don’t turn a crew onto a score and not expect any heat”. These are the words, more or less, of the Grand Theft Auto anti-hero, Trevor Philips, as he shares his wisdom on decision making, risk and uncertainty. Trevor, an experienced bank robber, high jacker, drug dealer (amongst others) is reflecting on the consequences of making dramatic decisions which carry huge amounts of risk and lead to a very uncertain future. In Trevor’s world, the upside of decisions is closely intertwined with the downside-if you want the rewards, you personally have to endure the risks; you have to be prepared to take the heat. Being exposed to heat (in the Trevor context, the inevitable FBI investigation) is a drastic lesson in self-awareness; if it’s you having to deal personally with the consequences of your own decisions, then you’re going to be very focused on the quality of your plans and your ability to cope, if and when the heat arrives. This sharp and immediate connection between decision and action is “working on the edge”.

By contrast, in organisational life, decisions can be implemented with significant distance between those who make decisions and those who are carrying the decisions out, and this includes the consequences. In other words, not everyone who makes the decisions gets to feel the heat. This lack of connection between decision and consequence can have a number of negative effects. The ability to adapt becomes blunted, plans become too abstract, communication slows, and certain types of culture can edit feedback and erase bad news stories. It’s a little like trying to explain what a landscape looks like to someone on the end of a cell phone. The person listening to the description gets a general image of the landscape, but they are essentially matching the description to the nearest available analogue from memory, they are essentially thinking “it sounds like landscape B which I saw last year”. Without actually seeing the landscape, they are robbed of the subtle and nuanced detail. This detail might be vital feedback, but if the person describing the landscape misses some of this subtle detail out, for whatever reason, then feedback is compromised. The describer could be simply describing the landscape they feel the listener wants to hear, they could be focused only on the detail which has important meaning to them, they could be editing the scene in a variety of different ways and for a variety of different reasons. When there is distance between the planner and the implementer, the same limits are in place as those experienced by our landscape describer and listener. This is “working off the edge”. Here is a quote from a senior manager who was on the receiving end of a change management strategy which was designed off the edge

“It will never work (the strategy) because it has no bearing on how things actually work in this department. It’s been designed by the executive like your pushing around shapes and thinking-this could go here, and then we can put this one in there. Our director knows this, but executive went ahead and said this is how things are going to be, and too many important people have too much of a stake in looking competent to the board, so we’ve been told we’re stuck with it”.

Examining people and populations who work on the edge can reveal both interesting and useful lessons for anyone who manages uncertainty and change. The people who work on the edge have stories about their experiences which reveal the risks of a closed mind and the benefits of adaption. Because these people have personally experienced the downside of their decisions, they are able to provide lessons which can be used to bring decision making closer to the edge. So who works on the edge, in addition to Trevor Phillips? Start-up founders work on the edge, so do extreme athletes, branches of the military and police, areas of medicine and health, and explorers. Below are two articles written by the explorer Cathy O’Dowd, which have been featured on the Echo Breaker blog. In these articles are examples of how luck can be mistaken for skill, how examining worst case scenarios can be used to generate adaptive strategies and significantly, how making some decisions must involve an acknowledgement and acceptance that the ultimate price could be paid.

It’s not necessary to work on the edge to create an adaptive mind-set. It’s possible to improve our ability to adapt by moving our decisions closer to the edge. We can achieve this by ensuring there is closer distance between planners and implementers, that we fully understand the environment which our decisions effect, but most of all, we should imagine that we are personally affected by the worst consequences of each decision we make; we all need to ask if we could deal with the heat if and when it comes.

 

Some risks can’t managed away, if you’re going to make decisions in environments which carry dread risk, you should be prepared for the consequences

http://wp.me/p3jX7i-54

The benefits of an adaptive strategy in high risk and fast changing environments

http://wp.me/p3jX7i-4Z

 

 

 

Facing up to Change

22 May

Fairly early on in my research career I designed a model for capturing the reasoning people applied to make sense of an initiative, change project, anything that involved moving from A to B. The model had to be very simple because quite often you don’t get very long with respondents (they’re busy) and the model had to be used effectively by non-researchers to do research (spreading limited resources). This was the reality of contract research-tight deadlines, scarce resources and a lot to do. However, the model proved pretty effective in creating a simple picture of how well a person, organisation, or environment would be dealing with future or current change. The model I produced was called the Beliefs, Barriers and Control (BBaC) model, and applied three simple steps which I’ll outline below.

The first stage, Beliefs, was a series of questions which drew a picture of the respondents world view; what the respondent thought would occur when X happened, how they would cope, what values they associated with objects and projects. The second stage was Barriers, and was focused on how rigid the beliefs were, and how world views contradicted the perceived goal of a change programme; this stage was basically looking at contradictions between beliefs and how they might clash. Finally was Control, which sought to capture the tactics, resources, strategies respondents felt were available to them to reinforce their own beliefs. Overall, the model provided a useful picture of how well a person, an area or an organisation would deal with change and a challenge to their existing worldview. I’ll give you an example below.

Imagine a hospital that adopts this worldview-hit target at any costs. The role of care becomes a threat to this mind set as it can contradict the dominant goal of achieving targets; for example, quite often providing excellent care means authorising procedures which take a department over budget, which in turn means a target is not achieved. As a result, delivering care begins to cut corners; the target driven mind set digs in and defends itself with a range of tactics- leaning excessively on statistical performance measures, tightening procedures, processes and discipline to the point of stifling any insight, initiative and eventually compassion. In this environment small errors will accrue, but the barriers put in place to defend the world view keep them hidden. To an outsider casually looking in, all they see is great looking performance targets. This example is a rigid set of Beliefs, Barriers and Controls, ready to shatter any minute.

If you’d like to measure the flexibility of your own worldview, you can apply a version of the model using these questions-

When you have a new idea, plan, strategy etc. who do you ask first for feedback?
When your beliefs are challenged how do you react?
When you’re challenged, how do you use data, information, procedures, rules and regulations?
When did you last let a strongly held belief go?

If you’re asking only people who are likely to approve your ideas first (because of your relationship with them, their time constraints, they are always positive etc.) then this can reinforce your beliefs quickly and could potentially make them defensive. This feeds in to the second question, do you instinctively defend? Do you view opposing worldviews as threatening or challenging? When you are being challenged, how do you use data, procedures, rules etc.? Do you use them to reinforce your beliefs, to point out errors in other beliefs? Or alternatively, do you use data to proactively challenge and test your beliefs, to look for new insights? Finally, how often do you let go of something you were heavily invested in? This is incredibly difficult to do on the psychological level but essential for resilience, innovation, and recovering from surprise, unexpected events.

The Key Problem with Uncertainty-People Just Don’t like It

5 May

Managing uncertainty is becoming a higher and higher priority for businesses around the world. It’s very easy to see why-political, economic and social instability is everywhere and highly unlikely to end in the medium future at least. Making decisions, plans and strategies which have any chance of a positive outcome is harder than ever. Because it is harder, businesses now want to focus more on managing uncertainty effectively, and are consequently looking for methods to improve this ability. However, putting aside the geo-political situation, and all the socio-economic threats, the real problem is that human beings just don’t like facing up to uncertainty-and that is the biggest threat to effectively operating a business over the next 5-10 years.

The UK are days away from a General Election. All the major and minor parties seem to be focused on creating retail deals with voters-you vote for me, you get this in return. This is politics, but this focus has meant the world outside the UK has been largely ignored. The focus is entirely on domestic policy. I read a theory why in one of the UK broadsheets-every time a focus group is asked about global and economic uncertainty it makes them anxious, the focus group simply doesn’t want to acknowledge it. So, the events which will define the world, business, and individuals on the global level over the next few years are simply being ignored. The entire election is based on promises and perceptions of stability-stability over business performance, stability of income, stability of prices, stability to pay the mortgage. If this stability doesn’t exist, if it is replaced by uncertainty, then the very fabric people’s lives are perceived to be unravelling.

This drive toward stability and with it smooth, linear solutions, plans and strategies means acknowledging and managing uncertainty gets shut out. A quote from Karl Weick and Kathleen Sutcliffe’s (2007) Managing the Unexpected illustrates the default business position when faced with an interruption in stability; after an unexpected event breaks stability, learning takes place, the chaos, and how it was dealt with, provides clear lessons on the flaws and limits in current processes, and also the corresponding thinking strategies used to cope when events are stable. However

“…it won’t be long before candour gives way to moments of normalizing that protect reputations, decisions, and styles of management. As soon as the official stories get “straightened out” and repeated, learning stops”. (Weick and Sutcliffe, 2007, p. 109).

Stability and “getting back to normal” are very comforting. However, they shut out the lessons people learn and the skills they develop to manage unexpected events. So, if these skills are so valuable and becoming increasingly necessary, why do businesses and people shut them out, even at their own detriment? Human beings are pattern recognition machines (see Kahneman, 2011), a person sees a complex mass of information and is driven to find some pattern in it, and some narrative or plausible story that integrates what is being seen and experienced into what we know already. This drive can sometimes work well for us in environments which are reliable when it comes to making sense of cues and patterns (aspects of clinical reasoning and firefighting, see Weick, 2009) but can work against us when we attempt to make sense of unreliable environments such as what other human beings are about to do next. The drive produces social constructs like stereotyping and mistaking a run of luck for skill, both of which leave people floundering and confused when they become contradicted. This psychological drive explains why the new becomes frequently rejected, innovation can be viewed as just “disruptive” and change is viewed as a threat-they simply don’t fit within the current method of making sense of events. To this end, stability is seen as the normal pattern, uncertainty viewed as a threat; to alleviate the psychological discomfort this threat causes, it gets explained away, absorbed within the pattern of stability. Most significantly, uncertainty becomes something which is ok to ignore. When eventually uncertainty can no longer be ignored, far fewer people than necessary have the skills and resilience to cope effectively.

A method of addressing the above is to draw lessons from “reverse domains” where uncertainty is the norm, and any attempt to impose stability could be fatal. These include some clinical environments where initial diagnosis need to kept flexible, firefighting where conditions such as wind direction might suddenly change and so the situation changes rapidly, and alpine skiing where threats such as avalanches have so few cues extreme caution and focus on discrete changes is essential to survival. These domains provide techniques for assessing the external environment in a “critical” way, because in a “reverse domain”, when stability is encountered the alarm bells start ringing. How to explore and develop these techniques for business I’ll discuss in a future blog.

Reading.
Weick, K. & Sutcliffe, K. (2007) Managing the Unexpected. Jossey- Bass (Wiley Imprint)
Weick, K. (2009) Making Sense of the Organisation: The Impermanent Organization. Voume Two. Wiley
Kahneman, D. (2011) Thinking Fast and Slow. Penguin.

Comebacks, Decisions and Resilience

23 Jan

There is a cultural fascination with comebacks, against the odds wins, and resurrection. For thousands of years cultures have been inspired by tales and mythologies of heroes facing quests which involve consequences so brutal and dangerous it’s impossible to imagine success. Within these mythologies are crucial lessons about decision making, risk and the role of resilience. The Greek hero Odysseus, for example, and his journey back home after the Trojan War, is a story of courageous decision making backed up with an array of innovation, tactics and strategy so adaptable in the face of uncertainty that any CEO would be put in awe of it. In this article I’m going to try and deconstruct episodes which could be labelled as comebacks but are far more illustrative of resilience and adaptability.

When presenting my research I went through a stage of describing, briefly, what I thought a “good” decision was. I suggested that the emphasis should be taken away from searching for perfect or near perfect decisions and instead placed on the question “could I handle the consequences of this decision?”. My thinking was this, once this question is asked, then thinking starts to focus on “what could those consequences be?” and crucially “what could go wrong and what would\could I do about it?”. The sum of these questions, hopefully, starts to produce the potential problems which could occur, and also an inventory of resources which could be used to face these problems. There needs to be some caution when doing this as two layers of bias can occur. Firstly, the problems identified are based only on what has been encountered in the past (risk as opposed to uncertainty), and secondly, there is overconfidence in the resources to overcome any problem identified. These two layers of bias can lead to people and organisations becoming ambushed by uncertainty, a version of a risk never before encountered, and which the resources aren’t set up to deal with. How well this ambush is dealt with is the comeback zone and requires adaptability and resilience. I’ll illustrate with an example from a few thousand years ago by returning to Odysseus.

When Odysseus finally headed off on his journey home he couldn’t have imagined what waited for him. Odysseus had a general idea, but he backed himself to handle the consequences-he had the confidence to keep going, kept himself ready for anything, was prepared to be knocked off course, and so kept his options open to adapt. If Odysseus had been overly rigid in his risk analysis, basing it on his previous journeys alone, he would have been left a gibbering and confused wreck when confronted with the Cyclops. However, he took this encounter pretty much in his stride. In other words, Odysseus remained ready for shocks, surprises and unexpected events; and this, I’m suggesting, is the anatomy of a comeback.

Gigerenzer (2014) talks of the role and function of “adaptive toolkits” when dealing with the unexpected, an array of actual and cognitive coping mechanisms which allow individuals and organisations to adapt, role with the punches, and Odysseus is good example. Adaptive toolkits are in many ways the opposite of carefully constructed and modelled risk registers. If the risk strategy encounters the unexpected, and the adaptive toolkit is absent, then the individual and organisation is sunk. Adaptive toolkits are like arsenals which can be used when under attack, if the arsenal is well stocked then the higher your confidence in fending off the attack. If the confidence is high (high enough) then the greater the cognitive space to innovate and adapt in the face of uncertainty- this is the comeback. The comeback is an array of tacit skills and coping mechanisms which can be used when the unexpected is encountered. It’s a flexible approach which absorbs shock and bounces back.

Most organisations don’t take the same approach as Odysseus. Instead they rely on modelling risk and stocking up on procedures and processes. These approaches don’t lend themselves to a comeback. If you Google business comebacks you’ll find some great inspiring stories (Apple 2001 for example) but in the grand scheme of things there aren’t that many. The reason- most organisations aren’t designed to comeback.

References

Gigerenzer, G. (2014) Risk Savvy: How to Make Good Decisions. Penguin