Tag Archives: Risk analysis

Why Focusing On Catastrophe Is So Effective

3 Oct

Gary Klein’s pre-mortem technique has a long and effective history in improving forecasting, plans and decisions (Kahneman, 2011, Klein, 2007). The technique is incredibly simple, as the below example illustrates-

You and your team are about to agree a decision. Before you do so, imagine the decision has turned out to be a complete catastrophe. Everyone, working on their own, takes 5 minutes to write down the history of this catastrophe. Each individual history is then shared with the team.

I recently wrote about an interview featured on McKinsey Classic with Gary Klein and Nobel Laurette, Daniel Kahneman. The two psychologists discussed the role of intuition in executive decision making. Naturally, the pre-mortem technique came up as a highly effective method of improving decisions.

The logic behind why the technique works so well has been covered several times in articles on this blog, and covered extensively across research and corporate literature. However, Klein’s simple explanation of what lies behind the technique’s success in the McKinsey interview is incredibly insightful, and worth sharing

“The logic is that instead of showing people that you are smart because you can come up with a good plan, you show you’re smart by thinking of insightful reasons why this project might go south. If you make it part of your corporate culture, then you create an interesting competition: “I want to come up with some possible problem that other people haven’t even thought of.” The whole dynamic changes from trying to avoid anything that might disrupt harmony to trying to surface potential problems”


Kahneman, D (2011) Thinking Fast and Slow. Penguin

Klein, G. (2007) The Power of Intuition: How to Use Your Gut Feelings to Make Better Decisions at Work. Currency




Human Judgement and Cognitive Computing

13 Sep

McKinsey have published an outstanding interview with Gary Klein and Danial Kahneman. The interview is a reflection on Klein and Kahneman’s classic paper- Conditions for intuitive expertise: A failure to disagree (2009). Whilst the interview reflects on the two authors positions when it comes to intuitive decision making, the prime focus is on executive judgement- is intuition a good basis for top level business decision making? In this article I’ll briefly reflect on some of the key points raised by Kahneman and Klein, and how aspects of cognitive computing could potentially support some of the author’s suggestions.

Continue reading

Getting Answers from Unusual Places

30 Mar

Sometimes the answer to a problem comes from an unusual place that is right in front of us. This article covers how we can potentially transfer expertise from one disciplinary subject to another.

Many organisations contain expertise which stretches across multiple disciplines. For example, construction, engineering, research and development all contain reservoirs of expertise particular to their discipline’s training, experience, and cultural sense making. All these disciplines apply their expertise to design, deliver and problem solve during the completion of tasks. Problem solving, and dealing with tough non routine cases, are generally the points where expertise becomes most active and innovation takes place (Taleb, 2012, Klein, 2014).

Continue reading

Labels and Accidents

7 Mar

Organisations, projects and people who operate in dynamic, high risk environments constantly need to update their understanding of a situation. The reason is that dynamic, high risk environments constantly change and they continually surprise.

Fighting a fire, building a hospital or managing diverse projects are all environments where plans and expectations become derailed by reality. Scanning an environment for even the smallest deviation in plans and expectations can ensure that small incidents do not explode into catastrophes. However, one of the biggest barriers to scanning and updating a dynamic, high risk environment are the techniques we turn to simplify our world and make it more manageable- plans and labels. This article discusses how plans and labels can turn a dynamic situation into a potentially dangerous situation.

Continue reading

Risks Which are Simply Too Big

5 May

The link below is to an article by the explorer, Cathy O’Dowd. In this blog Cathy discusses the Nepal earthquake disaster and the role of planning and preparing for massive risk events when exploring. The bottom line seems to be this when it comes to exploring in environments with massive risk potential (Everest for example)-outside of a general emergency plan some unlikely but probable risks will simply end your life regardless of planning.

It’s a stark insight and highlights how risk strategies should begin, and sometimes end, with two simple questions-what’s the absolute worst case scenario? And, if that scenario takes place can we handle the consequences? Explorers, unlike certain corporate positions, are directly and fatally affected by the consequence of their decisions. Appreciating the cost of massive risk on a personal level exposes true faith and conviction in plans.


Prediction, Tolstoy and New Technology

27 Mar

Way back in the 1800s decision makers were wrestling with the notion you could plan and predict outcomes which involved people. A list of pivotal figures from Tolstoy to William James (regarded by many as the father of modern psychology) devoted some of their best work to explain why prediction of human outcomes was an unachievable goal. If you’re not going to read Tolstoy’s War and Peace, a masterpiece of decision making critique, James provided an outstanding framework for pragmatic thinking when it came to planning in the face of uncertainty.

James argued that no plan or decision was true, a fact, it simply became true or otherwise as the consequences of action unfolded. Uncertainty never gave you enough to predict, you simply had to think through the consequences of actions carefully and be ready to adapt. Like the scientific hypothesis, you explored whether your expectation stood up, if it didn’t, then it was adapted.

Tolstoy’s work uses literature to account for the arrogance of people who think they can predict and plan with certainty and I often wonder what Tolstoy would have thought about big data, new technology and the notion of prediction. I’ll never know, but I do think the work of both Tolstoy and James are highly relevant to the technology enabled decision makers of today.

In my mind, whether you were planning in the 1800s or you are in 2014, using technology for predictive purposes and making decisions off the back of these predictions is something to be approached cautiously. When you predict, especially with confidence, you expose yourself to fixation- the confirmation bias. This means that when decisions rested on confident predictions encounter the complexity of human behaviour and become contradicted, the investment (psychology, resources and time), in that initial plan and decision, cognitively pulls at you to avoid the loss of that investment. The consequences of this is loss aversion (https://echobreaker.wordpress.com/2013/09/16/confirmation-bias-and-data-analysis/) resulting in the explaining away of contradictory data, fixating on the initial decision. So more data can actually accentuate biases by increasing subjective self confidence (https://echobreaker.wordpress.com/2013/09/24/the-more-information-the-worse-the-decision/)

A better use of big data is discovery, and I think this works well with a form of risk primed decision making. To echo James, thinking through the consequences of a decision before its implemented is a good starting point. However, I think this can be taken a step further, when thinking about the consequence of action there is a tendency to frame and anchor consequences based on the worst of the experienced past. This is good, but it doesn’t prepare you or an organisation for the shock of a potential “even worse” scenario. A methodology to counter this is to take the worst historic event which methods like stress tests focus on and use data to construct even worse scenarios- there will always be a point in a crisis where the crisis hits a floor (in 2008 it was a bailout for example), data can be used to model how bad things could have got if this floor didn’t materialise when it did.

This combination of data and imagination can produce thinking on- how would I\ we have dealt with that? What fire doors or communication links could be built in to avoid this? Imagining risks unlocks creativity, problem solving in the future and enhances the longer term adaption of decisions. Placing risk first, as oppose to prediction, is a lesson from Tolstoy and James, but it’s a lesson which can drive discoveries today.

Making Decisions under Pressure

12 Mar

Pressure has a tendency to narrow focus and fixate your view. When your ancestors felt pressure from being hungry or cold then the narrowing of focus and the related physiological responses, the motivation to action, was a good thing. When this natural response kicks in at the office, pressure can tunnel your vision on a particular narrative, data set or belief. However, unlike your ancestors this focus may not be as simple as getting food and warmth. It could involve where to invest, which product to back, who to promote or where to start drilling. When faced with pressure in a complex domain it’s vital to maintain a broad awareness of the situation, if not, then pressure can leave you fixated and making poor decisions; a form of cognitive bias.

When you are faced with organisational pressure to “get something done” it pushes you to find an answer quickly. This isn’t necessarily a bad thing; great things can happen under pressure. However, awareness of the risks in your domain is crucial to reacting positively – you shouldn’t trade risk for speed, but you can learn to speed up your ability to identify and manage risk.

Pressure will speed up your pattern recognition, the desire to find and justify an answer\ solution. If you are dealing with a lot of data and\ or uncertainty (for example) then you are operating at a greater risk of finding a pattern to deal with the pressure, as opposed to using the pressure to find a superior option; and consequently, the potential to be taking on board hidden risks. With this in mind, how can you improve decision making under pressure? Below are a couple of methods

Klien’s (2003) prospective forecasting is a method where once you’ve completed your plan, or reached your decision, imagine its 12 months (or 10 minutes) into the future and your plan\ decision has been a complete catastrophe, then, in no more than 5 minutes jot down the history of that failure. This works because it’s far easier to anticipate problems when the outcome is known. It insures against the desire to prove yourself right and explain away contradictions.

The shock method (can be used after prospective hindsight) is when you imagine your decision or plan has run into immediate trouble and you analyse the fire doors you have available to contain the risk and adapt the decision- how fast can you react and adapt? How quickly can you communicate to the necessary people or rehash the data? In other words, how aware are you of the ability of your domain to cope the risks your decision could bring. If you can’t adapt the decision and contain the risk, you need to start your revising your plans.

Using these methods when making decisions or constructing plans under pressure can enable you to have some degree of mitigation over the cognitive biases pressures can bring. Experienced high performers (within certain professional domains) can intuitively analyse risk by having a very strong sense of what could go wrong, and they can do it fast. To develop greater sense making skills when it comes to risk, practice (using case scenarios) your pressurised decision making using the above methods and you’ll improve your ability to naturally detect errors- crucial when deciding under pressure.

Loss Aversion and Horizon

3 Mar

Like a lot of people I watched Horizon’s report on decision making last week, and really enjoyed it. The research was presented in a skewed but entertaining way and it’s motivated me to write down my thoughts on loss aversion as a cognitive bias as it provides a great illustration of the tension between a bias and a heuristic in decision making. For the purposes of this article I’ll define loss aversion quite crudely as people’s strong preference for avoiding losses over achieving gains. This preference can result in people turning down options or risks which would yield gains but end up being declined if the risk\ option is framed in terms of losses- you stand a 10% chance of losing 70% of what you have for example.

Cognitive bias itself is a systematic error, which is reliably predictable, when a decision is made (and if the decision maker lacks the necessary information to frame the question appropriately). A heuristic is more or less its opposite, a rule of thumb, an imperfect but good enough short cut which works well most of the time in a particular domain. In a very simple form we can think of bias as a mistake and a heuristic as a rule of thumb. Loss aversion, I would argue, is something which has become a bias. The reason for this, I think, is that loss aversion began life as a heuristic which became a bias as a result of changes in the human risk domain.

And this is why –

Way back in our history survival was dependent upon meeting our most basic needs- food, shelter, warmth. If you were able to secure yourself a cave (for example) with some furs and fire for warmth, and some food, you were more likely to survive, at least in the short term. The subtraction of these items meant that you were far less likely to survive. So, once these items were secured, holding onto them was essential, it paid to be loss averse. If you were able to hold onto these items it would also allow you to minimise future risk taking, further increasing your survival chances since the risk consequences were very often life and death. In a domain defined by scarce recourses and securing these recourses carries great risk, then loss aversion operates as a heuristic.

Back to today, when you have a surplus of warmth, shelter and food (in some economies and only in sections within them) you can probably afford to take a few well thought out risks to try and get a little extra due to changes in the risk domain. But the old heuristic operates as a bias because when you are faced with a loss, or a perceived loss, even from a relative position of strength or security (your risk domain) you’ll still short cut default to avoid losing anything- unless you know the risk domain which enables you to frame decisions more appropriately and reverse the bias. So, why is it important to point out that bias used to be a heuristic? I think it’s important because the role of information and environment in shaping perception is crucial. Knowing the boundary conditions for when a heuristic can become a bias and vice versa is essential to making a decision involving different or changing domains; and especially when seeking to transfer expertise from one domain to another. In short, knowledge of the risk domain is crucial to framing decisions appropriately, if you don’t remain sensitive to changes in risk a heuristic could easily become a bias, and sometimes with disastrous consequences.

Confidence, Risk and Decision Making

5 Feb

A sudden lack of self confidence is not uncommon in professional sports men and women and there are now highly experienced and effective sports psychologists around to help- they have achieved some exceptional results and many a professional athlete has credited psychology with either taking their career to the next level or simply saving it. So, given the effectiveness of these techniques in managing stress and building confidence in very challenging situations, is there a place for importing these techniques into business? Would it be helpful for someone in business to think like an elite athlete?

No doubt confidence is essential to the professional athlete and confidence is important in business; so far, so good, but they part company on the subject of risk. In sport, risks are well known, in some sports you may actually not come home, but its known what the risks are, the worse case scenarios, and they can be managed, more or less. Sport is known risk and it has a floor. And the upside, which is the application of confidence, talent and luck, can sometimes be, or at least seem to be, limitless. It is a good thing to have very confident sports people.

In business risk is very different. Technology, globalisation, financial interdependencies amongst others all mean that risk is very hard to know and contain. Problems occur when organisations feel they can model and predict risk (2008 is an easy example to call on) and make decisions on this basis, with confidence. Complex environments are categorised by the lack of floor to risk, no one knows quite how bad things could get but risk models can leave you thinking you have an answer. The validity of this statement can easily be checked by the number of bankers’ post 2008 who said- it had never happened before. That is a comment reminiscent of Peter Drucker’s observation that basing the future on past events is like driving a car forward looking through the rear view mirror. Confidence within a business environment needs to be of a different type because it operates in a different risk domain- if you’re feeling confident then it’s time to be cautious.

This issue of confidence in the business environment was addressed by Danny Kahneman who observed subjective self confidence is no reliable indicator of ability or expertise. In sport you get proof positive of expertise very quickly, in business you don’t or it takes a long time to find out, and I’m sure everyone has a story to support Kahneman’s observation. Decision making in business requires a grasp of context, and this should include a very strong sense and accompanying analysis over what could go wrong. Thinking about what could go wrong basically prepares you for the consequences of your decision- if this goes sour, can I\ we handle it? Taking this approach is far superior to the cognitive strain of trying to make perfect decisions-they don’t exist.

By contrast in sport, it’s not a good idea to be thinking what could go wrong (at least not most of the time) but in business it’s essential. The two domains are defined and separated by the levels of uncertainty and whether or not the risk has a known floor.

A rule of thumb- if your domain has regular cues and patterns which can be linked to reasonably consistent outcomes then a sense of control is something which could be relied upon, cautiously. If your domain contains large amounts of random noise which could lead anywhere, well, I’ll leave you with a question from Danny Kahneman “what makes him believe he is smarter than the market? Is this an illusion or skill?”

A Role for Vision in Decision Making

22 Jan

I had the pleasure of speaking at a clinical reasoning conference yesterday and a question from one of the delegates really stood out- how much is decision making affected when an organisations vision isn’t shared by senior managers and executives? The delegate was referring to how decision making by non managerial professionals is affected but the answer is relevant across all domains- it affects it significantly and with potentially catastrophic results.

Quite often we hear that for a job or career to be rewarding two things need to be in place- a level of autonomy and a sense of meaning. You can have a level of autonomy in your work but without a sense of meaning, of being part of a bigger purpose, then it’s possible to feel what the philosopher Michelle Foucault observed- certain types of freedom can themselves be oppressive. Being able to answer the question, at least to some degree, what’s the point of all this, is essential to a sense of satisfaction in work. It’s also essential to decision making and innovation.

I’ve pointed out frequently in previous articles that a key difference between an expert and a novice in their decision making is the ability to assess risk, to identify what could go wrong. This could be simply expressed as anti goals, things you don’t want to happen. And sometimes focusing solely on anti goals is all that’s required to plan positive outcomes. The ability to develop a keen sense of anti goals only comes with a deep understanding of the domain in which you operate and what success or good outcomes look like, then you can identify what you want to avoid.

To understand what a good outcome\ success looks like you need to visualise it when faced with a challenging situation. You need to be able to draw from experience and see what this situation needs to become, how to get there and most importantly things to avoid on the way. This simple process allows you to adapt previous methods to the current situation, spot leverage points and innovate, and keep a plan on track in the face of distractions. However, none of this is possible without vision.

It’s vision which provides the background of meaning; it allows people to understand the consequences of their actions in relation to achieving higher level goals. It also provides them with a strong sense of what to avoid. When an organisation tries to cope without vision quite often people turn to routine, customs and practices and\ or bureaucracy as the default mode of avoiding blame sets in. What goes out the window is the ability of people to make decisions swiftly by sizing up a situation or opportunity in relation to the organisational vision, along with the ability to spot leverage points and innovate. It happens because everyone is fumbling about in the dark and so tries to simply stay safe. Everyone could feel autonomous in this situation but it would feel like straight jacket with no one certain what to do with it.

It’s far better to just turn the lights on.