Making Sense of Risk

2 Feb

How do people think about risks? Last year I wrote a few articles on the Spiral methodology I designed- a method of explaining and analysing how a person thought about and responded to risk. The Spiral was the result of investigations into commercial decision making contrasted and compared with interviews carried out with 50 respondents who had been diagnosed with mental health difficulties. Summarised simply, I suggested that the process of making sense of events, data and interruptions can dysfunctionally “narrow frame”, driving both the person and the organisation into a dysfunctional method of framing risk which narrows the world view into a defensive shell. In light of recent research I’m going to use this article to re-explore the role of “rules of thumb” or heuristics in this process.

The Spiral was a cognitive process where an individual or an organisation, in light of recent experiences or overly ingrained routines\ customs\plans, would increase the amount of cues, interruptions, surprises they encountered being framed as threatening risks. Think of it like this- a person travels by car every day to work always taking the same route. One day, whilst taking this route, they see an accident taking place ahead of them, nothing dramatic, nobody hurt or cars written off. The person had also been feeling particularly overloaded at work recently, both the volume and complexity of work had increased. Seeing this accident take place on a route they took every day, and at a time when another area of their life had become more challenging, the person framed a route they had taken every day for years as far too risky to take again. That day at work they used a website to get an alternative route and from that point on, that’s the route they took. The door had shut on this particular risk.

Imagine applying this strategy over and over again, the frame could potentially narrow to such a degree that leaving the house would become an unbearable risk. Eventually, when a challenge, event, interruption finally occurs which cannot be avoided, then being dug in, with reduced coping strategies, is the worst place to be-by reducing risk exposure, vulnerability to uncertainty is greatly increased.

This also occurs in organisations. Unexpected and\or unwanted interruptions are tackled with an increase in procedures and processes which dull sensitivity to operational discrepancies and reduce tactical responses (Klein, 2011, see also Starbuck et al 2008, Weick, 1995). This process takes eyes off small cues and only places eyes “back on” again when that small cue has become a full blown catastrophe and survival seems like a far off fantasy. Organisations also “spiral” when they overly plan. As events bypass the plan, the organisation can either adapt the plan or even change the objectives, or it can it can dig in and defend the plans relevance to a much changed context. As Starbuck et al (2008) observe, rigid strategic planning can make organisations “option blind”, and this is the bottom of the spiral-dug in deep, starved of risk exposure to such a degree that improvisation tactics are non-existent, and eventually being overrun by a metaphoric snowball.

Both people and organisations are adaptive-they are both capable of processing stressors as information, and developing adaptive learning (see Taleb, 2012). This is opposed to creating distance between stressors and cognition (creating procedures and processes which dull sense making) and distance between experience and heuristics (avoiding non-routine and\or planned events). To apply this adaptive quality to risk response improvement, I suggest (and agree with Taleb, Weick et al <ibid> and Gigerenzer, 2014) exposure to risk and focused development on heuristics as the answer.

In a nutshell this means identifying rules of thumb which, though not perfect, serve to extract vital cues from complex, noisy and volatile situations (see Weick, 1995). This approach beats the search for “perfect” responses (which frequently push action into the future and responsibility down the chain of command) as it leads to action quickly. Acting quickly carries risk, and it also sounds risky, but action also provides feedback through exposure. Providing the feedback loops are in place (I touch something hot, it burns) then the stressors which meet action can be quickly processed into information. This means the initial plan can be adapted quickly, driven by the environmental responses to action. In this suggestion we see- heuristic, action\risk exposure, adaption.

Heuristics, rules of thumb, are basically super condensed expertise, which appear intuitively in the consciousness. Where to look for rules of thumb which are a) credible and b) provide a framework which can be shared, will be the subject of a future article.


Weick, K.E (1995) Sense Making in Organizations. Sage

Taleb, N.N (2012) Anti Fragile: Things That Gain From Disorder. Penguin

Klein, G. (2011) Streetlights and Shadows: Searching for the Keys to Adaptive Decision Making. MIT Press

Starbuck, W.H. M.L Barnett et al (2008) Payoffs and Pitfalls of Strategic Learning. In Journal of Economic Behaviour and Organization 66 (1): 7-21

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: