Mistakes of Imagination

19 Nov

I once read a book by Philip K Dick, the Science Fiction writer, and then another 30 of his books one after the other. One of the main themes in Dick’s work was mental health, or rather, a questioning of how mentally healthy his characters were. As a consequence, some form of therapy and therapist was frequently present in his books, and particularly in a book called Valis. In this book there is a remark, which is-there is a sentence out there which can cure everyone. Not a universal sentence, but a particular sentence for each person, and so the task becomes trying to find this sentence.

I like to think that this one sentence is a rule of thumb or heuristic- a simple sentence that suddenly makes everything slot into place. The result can be new insight, increased performance, an equitation transformed from impossible to easy or a task taken to the next level. It can also mean letting go of a process, an idea or a course of action which once seemed vital; but essentially, this sentence leads to improvement.

A colleague of mine recently uttered a sentence “most mistakes are of imagination”, and this instantly gave me a slightly different slant on decision making and sense making. My colleague was referring to the limit in thinking which can be defined as- the inability to imagine what you do not know. To unpack this sentence we can describe it as- mistakes occur when people, plans and organisations overly rely on what they know and underweight what they don’t know.

The two system way of thinking (see Kahneman, 2011) helps explain this conceptually. Human beings make decisions using two systems- system one and system two. System one is fast, it is pattern recognition-this looks like that. System two is slow- it verifies the pattern which system one has identified. To paraphrase Kahneman (ibid.) problems occur because system two is lazy- if the conclusions of system one are in anyway plausible then system two seems to say “that’ll do, go with it”.

When a person is inexperienced then verifying initial conclusions so simply is a major source of error. Alternatively, field studies have demonstrated (see Weick, 2009, Klein, 2007 for examples) that when people are experienced in certain types of tasks (nursing, firefighters, craft work for example) system two is able to locate subtle cues and patterns, adapt the conclusion and change course accordingly. The environments where these critical systems two’s are developed and operate are environments where the cues and patterns lead to fairly reliable results, and, there are strong and fast feedback loops in place- you know exactly how your actions have impacted.

When a task, plan and organisation is a affected by human behaviour then what you don’t get is-cues and patterns which lead to reliable results and fast feedback loops (“real time” data only tells you a version). As a consequence, these tasks, plans and organisations are vulnerable to uncertainty and overly weighting “what is known” increases this vulnerability. To mitigate against this vulnerability what is needed is imagination- the ability to think critically about what is not known; and this why many mistakes happen- a failure to imagine future states beyond what is known.

References

Klien, G.E. (2007) The Power of Intuition. Currency Books

Weick, K.E. (2009) Making Sense of the Organisation: The Impermanent Organization. Volume Two. Wiley

Kahneman, D. (2011). Thinking Fast and Slow, Allen Lane

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: