Our Future Selves and Decision Making

7 Jul

Below is a link to a TED Talk by the Harvard psychologist, Dan Gilbert. The talk is entitled, The Psychology of your Future Self, and illustrates how we, as human beings, have the capacity to get our expectations of the future so badly wrong. Gilbert addresses some key reasons why anticipations of future states can be so adrift, and within this article I’m going to reference these reasons to highlight how experience and imagination can significantly improve our ability to forecast, acquire expertise and make better decisions. But first, a small detour to ancient Greece.

In Homer’s Iliad, we meet Achilles, the world’s premier warrior and demi god. Earlier in the life of Achilles, he was given a choice. He could either choose to live a long and peaceful life, or he could live a short, glorious life where his achievements where would live forever. Achilles unquestioningly chose the latter. Fast forward to the Odyssey, and Odysseus is making his slow return home after the Trojan war. During his journey, he encounters the spirit of the now dead Achilles in the underworld. Achilles declares to Odysseus

“Glorious Odysseus: don’t try to reconcile me to my dying. I’d rather serve as another man’s labourer, as a poor peasant without land, and be alive on Earth, than be lord of all the lifeless dead” (The Odyssey, book XI, p.465-540)

The meaning of the above quote is open to debate, but it jars with Achilles earlier life choices. And this is the crux of Gilbert’s argument. Human beings are poor judges of what the future will bring, and how we will feel about our current decisions in the future.

Gilbert highlights two reasons for this. Firstly, we underestimate the degree of future change in both our environment and in our own personality and values. Secondly, to think about the future we use memory as opposed to imagination. This second reason explains in some way the first, and explains why we expect the future to resemble the past. In other words, our past is a forecasting template of the future (Kahneman, 2011).

The overuse of the past reflects in predictions of overly stable systems, where both the environment and how we think about life, and value objects and relationships, is expected to last forever. If you, a team, organisation etc. hasn’t changed much for many years then it will become increasingly difficult to imagine change. This can leave strategies and risk analysis literally “stuck in the past”, and unprepared for change, new events and unexpected risks (Taleb, 2012, Kahneman, 2011, Gilbert, 2013, Wilson et al, 2003).

The situation can also leave both people and organisations slow to grasp opportunities and explain away the potential for positive change (Kahneman, 2011, Klein, 2014). From this perspective, the future looks both free of the unexpected and seems very predictable, and also full of risks and fear of change.

Alternatively, when imagination is used to think about the future, dramatic improvements take place (Kahneman, 2011, Syed, 2015). Klein’s prospective hindsight method, also known as the pre-mortem method, is the definitive example

“An example is the premortem method (Klein, 2007) for reducing overconfidence and improving decisions. Project teams using this method start by describing their plan. Next they imagine that their plan has failed and the project has been a disaster. Their task is to write down, in two minutes, all the reasons why the project failed” (Kahneman and Klein, 2009).

The process of imagining an outcome, and then imagining the reasons for that outcome, takes the emphasis away from only using what is known from memory to think about the future. It also provides an excellent template to think about decision making and risk. Imagining failure may seem negative, but imagining ways in which to counter potential failure can generate innovation, and can also feel liberating.

This is the heart of improved decision making, identifying through imagination worst case scenarios and asking “if this happens, can I\we recover?”, if the answer is yes, then the decision can be made. If the answer is no, then the decision should be avoided. This process can also unlock aspiration, innovation and identify new opportunities.

Imagination has a relationship to experience. To illustrate, imagine person A and person B, and they both work for the same company. We wish both to make a forecast about a strategy just written for use in their organisation.

If person A has experienced a lot of change in their professional environment, had numerous trials and tests, has a tacit repertoire of what works, but more importantly, in depth knowledge of what could potentially go wrong with decisions, then this person’s imagination will deliver rich data (Kahneman and Klein, 2009).

By contrast, if person B has experienced only a relatively stable professional environment, is separated from the direct consequences of organisational decisions and has little knowledge of risks, then their imagination will not yield the same quality of data. However, we could improve person’s B’s forecasting by exposing them to person A’s experience.

Gilbert (2013) argues that a person’s forecasting (and correspondingly decision making) can improve if the person speaks to someone who has lived their future. In other words, current memory is replaced by the experiences of someone who has already made the decision they are considering. This can reduce cognitive bias (Wilson et al, 2003) and provide our potential decision maker with “surrogate experience” to boost their imagination.

Returning to person A and person B. If we evaluated the forecasting of person B, by asking them to imagine the organisational strategy has been a complete failure (complete a pre-mortem), their lack of experience will limit their imagination. If we then gave person B some strategic scenarios drawn from Person A’s experience, we could provide B with “surrogate experience”. Finally, we could ask person B to repeat the pre-mortem, and re-examine the quality of their forecasting.

This model, of taking experience, turning it into scenarios, and using it to improve forecasting in less experienced staff, has been used with outstanding results by Klein et al (2014). The Klein et al model takes existing experience and memory, and transfers it between members of an organisation. The model was developed by Hintze (2008) to enable novice firefighters to learn from expert firefighters, and I’ll summarise the process below.

Both expert and novice firefighters were asked how they would manage a testing fire-fighting scenario. The experts had seen similar situations before, and so could imagine a wide range of options and risks. The novice firefighters were yet to experience similar situations. Once the novices had outlined their decisions, they were immediately given access to the expert decision making for comparison. This quick transfer of surrogate experience resulted in improved decision making by 18%-30% inside half of one day.

The above article has highlighted how experience and imagination can overcome the natural human bias to underestimate change and rely too heavily on the past to make sense of the future. On this note, current advances in cognitive computing could very soon deliver methods where we are able to access “expert swarms” to gain surrogate experience and improve decisions and forecasting. Until that time however, it’s not a bad idea to take Kahneman’s advice when about to make a big decision or are dealing with the consequences of a significant change-spend time with friends, they give you a perspective on events that is very easy to lose.


If transferring surrogate memory is effective, why does it not fall into the same trap identified by Gilbert- of relying too much on the past to make forecasts and predictions? Experience in experts (Kahneman and Klein, 2009) is frequently defined by knowledge of what could go wrong. This manifests in professional situations by experts intuitively subjecting plans to a pre-mortem- imagining the plan has failed and how could the situation be recovered or changed. Lack of experience within an area is frequently defined by attempting to force an initial idea through regardless of contradictory data (ibid).




Wilson, Timothy D.; Daniel T. Gilbert (2003). “Affective Forecasting”. Advances in Experimental Social Psychology 35: 345–411
Kahneman, D (2011) Thinking Fast and Slow. Penguin
Gilbert, D (2013) (p.45) AFFECTIVE FORECASTING…OR…THE BIG WOMBASSA: WHAT YOU THINK YOU’RE GOING TO GET, AND WHAT YOU DON’T GET, WHEN YOU GET WHAT YOU WANT in John Brockman (Editor) (2013) Thinking: The New Science of Decision-Making, Problem-Solving, and Prediction. Harper Collins

Klein, G. (2014) Seeing What Others Don’t: the remarkable ways we gain insight. Public Affairs.

Klein, G. Hintze, N. Saab, D. (2013) Thinking Inside the Box: The ShadowBox Method for Cognitive Skill Development. International Conference on Naturalistic Decision Making 2013, Marseille, France.

Kahneman, D. Klein, G. (2009). Conditions for intuitive expertise: A failure to disagree. American Psychologist, 80: 237–251

Hintze, N. R. (2008). First responder problem solving and decision making in today’s asymmetrical environment. Unpublished Master’s thesis, Naval Postgraduate School, Monterey, CA

Homer. The Iliad. Penguin

Homer. The Odyssey. Penguin



Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: