Detecting Errors in Strategy

19 Sep

This is a quick introduction to the Spiral methodology we’ve designed when applied to strategy error detection. When any organisation, policy unit or research team are putting together a strategy or plan aimed at realising future goals there are 4 levels of potential error they should be mindful of.

Level one is the acquisition of knowledge. The first port of call when making a decision on future direction is the collection of data. If you’re getting data from people or from any complex system (systems which involve people) then you need to avoid potential areas of bias such as memory decay or data fixation (Wilson and Gilbert, 2003) and capture expertise (Hoffman et al, 2006). The Spiral seeks to compensate for bias by applying research methods for data collection from complex systems which are designed to minimise the presence of bias. This wouldn’t be the primary source of data collection, but it would place assumption testing on all data and support insight.

Level two is the analysis of this knowledge. What scripts are people using to make sense of it? There is always a script at work when reading data, the script of the individual (the CEO for example) and\ or the script of the organisational culture. The methodological goal here is to ensure that the data is read with a broad frame, one which considers other potential scripts, as oppose to a narrow frame, one which only considers and analyses through a single script such as the more complex the analysis the less chance of blame. Changing the perspective on data, by simply changing statistics to frequencies for example, has incredibly powerful effects on decision making effectiveness (Kahneman, 2011, Gigerenzer, 2014).

Level three is the gap between planners and implementers. A simple way to illustrate this level is through communication. Within an organisation there is always formal explicit knowledge-the guidelines and procedures for example. Also within an organisation is the informal knowledge, how things really get done. Creating plans which only deal with the structural side of the organisation and expect appropriate changes in behaviours are doomed to fail (see Klein, 2013, Weick, 2009).

These plans fail because expertise and insight become diluted by procedures and processes. What manifests are large gaps in reasoning between planners and implementers. The aim here is deploy simple but effective research methods which close this gap and get reasoning on a similar level. The pay off here is clear intent- the ability of people to achieve objectives, independently, even when things go wrong. This increases innovation and reduces the cognitive load on managers by reducing the frequency of the question- what should I do next?

Level four is forecasting, the anticipation of future states. The other three levels all feed into forecasting, perception is information based (Hoffman et al, 2006). Research suggests that forecasting is particularly susceptible to bias (see Wilson and Gilbert, 2003 and Kahneman, 2011 for some good examples). With this is mind the goal is to use methods which reduce bias in anticipation, such as prospective hindsight (Klein, 2007). Not only can these methods make forecasts more realistic it can also improve general risk management, resilience and adaptability.

This has been a very short introduction to the four levels where errors can occur in strategies and plans. In a future article I’ll provide more detail on some of the research methods we use to locate and address these errors.

References
Wilson, Timothy D.; Daniel T. Gilbert (2003). “Affective Forecasting”. Advances in Experimental Social Psychology 35: 345–411
Kahneman, D. (2011). Thinking Fast and Slow, Allen Lane 2011
Gigerenzer, G. (2014) Risk Savvy: How to Make Good Decisions. Allen Lane
Klein, G. (2013) Seeing what others don’t: The Remarkable Ways We Gain Insights. Public Affairs
Weick, K. Making Sense of the Organization (Volume 2) The Impermanent Organization, Blackwell
Hoffman, R. Crandall, B, Klein, G.. (2006) Working Minds: A Practitioner’s Guide to Cognitive Task Analysis Cambridge, MA: A Bradford Book
Klein, G. (2007) The Power of Intuition: How to Use Your Gut Feelings to Make Better Decisions at Work. Currency

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: