Explaining the Spiral- Our Current R&D Project

22 Aug

This article is an overview of where I’m at developing a toolkit (working title The Spiral) for improving decision making for individuals, groups, and organisations. Currently I’m at the proof of concept stage for some elements whilst other elements are well developed, but overall the Spiral is a fast effective means of detecting decision errors and correcting them.
The Spiral is based on the theory that individuals, groups, organisations and teams can all become focused on a narrow frame which ultimately threatens their survival, frequently without them knowing or too incrementally to notice.

The theory is well illustrated with reference to some recent research we conducted with people (n=50) who had been diagnosed with mental health difficulties and were about to receive treatment. A strong pattern emerged across the people (respondents) relating to how the mental health difficulties emerged and what the respondents hoped to achieve from treatment. The pattern we found can be summarised into the following bullet points-
• The respondent was initially affected by an incident, or a series of incidents, which affected their reasoning in a specific way- the respondent began to increase the amount of external factors which they categorised as threats and risks
• The respondent would then attempt to “de-risk” by removing contact with threats, however, these “risks” would include contact with friends, cessation of hobbies etc.
• As a consequence of the above the respondent had significantly reduced the size of their “world”- they had moved from a broad to a very narrow frame of reference
• The effect of the narrow frame would actually increase the levels of perceived risk in the respondents lives by reducing coping mechanisms and alternative methods of support-perceiving constant external threat became a behaviour norm
• As a result, the respondents would desire from treatment their “lives back” and “more confidence”, a return to a broader frame.

This is an extreme, discrete but common example, but I argue, it’s also a factual- a reoccurring pattern at individual and system levels, evident in both big and small.

In summary, the spiral is a generally defensive response to external factors which produces a narrow frame for processing the internal’ and external world. This is the bottom of the spiral, an entrenched, dug in position which can be easily overrun; a state of unpreparedness to deal with changes.

The spiral applies equally to groups and organisations and to individual decision making. A well-documented example at the group level is Janis’s (1989) investigation of the Kennedy administration decision making process around the Bay of Pigs (1961) where the mental model for assessing the crisis was based on a very fixed view of the Cuban and Castro’s mental model. Alternative evidence and contradictions towards this fixed view were rejected-the spiral had taken hold as the frame of reference became increasingly narrow, and defensive.

I would suggest the chances of becoming locked into the spiral at the individual, group and organisational level has increased with the level of information available. Once a position has been adopted it is now relatively easy to find the information to support this position and explain away contradictory evidence. In other words, huge increases in data can function to narrow the frame and increase the risk of the spiral- this is a paradox, more data is touted as producing better decisions, in practice, more data narrows the frame of reference as people instinctively lock onto to a pattern, as a means of heuristically managing data, and then seek to justify it.

A further example applicable to narrow framing exists within Wilson and Gilbert’s (2003) work on Affective Forecasting. Basically, affective forecasting is an experimentally demonstrated bias that human beings will either significantly over estimate or under estimate the outcome of future events positively or negatively. Although, affective forecasting can be based on a positive emotional position, for example-buying this new car will make me very happy, the prediction is still rigorously defended to threats which might suggest the car would not make the predictor as happy as they think. In other words, the predictor is likely to dig in and defend their position.
The spiral can also be located within science. Kuhn’s (1977) work on paradigms argued that scientific discovery was as much the product of political and power struggles as planned well executed lab experiments (although they clearly have a role!). The physician and epidemiologist John Snow’s work on cholera (Hempel, 2013) illustrates this well. Despite demonstrating (in 1849) that the spreading of cholera was not caused by foul air but was in fact orally contracted Snow’s findings were rejected by the then UK government on the grounds of being publically unpalatable. Galileo’s work on heliocentrism and the controversy this caused in relation to the Catholic Church having to reconsider its current (at the time) mental model on the relationship with the Earth to the Sun is an even starker example, but illustrates the principle well- mental models can be become dug in and defended vehemently against all perceived threats, regardless of the quality of contrary evidence.

A refusal, despite evidence to the contrary, to adjust from a current scientific mental model to a demonstrably improved mental model is well explored by Klein (2011) in his research into insight. Thinking can become so routinized and taken for granted that sometimes only a surprise or shock can adjust this mental model-literally kicking the thinking out of the bottom of the spiral. Kahneman (2011) identifies something similar when he observes that “you can’t teach psychology”. The argument is that a person can agree and understand a theory and\ or principle in one domain (the classroom) and go away and completely ignore its application in another domain; unless the person is in some way surprised. For example, the statistic and statement “20% of people who cross the road outside this building we are in” does not result in more cautious behaviour, but the statement “Jane’s friend Jill got run over outside this building last week” does result in such a change.

Something we encountered in our recent research (and covered in a recent article) was the degree of trust which was placed in colleagues during clinical decision making in teams. When the teams remained intact (the norm\ routine) things went well. The potential for risks in this arrangement emerge when there is a disruption to the team due to routine breakers such as sickness. The degree of trust dependencies remains largely unknown to the clinical teams and the routine has become something which a large degree of faith is placed. However, the teams were in a spiral, a narrow frame, and the role of the research is to “kick them out”- surprise the team- and get them to appreciate some of the risks this trust routine carries-whilst keeping intact the practices which work well.

What about a specific example from the domain of business- the subprime mortgage market is a good one. Despite repeated, and scientifically valid, warnings from many observers, the banking industry and some members of the economics community simply explained it away or ignored it; everyone seemed to be getting richer at the time; that was enough defensive evidence. This was a classic spiral, the frame was so narrow nothing could kick it out other than a crash, and still then…
So, what can be done to monitor, evaluate and kick people, groups and organisations out of the spiral? This is what we are currently proof of concept testing-a toolkit which identifies spiral positions and selects the most appropriate method for the “kick out”.

Some Kick Out Methods

Visualisation- Life size simulations of environments in which people actually practice has huge potential benefits in identifying expert\ novice differences in decision making and how tacit skills are acquired and applied. Defence, system requirement acquisition, health, data analysis and inventory\ construction site management can all benefit from the coding of tacit decision making at expert\novice levels; it can increase performance in hours what could potentially have taken years.

Assessing Reasoning Frames- experimentally testing how people reason at different levels of an organisation provides benefit to those seeking to manage change (for example) more effectively. I would argue a crucial aggravating factor in why change projects fail is due to differences in reasoning frames (particularly moral reasoning) at planner\ implementer levels.

Affective Forecasting Applications- exposing decision makers and forecasters to data and knowledge elicited from those “who have lived their future” significantly increases accuracy (Wilson and Gilbert, 2003). We’re currently researching how this knowledge should be represented to decision makers in a commercial setting but we know it will need to contain data and emotional, tacit experiences.

Prospective hindsight-in all the research I have done I have never (to date) come across a more powerful method for generating insight and adjusting decisions, plans and strategies into more manageable realistic forms. This method was developed and pioneered by Klein (2007) and in a nutshell it’s this- imagine its 12 months into the future and your plan has gone completely wrong, a catastrophe. Take 5 minutes and write down the history of this future catastrophe.

These are just some of the methods we are developing and using but they are all based around giving the user insight-something which was hidden or unknown and improves the mental model by broadening the frame of reference. In a future article I’ll provide an overview of how of we assess an individual, group or organisations spiral position.

To summarise- routine, tradition, procedures and processes have their place, in fact, they are essential. But constant adherence to these same necessities in changing, uncertain and dynamic environments can plunge people and systems into the bottom of the spiral, often without any awareness. It’s worth checking regularly that today was similar enough to yesterday to make sure these staples are still relevant and not dragging you down.

References
Irving Janis, Crucial Decisions: Leadership in Policymaking and Crisis Management, Free Press, New York, 1989, pp. 89–117.
Hempel, S. (2013). John Snow. The Lancet, 381(9874), 1269–70.
Kahneman, D. (2011). Thinking, Fast and Slow. New York, NY: Farrar, Straus and Giroux
Klein, G., & Jarosz, A. (2011). A naturalistic study of insight. Journal of Cognitive Engineering and Decision Making, 5(4), 335-351
Kuhn, T. S. (1962). The structure of scientific revolutions. Chicago, IL: University of Chicago Press.
Wilson, Timothy D.; Daniel T. Gilbert (2003). “Affective Forecasting”. Advances in Experimental Social Psychology 35: 345–411

Leave a comment