Archive | Risk RSS feed for this section

Why Focusing On Catastrophe Is So Effective

3 Oct

Gary Klein’s pre-mortem technique has a long and effective history in improving forecasting, plans and decisions (Kahneman, 2011, Klein, 2007). The technique is incredibly simple, as the below example illustrates-

You and your team are about to agree a decision. Before you do so, imagine the decision has turned out to be a complete catastrophe. Everyone, working on their own, takes 5 minutes to write down the history of this catastrophe. Each individual history is then shared with the team.

I recently wrote about an interview featured on McKinsey Classic with Gary Klein and Nobel Laurette, Daniel Kahneman. The two psychologists discussed the role of intuition in executive decision making. Naturally, the pre-mortem technique came up as a highly effective method of improving decisions.

The logic behind why the technique works so well has been covered several times in articles on this blog, and covered extensively across research and corporate literature. However, Klein’s simple explanation of what lies behind the technique’s success in the McKinsey interview is incredibly insightful, and worth sharing

“The logic is that instead of showing people that you are smart because you can come up with a good plan, you show you’re smart by thinking of insightful reasons why this project might go south. If you make it part of your corporate culture, then you create an interesting competition: “I want to come up with some possible problem that other people haven’t even thought of.” The whole dynamic changes from trying to avoid anything that might disrupt harmony to trying to surface potential problems”

Reading

Kahneman, D (2011) Thinking Fast and Slow. Penguin

Klein, G. (2007) The Power of Intuition: How to Use Your Gut Feelings to Make Better Decisions at Work. Currency

http://www.mckinsey.com/business-functions/strategy-and-corporate-finance/our-insights/strategic-decisions-when-can-you-trust-your-gut?cid=other-eml-cls-mkq-mck-oth-1609

 

 

Human Judgement and Cognitive Computing

13 Sep

McKinsey have published an outstanding interview with Gary Klein and Danial Kahneman. The interview is a reflection on Klein and Kahneman’s classic paper- Conditions for intuitive expertise: A failure to disagree (2009). Whilst the interview reflects on the two authors positions when it comes to intuitive decision making, the prime focus is on executive judgement- is intuition a good basis for top level business decision making? In this article I’ll briefly reflect on some of the key points raised by Kahneman and Klein, and how aspects of cognitive computing could potentially support some of the author’s suggestions.

Continue reading

When Expertise Works And When It Doesn’t

25 Jul

At this link is a Google talk delivered by psychologist and Nobel Laurette, Daniel Kahneman. The topic of the talk is expert judgement in decision making, and Kahneman discusses the collaborative work he carried out with Gary Klein.

Continue reading

Our Future Selves and Decision Making

7 Jul

Below is a link to a TED Talk by the Harvard psychologist, Dan Gilbert. The talk is entitled, The Psychology of your Future Self, and illustrates how we, as human beings, have the capacity to get our expectations of the future so badly wrong. Gilbert addresses some key reasons why anticipations of future states can be so adrift, and within this article I’m going to reference these reasons to highlight how experience and imagination can significantly improve our ability to forecast, acquire expertise and make better decisions. But first, a small detour to ancient Greece.

Continue reading

5 Golden Rules for Flexible Project Management

13 Jun

I previously wrote about methods to improve two aspects of project management

  1. Knowledge capture, and
  2. Communication

A focus on the importance of capturing knowledge and improved communication was inspired by a recent publication by Raconteur (Project Management, raconteur.net, #0376, 22\05\2016). Within this Raconteur publication was a piece entitled “The Five Golden Rules of Project Flexibility” (p.4), and provides the inspiration for this article.

Flexibility is essential for sustainable success, but for human beings, it can be very difficult to think and behave with flexibility. Below, I’ll outline some reasons behind this difficulty, before revealing Raconteur’s 5 Golden Rules.

Continue reading

Improving Project Management

6 Jun

The special interest publisher, Raconteur, recently produced a paper on Project Management (raconteur.net, #0376, 22\05\2016). Two reoccurring themes ran through the collection of articles

  1. The importance of knowledge capture in project management
  2. Effective communication when managing complex projects

As both of these themes frequently appear throughout the articles and research featured on this blog, it’s a good opportunity to share methods we’ve applied to improve knowledge capture and communication across projects, and with teams and organisations. So firstly, knowledge capture.

In their Raconteur contribution, Jim McClelland, addressed the importance of knowledge capture in his article entitled “Mindset and not toolset-it’s all about people…” If a project is complex (multiple partners, sites, boarders, regular surprises, changing environment etc.) then the degree of learning is potentially high. Managing a complex project involves multiple frontline adaptions as initial plans and strategies run into the “friction” of everyday life (Freedman, 2013).

The problem identified in the McClelland article is that the knowledge gained from managing project friction, remains the tacit property of frontline workers, and\or project managers. This problem is aggravated when project managers and key staff are transient, they move from project to project, company to company, taking their knowledge with them.

A potential solution is the implementation of methods which capture knowledge. This means frequently capturing and evaluating what project managers and frontline workers notice and prioritise in a work situation, how they notice contradictions to initial plans (when something starts to go wrong) how the contradictions are made sense of, and what adaptions take place to course correct or innovate around problems (see Starbuck, 2001, Klein, 2007, Rankin et al, 2014 for examples of this in various project environments).

Below is quick example of a method, a debriefing questionnaire, focused on capturing knowledge in a fast paced project environment. The questionnaire is designed to capture problem solving, with an emphasis on changing expectations, situation recovery and risk analysis-

What did you notice?

What surprised you?

What did you do?

How would you advise someone else to tackle a similar situation?

What should they avoid doing?

The second theme is communication. I would argue that a key component of effective communication is shared sense making, being able to shadow the thinking of someone else (see Klein et al, 2013 for examples).

Communication between different partners, professions, organisations, sites etc. is inherently problematic. The instruction of “fast” for example, has a lot of potential responses, all dependent on individual sense making. Unless organisations, teams and individuals develop methods which allow the intention behind plans to be fully understood, then regular problems can occur.

Communication problems are particularly acute when teams are distributed across organisations, geography, professions etc. As discussed earlier, plans encounter friction and need to be adapted. Projects run more efficiently when these adaptions are carried out by frontline workers with intimate knowledge of a current situation, and a real time view of what’s going on in the environment. Without clear intentions, adaptions can either be completely out of kilter with a plan, or frontline workers lack the clarity to deal with a problem, and keep referring to management for further instructions.

Both of these intention problems have significant consequences for the success and safety of a project. For example, adaptions in the wrong direction can correct a local deviation but weaken the broader project. A project may be running over budget and a team leader is told to cut costs. During this period an engineer gets a new job and leaves the project. To save money the leader of the engineer’s team decides not to re-recruit. The team leader adapts by carefully reorganising the remaining team member’s roles. Short term it’s a success. Then an unexpected event occurs and the team lack the flexibility to absorb and correct the shock.

The other side of the problem occurs when no adaptions take place. This occurs when frontline workers lack the clarity to adapt to changes in local circumstances. As a result, when an unexpected event or obstacle occurs on a project, instead of applying initiative, the frontline instead seeks instructions from further up the hierarchy. This situation eats into time, reduces the amount of available options to tackle a problem, places responsibility in the hands of someone who is nowhere near the situation and who only has a limited understanding. All these issues create extra demands and increase management pressure, destabilising the project further.

Communication problems can be avoided by applying methods which calibrate sense making. A useful method of communicating intent is a script developed by Karl Weick (see Weick et al, 2007 for examples). Below is a version of Weick’s intent script, and similar to versions we’ve used in our work with clinical decision making and organisational change-

This what I think we face

This is what I think we should do

These are the reasons why

This is what we need to look out for

Now talk to me

This article has featured quick examples of how to improve knowledge capture and communication. I would strongly agree with McClelland that successful project management is a mindset. I would also add that applying simple methods designed to collect knowledge and improve communication, develop and support the best conditions for project management success.

Reading

Starbuck, W.H. Hedberg, B. (2001) Handbook of Organizational Learning and Knowledge; M. Dierkes, A. Berthoin Antal, J. Child, and I. Nonaka (eds.); Oxford University Press, 2001

Klein, G. (2007) The Power of Intuition: How to Use Your Gut Feelings to Make Better Decisions at Work. Currency

Rankin, A. Woltjer, R. Rollemhagen, C. Hollnagel, E. (2014) Resilience in Everyday Operations: A Framework for Analyzing Adaptations in High-Risk Work. Journal of Cognitive Engineering and Decision Making March 2014 vol. 8 no. 1 78-97

Freedman, L. (2013) Strategy: A History. OUP USA

Klein, G. Hintze, N. Saab, D. (2013) Thinking Inside the Box: The ShadowBox Method for Cognitive Skill Development. International Conference on Naturalistic Decision Making 2013, Marseille, France.

 

 

 

Why Instructions Rarely Get Followed

19 Apr

All situations involve change. Yet most instructions, plans and procedures are static, and bear little resemblance to how frontline workers actually perform and behave. This could be because most instructions, plans and procedures assume that frontline workers passively process data. Instead, frontline workers interact with data in dynamic ways which adapt plans and instructions to meet the challenges of specific situations. This can leave what actually works well in an organisation invisible.

Continue reading

Lessons from an Oil Rig for Change and Leadership

11 Apr

A recent opinion article (full article here) on Oilpro.com highlights the need for safety culture to address human needs. Oilrigs are incredibly high risk environments where decision making, experience, expertise, procedures and leadership are essential to staying safe and completing tasks effectively. However, the Oilpro.com article identifies a trap people and organisations can easily fall into when confronted with high risk environments-the sensation that greater levels of procedures, micro management, and bureaucracy reduce mistakes, errors, and accidents. This trap contains the implicit assumption that narrowing the capacity for people to exercise judgement, improves performance. However, this approach has negative consequences for safety, leadership and managing change whilst building up potentially much bigger problems for the future (see Taleb, 2013, for a good overview of this concept).

The OIlpro.com article attends to how overuse of procedures and technology dulls the attentiveness of human operators. The author contrasts two types of oilrig work to make the point-

“Rig roughnecks and roustabouts repeat the same procedure over again for 12 hours straight without mistake, partly because then type of work has enough mix of eye, hand, foot and body movement to keep their mind occupied”

Compared to

“…systems operators no longer have a chance to patrol the plant; they must sit still in the same spot, staring into computer screens for hours at a time.”

The first description provides an opportunity for frontline workers to develop sense making of complex, non -routine situations, situational awareness and role expertise. These type of roles create tacit skills, as frontline workers develop fine-tuned heuristic methods of problem solving.

The second description represents distance between the human worker and their environment. This situation creates the opposite conditions for developing expertise. The human worker is cut off from the quality environmental feedback stemming from decisions. This separates the worker from the fine discriminations available from being directly on the frontline, leaving them with nothing to do but follow abstract procedure.

When an organisation introduces systems which potentially wash out expertise, it also dulls the organisation to small deviations in routines which could lead to large consequences; along with the methods to tackle these deviations in their infancy. It’s similar to beautifully redecorating a building whilst removing all fire detectors and fire extinguishers. Things look a lot better to the casual observer, but are in fact significantly more dangerous. The improvements are superficial and aesthetic.

The introduction of systems which separate workers from the opportunity to develop expertise, also reduces improvisation and innovation, and this has significant consequences for leadership. All procedures and routines will hit their limit. When procedures and routines hit their limit, they often require intuitive leadership- people who use experience and expertise to take control of a situation, and have the confidence and mandate to improvise when necessary. When procedures take precedence, problems are migrated up the hierarchy. This leaves frontline workers waiting for instructions from someone separated from a developing situation. This robs frontline workers of the opportunity to develop and practice leadership.

The article continues

“Leaders know that people want a sense of control of their work environment. Workers don’t want top down edicts telling them how they MUST conduct their work, to be announced by a memo stuck on a notice board. Work procedures must be constantly questioned, reviewed and modified”.

If leaders are looking to effectively manage change and build an adaptive (and resilient) organisation, then the experiences and expertise of frontline workers need to play a central role. Change works most effectively when it supports and enhances what people do well, not wash it out with top down, abstract systems and structures.

To achieve this, frontline workers need to operate in conditions which allows them to develop meaningful expertise (see above), and then this expertise needs to be collected, analysed and used to direct change, and constantly adapt, review and modify procedures. This places frontline experience at the heart of change. And this type of change provides people with clear meaning and significance, it’s their skills and experience which are driving the direction.

These arguments do not call for the abandonment of procedures, they have enormous value. Nor do these arguments call for constant disruptive change in which nothing can get done effectively. Instead they call for natural human strengths to be supported and enhanced by placing frontline workers in contact with the consequences of their decisions, allowing them to develop expertise, and the authority to lead and improvise.

If expertise is collected and analysed in a way which doesn’t increase workload (beyond the minimal), then it can be used to modify, support and enhance procedures, systems and structures. And this leads to greater levels of safety, innovation and motivation. This way, the organisation is shaped and changed by experience, not abstract theory masquerading as rigorous micro management.

Reading

http://oilpro.com/post/23150/opinion-top-down-safety-culture-fails-to-address-human-needs?utm_source=DailyNewsletter&utm_medium=email&utm_campaign=newsletter&utm_term=2016-03-15&utm_content=Feature_1_txt

Taleb, N. N. (2012) Antifragile: Things That Gain from Disorder. New York: Random House.

Labels and Accidents

7 Mar

Organisations, projects and people who operate in dynamic, high risk environments constantly need to update their understanding of a situation. The reason is that dynamic, high risk environments constantly change and they continually surprise.

Fighting a fire, building a hospital or managing diverse projects are all environments where plans and expectations become derailed by reality. Scanning an environment for even the smallest deviation in plans and expectations can ensure that small incidents do not explode into catastrophes. However, one of the biggest barriers to scanning and updating a dynamic, high risk environment are the techniques we turn to simplify our world and make it more manageable- plans and labels. This article discusses how plans and labels can turn a dynamic situation into a potentially dangerous situation.

Continue reading

“The Power of Negative Thinking”

23 Feb

Positive thinking only gets you so far. It’s negative thinking which really defines success. This is the argument put forward in an interview between Canadian Astronaut, Chris Hadfield, and The Red Bulletin (Red Bull magazine). Hadfield explains the point

“Self-help gurus are always advising us to think positively and envisage success, but it’s about as helpful as thinking about cupcakes. Just thinking about them isn’t going to help. It’s more important to think what could go wrong with a mission. Visualize failings, not success. That’s what’s essential to survival as an astronaut. I was an astronaut for 21 years, but I only spent six months in space. The rest of the time, I was looking into every detail that might have gone wrong during a mission. Once you’ve understood all the potential risks and you’re forewarned against them, fear no longer plays a part in your thought process”

In my research, and the research I draw upon, this argument runs like a red thread through accounts of decision making, planning and adaption. For example, Crandall et al (2006) argue that experts have a far greater knowledge of “what could go wrong” with decisions, plans and strategies than less experienced and accomplished staff across a variety of professional fields.

Weick at al (2007) in their analysis of resilient organisations, which includes NASA, identify that resilient organisations have an obsession with the question “what could go wrong?” In other words, they are prepared for failure and far more likely to learn from it.

In Jim Paul’s (written with Moynihan, 1994) account of lessons learned in losing large sums of money on the trading floor, the authors cite “avoiding losses” as the most significant strategy for success. By focusing on failure, on what NOT to do, the chances of success significantly increase because at the very least, a trader will stay in the game longer.

The “power of negative thinking” counter intuitively increases confidence as people, teams and organisations are far more prepared, and positive, about their ability to absorb failure and adapt. I’ve researched and seen the above manifest in fields as diverse as clinical decision making and construction site management (examples are here)

Weick (2009) refers to the ability of an organisation to adapt through adverse circumstances as having the “requisite variety”. Requisite variety is the sum of an organisation that has systematically learned from failure, analysed and then shared the lessons. A learning organisation, focused on “negative thinking” creates a reservoir of responses, both formal and tacit, which can be applied to complex, surprising and uncertain events. Chris Hadfield, in the quote below, sums the concept up perfectly

“I never experienced any fear when I got into a spacecraft— not because I was brave, but because I’d practiced solving every problem, thousands of times. Being well prepared makes all the difference. It minimizes any fear and gives you confidence”.

Reading

Paul, J. Moynihan, B. (1994) What I Learned Losing a Million Dollars. Columbia Business School Publishing

Crandall, B. Klein, G. Hoffman, R. (2006) Working Minds: A Practitioners Guide to Cognitive Task Analysis. The MIT Press

Weick, K. Sutcliffe, K. (2007) Managing the Unexpected: Resilient Performance in an Age of Uncertainty. Jossey-Bass.

Weick, K. (2009) Making Sense of the Organization, Volume 2: The Impermanent Organization. John Wiley & Sons

The Chris Hadfield interview with Red Bull, the Red Bulletin, at the link below

https://www.redbulletin.com/us/us/lifestyle/astronaut-chris-hadfield-explains-the-power-of-negative-thinking