Tag Archives: Data Analysis

How Polls May Have Missed President Trump

10 Nov

The SC\LA Times Daybreak tracking poll was one of the few polls which predicted the outcome of the USA Presidential election correctly. The article available at the link below (by David Lauter) outlines the detail behind why this poll may have succeeded as opposed to just got lucky. The most intriguing aspect from my perspective was that the poll weighted for “undercover voters”.

Undercover voters are a variable not explored by the majority of polls, but where explored by the SC\LA Times Daybreak tracking poll. This group is a segment of the population who did not vote in 2012, but if they did vote in 2016, the question was asked- who would they vote for?

Undercover voters are also people who are not comfortable revealing their voting intentions to various groups of other people. For example, Trump voters were reported as being uncomfortable revealing their voting intentions during telephone surveys. If a poll weights for these and similar interrelations between variables- who didn’t vote in 2012 but could vote in 2016 and for who and, degree of discomfort in discussing voting intentions with a stranger, then the polling result starts to look different.

If we examine this relationship between Trump voting intentions and preparedness to admit voting intentions to a stranger, then it becomes easier to see how polls could be missing out on crucial data in the analysis.

The results illustrate the value of a broad perspective when analyzing data and drawing conclusions. Examining not just voter intention, but the relationship of the voter with their chosen candidate (is it a candidate they are comfortable discussing with strangers?) can potentially reveal more accurate polling results. This principle underpins so much of systems thinking in decision making. Knowledge of relationships and interactions can frequently beat sheer number of variables.

Reading

Link to Article by David Lauter

Why Instructions Rarely Get Followed

19 Apr

All situations involve change. Yet most instructions, plans and procedures are static, and bear little resemblance to how frontline workers actually perform and behave. This could be because most instructions, plans and procedures assume that frontline workers passively process data. Instead, frontline workers interact with data in dynamic ways which adapt plans and instructions to meet the challenges of specific situations. This can leave what actually works well in an organisation invisible.

Continue reading

The Rise of Unstructured Data

17 Feb

IBM’s website features an interesting statistic- 80% of new data is unstructured. This means that new data is largely in the form of blogs, tweets, white papers, articles amongst many other different types. The vast amounts of unstructured data could have, and are having, quite profound effects on the way people and organisations view and analyse “soft data”.

Continue reading

Stories, Data and Decision Making

3 Dec

Stories certainly set a scene. Stories enable the listener or reader to locate themselves within a context, and then see that context through the eyes of the storyteller. A story allows the listener and reader to shadow the thinking, problem solving and emotional recollections of the storyteller. In other words, a story allows two or more people to attach someone else’s experience onto their own experiences. Continue reading

Forecasting-Why you don’t get what you thought you would

5 Aug

Wilson and Gilbert’s (see Wilson et al, 2003) work on affective forecasting is not only fascinating but also incredibly powerful when it comes to improving judgements and decisions. With data and options in greater supply, but also risks, taking measures to improve the quality of forecasts, the basis of decisions, seems like a sensible step to take. I’ll briefly attempt two things in this article- explain what affective forecasting is and how it could be applied.

Affective forecasting, in a nutshell, is the prediction of one’s emotional state in the future. For example- this car will make me very happy. If a person goes ahead and buys the car, they’ll be an initial euphoria followed by a relatively quick return to the person’s baseline emotional state. So- yes the car did produce some happiness but not for as long, or as much, as predicted (see Kahneman, 2011). This is affective forecasting, the overweighting of the effect of future events both for better or worse; another example- things rarely turn out, emotionally, as bad as predicted.

The bias of affective forecasting is extremely strong and influences decision making. It is also a difficult bias to shake off. Whether we like to admit it or not our emotional states drive our decisions. In business, decisions are made on predictions which are overweighed all the time. This can become aggravated by “data based decision making”. Once a prediction is made, the sheer volume of data available makes it very easy to cherry pick data to support a prediction whilst explaining any contradictions way. So, how can you improve judgements by taking account of this bias?

Gilbert (2013) suggests consulting someone who has lived your future. I’ll give you an example- a friend of a relative was fixated on investing into a property deal. The relative, with decades of project management experience behind them, argued it was a very bad idea, but the friend dismissed it; they had affectively forecasted the outcome favourably. I suggested they first watch a week of the TV programme Grand Designs, where all sorts of people take on major building projects (you could probably get 10 episodes in a week), and said make a note how many projects come in on time, on budget and what the investors, emotionally, go through. Twenty fours later, and two episodes of Grand Designs, my relative got a phone call explaining they were going to pull out the deal.

So, when making a decision based on how you think things will turn out in the future, don’t consult bland data, information sheets or similar. Instead get a comprehensive account of someone who has lived that future, it will improve decisions both for the individual and a business. This could be as simple as watching the right documentary and also a potentially better use of some of the massive data bases we now have access to.

References
Wilson, Timothy D.; Daniel T. Gilbert (2003). “Affective Forecasting”. Advances in Experimental Social Psychology 35: 345–411
Kahneman, D (2011) Thinking Fast and Slow. Penguin
Gilbert, D (2013) (p.45) AFFECTIVE FORECASTING…OR…THE BIG WOMBASSA: WHAT YOU THINK YOU’RE GOING TO GET, AND WHAT YOU DON’T GET, WHEN YOU GET WHAT YOU WANT in John Brockman (Editor) (2013) Thinking: The New Science of Decision-Making, Problem-Solving, and Prediction. Harper Collins

Decision Making- Some end of year field notes and thoughts

22 Nov

Health- How do experts produce outstanding care?

When we research people who work in health and are renowned for providing excellent care, the value of experience becomes apparent. Experts have developed rules of thumb which allow them to size up scenarios quickly, picking up cues and patterns; formulating a global picture of what is unfolding and how the scenario is likely to play out.

This cognitive process allows experts to operate beyond formal procedures and processes, anticipating potential problems and producing flexible plans, all with the goal of providing excellent care. For example, a highly experienced nurse may look first at the physical signs of a patient before looking at the patients chart, applying years of experience to recognise subtle signs and cues which may have been missed or recently developed. Simply following procedures and processes may miss cues which could be very meaningful to the patients care.

Information is crucial in health environments, but it loses its potential when it is not accompanied by analysis and sense making. Analysis and sense making are what separates health experts from novices, and so, when more information is introduced into an environment it runs the risk of being subjected to poor analysis.

Our recommendation- To enhance information’s value first illicit the current knowledge of experts and ensure it supports their experience based analysis and sense making skills. If not then information can run the risk of drowning out the tacit skills which deliver outstanding care outcomes

Decision challenges when using big data and new technology

We find that when a new technology is introduced into a workplace aimed at improving decision making the technology runs into an immediate problem-world visualisation. World visualisation means that the introduction of a new technology fundamentally changes the role which it is meant to support. This creates a contradiction- the technology is designed to improve decision making in a job, but the technology fundamentally changes the job it’s meant to support. You’ve created a new job with a new set of cognitive challenges, some of which may remain hidden for some time.

And more significant questions- has the technology supported the decision making processes or created a new set of skill requirements? Does the technology support the existing skills of the best members of staff? Do the cues and patterns which the best staff notice become more apparent or do they get drowned out by the noise? The authors Hoffman et al (2003) point out that when pilots were introduced to new sophisticated auto pilot systems, designed to make flying easier, a whole new set of cognitive tasks were created. How were they to integrate the new technology into their current methods of flying? How should they attend to the data? What should they pay more attention to, when and where?

I read recently that no one needed to be instructed how to use Google, however this misses the point, Google fundamentally changed the way a lot of people searched for information. It created a new set of reasoning strategies. As researchers who use Google constantly we always have to ask- how valid is this source? Where did it come from? What patterns are hidden in this text which could produce my next search terms? Not everyone thinks likes this, and one of our tasks is to instruct our trainee researchers on the cognitive skills required to place the masses of information Google produces in context, without it we just have easy cherry picked answers.

For organisations like intelligence agencies this is a fundamental challenge; there is no shortage of data but the sense making and analysis, the ability to join the dots creatively and effectively, is what prevents the terrorists’ attacks. Data without knowledge of the cognitive skills needed to analyse it is worthless.

Concept Mapping- Integrating Technology into Decision Making

6 Nov

Whether it’s open source data, big data, BI or any similar technology, the key point of using any of the former is to make “better” decisions. So, some of the present assumptions which underpin this point are that more data equals better decisions and with more data you stand the chance of making better discoveries.

The problem is, how do you effectively integrate these technologies into a current decision making process? Given the capacity for human error, even by experts, and that for any decision aid to be successful it MUST allow meaning to be drawn in way which acknowledges current practices and culture, how do you leverage new technology as a decision aid?

This is where the marketing and industry reports tend to stop and fail to address- how do you integrate new technology into the cognitive process of decision making? One such method is concept mapping, and it’s been used to great effect in weather forecasting.

Concept mapping requires the elicitation and representation of the knowledge people draw upon, in a profession, to make a decision. Its skilled work as it aims to draw out tacit and semi tacit knowledge, rules of thumb which people apply with regularity but remain largely unaware of. As an example tie your shoelaces whilst talking through the reasons for making each step, it’s harder than you think!

The extracted knowledge is then formed into short concept nodes, drawing from Candrell et all’s (2006) example, a concept for weather forecasting “Gulf of Mexico Effects” would be simply “Fog”. The concepts are then assembled into a hierarchy and linked together by casual relationships. For example

Node-Fog

Link- Leads to

Node-Rain

So, when you see fog, it has the potential to be an indicator for rain. The next step is to verify this indicator, to examine how applicable it is within this context; at this point it’s possible you’ll need more data.

The concept map will take you to a link which has the real time data feed (for example) of the current weather system. The map will also contain the concepts and links which other forecasters have used to confirm\ disconfirm similar predictions. Our example forecaster may “discover” something new, a novel way of examining the data which produces a faster more accurate test for assessing the potential of fog to produce rain. This gets added to the concept map and builds up the domain knowledge.

Concepts maps are a method of integrating expertise, current decision making frameworks and new technology. They capture rules of thumb, case studies, analytical tips and most importantly they capture and integrate discoveries. As the user learns, so does the organisation and from my perspective the most important point- it supports what people currently do well and doesn’t drown out expertise via a technology change programme.

The More Information the Worse the Decision

24 Sep

A great article by Marty Kaplan at the link below in which he reviews recent decision research out of Yale. In summary the research concludes that when a person is confronted with data which goes against their beliefs they’ll simply ignore or alter the information to suit their beliefs (Kahan et al, 2013). With the vast amounts of data and information available at all levels of an organisation and society this might well pose a problem for the purity of data informed decision making.

Increasing information has proven in many tests to be detrimental to decision making, the Kahan study is one of the latest. Big data sets have added a new dimension to this, with so many correlations available you can “prove” whatever you like. You don’t have to just dismiss facts anymore, you can simply move onto some more which suit you better. There are many psychological drivers behind this, it’s quite instinctive, and it’s much easier to argue you’re right rather than wrong. So, without throwing the baby out with the bathwater, what is the best way to utilise information effectively rather than with prejudice?

Information takes many forms- visual, auditory, statistical etc so one way to explore effective use of information is to look across to domains which are fast paced, constantly changing environments, saturated with data and require multiple decisions. Klien et al’s study of fire-fighters spring to mind, and being crudely simplistic, the fire-fighters made good decisions through highly sophisticated and rapid analysis. The quick lesson- when using data to make decisions imagine it’s a burning building, raise the stakes and ask yourself- what could go wrong here?

http://www.alternet.org/media/most-depressing-discovery-about-brain-ever

Confirmation Bias and Data Analysis

16 Sep

In the autumn addition of IT NOW, Adam Davison writes a very interesting article on Big Data. One point I particularly picked up on was the risk of confirmation bias when analysing data and Adam questioned what processes need to be in place to ensure this doesn’t happen. A little context, basically the confirmation bias is the human potential to look for information which confirms our beliefs, whilst explaining away anything which might contradict those beliefs.

The confirmation bias, or fixation, can be serious issue in data analysis and decision making; the dots simply don’t get joined or the discoveries made. A counter to it is not bias removal software or procedures but an increase in situational awareness that endures across time (fast and slow) and across scenarios. A focus on increased situational awareness will beat any process or procedure and greatly increase analytical rigour.

Finding Data Analysts in Unexpected Ways

1 Sep

In this article I’m taking a look at how data from seemingly mundane sources can be used to create situational awareness, improved decisions and increase organisational wisdom.

I’ve recently been doing some research\ consultancy with a health based organisation around rolling out a new management program. The management program has implications for how numerous teams work, collect and analyse data. The research involved the application of a prediction\ anticipation methodology to identify the domain the teams would be operating in and from this the development of means to maintain situational awareness in a changing environment. As I analysed some of the challenges the program faced I was once again struck by how many people involved in any project which aims to extend across a timeline identify the same hotspots- changing priorities, project creep, differing agendas, volatile political climate and so on. This is probably very familiar to any reader but illustrates the complexity and ambiguity of people, organisations and the future.

The future is so complex and so overwhelming that change can be perceived as threatening; it most frequently gets viewed in terms of losses. Most people have not been trained how to manage complexity, they are instead exposed to strategies which aim to identify and remove threats. The vulnerability of linear integrated plans is well documented, you simply don’t know where the biggest risks will come from, and this locks plans into pass\ fail situations. An alternative is more flexible plans with adaptable goals, with the biggest emphasis placed on developing team and individual situational awareness.

I’ll return to my e mail study to illustrate (apologies for the repetition but I hope it proves the point) , so I’ll just recap what this was- a two year investigation of 16,000 organisational e mails to explore how power and influence was communicated electronically. There was one very successful respondent during the study who was able to “get things done quickly” via e mail by adapting his e mail style to match the style (insight into the mental model) of who they were communicating. The methods used to achieve this can be summarised below-

Recognise the pattern of the e mail sender (for example- opens with Hi, closes with Cheers)
Copy the pattern of the e mail sender (for example- reply with Hi and close with Cheers)
Adapt the pattern to establish a rapport or shared pattern

Respondents in the study who weren’t able to achieve the completion of tasks\ requests via e mail presented the following

Did not recognise the senders e mail as a pattern which could provide insight into their (senders) mental model
Remain fixated on a single e mail pattern
Demonstrated no adaption

The successful respondent used e mail as data which was then used to lever an analogue; they viewed the e mail as a basis for decision and then adapted, maintaining situational awareness. Now, to return to my original topic of linear plans and perceived political futures, the reason these (linear plans) frequently do not meet expectations is because the people subject to the plans do not have enough available analogues. The plan or strategy aims to remove threats from a complex environment (very difficult) making people passive.

The proliferation of data and electronic communication has created vast amounts of adaptable analogues; if you look. What might seem just like text could in fact be valuable data which takes the analyst beyond passivity and onto proactive decision making through situational awareness. Increased visibility has meant data is now, to quote Heidegger, “at hand”. Taking an analytical perspective to what is “at hand” can produce the means to lever political and sensitive situations, to broaden domain understanding and improve situational awareness. Quite often we never need more information, just an analytical perspective and understanding.

The subject of top down power and order has been extensively researched and written about- the people at the top control the people down below. Strategies are a by product of this thinking, the use of language and formal authority creating an impression of control and order over the environment. The complexity of life derails this thinking, but the ability to use what is “at hand” to analytically navigate the environment is an underused resource. By encouraging people to be more analytical instead of data\ strategy consumers will produce more analogues. The creation of more analogues will increase the problem solving initiative of people. When this starts happening then goals can be reached through adaption rather than control. The by product is vast increases in organisational wisdom- the ability to anchor, simulate and adapt through complexity.