Archive | Big Data RSS feed for this section

The Rise of Unstructured Data

17 Feb

IBM’s website features an interesting statistic- 80% of new data is unstructured. This means that new data is largely in the form of blogs, tweets, white papers, articles amongst many other different types. The vast amounts of unstructured data could have, and are having, quite profound effects on the way people and organisations view and analyse “soft data”.

Continue reading

Keeping an Open Mind- Possibly a Bad Decision

14 Jul

This article seeks to explain the problems of keeping an open mind, it also seeks to offer a few tips of how to avoid the cognitive mazes an open mind can lead to.

Several articles ago, I said I would be covering the three frames of mind which determine the quality and relative success of decision making, managing uncertainty, and strategy. The three frames of mind were closed, adaptable, and finally, open, which I’ll discuss below.

Keeping an open mind, on the surface, sounds like a good idea. An open mind suggests that you’re keeping your options open, not being rushed into making a decision, and carefully weighing up evidence. On the other hand, being open minded can also mean being uncommitted, and this results in a great many options being identified, but none of these options being explored. This returns us to the Rudolph (2007) study of clinical decision making. To recap, Rudolph discovered three frames of mind which were used to generate decision making strategies when medical professionals were confronted with a complex and changing clinical scenario. The three frames of mind were closed, where a diagnosis was made and any contradictions were explained away; adaptable, where a diagnosis was made, explored, and adapted as new information presented itself, and open; where a professional flitted from one diagnosis to the next, never fully testing a single hypothesis. Rudolph referred to an open mind as “vagabonding”. With the array of options which face all of us every day, whatever type of decision we are looking to make, throw in the masses of data which can be used to generate even more options, then an open mind can become a maze; anyone is susceptible to vagabonding regardless of situation or task.

Herb Simon, the economist, psychologist, AI pioneer and Nobel laureate, highlighted that there were essentially two types of decision making strategy available to people. With potentially masses of data to process, we can either satisfice or maximize. Simon argued that it was not possible to weigh up all the many options and their consequences when we were confronted with decisions. No one is capable of reading the future, so our anticipation of how decisions will play out is based on limited knowledge; and because of this limited knowledge, a certain amount of intuition, or gut feeling, came into the equation.

As a result, Simon found that people would, to some degree, follow that gut feeling and satisfice- choose an option which was “good enough”. However, problems occur when people choose the other option and seek to maximize- endlessly search for non-existent perfect solutions.  If we now link the work of Simon to Rudolph, we can see that the medical professionals who selected a hypothesis which seemed plausible and then tested it (the professionals Rudolph categorised as adaptive), satisficed. The satisficers made a decision, tested it and adapted it.

By contrast, we can see that the medical professionals who maintained an open mind, sought to maximize, seeking a perfect diagnosis whilst making no diagnosis at all. Satisficing as an effective method of decision making has been located in other domains, amongst them, Klein et al (1989/ 2009) in their study of firefighter decision making. Instead of weighing up options, when confronted with a blaze, firefighters would intuitively select the first “good enough” option and adapt it. Applying an open mind to such a situation would significantly delay action as opposed to improving the outcome.

Although the majority of us do not put out fires or make clinical decisions, we might be put at greater risk of becoming an open minded maximizer as there are even less boundaries to sift options.  The vast amounts of data and choice we can factor into any decision can potentially put us into endless search mode, able to locate many things but critically evaluate nothing. This can encourage an open minded approach to decision making that is passive and results in being “stuck”. If the search for the perfect option involves discovering new information, which is no bad thing in itself, the process can become counter-productive as a faulty feedback loop gets inserted into the process. The counter-productivity occurs as the payoff of discovery, potentially becomes a reason to keep searching, as opposed to making a decision; this is the faulty feedback loop-search> find>search>find. If the priority is making a decision, don’t substitute that priority for searching because it’s easier in the short term. To put this in some context, below is a quote from a company I worked with which sums up perfectly the problems of an open mind whilst “marketing” the concept as a positive

“We needed to make a decision about 9 months…where the company was heading. Any decision would have done, we (the company) just needed some leadership. Instead we got this-wait and see strategy. The leadership decided we needed to see more data, which the customer results in Q3 would tell us what we should do, that, and this is the best…that this would reflect that we were customer orientated…so everyone has spent the last few months just waiting and guessing. It hasn’t done morale any good and leadership has just looked confused and weak”

When we are faced with selecting from a range of options we can become concerned with maximizing the selection. Imagine an everyday example, which dress, which watch, which car. In order to try and make the best choice, you set up a criteria in your head, but even when this criteria produces a selection you are still not happy, because you know how shaky your criteria was; it hasn’t convinced you that the best decision was made, there is still doubt, and so there are feelings of regret, anxiety and reflection on what could have been. This is what Klein (2003) referred to as the zone of indifference- a state where any choice would have done, but a feeling that the optimal choice had to be produced took over.

In these situations it is important to understand the role of satisficing. If each of the options contained a worse-case scenario which was tolerable\ manageable then the gut instinct, or selecting at random, is as good a guide as any to choosing between a range of options where each option would have been good enough. Organisational decision making may seem far too serious for this type of seeming flippancy, but the same rules essentially apply. Making a decision and then adapting it provides feedback on what is working and what isn’t; it quickly builds up the reservoir of experience and the ability to critically appraise future options. Passively keeping an open mind provides very little feedback, it can give you a lot of surface detail, but very little else. In short, satisficing beats open minded maximizing.

The following articles all discuss an open frame of mind

I began the Echo Breaker blog with a series of articles on Big Data, it was an area I was exploring with a colleague. A lot of what I wrote centred about how large data sets can create an illusion of a “one best answer”. This isn’t really an attack on Big Data but rather acknowledging the limits of its use. A potential problem for the future is that endless exploration of data sets could substitute for making a decision. Endless exploration of data sets might lead to some vital discoveries or better still, disprove some beliefs and theories, but it cannot be substituted for decision making. Two articles of this type below

http://wp.me/p3jX7i-2

A polemic which is focused on anti-maximizing when using large data sets

http://wp.me/p3jX7i-D

An article on the topic of the zone of indifference. When making a decision, the first step is analyse whether you\your organisation can handle the worst-case scenario which could arise from that decision. If you\your company cannot handle the worst case of any option, then you are not in a position to make a decision; it’s back to problem solving

http://wp.me/p3jX7i-1A

 

Passive decision making, the result of an open mind, and how the concept produces cultures which are focused on taking orders as opposed to making decisions

http://wp.me/p3jX7i-1N

Reading

Rudolph, J. (2003) Into the Big Muddy and Out Again. Error Persistence and Crisis Management in the Operating Room. Dissertation, Boston College

Summary of the above can be found in

Klein, G (2009) Streetlights and Shadows. Bradford Books

Klein (2003) The Power of Intuition. Currency books.

Klein, G. A. (1989). Recognition-primed decisions. In W. B. Rouse (Ed.), Advances in man-machine systems research (Vol. 5, pp. 47–92). Greenwich, CT: JAI Press

Simon, H. (1976), Administrative Behavior (3rd ed.), New York: The Free Press

 

 

 

 

 

 

Prison Break and Uncertainty

10 Nov

It might seem strange to highlight the television series Prison Break as a meditation on the relationship between risk, data and uncertainty, but I like to think it’s a near perfect example. Prison Break had a simple premise- a genius engineer (Michael) purposely gets convicted of armed robbery with the aim of breaking his brother (Lincoln), who is on death row (wrongly convicted), out of prison with the aid of the prison blue prints and a plan all tattooed onto his body. Michael entered the prison with a big data set, a plan, and the necessary skills. However, the unexpected inevitably got in the way.

Michael’s data and plans certainly took care of most risks, when things went wrong he had plans B & C. The data and plans accounted for the risks he had anticipated and calculated; however, things went really off course when Michael encountered uncertainty-events he couldn’t possibly have predicted. Prison Break did a great job of demonstrating not only the complexity of prison life with its constantly shifting allegiances but also the complexity of the world everyone involved lived in. Plans were constantly being adapted due to events which began many miles away from the prison and snowballed into huge problems, only visible at the last minute.

To adapt effectively to uncertainty, Michael had to often rely on the tacit skills of his criminal allies. Their collective years of operating in the underworld had given his fellow escapees a host of craft skills they could call upon to jury rig solutions fast; skills and experience Michael simply did not have. Eventually this combination of data, risk (contingency plans) and uncertainty (the ability to tacitly adapt to the unexpected) eventually got Michael and his brother out of prison.

If you imagine Michael as the CEO of Prison Break Inc. then he ran his organisation in the High Reliability style (Weick & Sutcliffe, 2007). Michael understood the value of planning, but not being so wedded to those plans that he was unable to adapt when they became impossible. It would have been pretty easy given the investment\ sacrifices he had made and the constant stresses of the various scenarios to fall into the spiral of the confirmation bias- a narrow frame of reference where contradictory evidence and interpretations are simply explained away (Kahaneman & Tversky, 2000). More significantly, he deferred to the tacit skills of those around him when his plans and data had reached their current limits.

To highlight this point, I’ll match Michael’s leadership of Prison Break Inc. against the features of High Reliability Organising

  • A pre occupation with failure- Michael was focused on learning from his mistakes and constantly tweaking his contingencies
  • A reluctance to simplify explanations- If something obvious presented itself it was explored and researched, not just accepted. This helped buy time when things went wrong and avoid errors
  • Sensitivity to operations- The strategy was to get Lincoln out of the prison and off death row, but the actions were always focused on what was going on right now. This enabled practices which weren’t working to be abandoned and new options explored
  • Commitment to resilience- Michael’s leadership was epitomised by his ability to recover from interruptions. He did not just have one big plan to deliver the strategy, he kept numerous options; a vast adaptive tool kit which made recovery from shocks and surprises that much easier (Gigerenzer, 2014)
  • Deference to expertise- When Michael’s plans had hit their limits, or he was indisposed, the tacit skills of local experts took up the slack. Neither Michael nor his team just relied on the formal plan and data.

The extreme environment Prison Break was set in and the immediacy of feedback on actions made it a decision making battlefield. It’s pure fantasy but still provides a useful analogy for examining how fast well thought out plans can recover from unexpected shocks and surprises.

References

Weick, K. E., & Sutcliffe, K. M. (2007). Managing the Unexpected: Resilient Performance in and Age of Uncertainty, Second Edition. San Francisco, CA: Jossey-Bass

Kahneman, D., & Tversky, A. (Eds.) (2000) Choices, values and frames. New York: Cambridge University Press

Gigerenzer, G. (2014) Risk Savvy- How to Make Good Decisions. Allen Lane.

Question-Can New Technology Improve Decision Making?

2 Oct

Having recently received funding for research into technology enabled decision making I thought I’d address a question an equity trader asked me the other day- Can the latest technology actually improve decision making? What follows is not an answer but some insight into decision making and technology, the “answers” will appear soon.

Underpinning decision making is situational awareness (SA). SA is the real time, ever evolving understanding of a person’s awareness of “what is going on”(Endsley, 1995). Endsley’s (1995) three level model provides both a good introduction and a usable method for understanding SA; the models three levels are-

1) Perception of the elements- the perceiving of data
2) Comprehension of the current situation- the translation of level 1 data into elements which allow its (data’s) relevance to a particular task to be understood
3) Forecasting- this is the highest stage of “processing” , drawing upon experience derived mental models to produce a forecast of the likely future states of the situation

When a person has relevant expert experience they are able to anticipate future states to a high degree of accuracy and so makes good decisions. Any new technology creates a new dynamic and so a new experience. This is specific to any task or role but the most important point is that it needs to support situational awareness (which comes from experience) and not contaminate it.

There are numerous examples of where a new technology has damaged decision making through the corruption of existing, and highly effective, SA. Significantly, new technology can also undermine one of its key aims- discoveries. This occurs by over focusing the level 1, perception stage, and thus reduces comprehension and arising leverage points. In other words, it can potentially narrow the view too much, leading to an over focus on data and not enough on trial and error analysis. A little like driving off a cliff thanks to your sat nav.

So, the conclusion is this- new technology could improve decision making but only with knowledge of current task SA requirements and using it (the technology) to enhance existing, and effective, SA. The best technologies will be the ones which cognitively support the user.

The More Information the Worse the Decision

24 Sep

A great article by Marty Kaplan at the link below in which he reviews recent decision research out of Yale. In summary the research concludes that when a person is confronted with data which goes against their beliefs they’ll simply ignore or alter the information to suit their beliefs (Kahan et al, 2013). With the vast amounts of data and information available at all levels of an organisation and society this might well pose a problem for the purity of data informed decision making.

Increasing information has proven in many tests to be detrimental to decision making, the Kahan study is one of the latest. Big data sets have added a new dimension to this, with so many correlations available you can “prove” whatever you like. You don’t have to just dismiss facts anymore, you can simply move onto some more which suit you better. There are many psychological drivers behind this, it’s quite instinctive, and it’s much easier to argue you’re right rather than wrong. So, without throwing the baby out with the bathwater, what is the best way to utilise information effectively rather than with prejudice?

Information takes many forms- visual, auditory, statistical etc so one way to explore effective use of information is to look across to domains which are fast paced, constantly changing environments, saturated with data and require multiple decisions. Klien et al’s study of fire-fighters spring to mind, and being crudely simplistic, the fire-fighters made good decisions through highly sophisticated and rapid analysis. The quick lesson- when using data to make decisions imagine it’s a burning building, raise the stakes and ask yourself- what could go wrong here?

http://www.alternet.org/media/most-depressing-discovery-about-brain-ever

Confirmation Bias and Data Analysis

16 Sep

In the autumn addition of IT NOW, Adam Davison writes a very interesting article on Big Data. One point I particularly picked up on was the risk of confirmation bias when analysing data and Adam questioned what processes need to be in place to ensure this doesn’t happen. A little context, basically the confirmation bias is the human potential to look for information which confirms our beliefs, whilst explaining away anything which might contradict those beliefs.

The confirmation bias, or fixation, can be serious issue in data analysis and decision making; the dots simply don’t get joined or the discoveries made. A counter to it is not bias removal software or procedures but an increase in situational awareness that endures across time (fast and slow) and across scenarios. A focus on increased situational awareness will beat any process or procedure and greatly increase analytical rigour.

A Possible Life without Data Scientists

15 Aug

I read a very interesting article on smartplanet.com entitled “Why big data means job growth for non- data professionals” and from my perspective just this title summed the data explosion up perfectly. The vast increases in data if used correctly (and I mean a big IF) will have far more value to non-data professionals and to their organisations than anything a data scientist could produce. This is because data represents an opportunity to develop and accelerate human expertise far more than any spurious predictive model.

The quest for data scientists is still built on the beliefs that complexity is mathematically tractable. The moment a belief in a predictive model occurs then trouble begins, it’s the banking crisis model all over again. You can’t predict complex systems, but when you believe you can, you think you’re in control, that the model can do it for you, and that’s when human beings become passive. It’s a form of electronic social loafing, when predictive models become part of your cognitive dynamic, expertise takes a back seat and conflicting data or hunches get explained away. Just take a look at the Enron case study, all the data was there but the problems kept being explained away, with plenty of models, graphs and tables to help.

Better use of data does not lie in the hands of data scientists, it lies in the hands of experts, in any industry, using it to adapt and test their mental models. Good decision makers are not passive; they are adaptive and use vast amounts of tacit skills and heuristics to navigate complexity. Good decision makers also do not predict, they anticipate- cognitively, prediction is what leads you to lock your car keys in your own car, anticipation makes sure you don’t. The problem with following the examples of good decision makers is that very often they know far more than they can say. Unlocking this expertise and transferring it is true high value information but you’ve got to know how to do it, otherwise you could be following the wrong cues.

Big data and some of the latest BI platforms represent huge opportunities for experts in any field to operate at more adaptive levels which allow them to identify and lever risks rather than be buried by them. These technologies also represent the chance to help unlock tacit knowledge and turn novices into experts faster, and also broaden the range of expertise. When a focus is finally placed on how people actually use data to make decisions as oppose to how can more people use data to make decisions then we’ll finally see some real developments.

All decisions start with a hunch, intuition. Hours of research have taught me that, and the rules of thumb people generate to manage complexity are just as effective now as they have always been; we just need to identify them, support them and transfer them. So, forget data scientists, in about 5 minutes you can make almost anyone a better decision maker using data via Google, so focus on a usable, simple BI platform and then use it to support expertise.

Final thought- The data explosion represents huge opportunities for non-data professionals but they need to be used correctly and responsibly. Remember, the data sets are now are so vast you can prove almost anything you want via correlative statistics, so positive cases are of dubious value. At that point, ask an expert. I’ll explain further next time….

Big Data Doesn’t Mean Better Decisions, Part 2

18 Jul

Big data and business insight will not deliver any benefits unless you know how to use it and what you are going to use it for. In fact, I could make a good case for ignoring it completely and why a company would be just fine without it. But big data is an opportunity which also currently presents a problem, how does a company get value out it and\ or how does it demonstrate value to invest in big data and its analytics?

Big data and analytics promise better decisions, but this will not happen without an examination of HOW these tools will produce better decisions. The business environment is highly complex, influenced by unexpected events and human behaviour, therefore you can’t find answers to questions, but you can manage outcomes in a more favourable direction based on analysis. And this is the key, the decision making process which maximises big data needs to be focused on improving human expertise.

So, the promises of improved decision making won’t happen until we acknowledge it’s human beings who are choosing the questions to ask, where to look, what to consider important and when to stop looking. The above are intuitive and analytical decisions, so in order to get the most out of big data analytics we need to focus on building up human expertise in their use. If we don’t, then there is a distinct danger users will think more and more information is the answer to complex issues such as- what will our customers want next year? There are so many variables in play influencing this outcome, that more information will simply add more complexity. As a user attempts to integrate more and more data points into an issue which has no defined answer, they will simply become confused. Efficiency and outcomes will be greatly improved when intuitive analytics take over.

If a human being is ultimately choosing where to look, then how can we improve the quality of where to search and what to ask? I believe an answer lies in firstly using beliefs to produce a picture of the future then using analytics to disconfirm this belief. This will not only provide an array of options to move between if events start taking apart a business plan, but also improves the ability of users to recognise cues and patterns which can lead to opportunities; decisions need to be dynamic, not static wait and see. The result is improved beliefs which guide the initial search points in the future, and faster, more efficient searches and decisions.

The current work of our research company is focused on some honest facts- it’s very difficult to gain expertise in business and sometimes the confidence of business leaders is misplaced. But you can manage decision making to produce better outcomes in complex environments. Big data has provided an opportunity to test long held assumptions about an organisation’s environment and customers, and most significantly, to modify the mental models we use to make decisions. This approach has huge implications for how we use analytics, and how we balance the relationship between data and intuition. For example, it’s possible to use a simple 3 step process to use the Google search engine to test assumptions underlying decisions and modify beliefs, in 5 minutes.

I do not think we will ever see true expertise in business decision making, but we can develop increased expertise in analytics which will improve our ability to manage complex domains. To be clear, big data is an opportunity, but by using a simple approach which blends the natural decision making process of human beings with sophisticated analytics we can maximise its value and make the investment work.

Big Data and Intuition- A Case of Babies and Bathwater

31 May

A recent symposium held at MIT reproduced the constant theme around organisational decision making and big data- gut instinct out and data driven decisions in. Whilst data and information are important and beneficial to organisations, such an anti intuition message  can carry huge amounts of risk, with one particular example springing to mind- the 2008 financial meltdown. In this article I’d like to address the role of intuition in decision making and its relationship with data, because regardless of the current narrative, intuition is vital to good decisions, you “just” need to know the limits of both.

The role of intuition in organisational decision making was defined by the work of Herb Simon, who provided a strong steer on what intuition is- pure and simple, intuition is recognition.  Intuition is that feeling something is wrong or right, you’re not sure why, it just feels that way. Psychologists suggest a pattern has been recognised, you may not be aware of the details, but the feeling is one of knowing with no explanation why, but it’s experience talking, you’ve seen it before. Researchers in the field, for example Gary Klein, identify that people operating in fields such as fire fighting use intuition derived from experience to generate mental models, assess them for effectiveness against a current situation, and adapt them. This model of decision making is fast, highly effective, and largely unknown to the person using it; it’s intuitive. This is tacit knowledge in action, highly effective information which exists outside of formal procedures and methodologies, and instead exists in people’s heads; the reason why collectively an organisation knows far more than it can say.

Data and information is formal in the sense that it sits there in front of you, it’s tangible. And as I discussed in an earlier article, the temptation is that data derived from highly sophisticated technology and methodologies is giving you an answer- the answer you have been looking for, the facts. The problem with just using data and facts to make decisions is that they can sit outside of context, and this where intuition comes in.

A lot which is written about big data is all the knowledge of the “world out there” it gives you. The data gives you new, faster and larger amounts of information and insight on your customers and competitors, it gives you the map, an answer. However, new technology and new platforms may not have changed the “world inside” your organisation, what the data tells you to do needs to be contrasted against the capabilities inside the organisation, and this where experience and intuition is vital. If a highly experienced member of staff says it just doesn’t feel right, listen, and put structured effort into finding out why, dig for that tacit knowledge, don’t just listen to the data.

Organisational initiatives fail because we think procedures and algorithms can suddenly turn that member of staff who has never delivered into an energetic go getter, or dominate a market which a company has failed to penetrate for the past 15 years, or predict human behaviour.  Data won’t change the internal capabilities of an organisation, it might point to a direction, but it’s the tacit knowledge, the intuition, which could tell whether you have the legs to get there. The 2008 financial meltdown looked impossible when analysing the data from models such as Value at Risk, but there were many people within the industry and on the street who intuitively knew it couldn’t stay on top of market volatility.

So, don’t analyse data in isolation, look at it within context, and that includes listening to intuitive doubts. To do this effectively you need to dig for that tacit knowledge, because quite often your best and most experienced staff will know far more than they can say. Effective intuition is linked to expertise, so increasing the visibility of tacit knowledge can increase the level of expertise within an organisation.  Expertise, as Gary Klein advocates, is an effective means of harnessing experience into the decision making process. By increasing the level of expertise, as opposed to always increasing methods and procedures, you increase the level of innovation and balanced decision makers in an organisation. I’ll discuss expertise further in my next article, but until then, when it comes to big data; don’t throw the baby out with the bath water.

Answers Crush Options

30 Apr

No matter what the technological advancements in data analytics, for the foreseeable future at least, it will still be human beings choosing what to focus on, what to ignore, how to make sense of choices and making decisions. There is a pervading bias that technology can just do it all for you, and in the absence of genuine AI, it can’t.

I believe we need to start acknowledging the limits of expertise in business, because this is what will ultimately make individuals and organisations better decision-makers. In parallel, technology should be a tool used to support our psychology, not replace it, and I’ll expand on these points for the rest of this article.

To begin, I’ll revisit the basics of human decision-making. Human decision-making is designed to make survival dominated decisions- spot a pattern, jump to a conclusion and react. This process saved our ancestors lives (often enough) but it’s essentially instinctive, and very fast, not critical. When one of our ancestors was under attack from a predator, they needed to act quickly and make a rapid decision under intense pressure; it was, and is, essentially a means of processing information rapidly.

In the modern world, most people in the west don’t encounter or anticipate such threats, but we respond to information in a similar way. We quickly look for a pattern, rationalise it, and act upon it. Human beings have a such a strong tendency toward pattern recognition and rationalisation that we build narratives to support our decisions, and these narratives then defend our choices and justify them to others. The vast amounts of information and data available today mean we are more susceptible than ever to our inherent biases; we’re under attack from information and react quickly to it.

To justify our reactions to data, we have sociologically built a biased narrative to support it- the more information you have, the better your decision. However, a recent study at Harvard University demonstrated the opposite, more information is more likely to result in a poorer quality of decision. This happens because a human being, when faced with complexity, immediately and subconsciously, attempts to build a pattern to produce an answer quickly. In dynamic environments this process also shuts out options; a conclusion narrows focus. This was something I focused on when doing research with armed police, the need for officers to delay pattern recognition, and delay the narrative, for as long as possible, in other words- maintain options.

The same holds true with big data and real-time information. When a person sees a trend (trending on twitter for example) the bias is to see it as an answer, a direction or a map, but with such a complex and dynamic ecology, the current trend can become irrelevant in seconds, and as a consequence so does your answer, direction or map. If you over commit or over invest, psychologically or materially, then you’ve eliminated options- answers crush options.