A Grand Vision… or tunnel vision?

Recently, I attended a live discussion-cum-interview with the two authors of the best-selling books “Freakonomics” and “Super Freakonomics“, Steven Levitt and Stephen Dubner, who were promoting their latest book, “Think Like A Freak.”

Now while “thinking like a freak” seems to me to be just a new name for the long-established traditions of thinking differently, thinking counter-intuitively and thinking outside the box, something that Steven Levitt, the Freakonomist, said during the discussion caused me to think a little differently about visions. Steven Levitt said, not exactly in these words:

When considering a situation, initially, put your morality to one side and look at the facts in an unbiased way. See what the evidence really says about the situation. Later, when you are determining what to do, you can bring your morality back into the picture.”

In other words, while it’s important that we are clear on our vision and our values, our morality, and that we make decisions that are aligned to our vision and values (for the reasons why, see my previous blog entry: “A vision of a greater meaning“), we should make sure that we don’t become so focused on our Grand Vision that we develop tunnel vision and stop seeing the world as it really is.

Tunnel vision

If tunnel vision is our tendency to become fixated on a single idea, option or belief to the exclusion of other ideas, options or beliefs, it can arise and be sustained in a number of ways.

First, our natural default is to consider only a single option when making a decision. Psychologists call this “selective hypothesis testing” and in their book about managerial decision making, “Think Again: Why Good Leaders Make Bad Decisions and How to Keep it From Happening to You”, Sydney Finkelstein, Andrew Campbell and Jo Whitehead refer to this psychological phenomenon as: “one plan at a time.”

The reason that we’ve evolved to consider just ‘one plan at a time’ is that thinking about more than one option at a time is difficult for us; considering just one option at a time is much easier. However, the ‘one plan at a time’ approach is a form of tunnel vision and problems with it can arise easily.

Evidence from scientific studies shows that when we consider just a single option, we perceive that option to be better, and as a result we are more likely to choose it, than if we considered the same option as just one of a number of different options.

So we typically consider one plan or option at a time and we usually over-estimate how good the first option that we consider is. But how often is the first option that we consider going to be the best option available? In fact, the first option that we consider might not even be a very good option, yet we tend to go with our first option unless we identify a really big problem with it. Only if we identify a big problem with the first option do we look for an alternative.

It’s a lo-o-o-o-ong tunnel…

Once we’ve focused on a single idea or option or made a choice about something, other aspects of our psychology tend to stop us from properly considering alternatives or from changing our mind. In my previous blog entry “Changeable weather and climate change”, I wrote about confirmation bias in the context of climate change denial. However, confirmation bias doesn’t just drive our attachment to really big beliefs; it can cause us to look for confirming information, and to ignore disconfirming information, for all sorts of ideas and choices. Confirmation bias is another form of tunnel vision.

How many people, whether in politics, in business or in other fields, develop visions or beliefs or make decisions that they hold onto so strongly that they develop tunnel vision and stop seeing the world as it really is?

Consider the circumstances under which Tony Blair, then the Prime Minister of the United Kingdom, led the UK into the Iraq War (or second Gulf War) in 2003. Peter Mandelson, one of Tony Blair’s closest and most trusted advisors at the time, wrote in his memoirs that Tony Blair “developed tunnel vision” regarding Iraq and the idea of deposing Saddam Hussein. He wouldn’t accept any challenge to his belief in the rightness of going to war in Iraq, despite plenty of evidence that existed at the time that it was both unnecessary and a bad idea.

More than ten years after the Iraq War, at a time when Iraq has effectively collapsed as a nation state, Tony Blair still believes that it was the right thing to do. The tunnels of our tunnel vision can be very long indeed.

 

Blog 22 media tunnel 3

Breaking out of the tunnel

How can we avoid the pitfalls of tunnel vision in our decision-making?

The ELECTIA approach to decision-making helps us to avoid tunnel vision when we make decisions in the following ways:

  • It directs us to Describe and frame the decision by defining clearly what is the decision to be made, by considering the broader context within which the decision exists, and by determining why it is relevant to make the decision. These last two points, considering the broader context and the decision’s relevance, help us to avoid the narrow focus of tunnel vision.
  • It guides us to make decisions that are Well grounded because they are based on relevant evidence. As Steven Levitt said, we should “initially, put [our] morality to one side and look at the facts in an unbiased way.”
  • It directs us to first Generate ideas and options and to then Rule options in or out, thereby ensuring that we consider more than just ‘one plan at a time’.
  • It asks us to identify our assumptions and our beliefs and to consider whether or not they are reasonable in order to ensure that our decisions are Free of bias.

So it is right that we have a vision and values and it is right that we check the alignment of the choices that we make to these. As Steven Levitt said: “…when you are determining what to do, you can bring your morality back into the picture.”

However we should also take steps to ensure that our vision hasn’t become tunnel vision, by testing the validity of our vision and being open to evidence that contradicts it. We should be cautious about considering just one plan or option in isolation and instead identify and also consider possible alternatives. We should remember the image of the tunnel, be alert to when we might be suffering from tunnel vision, and take whatever steps are necessary to break out.

Blog 22 media man

Predecisional frameworks

Another blog entry, another great Dilbert cartoon about decision-making within large organisations…

Blog 21 media Dilbert

So, does this mean anything? Well, the word “predecisional” isn’t defined by leading dictionaries, however, even if it isn’t a ‘real’ word, its meaning seems fairly easy to guess. If we take “predecisional” to mean something that occurs or exists before a decision, then we can see that this word is actually redundant and the Pointy Haired Boss is simply saying that the Steering Committee spent time in its meeting agreeing a framework for making the decision.

Why did the Steering Committee spend time doing this? Is it a delaying tactic, possibly driven by fear of making the decision? Or maybe it’s unnecessary philosophising or grandstanding by the Steering Committee, possibly the result of a need to look or feel more important?

Since this is a Dilbert cartoon and we expect it to satirise daily life within large companies, it’s easy for us to imagine that Dilbert’s disappointment must somehow be the result of the Steering Committee’s incompetence. But could there be a good reason to delay making a decision in order to agree a “predecisional framework” for making that decision?

A case for standard approaches to decision-making

The results from the diagnostic phase of an action research project that I have recently completed within a large organisation reveal that using standard approaches to support decision-making, such as frameworks or guiding principles, has a benefit for both the perceived quality of the decision-making process and, more importantly, for the likelihood of making a decision and for the perceived quality of the decision made. For more information, see my previous blog entries: “Dilbert on decision-making” and “Decision-making and Chinese astrology“.

However, if you are or were someone working in a large organisation and needing to make decisions in collaboration with your colleagues, would you want to spend time agreeing a “predecisional framework” for every decision? Why not instead adopt an approach to decision-making that is broad enough and flexible enough to be applied in all situations?

I believe that the ELECTIA approach to decision-making has this breadth and flexibility and so, in the second phase of the action research project, I asked the participants to apply the ELECTIA approach to support decision-making within the organisation and to collect data on the impact of doing so.

The benefits of the ELECTIA approach in action

Comparing the data gathered in the two phases of the action research, the project participants reported the following differences between the first phase (no interventions, or ‘baseline’) and the second phase (active interventions to support decision-making):

  • An increase from just over 60% to nearly 90% in the proportion of decisions where the participants rated the process of making the decision as ‘good’.
  • A reduction from 20% to just 2% in the proportion of decisions where the participants rated the process of making the decision as ‘poor’.
  • A reduction in the proportion of decisions that were not made from 15% to 5%.
  • An increase from 82% to 95% in the proportion of decisions where the participants rated the decision that was made as ‘good’.
  • A total elimination of decisions that the participants rated as ‘poor’.

Back to the Boardroom

So, poor Dilbert didn’t get a decision on his project because the Steering Committee spent its time agreeing a predecisional framework rather than making the decision. Meanwhile, use of the ELECTIA approach to support decision-making led to a reduction by two thirds in the proportion of times that a decision wasn’t reached. Dilbert might wish that his Steering Committee had adopted the ELECTIA approach!

Decision-making and Chinese astrology

What are the factors that influence decision-making in organisations?

Well, hopefully not…

Blog 20 media Dilbert

In my previous blog entry (“Dilbert on decision-making“), I described an action research project that I have recently completed, working within a large organisation. The project was conducted in two phases and the objective of the first phase was to develop a deep understanding of how decisions are made within the organisation, particularly to identify the factors that support or inhibit good decision-making.

The participants in the project, a group of 29 volunteers within the middle management level, all based at the organisation’s global headquarters, captured information about organisational decisions over a period of one month, logging a total of nearly 300 decisions. For each decision, information was captured on the following aspects of the decision:

  • Importance
  • Complexity
  • Who was involved
  • Who was the decision-maker
  • What were the circumstances of the decision and the factors that influenced it
  • What was the decision-making process used
  • Quality of the decision-making process
  • Whether or not a decision was made
  • Quality of the decision made
  • Time taken to make the decision

By analysing the resulting mass of information about the organisation’s decisions, it was possible to identify where correlations existed between pairs of any of the factors listed above.

Now, since we are talking about correlations, they do not provide any indication of causation. However, for some of the correlated pairs of factors it seems reasonable to hypothesise causal relationships, or, at least, opportunities to influence the outcomes of decision-making processes, such as whether or not a decision is made, or the quality of the decision, by influencing some of the circumstances under which the decisions are made.

The following diagram represents the main relationships that were identified, with correlations represented by:

  • Orange box: the factors within this box were all positively correlated with each other.
  • Arrows: these shows correlations between pairs of variables in different boxes. The associated positive and negative signs indicate a positive or negative correlation.

The assumed causal relationships are indicated by the direction of the arrows.

Blog 20 media factors model generalised

What opportunities do these findings suggest for making better decisions within organisations?

Some of the opportunities implied by these findings won’t seem like rocket science to anyone. For example, the research suggests that better decisions will be obtained by:

  • Ensuring that a reasonable amount of rational consideration is given to a decision.
  • Identifying situations involving strong emotions or complex or obstructive organisational politics and taking steps to reduce any negative impact from these on the decision-making process.
  • Ensuring that participants in the decision-making are clear regarding who will be the decision-maker(s) and what will be the decision-making process followed; a finding that I covered in my previous blog entry (“Dilbert on decision-making“).

However, some of the other findings may be less obvious to many people, for example:

  • Decisions that have been discussed previously and are being revisited are associated with less good decision-making processes, have a lower likelihood of a decision being  reached, and result in decisions that are perceived to be poorer on average. We might ask which came first: the need to revisit the decision or the poor decision-making process? Either way, seeing lots of decisions being revisited should be a big warning sign for organisations. Also, for a discussion about decisions that are repeatedly revisited, see my previous blog entry “Battling the zombie horde“.
  • Explicit reference to the organisation’s values was found to improve the perceived quality of the decision-making process, the likelihood that a decision was made, and the perceived quality of the decisions that were made. The ELECTIA approach to decision-making makes alignment to values a central part of decision-making; see my previous blog entry “A vision of a greater meaning“. What more can values-centred organisations do to ensure that employees reflect on and are guided by the organisation’s values?
  • The use of standard approaches to making decisions, such as frameworks or guiding principles, had a positive impact on the perceived quality of the decision-making process, on the likelihood of a decision being made, and on the perceived quality of the decisions that were made. In the first phase of this action research project, no standard decision-making approach was promoted or mandated, and a standard approach was used spontaneously in only around 10% of the decisions recorded, with no consistency in which decision-making approach was adopted. However, in the second phase of the project, the participants were asked to apply the ELECTIA approach in order to assess its impact on organisational decision-making. In my next blog entry I will describe the findings of the second phase of this project.

There are many other specific findings from the first phase of this action research project, and of course there is also much that can be hypothesised or implied based on these results. I will continue to explore some of these other findings in future blog posts.

Finally, it is also worth noting that this project couldn’t and didn’t consider all of the huge number of factors that might have some influence on organisational decision-making, Chinese astrology being just one of many…!

Dilbert on decision-making

I’m a huge fan of Dilbert cartoons, partly because their wry observations on daily life in large companies often cause me to laugh out loud and partly because these same observations often provide valuable warnings about the challenges and pitfalls that face employees. Naturally, many of my favourite Dilbert cartoons relate to decision-making within large companies. Here’s just one example:

Blog 19 media Dilbert 1

In my earlier blog entry, “Leaping lemmings, fads and frenzies“, I discussed decision-making by groups, some of the ways in which it can fail, and how it has the potential to produce better decisions than does decision-making by individuals.

What about decision-making in large organisations, which are made up of many different groups (teams, departments, business units, etc.) working together?

Large organisations are complex social and economic systems in which thousands of decisions are taken every day that will collectively determine the health and the success of the organisation. As Daniel Kahneman writes in his excellent book “Thinking, Fast and Slow”:

Whatever else it produces, an organisation is a factory that manufactures judgements and decisions. Every factory must have ways to ensure the quality of its products in the initial design, in fabrication, and in final inspections. An organisation that seeks to improve its decision product should routinely look for efficiency improvements at each of these stages.”

I recently completed an action research project with one such large organisation with the aims of:

  1. Developing a deep understanding of how decisions were made within the organisation, particularly to identify factors that supported or inhibited good decision-making.
  2. Testing the benefits of applying the ELECTIA approach to decision-making to decision-making by groups within the organisation.

A team of 29 volunteers helped me to gather data on almost 300 organisational decisions and on over 50 applications of the ELECTIA approach.

My action research project generated many detailed findings, however the main findings, which I are believe are likely to be relevant for most large organisations, were as follows:

  • Collective decision-making was rarely guided by either a standard process or by any discussion about the decision-making process to be adopted. As a result, most decision-making involved assumptions made by those involved about the process for making the decision and about who was to be involved, when and on what basis. A lack of clarity on these aspects of decision-making was reported by the project participants in 22% of the decisions that were recorded and analysed.
  • Lack of clarity regarding either the decision-making process being used and/or who was the decision-maker was found to be associated with a high degree of failure to make a decision and, when decisions were made, with decisions that were perceived to be of lower quality.
  • Lack of clarity was particularly prevalent in ‘cross boundary’ situations, e.g. within matrix teams or for decisions involving participants in different departments or parts of the business.
  • Active intervention in collective decision-making by the project participants using the ELECTIA approach substantially increased both the likelihood of making decisions and also the perceived quality of the decisions that were made. In fact, use of the ELECTIA approach completely eliminated decisions that were perceived to be ‘poor’ in this study.

The implications of these results are that, despite being “factories for manufacturing decisions”, organisations often do not attend directly to the quality of their processes for making decisions or to the quality of their “decision products”. Teams within organisations often become focused only on the subject matter of the decisions that must be made (“the what”) and lose sight of and make assumptions regarding the method for making those decisions (“the how”). Leaders don’t act to ensure that effective and efficient approaches to decision-making are applied routinely. As a result, employees may feel just like Dilbert in the cartoon above.

Therefore, substantial benefits for the quality of organisations’ decisions can be achieved by:

  • Focusing the attention of individuals and teams on the quality of decision-making processes and on the quality of decisions that are made. Once attention is focused on these areas, individuals and teams can often identify for themselves some ways in which they can improve the quality.
  • Providing guidance in the form of simple approaches, such as the ELECTIA approach, to help individuals and teams to improve the quality of decision-making processes and the quality of decisions.
  • Building the capability of leaders to create, facilitate and embed high quality decision-making processes through specific training and coaching.

In future blog entries, I’ll describe some of the more specific findings of my action research and share more of my favourite Dilbert cartoons. In the meantime, when you next find yourself involved in some collective decision-making within your organisation, empower yourself to intervene in the discussion with these questions:

  • What is the decision that we need to make? How does it fit into the bigger context?
  • Who should be the decision-maker(s)? Who else needs to be involved, at what stage and on what basis?
  • What should be the process that we follow for making the decision?

My research shows that asking these simple questions will create clarity for you and for the others involved in making the decision, and that this will help to ensure that you reach a decision and help to ensure that you make a better decision. Let’s just hope that your own Pointy Haired Boss doesn’t respond this way:

Blog 19 media Dilbert 2

Jumping red traffic lights

I had an interesting and unusual opportunity to Think about thinking whilst driving to work one morning this week; a traffic light at a road junction near my house had broken and, in the direction from which I approached the junction, was stuck on red.

Description=19th September 2006. Pic: Paul Rogers A traffic signal showing red.

Initially, of course, I didn’t know that the traffic light wasn’t working properly. I approached the junction, saw that the light was on red and stopped my car. I sat there for a few minutes, waiting, then for a minute  longer, then another minute…. At some point, after it seemed that I had sat waiting at the red light for too long, it occurred to me that something might not be right. But I couldn’t be certain that the traffic light was broken since there was no visible sign of damage to it. In this uncertain situation, I had to decide what to do next…

What’s happening with the thinking of someone in this situation?

First of all, there’s the obvious cause-and-effect relationship between the red traffic light (an external stimulus) and stopping or remaining stationary at the junction (an acquired habitual response). For any experienced driver, this learned response to red traffic lights is very deeply ingrained; it’s fast, automatic and subconscious; it’s driven by our thinking System 1 (see “Lottery Logic” for an introduction to our two thinking systems). We see a red light and we stop reflexively.

Next, comes the experience of increasing uncertainty about the working of the traffic light. It’s been red for a long time now… Shouldn’t it have changed already? Could something be wrong with it? These are thoughts that would have started in my subconscious System 1, which continuously compares our current experience to our memories of similar past experiences in order to identify things that are unusual and might therefore be interesting or dangerous.

My System 1 has a sense of how long, typically, I would expect to wait at a red traffic light. As I sat waiting longer and longer at this red light, my System 1 would have identified a growing difference between its expectation and what was actually happening. This would initially have caused me to experience a subconscious sense of discomfort; if anyone had been measuring my heart rate or my skin conductance, those measurements would have been showing signs of increasing anxiety. I might even have begun to fidget and fiddle in the car as this tension grew, however initially I wouldn’t have been consciously aware of this.

At some point, the subconscious anxiety about how long I had been waiting relative to my expectation would have grown large enough to pop into my conscious thinking. Now, my conscious System 2 would be asking the questions: “Hey, shouldn’t this light have changed already? Could something be wrong with it?”

Waiting for the traffic light repairman

How long would you sit at a red traffic light that you thought might not be working properly before you took action? And what action would you take? What options would you even consider?

As we wait longer at longer at the junction, the difference between our current experience and our expectation of what’s normal gets bigger and bigger, and so our uncertainty about the working (or rather, the not working) of the traffic light gets smaller and smaller. At some point, we become sufficiently certain that the traffic light isn’t working properly to start to consider doing something other than just waiting.

What are our options? As I’ve reflected on this experience after the event, I’ve identified four possible options. Before you read them, tune in to your thinking; then, as you read the options, notice your instinctive reactions or judgements as your read each one:

  • Wait until the traffic light repairman arrives and fixes the traffic light.
  • Get out of the car and fix the traffic light myself.
  • Turn around and take an alternative route.
  • Drive through the junction even though the traffic light is red.

Blog 18 media traffic light repair

As I sat at the junction, did I consider each of these four options? My memory is that I only considered one option, the last one. This is the only option that I remember consciously thinking about. So we need to ask another question…

Did my subconscious consider each of these four options? It’s not possible to say for certain, however, given what we know about System 1, it’s reasonable to believe that it did consider and very quickly discarded the first two options. As you read each of the first two options, you might have experienced yourself making an immediate judgement, something like: “Don’t be ridiculous!” If so, that’s the result of your System 1 very quickly considering and rejecting those options. How?

  • Although we don’t know exactly how long it will take for the traffic light repairman to arrive, our intuition, which in this situation is likely to be based on our past experiences of waiting for repairmen, tells us that it will certainly be longer than we’re willing to wait. This is an easy exercise for System 1 in comparing past experiences with current desires.
  • Intuitively, we know that we have no experience of fixing traffic light ourselves. We’ve never even seen the internal workings of a traffic light (if you are a traffic light engineer reading this, then I apologise for the generalisation!). We can imagine that the traffic light’s workings might be complicated, even baffling. Our associative System 1 might recall occasions when we’ve looked at the inside of other electronic devices or machines and been aware that we have no idea how they work. This is another easy decision for System 1 based on our past experiences.

The key point for these two first options is that we probably did consider them and reject them, very quickly and without even consciously thinking about them. In our daily lives, how often does our System 1 consider an option and reject it without us being consciously aware of it? Under what circumstances might this cause us a problem?

Answers on a postcard please!

What about your immediate reaction to the third and fourth options? Did you find the third option either surprising or “stupid”? Did you judge the fourth option to be the “obvious” or “correct” one? My guess is that your answer to those last two questions was “yes” in both cases. Why?

Since I’ve described to you a scenario in which someone (and in this case it was me, but it could equally have been you) was sat waiting for a long time at a red traffic light, your subconscious has already made a decision about what you would have done in that situation. And your subconscious has decided that you would drive through the junction in spite of the red traffic light. Having made that decision, the third option should seem either wrong or surprising to you and the last option should seem correct and therefore “obvious”.

What’s most interesting to me is your reaction to the third option. Did you find it surprising or did you think it was stupid? If you found it surprising, then it’s possible, even likely, that your subconscious either never considered or never completed its consideration of this option. Because you hadn’t (fully) considered it, when you read the option it seemed new to your System 1 and that triggered a sensation of surprise.

Alternatively, if you found the third option to be stupid, it’s possible that your subconscious had already considered and rejected this option, so that when you read it you had the same sort of “Don’t be ridiculous!” reaction that you had for the first two options.

My guess is that most people will have found the third option to be surprising because their System 1 settled quickly on the fourth option as the most viable solution to the problem of the broken traffic light and didn’t consider (or didn’t finish thinking about) the third option. However, I’d like to hear from you, so please let me know what your experience was, either by commenting on this blog entry or by emailing me at bruce@electia.co.uk

If my belief that most of us, myself included, didn’t (fully) consider the third option is correct, it raises the following questions: How often do we make a decision without considering all of the available options? Under what circumstances might this cause us a problem?

Jumping red

Finally, what are the factors that shape our thinking as we consider jumping the red traffic light? Here are a few:

  • The automatic association in our subconscious between red lights (external stimulus) and stopping and waiting (learned response) that was described above.
  • Our beliefs about breaking the law (since it’s technically still illegal to jump a red traffic light even if it’s broken and stuck on red!)
  • Our beliefs about ourselves and our lives. (“I mustn’t be late for work!“)
  • Our herd mentality and our predisposition to conform to the social norm. (“Will anyone see me as I jump this red traffic light? What will they think?” For more information, see “Leaping lemmings, fads and frenzies“)
  • Available, relevant information, such as what signals are the traffic lights showing in the other directions, how busy is the junction, what are other cars doing.
  • Our assessment of the possible consequences for each available option. (“If I jump the red light, might I have an accident? What would happen if I did?”)
  • The emotions that we attached to and experience when we consider each of the things listed above.

Of course, we’re not (hopefully!) going to encounter many broken traffic lights in our lives. However the point of using an example like this one is to learn something about how our thinking works and how we can translate this to some or many of our other decisions. Within the ELECTIA approach to decision-making, the following questions are asked within the ELECTIA process.

  • What options exist? Which have I considered? What options exist that I have not considered?
  • Are there any options that I have ruled out too quickly?
  • What would be the possible consequences of my choosing each option?
  • What factors are influencing my assessment of the different options? Are they reasonable?
  • On what basis should I rule options in or out?

I’ll leave you now to look for opportunities to ask those questions as you make decisions in your daily life. In the meantime, may your traffic lights be green!

Leaping lemmings, fads and frenzies

Zebras do it. Cedar waxwings do it. Herring do it. The late majority and (eventually) even laggards do it. Lemmings even do it over cliffs, at least according to popular myth (see the section “Misconceptions” in Wikipedia’s entry on lemmings to learn about the role that the Walt Disney Company played in creating the popular myth about lemmings committing mass suicide).

Blog 17 media zebras

The “it” that I’m talking about is ‘following the herd’ (or flock or school) and the tendency that exists in many species of animals to do this is called “herd behaviour“. In turn, herd behaviour is driven by “herd mentality“, the innate desire of individuals to stay with and be a part of the herd.

As is the case for zebras, cedar waxwings and herring, humans are a social animal with an evolved preference for living in groups, so herd mentality and herd behaviour are an important part of Human nature and they often have a big impact on the choices and decision that we make.

Since humans have complex social lives and social structures, our herd behaviours also take on complex forms. All human groups develop shared “norms”; that is, commonly agreed standards of acceptable behaviour. Our herd mentality then encourages us to conform to these group norms as a way of being part of the group.

Tribal rituals, good manners and popular fads are all examples of human social norms that are used to define and bind together groups of people. And of course these social norms vary from group to group; what passes as “appropriate attire” in some cultures could lead to verbal or physical abuse in others, particularly for women.

There are good evolutionary reasons for herd mentality and herd behaviour. For many animals there is usually greater safety in greater numbers; think about those zebras while hunting lions are around. Meanwhile, for others, working together means a greater chance of finding or catching food; this time, think about the lions. And finally, if others in your group are doing something, then that thing is likely to be safe and rewarding, whilst if they are not doing something, then there’s a good chance that it’s not safe.

However, this is not to say that all instances of following the herd are rational or lead to good outcomes. In our modern world, there are plenty of examples of:

Blog 17 media witches

Herd mentality meets multiple personality disorder

Even psychiatrists, medical professionals specialising in the workings of the human mind, can suffer from erroneous herd mentality and herd behaviour. In 1973, a book titled “Sybil was published in the US. It claimed to present the true story of a woman who had developed multiple personality disorder, which for her manifested as 16 different personalities, as a psychological defence against the memory of horrific abuse that she had suffered as a child.

Sybil was made into a film of the same name that was released in the US in 1976. Following the film’s release, diagnoses of multiple personality disorder (MPD) soared in the US. In most cases, the condition was said to be the result of the repression of memories of childhood abuse. Tens of thousands of cases of MPD were reported in the 1980s and early 1990s, compared to fewer than 200 cases that had ever been recorded before Sybil was released.

However, in 1994, a patient who had been diagnosed with MPD successfully sued her psychiatrist for implanting false memories and for actually causing her condition. Many similar cases followed soon afterwards and the tide of psychiatric opinion turned against MPD. Leading psychiatrists in the US now regard MPD to be a very rare condition and the huge increase in diagnoses seen in the 1980s and early 1990s to have been the result of an erroneous popular fad among both the public and psychiatrists themselves.

Collective considerations

Herd mentality and herd behaviour describe what happens when individuals make decisions to follow the herd.  Since they are common to so many animal species, we can surmise that these preferences arise from our thinking System 1 (for an introduction to our two thinking systems, see “Lottery Logic“). However, humans possess cognitive capabilities not shared by other species; our thinking System 2 allows us to rationalise and this is something that we can do both individually and collectively; that is, humans are uniquely capable of making decisions as a group.

When it comes to making decisions within a group, our System 1 will inevitably be at work, since of course we can’t turn it off, and the extent to which we engage our System 2, through our efforts to rationalise collectively, will have a big impact on the quality of group decisions. Some groups are smarter than others when it comes to making decisions or solving problems and many factors have been shown to influence a group’s collective intelligence.

Collective stupidity: Pearl Harbour and the US government shutdown

One way in which groups can be collectively stupid is the well-known pitfall of groupthink. Here, all members of a group become fixated on a single shared belief, which they strive to conform with and to reinforce. There is a loss of critical thinking and challenge within the group, and the group insulates itself from any information that would contradict the shared belief that it holds. Usually, groups suffering from groupthink become convinced of their superior intelligence and moral authority… which is ironic, since groupthink groups are less intelligent and typically less moral.

There are a number of famous examples of groupthink, including the poor decision-making by the US Navy that permitted the devastating attack on Pearl Harbour, the fiasco of the Bay of Pigs invasion, and the Vietnam War.

Blog 17 media Pearl Harbor

Groups that must make decisions can also be stupid, and ineffective, when they deadlock. Here, two or more different factions within the group take positions from which they aren’t willing to shift or compromise, where those different positions are incompatible. An example would be the recent US political deadlock that led to the partial shutdown of the US government.

And although this example involves two groups (Republicans and Democrats) that together form a larger group (the US political establishment) in deadlock, the same paralysis of decision-making can arise even with a single stubborn individual if that person has the power to block a decision being made. It only takes one obnoxious guest to ruin a dinner party.

Collective intelligence: Diversity of views, empathy and turn-taking

So what are the factors that allow groups to be smarter? There are many, some relating to the composition of the group (i.e. who are its members), some to the mindset of group members, and others to the behaviours adopted within the group.

One scientific study identified that the collective intelligence of groups was:

  • Correlated positively to the average social sensitivity (empathy) of group members and also to the number of women in the group, something that is possibly explained by females having, on average, greater social sensitivity.
  • Correlated negatively to the variance in the number of times each member of the group spoke. In other words, groups dominated by one or a few individuals are less collectively intelligent.
  • Not correlated either to the average intelligence of all the individuals within the group, nor to the individual intelligence of the smartest member of the group, nor to group motivation or cohesion.

In addition, if we again consider the two ways described above in which groups can be stupid, groupthink and deadlock, we can identify some other attributes of groups that should contribute to collective intelligence:

  • Having a range of different perspectives and ensuring that each perspective is heard and considered.
  • Ensuring that the mindset of the group is focused on collaboration and constructive challenge rather than on either unanimity or conflict.

In “The Lucifer Effect: Understanding How Good People Turn Evil”, Philip Zimbardo describes these attributes clearly when he writes:

Majority decisions tend to be made without engaging the systematic thought and critical thinking skills of the individuals in the group. Given the force of the group’s normative power to shape the opinions of the followers who conform without thinking things through, they are often taken at face value. The persistent minority forces the others to process the relevant information more mindfully. Research shows that the decisions of a group as a whole are more thoughtful and creative when there is minority dissent than when it is absent.”

Making groups smarter: Some ground rules

Since the evidence shows that the smartest groups are those that function well, rather than those that have smart people in them, the ELECTIA approach to decision-making defines the following “ground rules” to support better decision-making by groups:

  • The roles of everyone involved in making the decision should be defined at the start of the process (see “Battling the zombie horde”).
  • All key stakeholders in the decision (i.e. anyone with the power to overturn the decision or to block its subsequent implementation) should either be present or should have provided their input and agreed to abide by the decision that will be made in their absence.
  • The mindset of participants should be one of collaboration and commitment to make and implement a decision, not of seeking unanimity.
  • Participants should be open-minded, flexible, questioning and testing. They should be encouraged to ask questions of others to explore alternative ideas and increase their understanding.
  • The group should focus on identifying common ground first, then identify and focus on differences.
  • The group should focus on identifying “win-win” options.
  • The group should ensure that it listens to dissenting voices and explicitly and rigourously tests its beliefs (see “Changeable weather and climate change”).
  • Everyone within the group should be allowed and required to participate and to express their perspective.
  • Attention should be given to the group’s behaviours and processes, and they should be moderated if necessary.

The lemming that shouted “Stop!

So, are we all, at least on some occasions, just like the mythical leaping lemmings, mindlessly following the herd? An awareness of herd mentality combined with the practiced ability to Think about thinking will allow us to more often identify the influence of herd mentality and conformity on our decisions and will allow us to be more choiceful about when we follow the herd, for good reasons, and when we don’t.

And what about the dissenting views? What about the lemming that shouts “Stop!”? How should we treat those voices? Can they stop the herd from going over the cliff?

The answer is that we should hear them, ask questions to understand their perspective, and then give their ideas the same rational consideration that we give all available evidence. Some minority voices are misguided and some may even be madmen rather than saviours. However, by bringing alternative perspectives and ideas into our collective decision-making and by allowing ourselves to be challenged constructively by them, we reduce our risk of following erroneous fads, like the craze among psychiatrists in the 1980s for diagnosing multiple personality disorder, or falling into groupthink, like the naval strategists at Pearl Harbour.

The first popular fad that we might seek to resist in this way is the common belief that lemmings are obsessed with killing themselves en masse. Right here, right now, the lemming has shouted: “Stop!

Blog 17 media lemming

Battling the zombie horde

Large organisations beware! Hard-to-kill zombie decisions lurch through your meeting rooms and open plan office spaces, mauling your employees and thereby reducing their productivity!

Blog media zombie

This graphic idea was thrown up by some action research that I’m currently doing with a large organisation. The research found that a small but significant proportion of decision-making conversations: a) were revisiting decisions that had been discussed previously, and b) failed to reach a resolution, that is: they failed to actually make a decision.

The implication is that these decisions are not being made, but are instead being continuously, unsuccessfully revisited. The participants in the action research labelled these as “zombie decisions” and described them as being frustrating and a waste of time.

What makes a zombie decision?

Factors that are likely to create zombies decisions are: lack of clarity on accountabilities and processes, organisational politics, and limiting assumptions or beliefs about the importance of a decision. For example, a decision could become a zombie decision when:

  • Whoever is “holding” the decision (that is, is responsible for it) is not themselves the decision-maker; for example, if they don’t have the authority to take the decision.
  • Who the decision-maker should be isn’t clear. For example, this situation could arise with a decision that sits across two different departments, where it’s not clear which department has the right or the authority to make the decision.
  • Those who should be involved in making the decision either can’t agree on what should be done or simply aren’t engaged in making the decision.
  • There is either no clear path for escalating the decision for resolution or the person holding the decision is unwilling to escalate the decision to their seniors because they feel that the decision is too minor to deserve seniors’ attention and/or that they ought to be able to resolve it on their own.

One of the participants in the action research described these decisions as “getting stuck in the system”.

What can be done to identify and eliminate zombie decisions?

One option for eliminating any zombie decisions that already exist might be to have a “zombie amnesty” or a “bonfire of the zombies”. This could be a departmental or even organisation-wide initiative that takes place once each month, quarter or year. Employees trying and failing to deal with zombie decisions could either be empowered and required to just make the decision themselves, or they could be encouraged to bring the decision to senior management for resolution, with no judgement made about the small-ness or pettiness of the decisions so escalated.

How can organisations prevent zombie decisions arising?

Of course, the ideal situation would be for no zombie decisions to be created in the first place. The most powerful weapon to prevent new zombie decisions arising is to ensure clarity on: what exactly is the decision to be made; who are the decision-owners, the decision-makers, those to be consulted, etc.; and, what is the decision-making process that will be followed.

The ELECTIA approach to decision-making addresses this by asking exactly these questions within the first step of the ELECTIA process: Describe and frame the decision.

  • What is the decision to be made?
  • What is the purpose of the decision?
  • Who is the decision-maker?
  • Who else will be involved and on what basis?
  • What is the process that will be followed? Who will be involved at each stage?
  • For decisions with more than one decision-maker, are we looking for consensus or majority?

For commonly-made decisions, tools such as standard operating procedures (SOPs), project or team charters, RACIs and so on can be used to define and record standard answers to these questions.

A zombie plague

Many modern horror movies and novels, including the recently released “World War Z“, describe a zombie plague or zombie apocalypse in which zombies attack and either kill or infect living humans, and in doing so they create new zombies. “Zombie-ness” therefore spreads like a disease.

It might be taking the zombie analogy one step too far, but is it possible that within large organisations one zombie decision might spawn other zombie decisions? It seems reasonable to believe that one decision that isn’t made might cause other decisions to not be made. Could this happen so often that the whole organisation becomes over-run with productivity-destroying zombie decisions? Large organisations beware!

Blog media zombies 2