How to Overcome Analytical Bias to Become a Stronger Decision Maker

Introduction

How would you rate your own decision-making success? Are you a generally good decision maker? Why isn't everyone a great decision maker? What is holding us back? The answer is usually analytical bias and poor decision-making habits.

Without discipline and training, we tend to be lazy decision makers— even when it comes to important decisions.

The natural tendency in decision making is:

  • To consider only those alternatives that are obvious
  • To analyze only the areas of uncertainty with which we are familiar
  • And then to compare the options through a haze of bias and assumptions

Such decisions are made quickly following poor practices, and as a result are typically of poor quality. A good decision is made when the decision maker is fully informed about the implications of their choice. The best decision is the choice that offers the greatest chance of the most desirable outcome. Knowing what is best requires that all possible alternatives be identified and then evaluated objectively.

Well-informed, balanced analysis leads to the kind of decision you would want a brain surgeon to make on your behalf. High-quality decisions therefore require that you know as much as possible about the pros and cons of how different choices might turn out. A good decision is one in which you make while understanding any risks you are taking.

The bad news is that most people are not very good at objective problem analysis. In the end, problem solving is performed by trial and error. Even highly educated people typically muddle through problem analysis in a haphazard way. Most people are content with an occasional success and assume that no one else could do any better. Becoming a better decision maker begins with an understanding of the forces that are working to undermine the quality of one's decision making. This paper is a discussion of the effect of bias on the way we perceive problems and on the reasoning that leads to our decisions.

Bias

Bias is a viewpoint that prevents people from being objective. Bias is created by experience, education, and genetics. It is the expression of how one thinks and reasons about particular subjects. Bias, in its various forms, discourages us from being thorough in our problem analyses. It exaggerates our understanding of the factors that relate to a decision and encourages quick, poorly informed decisions.

There are several mental forces (influences) that contribute to our bias or viewpoint. These forces are always at play in the back of our minds, undermining the quality of our reasoning. They are the barriers to consistently brilliant decisions, which we all encounter.

Instinct and Intuition

With natural or intuitive problem solving and decision making, people are content to make quick decisions based on their biases and beliefs, rather than based on facts. Decisions are made quickly rather than thoroughly. People use their "guts" rather than their brains to arrive at conclusions.

Even important decisions are typically given as little thought as possible. Methodical problem analysis is extremely rare. It's as if additional time and energy spent on understanding a problem could not change the quality of the decision.

Intuitive decision making follows familiar patterns. The more familiar a situation seems the less analysis is applied. For example, in many homes the same remedy, such as chicken soup to cure a cold, is used to treat an illness year after year, even when it does not work.

Typically there is no systematic analysis of the options, alternatives, or implications of a particular decision. We jump to conclusions. If any problem analysis is performed it is used to find evidence that supports a familiar or convenient solution. Analysis tends to be self-satisfying, not self-critical.

In order to develop into effective decision makers, it is necessary to overcome the tendency to choose a convenient solution. We must not let instinct or intuition control our problem analysis.

Barriers to Brilliant Decisions

Broadly speaking, there are four categories of mental influences that affect decision making:

  • mental short cuts
  • emotions
  • stubbornness
  • focus

These forces are the barriers to objective problem analysis. Collectively, they undermine one's willingness to be thorough and therefore their ability (or inability) to be objective. Understanding the combined effect of these forces is an important first step toward becoming a better decision maker.

Mental Shortcuts

Strange as it may seem, most routine decisions are made unconsciously. Imagine eating a meal. Other than what to eat and when to begin, the process of eating is automatic. No thought is required for raising a sandwich to your mouth, chewing, and swallowing. All of the decisions about how to eat are made unconsciously. Most routine decisions are dominated by unconscious thought processes. They use mental shortcuts to simplify repetitive actions. Our brains work like planes operating on autopilot.

Imagine seeing someone in a white lab coat jogging across a busy road toward two cars that are stopped by the roadside. People are milling around. Based on what you have seen, you assume that the person in the white coat is a doctor and there has been an accident. Someone is hurt and needs assistance.

With just a few clues, we are able to paint a mental picture of what is happening. Mental shortcuts are used to fill in the details for which we lack any real evidence. Of course, the details that are added by us are imaginary. They are driven by biased interpretation of what has been witnessed. The point is that much, if not most, of our thinking is done on autopilot. Our mind interprets situations for us by filling in details based on previous experience.

Mental shortcuts are an integral part of routine thinking. They help us cope with a complex world by assimilating thousands of bits of sensory information every day, which might otherwise drive us crazy. They are a necessary convenience. Unfortunately, mental shortcuts encourage superficial thinking. They are the motors of our bias.

Shortcuts, leading to biased analysis of situations and problems, are always at work in the background of our minds, helping to assess situations by simplifying problems and drawing parallels with the past experiences— whether this is justified or not. Mental shortcuts are assumptions about situations based on past experiences. These are combined with observations to provide an adequate explanation, then the mind moves on to another issue.

The interpretation of a situation is presumed to be correct unless proven otherwise. Even though there may be no actual facts, just suppositions. Mental shortcuts discourage thorough problem analysis. They exaggerate one's understanding of situations and undermine objectivity.

Patterns in Everything

The basis of mental shortcuts is patterning. We think in patterns—our minds use patterns to organize the world around us. Patterns provide explanations for how things work and how they relate to one another. We seek explanations for everything, and therefore, we find comfort in patterns. The compulsion to explain, or determine the patterns, drives scientific inquiry.

Unfortunately, the compulsion to explain is not bounded by reason. If a logical explanation does not fit, or we do not have enough information to support a logical explanation, the mind will make up its own explanation. When observing a situation, the mind needs only a few clues—evidence of a pattern—in order to fill in the details.

For example, while walking to work you smell smoke and hear a siren in the distance. Instantly, you link the two and imagine a building on fire. However, contrary to the pattern being imposed on the data, there is no fire. The smell of smoke is from a barbecue, and the siren is from a traffic cop—two unrelated events. But most of us do not bother to test our assumptions, preferring to believe what first occurs to us.

Patterning is a reflex—a mental shortcut—and the problem is that patterns imposed on a situation are not necessarily correct. We just assume they are. Later in the same day we may even tell coworkers that we "saw" a fire on the way to work. We believe the conclusion to which our mind jumped and reality gets confused with imagination. The boundary between the two is lost.

Patterns Allow for Mental Shortcuts

Explanations create patterns, and patterns allow our mental autopilots to be programmed to follow familiar routines. Mental shortcuts help us simplify the world. They are essential. The downside is that they simplify our thinking about the world, leading to superficial analysis and poor-quality decisions.

It is not that patterning mental shortcuts is necessarily bad. We would be unable to function in a complex world without mental shortcuts. For trivial decisions, such as what to eat for lunch, having our thinking driven by mental shortcuts is not a significant weakness. The problem is that since mental shortcuts are unconscious, we are not aware of their impact on the quality of our problem analyses, whether that impact is good or bad. Everyone presumes to look at problems objectively. In reality, truly objective thinking is not possible. No one is a machine. All thinking is influenced by bias.

Bias deepens with experience. Consider your opinion on the subject of decision making. You started reading this paper with beliefs (bias) about how people think. After reading this paper, your modified beliefs (bias) will be a blend of what you already believed plus selected opinions expressed here. Which of the ideas expressed in this paper are you most likely to remember? The ones that support what you already believe. People naturally collect evidence that supports their beliefs. Anything that contradicts those beliefs tends to be treated as unreliable and ignored.

If you had a strong opinion about decision making before reading this paper, the arguments made will have two possible effects:

  1. If you disagree with anything that is said, you will probably ignore it.
  2. If you agree with the arguments made, your bias will become more deeply entrenched.

It is a rare person that truly reads with an open mind.

Any Explanation Will Satisfy

If the mind cannot see a pattern in the information that has been received, it will force an explanation to fit. When presented with bits of information that have no particular relationship, we find one anyway—even if the explanation is not valid. Subconsciously, it is more important to have an explanation than for the explanation to be sensible.

For example, if you smelled smoke and heard a siren, would it not make sense to expect a pillar of smoke to be visible somewhere nearby? Without more evidence of a house fire, the smell of smoke and sound of a siren should not be linked together. But instead of looking for evidence to disprove our assumptions, we accept the idea that first jumps into our heads and leave it at that.

Convenient Explanations Stop the Search for the Best Answer

When it comes to decision making, people tend to be satisfied with the first explanation that fits. Once we have decided on an explanation, we give up the search for alternatives. For example, the newspaper reports an increase in drug-related crimes in your community. In a separate report, you read that there are more immigrants in your community. Using mental shortcuts your mind relates the two stories and comes to a conclusion that you believe is logical because of your biases and assumptions.

Without any real evidence you start to believe that crime and immigration are related. Over time you pay special attention to news reports that appear to support your position and gradually become ever more convinced. Finding additional evidence that supports your conclusion proves that you were right in the first place. No effort is made to prove that you were wrong. No effort is made to test the validity of "facts" used as evidence.

Emotions Help Limit the Field of Choices

Mental shortcuts are not the only problem when it comes to intuitive decision making. Emotions, focus, and stubbornness can also lead to distortion. In this context, emotions are defined as anything that affects your mental state. It does not necessarily refer to an emotional decision.

Whether you're experiencing sadness or spine-tingling elation, or something in between, your emotional state affects your ability to objectively analyze a situation. Emotion limits our ability to reason objectively, making an already dysfunctional process even less effective.

The Ability to Concentrate Despite Distractions

Focus is the ability to concentrate on a task. It is what permits a mother to prepare dinner while six children run rampant at her feet. Focus is the ability to filter out sensory noise and to concentrate on accomplishing a task without being distracted. It is essential to our very survival.

However, when it comes to problem solving, there is a huge downside to the tendency to focus. A narrow focus encourages us to view problems one-dimensionally. We latch on to the first solution that provides an explanation, even though the explanation may not be logical. Having focused on a solution, we become content with it and lose interest in alternatives. Mental laziness kicks in. Any evidence that contradicts our choice is discredited and devalued. A narrow focus in problem analysis limits the field of view. It allows us to see only what we want to see.

Stubbornness

People are stubborn. Once an idea or decision takes root in our minds, it is nearly impossible to dislodge. It is very difficult to change the patterns that people use to explain the world. Many of our most cherished beliefs are simply illogical. And we know it, but it doesn't matter, and we rationalize away the disparity.

For example, most people think they are good decision makers. It takes a training course and a persuasive instructor to convince even a few students that this belief may not be correct. A successful training course will convince only a small percentage of students that becoming a skillful decision maker requires considerable effort. Most will assume that they are among the few with natural abilities. Arbitrary beliefs to which we stubbornly adhere wreak havoc on our ability to analyze objectively and to solve problems effectively.

Battling with Intuitive Thinking

The point is that a myriad of natural mental barriers prevent us from being effective decision makers. These mental forces undermine objectivity and result in self-deception about our level of knowledge and perspective. Intuitive decision makers do not actively seek alternatives that would conflict with the preferred, convenient solution. Doing so would slow things down and, the logic is that, "We have no reason to believe that a decision made more slowly would be any better."

In order to improve the quality of one's decision making, it is necessary to understand the shortcomings of intuitive, unstructured thinking. The first step in becoming a rational decision maker is therefore to develop a comprehensive understanding of how one's mind operates versus how one assumes it operates. The second step is learning how to dismantle problems into manageable bits that can be more easily and systematically analyzed.

Critical Thinking to the Rescue

Moving beyond the natural boundaries of our own minds requires that we think critically by analyzing our knowledge and that of others. Critical thinking is the discipline of making sure that you use the best thinking of which you are capable in every situation. In order to become a critical thinker, you must understand how the mind reasons, and use that understanding to balance your analyses.

What are the symptoms of being a critical thinker? Critical thinking is when you question your own and other people's assumptions, reasons, motivations, and outlooks. Further, questioning must not be focused on generating mere contradictions. Instead, it should be focused on the discovery of context, reasoning, and point of view. Critical thinking asks questions to answer questions. It seeks reason and logic as the basis of understanding.

Determining the Extent of Our Ignorance

In effect, critical thinking puts the extent of our real understanding and knowledge into perspective. It illustrates what we do and do not know by revealing the nature and significance of assumptions and gaps in information.

The surprising outcome of critical thinking is not to demonstrate our knowledge of a subject, but rather to illustrate our level of ignorance. And that is the point.

The quality of a decision, in foresight, is the degree to which one is informed about the risks involved and the uncertainties associated with alternatives. We never have perfect information. Knowing what we do not know is what leads to the best possible decision.

Conclusion

People are all too willing to come to a conclusion and make decisions on the basis of little or no valid information. We let personal biases and other intuitive mental forces push our problem analysis towards quick decisions.

In contrast, good decision makers battle the tendency toward lazy thinking and make the effort to really analyze issues. To become a better decision maker is to become a critical thinker. Only by being an effective critic of your own reasoning—separating fact from assumption—will you be able to make consistent, high-quality decisions.

About the Author

Brian has been a contract instructor for Global Knowledge since 1999. He divides his time between management consulting, project management, technical writing, and professional development training. Brian is an entrepreneur who has started companies in such diverse fields as fish farming, woodwork, gift manufacturing, and catering. He is the author of numerous training courses relating to professional skills, project management, and decision making.