Not Quite Rational

Psychology

Not Quite Rational

Cognitive biases – what they are and how they affect our decisions

By Dylan Rose
07/03/2019

Background

Many of us believe ourselves to be mostly rational most of the time. We tend to believe that we are in complete control of what we do, and we tend to also believe that we make the best decisions we can with the information available to us. 

This notion is, however, patently false. As I covered in my previous article, much of what we do is dictated by habit. According to one study published in the Journal of Personality and Social Psychology, between one-third and one-half of our daily activities fall under the category of habits (Wood, Quinn, & Kashy, 2002).

Furthermore, even much of what isn’t technically habit is decided by autonomous processes in our brains without ever making it into conscious thought. In other words, we can do things without conscious awareness that they are even being done.

System 1 and System 2

Daniel Kahneman, a psychologist, economist, Nobel Laureate, and authoritative figure in judgement and decision-making, has covered this topic extensively in his book, Thinking, Fast & Slow

He creates a distinction between what he refers to as “System 1” and “System 2,” which can be summed up as a quick, efficient, automatic, and intuitive method of thinking – and a slower, more deliberate, effortful, and accurate form of thinking. System 1 is in charge of decisions that need to be made quickly, in which there is either no time to think effortfully, or no perceived reason to do so. System 2 is in charge of more complex analyses that demand special attention and extreme accuracy (Kahneman 21).

Who's really in control?

The thing to note here is that System 2, which we tend to identify with as us, actually takes a back seat to System 1, over which we have no control. As Kahneman states, “In the unlikely event of this book being made into a film, System 2 would be a supporting character who believes herself to be the hero… the thoughts and actions that System 2 believes it has chosen are often guided by the figure at the center of the story, System 1” (Kahneman, 31).

Cognitive Biases

Unfortunately, both systems can be consistently fooled. Cognitive biases are systematic errors in our thought processes that influence our judgments and decision-making. They seem to be inherent flaws in the wiring of our brains that lead to consistent, systematic, predictable, reproducible irrationality. We have come a long way in understanding them, but we have not overcome them.

Cognitive biases help explain all sorts of irrationality in the world. How is it that otherwise highly intelligent people can hold onto blatantly irrational prejudiced thoughts? Why do so many people have difficulty managing their money? Why are politicians elected based on charm, good looks, or anything other than their capability of effectively performing their duties? These little brain malfunctions are so pervasive that they influence everything from our daily decisions to who runs our nations. 

Truthfully, you can’t escape cognitive biases. Even learning of their existence and coming to understand them doesn’t save you from their influence. They are the literal incarnation of the phrase “to err is human.” Coming to understand them and how they influence our decisions, however, allows us to occasionally catch ourselves in the act and mitigate the mistake as it is being made. 

Moving forward, I will be outlining some of the most common cognitive biases and how they affect our decisions.

Overconfidence Effect

The overconfidence effect is a bias that occurs when people have significantly more confidence in their own judgements than the objective accuracy of their judgements shows they ought to be

To illustrate this, consider what would happen if you asked someone what the likelihood of some event occurring was, such as their favorite sports team winning the big game, and they responded with a confident 100%. If human confidence were perfect, this would mean that their 100% assessment would be accurate 100% of the time. 

Now, if you went on and asked a fan of the opposing team, they might also give you a 100% confidence in their team winning. Even in a less extreme example, imagine both fans estimated that their own team had a 60% chance of winning. Both estimates can’t be right – there is only 100% to work with. 

This effect might explain why so many people go broke opening businesses in risky industries. The Small Business Association states that 50% of new businesses fail during the first five years, and 66% fail during the first 10 years.

At this rate of failure, prospective entrepreneurs ought to be balance the risk vs reward and be mindful of where they allot their capital. Yet many new business owners are 100% confident that they can make their business successful with the right amount of talent, dedication, and entrepreneurial magic. 

They throw all their personal resources at the business, take out a second mortgage on their home, and risk losing everything. The reality here is that even if you do everything right, your success is still heavily dependent on a variable you can’t control – random chance. The market might dip the day you open your doors, and you would never have a way of knowing.

Hindsight Bias

Hindsight bias refers to the phenomenon in which people believe that events that have already occurred were more predictable before they occurred than they really were. A simple way to sum it up – “I told you so.”

A common example of this is past elections. When Donald Trump was elected as President of the United States in 2016, many people were surprised. Many others, however, claimed that it was obvious that Trump would win the presidency. 

It was, in fact, not obvious. Pre-election polls aren’t always accurate, and there are always unknown factors. Prior to the election, most polls placed Hillary Clinton at the top, with an extremely strong chance of winning. As the day of the election started, New York Times projected Hillary to have over an 80% chance of victory. As more votes rolled in, however, her chances lessened, eventually dipping below five-percent. In the end, Donald Trump was the victor, but it was never obvious.

Availability Heuristic

The availability heuristic is a shortcut of sorts that our brain employs when gauging the frequency and importance of an event. In a nutshell, it states that information that is easily recalled is often perceived as more important and more common than it really is. 

For example, people tend to fear murder, plane crashes, and terrorism far more than they fear heart disease, diabetes, and Alzheimer’s disease. Murder, plane crashes, and terrorism, however, are exceedingly rare causes of death. Heart disease, diabetes, and Alzheimer’s disease, on the other hand, are some of the leading causes of death in the United States. 

Since violent causes of death lend well to sensationalism, they tend to be covered much more frequently in the media, and the stories elicit powerful emotions. This repeated exposure combined with the strength of the emotions it incurs causes the information to be readily available. Consequently, we tend to believe that these causes of death are significantly more common than they truly are.

Anchoring

Anchoring occurs when a decision is affected by a piece of information presented to us prior to making the decision. 

Imagine, for example, a salary negotiation. If the employer suggests the first figure and lowballs it, at, say, $28,000, then the employee must anchor to that figure and adjust to a figure they feel is more reasonable – for example, $35,000. After a few rounds of negotiation, the agreed-upon figure is likely to be closer to the first number cited. Let’s assume they end somewhere around $32,000.

Alternatively, imagine that the employee suggests the first figure and goes with something relatively high, such as $42,000. The employer must now anchor to that figure and adjust to something they are more comfortable with. Now the employer is stuck at $35,000, a figure that they originally deemed too high. After negotiation, they settle on $38,000. 

In this example, offering the first figure made a $6,000 difference in both circumstances. The anchoring effect can influence our decisions in more circumstances than just finance, and it can often prove to affect the end-result rather significantly.

Conclusion

Cognitive biases are a very well-documented topic of research. They have common application in fields such as psychology, sociology, and behavioral economics. Understanding cognitive biases can help us further understand ourselves, as well as the society that we live in.

 

If you wish to learn more about cognitive biases, you can find a rather complete list of them on Wikipedia here: https://en.wikipedia.org/wiki/List_of_cognitive_biases. If you follow the sources, you will find a great deal of reading material.

I also highly recommend checking out the book Thinking, Fast and Slow, by Daniel Kahneman, as well as Predictably Irrational, by Dan Ariely. 

Sources: 

Kahneman, D. (2011). Thinking, fast and slow. New York, NY: Farrar, Straus and Giroux

Wood, W., Quinn, J. M., & Kashy, D. A. (2002). Habits in everyday life: Thought, emotion, and action. Journal of Personality and Social Psychology, 83(6), 1281-1297.  doi: http://dx.doi.org/10.1037/0022-3514.83.6.1281

https://www.nytimes.com/elections/2016/forecast/president

Dylan Rose

Dylan is a California-born, Utah-based author with a background in business, psychology, and behavioral economics. He likes to write about topics that can provide useful tools and positively impact the lives of his readers.

Tell us what you think!

Close Menu
shares