"Thinking Fast and Slow"
I’m inspired to write today by the book Thinking Fast and Slow by Daniel Kahneman
The author illustrates a story about a man and his kid who come across a lion for the first time. He doesn’t know what it is so he gets too close and the lion eats his kid. Horrible story to start a blog post with, but the point is that the next time he spots a lion far away he immediately hides himself and his kids. Humans aren’t often hunted down by wild predators these days, but there are new ways of becoming prey that make this idea worth writing about today. This story sets up the two types of thinking described by Kahneman, which he calls System 1 and System 2. Personally, I prefer names that mean something so I can remember which one is which more easily, but since it’s only two systems, I’ll let it slide.
System 1 thinking is fast and automatic, much like the father after the trauma of his first lion encounter. This system is where you make decisions based on past experiences. I call this “programming” in my office. In System 1, you can make decisions with cognitive ease. You barely need to think at all. This can be both good if it’s based on mastery, like a skilled jazz player, or bad if it’s based on emotional trauma.
System 2 thinking is slow and logical. It’s deliberate and takes some effort. This is making decisions with cognitive strain.
Many of us think we are operating in System 2 when it’s actually System 1. We are highly programmed. Much more than we think. And there are many ways we can be taken advantage of because of fast and automatic System 1 thinking. I listed a few out below, but there are many more. The goal here, like in all things for me, is to have more peace of mind. I’m not describing these concepts to cause anxiety or fearfulness of being taken advantage of. I’m doing it to encourage empowered decision making.
The concept of Anchoring describes how we depend on relative information to make or approximate guesses. Influencers who shape what we anchor our information to can manipulate our conclusions. For example, you might not have thought that a particular jacket from TJ Maxx was worth $30 if I held it up and asked you, but if I told you that the suggested retail price is $99 before I asked you the $30 might look like a great deal. That’s why suggested prices are listed above actual prices.
Availability Heuristic Bias is another one. The best example of this is making decisions based on availability of information. If you watch the news, you would confidently guess that terrorists are a bigger threat to your life than TVs. This is because you constantly see terrorism stories on the TV. But what the TV doesn’t want you to know is that statistically you are 55 times more likely to be killed by a TV falling on you. Also, a coconut falling on you. I just looked that up actually. I was afraid to look it up while living in Florida. I didn’t want to know the truth (this is known as an Ostrich Bias if you care to know). But I always suspected those things were killers. I thought about it every time I went for a run.
There is also a bias called the Jones Effect or Bandwagon Effect. This is the idea of “keeping up with the Jones.” Humans in general want to feel part of a tribe, or more maybe more accurately, we don’t want to be left out. Therefore, we can be easily persuaded to choose whatever is popular in our social group, for example, liking the music that your friends like or buying a product that all your friends are buying.
The Confirmation Bias is when we seek out information or filter data based on things we already believe. This tends to be a big problem when conducting scientific experiments. You can easily bias an experiment design in order to prove what you already think you know. This is why you should also investigate where the funding behind a study came from. Especially when it involves nutrition.
The Framing Bias is the secret to salesmanship. This is how you influence choices with your wording. On average, people think that “98% fat free milk” has less fat than “1% fat milk.” Not kidding about that. They must not do the math and simply think that fat milk has more fat than fat free milk.
The Sunk Cost Fallacy can be illustrated a couple ways. Think about a poker player who lost $1000 and thinks that the odds are now due to shift in his favor so he keeps playing. In reality, the odds are the same every time he decides to play. In other words, every game starts with the same odds. Like flipping a coin. Each toss has 50/50 odds even if the last 10 in a row have been tails. This also applies to someone who decides to get healthier, but can’t throw away the junk food in their home. They have to finish it before eating healthy because they don’t want to waste it. The moral of this bias is to not let past decisions influence your ability to make good decisions in the present.
Please don’t be an anxious consumer because of this post. Try to be an informed consumer. Slow down and take advantage of System 2 thinking before making big decisions.