7 Biases That Everyone Has (And How They F*ck Up Your Life) –




Here’s a newsflash for you: There are 7 billion or so people on this planet and most of us, have done or said something monumentally stupid. Like watching Long Island Medium, thinking it would be funny. Or that time I decided chocolate and spaghetti were really meant to be together, or that time I decided to drink a bottle of wine and woke up 2 hours away from home, on a train holding a bunch of roses with the roots still attached.

Life is full of dumb decisions. Our brains are against us. Why?

Cognitive bias. Cognitive bias is an error or glitch in thinking where people make decisions based on their emotions, beliefs systems, hormones, likes and dislikes rather than through critical evaluation and the scientific method.

Bias does exist for a reason. Every day, we are required to make decisions – often at a moment’s notice. We need a way to filter through the vast volumes of information we’re constantly bombarded with and bias is the brain’s way of simplifying the decision-making process. And yet, if we aren’t aware of these cognitive bias we’re more prone to make poor decisions, believing in conspiracy theories and get ourselves kicked out of that gay bar in town because Vodka makes bad decisions for me.

We’re going to look at ways our brains trick us into making poor life choices, in particular, I’m going to pick on anti-vaxxers because, well – their choices are monumentally fucking stupid. We’ll follow Heather Dexter, the woman who chose her ego over her children and left them to suffer through whooping-cough for 6 months so we can see how cognitive bias can fuck up everyone’s day.

Affect Heuristic: Never Trust Your Gut


It may not come as a shock to you but humans are emotional creatures, and not surprisingly our emotions tend to influence the decisions we make.

Most of us, at one point in our lives have used a form of decision-making shortcut called “affect heuristic” when we’re faced with a difficult decision, and for most of us – we’ve learned that making a decision in the heat of the moment can be a terrible idea.

If you’ve ever gone “with your gut”, taken a dump on your boss’s desk in anger or experienced road rage, you’ve used this shortcut. It isn’t so much a cognitive bias, as much a glitch in thinking and it can lead to good outcomes and some very bad outcomes.

Researchers have found that if people have positive feelings towards an activity, then they’re more likely to judge the risks as low and the benefits high. On the flip side, if their feelings towards an activity are negative, they’re more likely to perceive the risks as high and benefits low. And this class, is partly how anti-vaxxers operate.

 How It Fucks You Over:

New Line Cinema

Anti-vaxxers such as Heather have issues with vaccines – that much is clear. Conspiracy theories and lies have fuelled her negative feelings, largely due to her “training” in the field a naturopathy – the alternative medicine version of “throw shit at a wall and see what sticks” in terms of treatment. Indeed, those ideas fuelled her negativity so much so that she perceived a child getting a vaccine to be of a higher risk (of what, I do not know) with fewer benefits than contracting a deadly disease.

And the mind-fuckery doesn’t stop there. The anti-vaccine movement actually see getting preventable diseases as a positive thing, therefore the risks associated with mumps, measles, rubella, whooping cough or chicken pox is much lower and the perceived benefits much higher than getting a vaccination.

Suffice to say, a choice made from emotion and fuelled by incorrect information is a choice you shouldn’t be making. Did Heather re-evaluate her choices once her children’s illness became worse? Don’t be silly, that would be smart or something.

Confirmation Bias: Think You’re Always Right?


Another fun-filled fact: People like to think they’re correct – All The Time.

We like to feel special or superior to others. We get off on the joy that is being right about the fact that drinking three jugs of cider at my engagement party was a dumb idea that my wife was right about and my only pleasure that got me through my hangover was that I was less drunk than she was. I’m sorry I puked in your letter-box random stranger.

Confirmation bias is the terrible method we use to prove our “rightness”, it happens when we search for or misinterpret information in order to confirm or strengthen our preconceived ideas or belief systems.

It also explains why people have a tendency to dismiss evidence that doesn’t support their beliefs – it’s easier to feel “right” when you ignore everything that proves you wrong. That information might be gathered from Facebook groups or forums rather than medical journals, science classrooms, and laboratory research.

  How It Fucks You Over

TriStar Pictures

Engaging in confirmation bias has far-reaching impacts. If you actively ignore evidence in favour of your own belief systems, not only are you getting incorrect information, but your choices can negatively affect those around you.

In the case of Heather, she just couldn’t seem to understand that her choices were affecting her children. Did she reconsider her ideas? Perhaps for a fleeting second, but she doubled-down and ignored her family’s pleas and continued to her end goal which I presume is to experience what it’s like to purchase child coffins.

A person may have a belief system they have emotionally or financially invested themselves in and their ego won’t allow them to push it aside. Confirmation bias is the brain’s way of preserving that belief, because it’s easier to feel confident in your “rightness” when the only people around you are those patting you on the back and praising your choices.

Bandwagon Effect: If All Your Friends Jumped Off A Bridge…


Say you’re at a party and everyone there loves peanut butter/hates vegemite, and you love vegemite/hate peanut butter.

Imagine your love of vegemite is found out and everyone at the party starts bagging you out and trying to convince you that peanut butter doesn’t taste like horse shit because clearly they’re wrong in every aspect of their lives.

“I like vegemite (true), peanut butter is gross (also true)”, “You’re wrong” they all reply. After a while you lose your mind, give in and start agreeing that peanut butter is awesome and vegemite is evil, and as so many people have repeatedly told you dirty lies – you begin to believe them.

This is a phenomenon called the “Bandwagon effect”. Essentially, it’s the tendency to do or believe things because many other people do (or believe) the same thing. Not surprisingly, there is a greater chance of a person believing in something, the more people around them who believe the same thing.

Additionally, people can do or believe these things regardless of the beliefs they already hold, they may even override their original beliefs with new ones. Now, I’m not certain whether Heather was always an anti-vaxxer or not, however, the phenomenon is pretty well illustrated by this article “Confessions of a (Reformed) Natural Mom” from Amber, over at Go Kaleo. You start out with a bit of Googling, find a peer group who all parrot the same ideas and without noticing, you begin to believe them. Because there is no one left in your life to tell you anything different.

 How It Fucks You Over

Walt Disney Pictures

Tackling our bandwagon biases is important because it frees us up to be more objective and rational about our decisions and it really could have saved me from the third Harry Potter movie. I know, the directors were keen to make bank from the franchise but fuck your anthropomorphic werewolf. No. Whichever bunch of people thought that was a good idea need to start looking for other kinds of work.

Engaging in bandwagon bias and believing in things without or despite supporting scientific evidence just because a large group of people feel or parrot the same idea, isn’t a rational method of decision making and it doesn’t give any credibility to the ideas you believe in. You haven’t evaluated any assertions made, you’ve just gone with the flow.

This mode of thinking can lead to people not vaccinating their children, it may lead to hate crimes stemming from learned bigotry and terrible sequels that don’t really exist called “Alien 3” and “Alien Resurrection”.

The Dunning-Kruger Effect: Too Stupid To Know You’re Stupid

Universal Pictures

Named after named after David Dunning and Justin Kruger of Cornell University, the Dunning-Kruger effect occurs when unqualified or unskilled individuals hold an illusion of being far more knowledgeable or skilled in an area than they actually are. I refer to it as being too stupid to know you’re stupid.

The explanation for this was reported by Dunning and Kruger in their 1999 article, in the Journal of Personality and Social Psychology “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments“:

…people who lack the knowledge or wisdom to perform well are often unaware of this fact. We attribute this lack of awareness to a deficit in metacognitive skill. That is, the same incompetence that leads them to make wrong choices also deprives them of the savvy necessary to recognize competence, be it their own or anyone else’s.”

 How It Fucks You Over


Interestingly, Dunning and Kruger found this lack of skill or knowledge doesn’t leave people feeling confused or unhappy. People engaging in the Dunning-Kruger effect like Heather Dexter, a person who clearly has no idea what she is doing, presents as a confident, well-spoken individual. She isn’t bothered by her lack of scientific knowledge because she doesn’t know she lacks any.

You can find this with any science denier; they are so convinced of the rightness of their knowledge, that they have no idea and no ability to admit when they’re wrong.

One very potent example of this is The Food Babe. Vani Hari has made an empire out of scaring people about chemicals, food, life, the universe and anything else she can add the words “I read this and I was shocked” to. Side note: Honestly petal, sit down, have a snickers and calm the fuck down, your blog posts and website make you seem like a store mannequin with anxiety masquerading as a cult leader. In her !*$%-inducing article “The “Food Babe” Blogger Is Full of Shit” over at Gawker, Yvette d’Entremont, Facebook’s own SciBabe, the dirty joke slinging, stylish hairdo having chemist took Vani Hari for task for this face melting embarrassment:

“If you want proof that Hari doesn’t research anything before she puts it online, look no further than this article on airplanes, which she deleted from her site. She claimed that pilots control the air in an airplane, so you should sit near the front to breathe better air. She wrote that passengers are sometimes sprayed with pesticides before flights, and that airplane air is pumped full of nitrogen.
Please recall high school science, in which you hopefully learned that the atmosphere is 78% nitrogen.”

Two seconds on Google Vani, two seconds to find a science sites for grade-schoolers. You make me ashamed to share the same type of genitals. One could argue that Vani doesn’t know, what she doesn’t know – that doesn’t excuse her posting it on the Internet and presenting it as fact.

Illusory Correlation Effect: Imaginary Patterns


The human brain is really good at finding patterns and even better at finding patterns that aren’t there. We also have a need to understand the world, or at least ascribe meaning to the events that happen in our lives – it helps us to feel in control. Even if that way of feeling in control is to think that governments the world over are conspiring with every drug companies ever to keep us sick and kill us all.

Why accept the fact there isn’t yet a cure for most cancers, when it’s easier to believe that pharmaceutical companies hide cures and that’s why your love one died? Why accept that autism is genetic when it’s easier to believe vaccines did it? Or vaccines cause diabetes? Or SIDS? Or whatever else conspiracy theorists want to believe vaccines for because they have no fucking clue?

The illusory correlation effect happens when a person falsely correlates two or more events as being related. A child getting a vaccine and shortly after presenting more obvious signs of having autism is a sequence of events that are falsely correlated. The two events don’t have a relationship, but some vaccines on the schedule are administered around the time autistic children often present more obvious signs of autism.

Additionally, once the false correlation is made, a false causation may also be made. One might feel that presenting with signs of autism so close to getting vaccine means that getting a vaccine causes autism. Our naturopath frenemy Heather may feel that because the end of her children’s whooping cough coincided when a placebo she gave them, she may feel that the useless treatment actually cured the illness.

 How It Fucks You Over


You’re walking down the street and you have your bag stolen by a person of a specific demographic. You’re pissed because your iPod was in there and it was autographed by the only man you would go heterosexual for. You falsely correlate the specific demographic with bag snatchers and decide that all people in that demographic are terrible human beings that deserve nothing less than having all parking spaces that look empty, filled with motorcycles which they only notice at the last moment, just after they’ve slowed down to turn.

Due to the false correlation you’ve created, you now hug your bag close each time you see a person of that demographic – thus, the sweeping generalisation of bigotry is born. This false thinking is pretty useful, if you consider a poor life choice to be useful. It’s not a stretch to think how this glitch in thinking can lead to poor medical choices and for Heather Dexter, our naturopath with the qualifications of a tapeworm, her children faced the consequences of her poor choices.

I wrote an article last week that discusses alternative medicine and a point I raised was how alternative medicine typically targets illnesses that resolve on their own. Tummy aches, headaches, hangovers and sometimes – whooping cough. Heather Dexter’s children did get through their whooping cough, but many children do not especially infants and immunocomprised children. The illusory correlation effect accounts for why people still insist that alternative treatments like the magic ghost water that is homeopath, a treatment so thoroughly debunked, its continued existence is embarrassing – have played some part in their recovery.

You can see where I’m going with this. The illusion that something has worked to help you recover from an illness, when it has done nothing at all can lead to several things: you may continue to waste money on something that isn’t working; you may forego actual medical treatment when the need arises resulting in harm or death and/or you might make the first two choices for your children.

Hindsight Bias: It’s Much Easier To Predict The Past


You may have heard of Nostradamus, the French (alleged) seer who published collections of prophecies that have been the bane of my existence since before I was born.

I love my family but seriously guys, the X-files wasn’t a fucking documentary, psychics don’t exist and Nostradamus – much like the Long Island Medium just makes a bunch of vague statements that relies on the reader making connections based on their own subjective interpretations.

Hindsight bias is the reason why people have a tendency to make connections between things after an event has occurred. We may look back at an event with a feeling the event was predictable despite having no objective evidence for being able predict it.

Hindsight bias is also the reason why the saying “Things happen for a reason” exist. No. No. Sometimes that reason is because you’re stupid and have made a poor choice and now you need to justify your poor choices so you can feel less bad about making them. Jerk.

 How It Fucks You Over


Hindsight bias makes it easy to formulate or reinforce conspiracy theories when people make relationships between events where none exist, to the point they feel they could have predicted an event occurring. You can find examples of this on nearly every anti-vaxxer or other conspiracy website like infowars.com: “The NRA Warned LGBT Community One Week Before Massacre” or “Memo: CDC Officials Warned Of Safety Risks, Diseases Amid Immigrant Influx”.

These headlines are deeply upsetting and serve not only to illustrate that Alex Jones, star of infowars, must have been born on a highway, because that’s where most accidents happen – they also serve to show how outright lies can lead to people formulating connections and relationships in their mind, to make the lies they are told seem plausible. I’m not going to link the articles here because I’m not in the habit of giving jerks traffic – take a look and you’ll see why his listeners believe in things that can only be cleansed with fire.

Seriously, Alex Jones needs to walk around wearing a condom because if he’s going to continue acting like a dick – he should at least dress like one.

Cognitive Dissonance: Your Inner Hypocrite


Cognitive dissonance. We’ve all heard about this one.

Cognitive dissonance is what happens when a person holds two or more contradictory belief systems. It’s why anti-vaxxers are okay with alternative medicine making billion dollars per year profits but vilify the pharmaceutical companies that do the same, it’s why anti-GMOers against genetically modified food are okay with using insulin, it’s how homophobic gay people become state senators and it’s how anti-abortionists rationalise having abortions.

The term cognitive dissonance is actually used to describe the feelings of discomfort that result from holding conflicting beliefs. When there is a discrepancy between beliefs, something must change in order to eliminate or reduce that feeling of discomfort.

This typically results in an individual tying themselves in knots to explain away their contradictory ideas until they feel comfortable in their ability to hold onto them. It’s also a survival mechanism for those feel the need to protect their beliefs. A person might rationalize their belief by shifting the blame, deflect by shifting the Burden of Proof or cite a conspiracy theory.

 How It Fucks You Over

New Line Cinema

For people like Heather Dexter – the cognitive dissonance is clear. She was able to rationalize that her approach was correct, despite the contradictory evidence her children’s worsening illness was presenting.

Whether it be to preserve her belief system, protect her ego or outright because she valued being perceived as right over the safety of her children – who knows, but the point is clear, cognitive dissonance is a dangerous mindfuck. Our brains are more prone to faulty thinking than we realize.

Any one of these forms of faulty thinking we’ve discussed can lead to a bad outcome, but put them all together and you have a dangerous combination of emotional response, false pattern recognition, ego protection, a need to be correct and an inability to see when you’re wrong. When we base our decisions on our bias or faulty thinking, we end up with people who keep their children sick or movements that lie about preventative medicine that protects against deadly diseases.

Or throwing up in letter-boxes the morning after.

Please wait...

And Now... A Few Links From Our Sponsors