Tuesday, February 2, 2016

Denial 101: Something I find hard to believe ...

Previously we looked at the effect of peer pressure when we try and adopt the new idea that's "all the rage".  Today as promised we're going to look at the impact of denial, an effect I find very much paired with the "group delusion".

As originally planned, I would be diving right in and exploring the places where we find denial as a very real effect within IT departments.  However as I wrote and expanded my ideas, I found a little too much material - so I ran it by a friend who said it really deserved to be split up over several posts ... so change of plan!


Although I talked a little about denial in my piece of the psychology of The Force Awakes (and believe me have I seen that piece validated in recent weeks), I want to spend this first post looking further down the rabbit hole understanding more about it.



Like any of the effects we've talking about, it's easy to feel a little smug and superior as we talk about it.  As if these issues are something that "happen to other people".  But denial is such a powerful effect on us as human beings because it works on our emotions (like many of the psychological effects we're looking at).  We'd love to think that we use rational thinking to control our emotions, but often it works the other way around - our thinking tends to be slaved to rationalise the emotional outcome we want.

We are all led astray in our thinking when emotions are entangled.  The point of this series of articles is to shed a bit of light on common traps we fall into, so we can be a little wiser - maybe asking ourselves a little "is this a MacGuffin effect?".

In a nutshell, denial is the rejection of facts because of an emotional reaction.  I like how this is covered within the Your Deceptive Mind chapter on denial where it says that people commonly fall into a denial trap and will start with their desired (emotional) outcome, and use it to systematically reject any evidence which does not support that outcome.  This form of reasoning increasingly requires the presence of conspiracies to support their model of thinking.

And indeed a really good example of this is the piece I wrote in 2014 where we tried to convince someone who believed in a flat earth that the Earth was round.  In that piece they initially respond to evidence with vague science, saying "the Earth is round ... but flat ... like a plate".

Then when it's said they could try Skyping someone in another part of the world to see if it's night there whilst day where the sceptic is - however they're convinced the other party will be part of the conspiracy, so debunks the experiment.

And of course there's the line which several people have said is their favourite, "When someone flies from London to Auckland via America, it's okay, because that route exists.  But if they go via Asia, then the pilot flies around Africa for a bit until the passengers get disorientated and it gets dark.  He then heads via America, to New Zealand, stopping off at a secret replica of Singapore they've build deep in the Andes".  More conspiracy.

So what leads so many people to go down that path?  Whenever we make any kind of decision, we essentially make an investment of time, money, ego, and pride into that decision - we're obviously committed to that decision "coming out alright".

But sometimes pride and ego will mean we just stick to that decision, even in the face of new and glaring evidence.  We want to be seen to be someone consistent, not someone who is ever wrong.

So rather the rectify our decision, we'll choose to undermine any contrary evidence just like our flat earther.  This is known as the "escalation of commitment" - where people continue to justify commitment of time and money based on an initial decision and "we've already invested in this course of action".  The phrase "throwing good money after bad" is of course one which perfectly sums up this behaviour.




Here's an everyday example for those of you who can remember driving before the days of SatNav.  A couple of friends, Thelma and Louise, are driving to Mexico City.  They're supposed to be stopping at the Grand Canyon as they do so, and they passed a sign that said they'd see it in 10 miles ... but that was 15 miles ago.

Thelma wants to go back as she thinks they've missed a turning.

Louise says she has an excellent sense of direction, and is sure they'll find the Grand Canyon soon enough.

Thelma says she's seen a sign saying they're heading towards a place called Bitter Springs, which means they're heading in the opposite way to both the Grand Canyon and Mexico.

Louise is sure that Thelma is just reading the map wrong and is not about to turn around now.  This way has to end up in Mexico eventually.

Right?


The couple having an argument about directions, and someone just won't turn back, because "we've come this far".  Sound at all familiar?

Our flat earther has invested time and pride into his worldview, and because conceding that worldview means conceding his pride, he refuses to.  Even to the point of turning down a "round the world cruise" he won on the lottery, because it means admitting being wrong.



So - the question is, where does escalation of commitment happen in testing?  And I'm afraid to say, it happens everywhere!  Pretty much anywhere you've sunk time and money into doing something, there is a state of mind that wants to continue doing that course of action ... because it has to pay off eventually, right?  Oh dear God please, it has to pay off!!!



Other everyday examples of denial and escalation of effort you might want to think about ...

  • A friend pointed out the Vietnam war followed this behaviour all too chillingly.  It started out as a small involvement of US forces.  But as it went badly, more and more forces were brought in from the American side, because "we've come this far".  You see something similar in the "one big push" mentality in The Great War, particularly in The Battle Of The Somme.  I talked about the Battle Of The Somme, and "sticking with the plan" in the face of changing evidence here back in 2013.
  • "I read about this amazing diet last week.  I mean, I'd tried a few other diets over the years, but this one actually works.  I read it in a magazine."  You can substitute the world "diet" with any piece of revolutionary fitness equipment which "is like having a gym in your own home", and has so changed the world, it's not available in shops ... only to order over the phone on a midnight infomerical slot.  [It's like the gyms are conspiring to keep your membership]
  • The right wing American politician whose response to this month's school shooting death toll is "the victims and their families will be in my prayers".  Just like they were last month - except this time they'll pray really hard.
  • The Aztecs used human sacrifice to appease the god of rains.  If there was no rains, then obviously they'd not sacrificed enough people.  I talked about this here in 2014.
  • Variants on this meme when politicians have committed to a policy, despite it not producing the results they promised ...



If any of those points made you squirm uncomfortably, then well done, you're waking up.



[By the way - I've taken a few shots at the right wing there, and I'm an out and out socialist at heart.  But critical thinking still applies to me - especially if someone is sharing on social media a "new item" which aligns with my beliefs.  I often am somewhat suspicious if it falls into the camp of "I knew it", and start checking up on it.  To be honest such fake stories annoy the heck out of me, because they undermine my political stance, and make it just way too easy for friends who have different opinions to "score cheap points" over me.]

No comments:

Post a Comment