Thursday, June 18, 2009

We’re All That Gullible

Paper we’re looking at: "You Can't Not Believe Everything You Read" by Daniel T. Gilbert, Romin W Tafarodi, and Patrick S. Malone

You know what’s weird? We all use the word “gullible” a lot, but it’s actually not a real word. Not in the dictionary. It’s just one of those things we say that’s not real.

Did you believe me? Was I convincing there? Did you run to dictionary.com to check if I was a lying sack of sugar?

Well, I was lying, of course. Gullible is a word. And chances are, you’ve seen that particular joke a time or 12 before. So why did I bother saying that here?

Because it dovetails in PERFECTLY with the paper we’re going to look at. Gilbert & his team show something that’s damn near magical in this paper – Namely, that we believe EVERYTHING we read or hear, and then work actively to DISBELIEVE it.

Think about that for a second. For now, it doesn’t matter if you believe it’s true – I’ll show you the studies they did to prove this. But think about the implications.

First, it means that it’s hard to disbelieve. It’s hard mental effort. When we look at some of Baumeisters papers, we’ll see that effort (mental energy) is a physical quantity, fueled by glucose. But for now, let’s just agree that mental effort IS effort, that it takes energy, and so when we’re tired, or “drained”, or low on mental energy, we don’t have as much to expend.

Fair enough so far, I know.

So what that means, assuming Gilbert is correct in this paper, is that if we’re operating on REALLY low energy, we won’t be able to DISBELIEVE anything. We’ll be the prototypical sucker – believing anything we’re told, because we can’t summon up the energy to disbelieve it.

Now, I don’t know about you, but that’s pretty scary to me.

It also explains a lot.

For example, it explains why cults ALWAYS take at least a few steps towards reducing the followers energy, especially in the “early” days. Whether it’s feeding them gruel, or having them go through strenuous physical exertion, it’s pretty clear that they’ve understood this concept for a long time – When people don’t have physical energy, they don’t have mental energy. When they don’t have mental energy, they lose their ability to disbelieve. When they lose their ability to disbelieve, they’ll believe any stupid thing you tell them about you spaceship to God and sign over all their money to you.

Hmmmm – Anybody here hungry, tired, and interested in joining a cult?

I kid, I kid.

It also suggests, though, that if you’re distracted after “learning” something, that you’re more likely to believe it. And THIS could be something VERY dangerous for all of us, especially as the world constantly gets more and more distracting.

“The company’s in great shape” you read on a memo, then the phone rings and you answer it. Now, you’re much more likely to believe that the company’s in great shape, even if you forget about the memo. And even if the memo had gone on to say “Compared to Enron.” You read it, you got distracted, you believe it.

Is it that simple? Are we THAT gullible? Probably not. But let’s take a look at Gilberts research here, before we feel too confident…

The first thing to consider is how he assessed how much mental energy the subjects in the studies had. In these studies, he put them under “cognitive load”, a fancy schmancy way of saying that he busied them with other things. Basically, multitasking. In one study, he used time pressure instead of cognitive load. In other words “Here’s 30 minutes worth of information. You’ve got to process it all in 3 minutes.” Gee, that sounds like most companies I’ve worked with….

A quote from the paper: “The most basic prediction of this model is that when some event prevents a person from "undoing" his or her initial acceptance, then he or she should continue to believe the assertion, even when it is patently false.”

That’s a pretty scary prediction.

So let’s back it up.

In the first experiment in this paper, there were 68 subjects who were to act as judges, and determine sentencing for a defendant. They were told that the reports they read had both true and false statements in them, and even more helpfully, the false statements were color coded. (Wouldn’t it be nice if we could pick out falsities & lies in real life so easily?) The false statements exaggerated the severity of the crime.

Here’s where the experimental part comes in. The subjects were split into 2 groups. Group 1 just read the file, and was told to discount the false statements.

Group 2 was given the same instructions, but at the same time they were reading it, they were also subjected to a “digit search” task – basically, they were keeping their brain occupied doing this other search, looking for certain numbers while reading the report. As I said before, the real-world term for this is “Multi-Tasking.” They had to split their attention. Sound familiar?

So what happened? Well, the group that had to split their attention recommended sentencing that was TWICE as severe as the group that didn’t split their attention. In other words, they appeared to believe the “lies”, even though they were conveniently color coded, because they didn’t have enough mental energy to DISBELIEVE them after reading.

There were a couple of additional experiments done in the paper – if you’re interested in reading about them, you can certainly look at the paper. Now, though, I’m going to move past the experiment section, and pull out some select quotes from the analysis & discussion section of this paper.

“Gilbert et al. found that speed reading a false statement increased the probability that subjects would later recall the statement as true, but that assessing the veracity of a false statement had no such effect. In other words, time pressure affected memory for the veracity of a false statement in much the same way that interruption did.”

If you have time, you’re not going to remember falsehoods as truths later on. But if you’re speed reading, just rolling along without thinking about it too much, or skimming and not taking the time to actually ASSESS what you’re reading – Then you’re in a world of trouble.

“Research on human lie detection has consistently uncovered a truthfulness bias, that is, a tendency for people to conclude that others are telling the truth when they are not.”

This makes me feel a lot better – because I KNOW there have been a lot of times in my life where I’ve “believed the best” of people. It’s good to know that I’m not alone in the “sucker” section of life – and, really, I think it speaks volumes about the GOODNESS of most people that we have a bias to assume people are being truthful with us. We’re not as jaded and cynical and hard-hearted as we’re often portrayed.

One final quote from this paper: “People, then, do have the potential for resisting false ideas, but this potential can only be realized when the person has (a) logical ability, (b) correct information, and (c) motivation and cognitive resources.”

I think the first 2 are pretty self evident – It’s not really a shock to find out that people who have very poor logical reasoning skills, or who are lied to, might buy into some really awful ideas.

But that third point – the “Motivation and cognitive resources” is a doozey. Because it’s telling us that without enough “mental energy”, or without some stake in the game, that we’re much more likely to passively believe something that we’d otherwise be very quick to disbelieve.

As a side note, I’d like to point out that along with cults, hypnotists have been using these concepts for years, often in ways that are very helpful to people. We’ve all heard the stories of people who quit smoking through hypnosis, or who gained confidence, or who had some positive effect (though we usually hear these stories about second or third-hand accounts – for some reason, few people ever tell the story about how THEY decided they needed hypnosis…)

Hypnotists have long understood the power of the “confusion induction” – in other words, they’re overloading the cognitive faculties of the subject. That way, they’re FAR more likely to be “believed”, and when the subject believes they don’t crave cigarettes anymore, or when they believe that they enjoy exercise, then they’re far more likely to act accordingly.

Last words on this paper: It’s REALLY important that you be wary of putting yourself in a situation where you might “learn” falsehoods, especially if you’re in a low mental energy state, or you know you won’t have the time to evaluate the material being presented well.

Because time pressures and low-energy are both factors in believing things that we’d otherwise quickly label as false, we need to be especially aware of when those things are acting on us, and take steps to ensure that they don’t negatively impact us in situations where “learning” something that’s false could have negative consequences on our lives or in our jobs.

No comments:

Post a Comment