How To Outsmart The 'Boomerang Effect' And Get People To Believe The Truth Instead Of A Lie

Boomerang.jpgI used to play the game "telephone" with my friends when I was younger. We'd sit in a circle and one girl would whisper something in the ear of the girl next to her, who would then pass it on to the next girl, and so on and so on. By the time it got to the last girl in the circle, it barely resembled the original whispered message. Yet when the original message was revealed, some of the girls did a head-smack and some dug their heels in and insisted the message they passed on was accurate. Sometimes I was even one of those girls. It's okay; you can admit it, once in a while you were, too. Even you guys out there have fallen into this trap.

Those of us who believe the wrong message is the right message even when the right message is clarified, are experiencing what's known as "The Boomerang Effect."

The Boomerang Effect is a social psychology term that holds that when you attempt to change someone's mind about something using a fact to dispute a current belief, that current belief will actually become stronger.

I know! Absurd, isn't it?

Why can't people just say, "Oh! I see that now. Thank you so much for teaching me something new!"?

Well when was the last time you said that to a political opponent? I'm betting pretty rarely, even if they've sometimes actually given you food for thought.

The Power of Political Misinformation ~ by Shankar Vedantam for The Washington Post

As the (2008) presidential campaign heats up, intense efforts are underway to debunk rumors and misinformation. Nearly all these efforts rest on the assumption that good information is the antidote to misinformation.

But a series of new experiments show that misinformation can exercise a ghostly influence on people's minds after it has been debunked -- even among people who recognize it as misinformation. In some cases, correcting misinformation serves to increase the power of bad information.

In experiments conducted by political scientist John Bullock at Yale University, volunteers were given various items of political misinformation from real life. One group of volunteers was shown a transcript of an ad created by NARAL Pro-Choice America that accused John G. Roberts Jr., President Bush's nominee to the Supreme Court at the time, of "supporting violent fringe groups and a convicted clinic bomber."

A variety of psychological experiments have shown that political misinformation primarily works by feeding into people's preexisting views. People who did not like Roberts to begin with, then, ought to have been most receptive to the damaging allegation, and this is exactly what Bullock found. Democrats were far more likely than Republicans to disapprove of Roberts after hearing the allegation.

Bullock then showed volunteers a refutation of the ad by abortion-rights supporters. He also told the volunteers that the advocacy group had withdrawn the ad. Although 56 percent of Democrats had originally disapproved of Roberts before hearing the misinformation, 80 percent of Democrats disapproved of the Supreme Court nominee afterward. Upon hearing the refutation, Democratic disapproval of Roberts dropped only to 72 percent.

...

OUCH! That's not good. That's not good at all. Two-thirds of those whose opinions had been swayed by misinformation, held onto their belief in that misinformation after having heard the corrected information! And that percentage appears to hold up across several study groups. But there's good news in that, too. A full third of people who have been given corrected information will actually abandon the false information in favor of the truthful information.

So How Do We Convince People To Believe The Truth Instead Of The Lie?

If we want to refute a specious claim, or even more importantly, convince “non-believers” that the Democratic or Progressive approach to governance is better for our society than the Republican or Conservative approach, and we want to avoid the deadly Backfire Effect, how do we do it?

Robert Todd Carroll, author of “The Skeptic’s Dictionary” gives us a starting point to formulating a solution by explaining several of the possible reasons this effect even occurs in the first place:

“Another explanation involves communal reinforcement and the assumption that there is more information you don’t have that supports your belief.  If one knows that there is a community of believers who share your beliefs and one believes that there is probably information [they] don’t have but which would outweigh the contrary information provided, rationalization becomes easier. It is possible that the rationalization process leads one to give more weight to reinforcement by the community of believers.

“How much play one’s belief gets in the media, versus the play of contrary information may also contribute to the backfire effect. If messages supporting your belief are presented far more frequently in the media than messages contrary to your belief, or presented repeatedly by people you admire, the tendency might be to give those supportive messages even more weight than before.”

bingo.JPGBINGO!

This theory ties in perfectly with both George Lakoff’s and Jonathan Haidt’s teachings that one of the strongest pulls in fomenting political ideology is the sacredness of the group to which we belong.

Despite what you might have learned in Economics 101, people aren’t always selfish. In politics, they’re more often groupish. When people feel that a group they value is under attack — be it racial, religious, regional or ideological — they rally to its defense, even at some cost to themselves. We evolved to be tribal, and politics is a competition among coalitions of tribes.

The key to understanding tribal behavior is uncovering what the group holds as sacred. The great trick that humans developed at some point in the last few hundred thousand years is the ability to circle around a tree, rock, ancestor, flag, book or god, and then treat that thing as sacred. People who worship the same idol can trust one another, work as a team and prevail over less cohesive groups. So if you want to understand politics, and especially our divisive culture wars, you must follow the sacredness.

And this is why at The Winning Words Project we like to put an emphasis on framing Democratic and Progressive policies within a moral context. For instance, Republicans have long argued that what we earn is “ours” and the portion the government collects in taxes is being immorally used to support others who didn’t earn it (those others generally being people they see as not being part of their group). So when we at The Winning Words Project suggest calling what have historically been known as “Entitlement Programs” by the term “Earned Benefits Programs,” we attach the same moral net around those programs as conservatives do to taxes in general, by pointing out that Medicare and Social Security are monies withheld from our paychecks by the government, invested on our behalf, and returned to us during retirement as either income (Social Security) or healthcare coverage (Medicare).

It is in this way that we can say to our Republican or Conservative friends or family, “We agree with you at least to some extent. We, too, want to keep as much of what we earn for ourselves as possible, and we want you to be able to, too. That’s why we advocate for not gutting the rate at which the government has to return your Earned Benefits! We all earn that money through our work and we want it back when it was promised to us. It’s immoral for the government to promise to return it to us at a certain rate and at a certain age, then reneg on the deal. We don’t like being cheated out of our income any more than you do!”

This approach begins to allow that person the ability to see themselves in the same group as us: people who want our money back from the government, and people who don’t want the government cheating us out of what’s ours. And in this way they are more likely to agree that it is morally wrong for Republican representatives to try to take it away from us, because now us = them.

Of course we know that we aren’t likely to win over every Republican in this country, particularly because we recognize the reality of the Boomerang Effect. But there are plenty of people — especially those in the middle — who are able to be convinced … if we tell our story using the right moral frames.

 

Now Read Your Brain On Politics »

 

Do you like this post?

Showing 6 reactions


@ANNESWISH tweeted link to this page. 2012-06-02 18:15:44 -0700
So How Do We Convince People To Believe The Truth Instead Of The Lie? http://t.co/znJsWuph
@dvnix retweeted @jillwklausen 2012-06-02 18:10:52 -0700
.@shoq The Backfire Effect—Why The Truth Can Make People Believe The Falsehood More Strongly http://t.co/jvG0nCKZ
@Shoq retweeted @jillwklausen 2012-06-02 18:08:03 -0700
.@shoq The Backfire Effect—Why The Truth Can Make People Believe The Falsehood More Strongly http://t.co/jvG0nCKZ
@funksands retweeted @WinningWordsPro 2012-06-02 16:20:31 -0700
Framing Primer: The Backfire Effect—Why The Truth Can Make People Believe The Falsehood More Strongly http://t.co/8VYUqVvr
@LewisGivens1 retweeted @jillwklausen 2012-06-02 16:19:11 -0700
Framing Primer: The Backfire Effect—Why The Truth Can Make People Believe The Falsehood More Strongly http://t.co/jvG0nCKZ
published this page in Framing & Winning Words 2012-06-02 16:04:00 -0700