* Why It’s Hard to Change People’s Minds *

2008-10-09

Richard Moore

http://www.alternet.org/story/101973/

Why It’s Hard to Change People’s Minds

By Sean Gonsalves, AlterNet
Posted on October 7, 2008, Printed on October 8, 2008
http://www.alternet.org/story/101973/

A long time ago, Mark Twain told us: “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”

Entwined in Twain’s train of thought, is an implicit — and important — distinction: the difference between being uninformed and being misinformed.

Today, there’s scholarship to back up Twain’s theory that being ignorant isn’t as troublesome as being certain about something that “just ain’t so.”

Ignorance can be educated. But what’s the antidote to misinformation? Correct information?

Not exactly — according to political scientists Brendan Nyhan and Jason Reifler, co-authors of one of the few academic studies on the subject, “When Corrections Fail: The persistence of political misperceptions.”

While it may seem like common sense to think misinformation can be countered by giving people the real 411, Nyhan and Reifler’s research indicates that correct information often fails to reduce misperceptions among the ideologically-committed, particularly doctrinaire conservatives.

That’s something many readers of this column understand intuitively after having seen false claims like Obama-is-a-Muslim refuted over and over again and yet, unbelievably, somehow manages to persist.

There’s lots of research on citizen ignorance but there’s only a handful of studies that focus on misinformation and the effect it has on political opinions. Nyhan and Reifler’s work adds to what Yale University political scientist Robert Bullock has found: it’s possible to correct and change misinformed political opinions, but the truth (small ‘t’) ain’t enough.

In Bullock’s experimental study, participants were shown the transcript for an ad created by a pro-choice group opposing the Supreme Court nomination of John Roberts. The ad falsely accused Roberts of “supporting violent fringe groups and a convicted clinic bomber.”

What Bullock found was that 56 percent of the Democratic participants disapproved of Roberts before hearing the misinformation. After seeing the attack ad, it jumped to 80 percent.

When they were shown an ad that refuted the misinformation and were also told the pro-choice group had withdrawn the original ad, the disapproval rating didn’t drop back down to 56 percent but to 72 percent.

Nyhan and Reifler conducted a series of studies where subjects were presented with mock news articles on “hot button” issues that included demonstrably false assertions like: Iraq possessed WMD immediately before the U.S. invasion. Tax cuts lead to economic growth. Bush banned stem cell research, as Sens. Kerry and Kennedy claimed during the 2004 presidential campaign.

With the Iraq-possessed-WMD-immediately-before-the-invasion assertion, participants were shown mock news articles supporting the unfounded Bush administration claim and then provided the refutation by way of the Duelfer Report, which authoritatively details the documented lack of WMD, or even an active production program, in Iraq just before the invasion.

But instead of changing the minds of ideologically-committed war-backers, Nyhan and Reifler found a “backfire effect,” in which Iraq invasion-supporters only slightly modified their view without letting go of the misinformation by saying “Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived.” Sigh.

Nyhan and Reifler attribute that kind of “thinking” to the affects of “motivated reasoning,” which can distort how people process information.

“As a result (of motivated reasoning), the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups.”

Now you know why those back-and-forth on-line debates so often prove to be fruitless. Unfortunately, neither Bullock, Nyhan, or Reifler suggest a way to successfully counter misinformation clung to by those who hold their political opinions with an air of certitude.

Washington Post columnist Shankar Vedantam suggests wrapping refutations in language that enhances the self-esteem of the misinformed.

Whatever you do, just don’t forget Twain’s timeless advice: “tell the truth or trump — but get the trick.”

Sean Gonsalves is a syndicated columnist and news editor with the Cape Cod Times.

© 2008 Independent Media Institute. All rights reserved.
View this story online at: http://www.alternet.org/story/101973/