InvestorsHub Logo

StephanieVanbryce

07/13/10 10:58 PM

#102131 RE: Alex G #102108

Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

that's a good article ..I think we have proved that theory right here on ihub.. that it's totally true. (teabaggers)

Bernstein, Sides, Henry Farrell and Matt Yglesias were all following it .. and added to it ....if interested, it's all combined here .

http://plainblogaboutpolitics.blogspot.com/2010/07/democracy-with-open-eyes.html




fuagf

07/14/10 9:06 PM

#102232 RE: Alex G #102108

Chicken or Egg: Consciousness strikes again
Apr 25, 2008

Many more links inside .. http://blogs.abc.net.au/allinthemind/2008/04/a-77-year-old-l.html

Three days ago there was an interesting interview on ABC Australian radio, re research
on how much of our actions and thoughts really are a result of a conscious me', now.

Along the idea, i think, that once mind sets are fixed, then
much of what many think is conscious, 'aware', thinking now is not.

ChickenA 77 year old All in the Mind .. http://www.abc.net.au/rn/allinthemind/ ..
listener made contact, long troubled by a matter of chicken and egg:



Which comes first - he asks - thought or brain (electro) activity?

The conundrum of consciousness strikes again. ie. Do mental events correspond directly
with neural events? And, if one dictates the other, are the mind and brain not the same thing?


So...one rather intriguing brain phenomenon is often rolled out in this ever lively debate over consciousness and free will.

The question at the heart of the free will question is: do we really have any conscious control over our thoughts and actions, or is that just an illusion? Instead, does our brain wholly run the show, without consultation with a "me" within? Brain = mind.

Physiologist Benjamin Libet (1916-2007) made a striking discovery when he found that the brain registers a stimulus to act before we're actually consciously aware of that stimulus. There's an ever so small delay of a few hundred or so milliseconds between when our brain itself knows something, and when we know we know that something.

This electrical activity registered in the brain before an action is called Bereitschaftspotential or readiness potential.

Libet's provocative experiment suggested that the brain leads, and the mind follows. If we possessed free
will over and above our biology, you'd think it was the other way round. Mind = boss. Brain/body = servant.


Of course all this has lead to much philosophical tossing and turning over whether the mind and the brain are indeed the same thing, or does dualism rule the roost? Neuroscientists, who spend their days up to their elbows in nerves and neurons, tend think that's all a bit too kooky and metaphysical.

Libet published a book about his experiments, Mind Time: The Temporal Factor in Consciousness (Harvard University Press, 2004).

Here's an obit for Libet, who died last year at 91, outlining his legacy.

And the wikipedia entry on Libet looks reasonably thorough.

Here's one take on Libet's work by UK philosopher Prof Ted Honderich (whose own thinking has in turn been taken on by others, notably Dan Dennett here)


Past All in the Mind shows I've done exploring consciousness include:

* The Mind Body Problem Down Under
* The Nature of Consciousness Debate - Part 1 and Part 2 (at the Australian Science Festival)
* David Chalmers on the Big Conundrum: Consciousness
* Susan Greenfield Contemplates Consciousness
* Zombies and Human Consciousness
* Is the Visual World a Grand Illusion?
* Radiant Cool: Detective Thriller takes on Consciousness
* Meditation and the Mind: Science Meets Buddhism
* An Intimate History of the Unconscious

Don't be losing any precious neurons over it. A few hundred more generations worth of debate is surely ahead of us. We seem to like it that way.

P.S Thanks to PhD student Patrick Hopkinson for helping me out on the Libet reference - I was racking my brains to remember again. Colleague Dr Karl rang me about this last year, wanting to repeat Libet's experiments for a Sleek Geeks episode on ABC TV.

Patrick submitted me to an EEG a few years back for a show - Listening to the Mind Listening). EEG = electroencephalogram (attractive cap sprouting with wires to measure brain electrical activity).

http://blogs.abc.net.au/allinthemind/2008/04/a-77-year-old-l.html

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”


It seems clear that teabaggers are servant to the “natural defense mechanism to avoid that cognitive dissonance.”

fuagf

01/23/13 10:33 PM

#197585 RE: Alex G #102108

Diss Information: Is There a Way to Stop Popular Falsehoods from Morphing into "Facts"?

False information is pervasive and difficult to eradicate, but scientists are developing new
strategies such as "de-biasing," a method that focuses on facts, to help spread the truth

By Carrie Arnold - October 4, 2012


President Obama's Certificate Of Live Birth President Obama's Certificate Of Live Birth Image: Flickr/Talk Radio News Service

A recurring red herring in the current presidential campaign is the verity of President Barack Obama's birth certificate. Although the president has made this document public, and records of his 1961 birth in Honolulu have been corroborated by newspaper announcements, a vocal segment of the population continues to insist that Obama's birth certificate proving U.S. citizenship is a fraud, making him legally ineligible to be president. A Politico survey .. http://www.politico.com/news/stories/0211/49554.html .. found that a majority of voters in the 2011 Republican primary shared this clearly false belief.

Scientific issues can be just as vulnerable to misinformation campaigns. Plenty of people still believe that vaccines .. http://www.scientificamerican.com/topic.cfm?id=vaccines .. cause autism and that human-caused climate change is a hoax. Science has thoroughly debunked these myths, but the misinformation persists in the face of overwhelming evidence. Straightforward efforts to combat the lies may backfire as well. A paper published on September 18 in Psychological Science in the Public Interest (PSPI) says that efforts to fight the problem frequently have the opposite effect.

"You have to be careful when you correct misinformation that you don't inadvertently strengthen it," says Stephan Lewandowsky, a psychologist at the University of Western Australia in Perth and one of the paper's authors. "If the issues go to the heart of people's deeply held world views, they become more entrenched in their opinions if you try to update their thinking."

Psychologists call this reaction belief perseverance: maintaining your original opinions in the face of overwhelming data that contradicts your beliefs. Everyone does it, but we are especially vulnerable when invalidated beliefs form a key part of how we narrate our lives. Researchers have found that stereotypes, religious faiths and even our self-concept are especially vulnerable to belief perseverance. A 2008 study .. http://www.sciencedirect.com/science/article/pii/S0022103107000686 .. in the Journal of Experimental Social Psychology found that people are more likely to continue believing incorrect information if it makes them look good (enhances self-image). For example, if an individual has become known in her community for purporting that vaccines cause autism, she might build her self-identity as someone who helps prevent autism by helping other parents avoid vaccination. Admitting that the original study linking autism to the MMR (measles–mumps–rubella) vaccine was ultimately deemed fraudulent would make her look bad (diminish her self-concept).

In this circumstance, it is easier to continue believing that autism and vaccines are linked, according to Dartmouth College political science researcher Brendan Nyhan. "It's threatening to admit that you're wrong," he says. "It's threatening to your self-concept and your worldview." It's why, Nyhan says, so many examples of misinformation are from issues that dramatically affect our lives and how we live.

Ironically, these issues are also the hardest to counteract. Part of the problem, researchers have found, is how people determine whether a particular statement is true. We are more likely to believe a statement if it confirms our preexisting beliefs, a phenomenon known as confirmation bias. Accepting a statement also requires less cognitive effort than rejecting it. Even simple traits such as language can affect acceptance: Studies have found that the way a statement is printed or voiced (or even the accent) can make those statements more believable. Misinformation is a human problem, not a liberal or conservative one, Nyhan says.

Misinformation is even more likely to travel and be amplified by the ongoing diversification of news sources and the rapid news cycle. Today, publishing news is as simple as clicking "send." This, combined with people's tendency to seek out information that confirms their beliefs, tends to magnify the effects of misinformation. Nyhan says that although a good dose of skepticism doesn't hurt while reading news stories, the onus to prevent misinformation should be on political pundits and journalists rather than readers. "If we all had to research every factual claim we were exposed to, we'd do nothing else," Nyhan says. "We have to address the supply side of misinformation, not just the demand side."

Correcting misinformation, however, isn't as simple as presenting people with true facts. When someone reads views from the other side, they will create counterarguments that support their initial viewpoint, bolstering their belief of the misinformation. Retracting information does not appear to be very effective either. Lewandowsky and colleagues published two papers in 2011 that showed a retraction, at best, halved the number of individuals who believed misinformation.

Combating misinformation has proved to be especially difficult in certain scientific areas such as climate science. Despite countless findings to the contrary, a large portion of the population doesn't believe that scientists agree on the existence of human-caused climate change, which affects their willingness to seek a solution to the problem, according to a 2011 study .. http://www.nature.com/nclimate/journal/v1/n9/full/nclimate1295.html .. in Nature Climate Change. (Scientific American is part of Nature Publishing Group.)

"Misinformation is inhibiting public engagement in climate change in a major way," says Edward Maibach, director of the Center for Climate Change Communication at George Mason University and author of the Nature article, as well as a commentary that accompanied the recent article in PSPI by Lewandowsky and colleagues. Although virtually all climate scientists agree that human actions are changing the climate and that immediate action must be taken, roughly 60 percent of Americans believe that no scientific consensus on climate change exists.

"This is not a random event," Maibach says. Rather, it is the result of a concerted effort by a small number of politicians and industry leaders to instill doubt in the public. They repeat the message that climate scientists don't agree that global warming .. http://www.scientificamerican.com/topic.cfm?id=global-warming-and-climate-change .. is real, is caused by people or is harmful. Thus, the message concludes, it would be premature for the government to take action and increase regulations.

To counter this effort, Maibach and others are using the same strategies employed by climate change deniers. They are gathering a group of trusted experts on climate and encouraging them to repeat simple, basic messages. It's difficult for many scientists, who feel that such simple explanations are dumbing down the science or portraying it inaccurately. And researchers have been trained to focus on the newest research, Maibach notes, which can make it difficult to get them to restate older information. Another way to combat misinformation is to create a compelling narrative that incorporates the correct information, and focuses on the facts rather than dispelling myths—a technique called "de-biasing."

Although campaigns to counteract misinformation can be difficult to execute, they can be remarkably effective if done correctly. A 2009 study found that an anti-prejudice campaign in Rwanda aired on the country's radio stations successfully altered people's perceptions of social norms and behaviors in the aftermath of the 1994 tribally based genocide of an estimated 800,000 minority Tutsi. Perhaps the most successful de-biasing campaign, Maibach notes, is the current near-universal agreement that tobacco smoking .. http://www.scientificamerican.com/topic.cfm?id=smoking .. is addictive and can cause cancer .. http://www.scientificamerican.com/topic.cfm?id=cancer . In the 1950s smoking was considered a largely safe lifestyle choice—so safe that it was allowed almost everywhere and physicians appeared in ads to promote it. The tobacco industry carried out a misinformation campaign for decades, reassuring smokers that it was okay to light up. Over time opinions began to shift as overwhelming evidence of ill effects was made public by more and more scientists and health administrators.

The most effective way to fight misinformation, ultimately, is to focus on people's
behaviors, Lewandowsky says. Changing behaviors will foster new attitudes and beliefs.

http://www.scientificamerican.com/article.cfm?id=how-to-stop-misinformation-from-becoming-popular-belief

fuagf

12/29/15 10:04 PM

#242196 RE: Alex G #102108

How to debunk false beliefs without having it backfire

"How facts backfire
Researchers discover a surprising threat to democracy: our brains
"

.. this post to Alex G's so it might be read directly beneath the other reply in which psychologist Stephan Lewandowsky gets a mention ..

Updated by Susannah Locke on December 22, 2014, 10:40 a.m. ET @susannahlocke


This doesn't look good. Shutterstock

There's nothing worse than arguing with someone who simply refuses to listen to reason. You can throw all the facts at them you want, and they'll simply dig in their heels deeper.

Over the past decade, psychologists have been studying why so many people do this. As it turns out, our brains have glitches that can make it difficult to remember that wrong facts are wrong. And trying to debunk misinformation can often backfire and entrench that misinformation stronger. The problem is even worse for emotionally charged political topics — like vaccines and global warming.

So how can you actually change someone's mind? I spoke to Stephan Lewandowsky, a psychologist at the University of Bristol and co-author of The Debunking Handbook .. http://www.skepticalscience.com/Debunking-Handbook-now-freely-available-download.html , to find out:

Susannah Locke: There’s evidence that when people stick with wrong facts, it isn't just stubbornness — but actually some sort of brain glitch. Why is it so difficult to change people’s minds?

Stephan Lewandowsky: It’s not an easy task to update people’s memories. That’s a very clear result that even happens with completely innocuous items. It's a fundamental problem for our cognitive apparatus to update what’s in our head.

What people have suggested — and what I think is going on — is that what people remember is the information, and then they attach a tag, "Oh no it’s not." And the problem is that often this tag can be forgotten. So you remember the misinformation, but not the fact that it’s false.

Now, one of the ways to get around that is to tell people not just that something is false, but tell them what’s true. Alternative information makes it much easier to update your memory.

---
"you remember the misinformation, but not the fact that it’s false"
---

That’s a classic study where people are told there’s a fire in a warehouse, and we found oil paints or flammable materials in the wiring cabinet. Then, later on, it will say, by the way, the wiring cabinet was empty. Now, if that's all you do, people will still think that there was oil paint in the wiring cabinet. Just simply saying something isn't true doesn't do the trick.

But instead, if you say the wiring cabinet was empty, and we found some petrol-soaked rags [elsewhere] at the scene, then people forget about the wiring cabinet because they have an alternative explanation for the fire. You need an alternative to let people let go of the initial information.

Locke: What’s the biggest thing people do wrong when trying to change other people’s minds?

Lewandowsky: The moment you get into situations that are emotionally charged, that are political, that are things that affect people’s fundamental beliefs — then you've got a serious problem. Because what might happen is that they’re going to dig in their heels and become more convinced of the information that is actually false. There are so-called backfire effects that can occur, and then the initial belief becomes more entrenched.

Locke: How can people prevent these backfire effects on political issues?

Lewandowsky: It’s very difficult. A lot of this stuff is about cultural identity and people's worldviews. And you've got to take that into account and gently nudge people out of their beliefs. But it’s a difficult process.

One [solution] is that if you give people an opportunity to self-affirm their beliefs ahead of time. Let's talk about weapons of mass destruction in Iraq. They didn’t exist, right? After Iraq was invaded, they didn't show up. And yet I think to this date about 30 percent of the public believes in the existence of weapons of mass destruction, and that’s sharply along partisan lines. If you get Republicans into the laboratory, and you say hey, there weren’t any weapons of mass destruction, that may strengthen their incorrect belief. We’ve done exactly that study .. http://pss.sagepub.com/content/16/3/190 .

---
"You get a liberal to talk to liberals and a conservative to talk to conservatives"
---

There’s some evidence that you can avoid that if you ask people to tell us [about] an occasion when you felt really good about your fundamental beliefs in free enterprise (or whatever is important to the person in question). Then they become more receptive to a corrective message. And the reason is that it’s less threatening in that context. Basically, I make myself feel good about the way I view the world, and then I can handle that because it’s not threatening my basic worldview.

The other is you can have a messenger who is consummate with your beliefs. You get a liberal to talk to liberals and a conservative to talk to conservatives.

Locke: Have psychologists completely thrown out the information-deficit model — the idea that you can change people's understanding by giving them the correct information?

Lewandowsky: It’s a nuanced issue. A couple of years ago, people basically said the information-deficit model is dead — it’s all basically about culture. Now I think that’s an oversimplification. It’s a combination of two factors. Culture is extremely important. But it’s also true that in some circumstances providing people with information is beneficial. That is, more information does enable people to sort out what's going on.

---
"superficially just throwing information at people probably will make them tune out"
---

Now, the trick appears to be that you’ve got to get people the opportunity to deal with information in great depth. If you have a situation like a classroom where people are forced to sit down and pay attention, that’s when more information is helpful. There's a lot of evidence of this in educational psychology.

Now the problem is in a sort of casual situation, people listening to the radio or having a superficial conversation — that's where the information deficit model doesn’t apply. And superficially just throwing information at people probably will make them tune out. So you’ve got to be careful when you’re talking about public discourse, TV, radio, media.

Locke: Let’s say I’m going home for the holidays and have an uncle who doesn’t believe in climate change. How can I change his mind?

Lewandowsky: It’s difficult. There’s a couple of things I can suggest. The first thing is to make people affirm their beliefs. Affirm that they’re not idiots, that they're not dumb, that they’re not crazy — that they don't feel attacked. And then try to present the information in a way that’s less conflicting with [their] worldview.

One of the problems I've been working with is people's attitudes toward climate change. For a lot of people, the moment they hear the words "climate change," they just shut down. But there are ways that you can get around that. For example, it’s been shown that if you show the health consequences of climate change or if you can have market-based solutions to the problem, that does not challenge their worldview too much.

---
"Affirm that they’re not idiots ... SO that they don't feel attacked"
.. the SO is as [SO], but couldn't get the color to work ..
---

If you tell people that there is an overwhelming scientific consensus that 97 out of 100 climate scientists agree on the basic notion of global warming, it seems that is a gateway belief that enables people to recognize the importance of the issue.

More often than not, that is effective with people who are ideologically disposed to reject global warming as a fact. In general, people are very sensitive to what they perceive to be the majority opinion around them.

Locke: If you throw too much information at people, are they more likely to reject your stance?

Lewandowsky: That’s quite nuanced, and it depends on how much time people are willing to invest in processing the information. If people sit down with the intention of listening and trying to undo the problem, then we have no evidence for an overkill backfire effect.

However, there's plenty of evidence that in a casual context — turning on the TV or whatever — you can dilute the message by putting too much information in it. This whole information-overload issue is more critical in a more casual context. And that's always important.

Most of the research on misinformation has mimicked casual situations. People just sit there and read something like a newspaper article, and that’s when you get backfire effects and people are very susceptible to misinformation.

Locke: What about the "familiarity effect," in which just mentioning the wrong information could make it stick even harder?

Lewandowsky: As recently as two or three years ago, I would have assumed that it exists. Now, it’s beginning to look like that’s not terribly robust. We’ve had a hard time trying to reproduce it. It sometimes occurs and sometimes doesn’t. I’m inclined to think it will turn out to be quite infrequent.

Locke: What’s your favorite experiment that shows the difficulty of debunking?

Lewandowsky: The one study .. http://pss.sagepub.com/content/16/3/190 .. I like a lot is the one I did about the Iraq war that was published in 2005. And what we did there was to look at people’s processing of information related to the Iraq war and the weapons of mass destruction. We ran the study in three different countries: in the US, in Germany and in Australia — at the same time.

And what we found is that Americans who knew something was false continued to believe in it, which makes no sense. We said, here’s this piece of information and asked them if they knew it was retracted. And a minute later, we asked them whether they believed the information. And they continued to believe it. The Germans and Australians did not.

Now, at first glance, that makes it sound as though there's something weird about Americans compared to the other two nationalities. But what’s really interesting is that’s not the case at all. What drove this effect was the skepticism [of the reasons why the war was being fought in the first place]. It turns out that when we asked people if they thought the war at the time was fought over weapons of mass destruction, that item could predict whether people would continue to believe things that are false.

When you control for skepticism, all those differences between Americans and Germans and Australians disappear. There was an underlying cognitive variable that explains it. It just so happened that there were far more skeptics in Germany and Australia at the time.

Locke: How has psychology’s understanding of debunking shifted since you first started studying it?

Lewandowsky: Over the past 10 years or so that I’ve been doing this, the role of cultural worldviews and people's identification with their own culture has been realized more and more. And equally, we know that skepticism is extremely important. People who are skeptical about the motives of someone telling us something — that’s very important and fairly new.

Another thing that’s emerged more and more over time is the existence of backfire effects that if you tell people one thing, they’ll believe the opposite. That finding seems to be pretty strong.

Locke: Have you seen people changing their messages in response to this new research?

Lewandowsky: The Debunking Handbook .. http://www.skepticalscience.com/Debunking-Handbook-now-freely-available-download.html — that’s been downloaded at least half a million times. So that message is getting out, I think. I’ve seen a lot of reference to that handbook, and I think some people in the media are now aware of how difficult it is to remove information from public discourse.

I’m vaguely optimistic that this research is having an impact. And certainly when it comes to government and large organizations, I think they’re beginning to be fairly savvy in what they say and how they do it, in part because of the research.

Locke: Is there anything else important that people should know?

Lewandowsky: One thing that I would point out is that it’s very important for people to be skeptical and anticipate that people will be misleading to the public. Some of the misinformation that’s out there is not accidental. I think there’s quite a bit that’s put into the public discourse in order to have a political effect. It’s supposed to be wrong, but effective.

What our research shows is that if people are aware of the possibility that they might be misled ahead of time, then they’re much better at recognizing corrections later on.

This interview has been lightly edited and condensed for length and clarity.

Further reading

How politics makes us stupid
http://www.vox.com/2014/4/6/5556462/brain-dead-how-politics-makes-us-stupid .. [also to be posted in full]

Card 1 of 11

How can you tell if scientific evidence is strong or weak?

The world abounds with evidence and studies, some of it good and some of it poor. How can you know what to trust?
This card stack aims to guide you through tricky issues that can cloud your understanding of scientific findings.

One major problem is that scientific lingo often means something different in common parlance. And these words can insidiously sneak
into media coverage. Simple words such as theory, significant, and control have totally different meanings in the realm of science.

http://www.vox.com/2014/12/22/7433899/debunk-how-to

Note: there are 10 more cards inside.

Interesting stuff, eh .. :)

fuagf

01/01/16 5:19 AM

#242261 RE: Alex G #102108

How politics makes us stupid

i truly hope there are many who find this most insightful article from Ezra as interesting as i did .. it
adds yet another dimension to the damaging and dangerous situation all countries experience ..


"How facts backfire"

.. sorry, i couldn't make the images smaller, as they are inside ..

by Ezra Klein on April 6, 2014

There’s a simple theory underlying much of American politics. It sits hopefully at the base of almost every speech, every op-ed, every article, and every panel discussion. It courses through the Constitution and is a constant in President Obama’s most stirring addresses. It’s what we might call the More Information Hypothesis: the belief that many of our most bitter political battles are mere misunderstandings. The cause of these misunderstandings? Too little information — be it about climate change, or taxes, or Iraq, or the budget deficit. If only the citizenry were more informed, the thinking goes, then there wouldn’t be all this fighting.

It’s a seductive model. It suggests our fellow countrymen aren’t wrong so much as they’re misguided, or ignorant, or — most appealingly — misled by scoundrels from the other party. It holds that our debates are tractable and that the answers to our toughest problems aren’t very controversial at all. The theory is particularly prevalent in Washington, where partisans devote enormous amounts of energy to persuading each other that there’s really a right answer to the difficult questions in American politics — and that they have it.

But the More Information Hypothesis isn’t just wrong. It’s backwards. Cutting-edge research shows that the more information partisans get, the deeper their disagreements become.


In April and May of 2013, Yale Law professor Dan Kahan — working with coauthors Ellen Peters, Erica Cantrell Dawson, and Paul Slovic — set out to test a question that continuously puzzles scientists: why isn’t good evidence more effective in resolving political debates? For instance, why doesn’t the mounting proof that climate change is a real threat persuade more skeptics?

The leading theory, Kahan and his coauthors wrote, is the Science Comprehension Thesis, which says the problem is that the public doesn’t know enough about science to judge the debate. It’s a version of the More Information Hypothesis: a smarter, better educated citizenry wouldn’t have all these problems reading the science and accepting its clear conclusion on climate change.

But Kahan and his team had an alternative hypothesis. Perhaps people aren’t held back by a lack of knowledge. After all, they don’t typically doubt the findings of oceanographers or the existence of other galaxies. Perhaps there are some kinds of debates where people don’t want to find the right answer so much as they want to win the argument. Perhaps humans reason for purposes other than finding the truth — purposes like increasing their standing in their community, or ensuring they don’t piss off the leaders of their tribe. If this hypothesis proved true, then a smarter, better-educated citizenry wouldn’t put an end to these disagreements. It would just mean the participants are better equipped to argue for their own side.

Kahan and his team came up with a clever way to test which theory was right. They took 1,000 Americans, surveyed their political views, and then gave them a standard test used for assessing math skills. Then they presented them with a brainteaser. In its first form, it looked like this:

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Medical researchers have developed a new cream for treating skin rashes. New treatments often work but sometimes make rashes worse. Even when treatments don't work, skin rashes sometimes get better and sometimes get worse on their own. As a result, it is necessary to test any new treatment in an experiment to see whether it makes the skin condition of those who use it better or worse than if they had not used it.

Researchers have conducted an experiment on patients with skin rashes. In the experiment, one group of patients used the new cream for two weeks, and a second group did not use the new cream.

In each group, the number of people whose skin condition got better and the number whose condition got worse are recorded in the table below. Because patients do not always complete studies, the total number of patients in each two groups is not exactly the same, but this does not prevent assessment of the results.

Please indicate whether the experiment shows that using the new cream is likely to make the skin condition better or worse.


What result does the study support?

* People who used the skin cream were more likely to get better than those who didn't.
* People who used the skin cream were more likely to get worse than those who didn't.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

It’s a tricky problem meant to exploit a common mental shortcut. A glance at the numbers leaves most people with the impression that the skin cream improved the rash. After all, more than twice as many people who used the skin cream saw their rash improve. But if you actually calculate the ratios the truth is just the opposite: about 25 percent of the people who used the skin cream saw their rashes worsen, compared to only about 16 percent of the people who didn’t use the skin cream.

This kind of problem is used in social science experiments to test people’s abilities to slow down and consider the evidence arrayed before them. It forces subjects to suppress their impulse to go with what looks right and instead do the difficult mental work of figuring out what is right. In Kahan’s sample, most people failed. This was true for both liberals and conservatives. The exceptions, predictably, were the people who had shown themselves unusually good at math: they tended to get the problem right. These results support the Science Comprehension Thesis: the better subjects were at math, the more likely they were to stop, work through the evidence, and find the right answer.

But Kahan and his coauthors also drafted a politicized version of the problem. This version used the same numbers as the skin-cream question, but instead of being about skin creams, the narrative set-up focused on a proposal to ban people from carrying concealed handguns in public. The 2x2 box now compared crime data in the cities that banned handguns against crime data in the cities that didn’t. In some cases, the numbers, properly calculated, showed that the ban had worked to cut crime. In others, the numbers showed it had failed.

Presented with this problem a funny thing happened: how good subjects were at math stopped predicting how well they did on the test. Now it was ideology that drove the answers. Liberals were extremely good at solving the problem when doing so proved that gun-control legislation reduced crime. But when presented with the version of the problem that suggested gun control had failed, their math skills stopped mattering. They tended to get the problem wrong no matter how good they were at math. Conservatives exhibited the same pattern — just in reverse.

Being better at math didn’t just fail to help partisans converge on the right answer. It actually drove them further apart. Partisans with weak math skills were 25 percentage points likelier to get the answer right when it fit their ideology. Partisans with strong math skills were 45 percentage points likelier to get the answer right when it fit their ideology. The smarter the person is, the dumber politics can make them.

[ INSERT: .. i wonder if a person's ability at math is necessarily a measure of their overall smarts .. i don't
think so .. just a thought that could be an, albeit only a relatively insignificant qualification there ]

Consider how utterly insane that is: being better at math made partisans less likely to solve the problem correctly when solving the problem correctly meant betraying their political instincts. People weren’t reasoning to get the right answer; they were reasoning to get the answer that they wanted to be right.

The skin-rash experiment wasn’t the first time Kahan had shown that partisanship has a way of short-circuiting intelligence. In another study, he tested people’s scientific literacy alongside their ideology and then asked about the risks posed by climate change. If the problem was truly that people needed to know more about science to fully appreciate the dangers of a warming climate, then their concern should’ve risen alongside their knowledge. But here, too, the opposite was true: among people who were already skeptical of climate change, scientific literacy made them more skeptical of climate change.

"Individuals subconsciously resist factual information that threatens their defining values."

This will make sense to anyone who’s ever read the work of a serious climate change denialist. It’s filled with facts and figures, graphs and charts, studies and citations. Much of the data is wrong or irrelevant. But it feels convincing. It’s a terrific performance of scientific inquiry. And climate-change skeptics who immerse themselves in it end up far more confident that global warming is a hoax than people who haven’t spent much time studying the issue. More information, in this context, doesn’t help skeptics discover the best evidence. Instead, it sends them searching for evidence that seems to prove them right. And in the age of the internet, such evidence is never very far away.

In another experiment Kahan and his coauthors gave out sample biographies of highly accomplished scientists alongside a summary of the results of their research. Then they asked whether the scientist was indeed an expert on the issue. It turned out that people’s actual definition of "expert" is "a credentialed person who agrees with me." For instance, when the researcher’s results underscored the dangers of climate change, people who tended to worry about climate change were 72 percentage points more likely to agree that the researcher was a bona fide expert. When the same researcher with the same credentials was attached to results that cast doubt on the dangers of global warming, people who tended to dismiss climate change were 54 percentage points more likely to see the researcher as an expert.

Kahan is quick to note that, most of the time, people are perfectly capable of being convinced by the best evidence. There’s a lot of disagreement about climate change and gun control, for instance, but almost none over whether antibiotics work, or whether the H1N1 flu is a problem, or whether heavy drinking impairs people’s ability to drive. Rather, our reasoning becomes rationalizing when we’re dealing with questions where the answers could threaten our tribe — or at least our social standing in our tribe. And in those cases, Kahan says, we’re being perfectly sensible when we fool ourselves.

[ there is a separation ..... square ..... separation symbol here ]


Sean Hannity. Uri Schanker/WireImage

Imagine what would happen to, say, Sean Hannity if he decided tomorrow that climate change was the central threat facing the planet. Initially, his viewers would think he was joking. But soon, they’d begin calling in furiously. Some would organize boycotts of his program. Dozens, perhaps hundreds, of professional climate skeptics would begin angrily refuting Hannity’s new crusade. Many of Hannity’s friends in the conservative media world would back away from him, and some would seek advantage by denouncing him. Some of the politicians he respects would be furious at his betrayal of the cause. He would lose friendships, viewers, and money. He could ultimately lose his job. And along the way he would cause himself immense personal pain as he systematically alienated his closest political and professional allies. The world would have to update its understanding of who Sean Hannity is and what he believes, and so too would Sean Hannity. And changing your identity is a psychologically brutal process.

Kahan doesn’t find it strange that we react to threatening information by mobilizing our intellectual artillery to destroy it. He thinks it’s strange that we would expect rational people to do anything else. "Nothing any ordinary member of the public personally believes about the existence, causes, or likely consequences of global warming will affect the risk that climate changes poses to her, or to anyone or anything she cares about," Kahan writes. "However, if she forms the wrong position on climate change relative to the one that people with whom she has a close affinity — and on whose high regard and support she depends on in myriad ways in her daily life — she could suffer extremely unpleasant consequences, from shunning to the loss of employment."

"Kahan’s research tells us we can’t trust our own reason. How do we reason our way out of that?"

Kahan calls this theory Identity-Protective Cognition: "As a way of avoiding dissonance and estrangement from valued groups, individuals subconsciously resist factual information that threatens their defining values." Elsewhere, he puts it even more pithily: "What we believe about the facts," he writes, "tells us who we are." And the most important psychological imperative most of us have in a given day is protecting our idea of who we are, and our relationships with the people we trust and love.

Anyone who has ever found themselves in an angry argument with their political or social circle will know how threatening it feels. For a lot of people, being "right" just isn’t worth picking a bitter fight with the people they care about. That’s particularly true in a place like Washington, where social circles and professional lives are often organized around people’s politics, and the boundaries of what those tribes believe are getting sharper.

[ separation symbol again ]

In the mid-20th century, the two major political parties were ideologically diverse. Democrats in the South were often more conservative than Republicans in the North. The strange jumble in political coalitions made disagreement easier. The other party wasn’t so threatening because it included lots of people you agreed with. Today, however, the parties have sorted by ideology .. http://www.vox.com/cards/congressional-dysfunction/what-is-political-polarization , and now neither the House nor the Senate has any Democrats who are more conservative than any Republicans, or vice versa. This sorting has made the tribal pull of the two parties much more powerful because the other party now exists as a clear enemy.

"The ice caps don’t care if it’s rational for us to worry about our friendships."

One consequence of this is that Washington has become a machine for making identity-protective cognition easier. Each party has its allied think tanks, its go-to experts, its favored magazines, its friendly blogs, its sympathetic pundits, its determined activists, its ideological moneymen. Both the professionals and the committed volunteers who make up the party machinery are members of social circles, Twitter worlds, Facebook groups, workplaces, and many other ecosystems that would make life very unpleasant for them if they strayed too far from the faith. And so these institutions end up employing a lot of very smart, very sincere people whose formidable intelligence makes certain that they typically stay in line. To do anything else would upend their day-to-day lives.

The problem, of course, is that these people are also affecting, and in some cases controlling, the levers of government. And this, Kahan says, is where identity-protective cognition gets dangerous. What’s sensible for individuals can be deadly for groups. "Although it is effectively costless for any individual to form a perception of climate-change risk that is wrong but culturally congenial, it is very harmful to collective welfare for individuals in aggregate to form beliefs this way," Kahan writes. The ice caps don’t care if it’s rational for us to worry about our friendships. If the world keeps warming, they’re going to melt regardless of how good our individual reasons for doing nothing are.

[ separation symbol ]

To spend much time with Kahan’s research is to stare into a kind of intellectual abyss. If the work of gathering evidence and reasoning through thorny, polarizing political questions is actually the process by which we trick ourselves into finding the answers we want, then what’s the right way to search for answers? How can we know the answers we come up with, no matter how well-intentioned, aren’t just more motivated cognition? How can we know the experts we’re relying on haven’t subtly biased their answers, too? How can I know that this article isn’t a form of identity protection? Kahan’s research tells us we can’t trust our own reason. How do we reason our way out of that?

The place to start, I figured, was talking to Dan Kahan. I expected a conversation with an intellectual nihilist. But Kahan doesn’t sound like a creature of the abyss. He sounds like, well, what he is: a Harvard-educated lawyer who clerked for Thurgood Marshall on the Supreme Court and now teaches at Yale Law School. He sounds like a guy who has lived his adult life excelling in institutions dedicated to the idea that men and women of learning can solve society’s hardest problems and raise its next generation of leaders. And when we spoke, he seemed uncomfortable with his findings. Unlike many academics who want to emphasize the import of their work, he seemed to want to play it down.

"We fixate on the cases where things aren’t working," he says. "The consequences can be dramatic, so it makes sense we pay attention to them. But they’re the exception. Many more things just work. They work so well that they’re almost not noticeable. What I’m trying to understand is really a pathology. I want to identify the dynamics that lead to these nonproductive debates." In fact, Kahan wants to go further than that. "The point of doing studies like this is to show how to fix the problem."


A nurse loads a syringe with a vaccine against hepatitis. Robyn Beck/AFP/Getty Images

Consider the human papillomavirus vaccine, he says. That’s become a major cultural battle in recent years with many parents insisting that the government has no right to mandate a vaccine that makes it easier for teenagers to have sex. Kahan compares the HPV debacle to the relatively smooth rollout of the hepatitis B vaccine.

"The point of doing studies like this is to show how to fix the problem."

"What about the hepatitis B vaccine?" he asks. "That’s also a sexually transmitted disease. It also causes cancer. It was proposed by the Centers for Disease Control as a mandatory vaccine. And during the years in which we were fighting over HPV the hepatitis B vaccine uptake was over 90 percent. So why did HPV become what it became?"

Kahan’s answer is that the science community has a crappy communications team. Actually, scratch that: Kahan doesn’t think they have any communications team at all. "We don’t have an organized science-intelligence communication brain in our society," he says. "We only have a brainstem. We don’t have people watching for controversies over things like vaccines and responding to them."

In Kahan’s telling, the HPV vaccine was a symphony of missteps. Its manufacturer, Merck, wanted it fast-tracked onto shelves. They sponsored legislative campaigns in statehouses to make it mandatory. In order to get it to customers quicker they began by approving it for girls before it was approved for boys. All this, he believes, allowed the process to be politicized, while a calmer, slower, more technocratic approach could have kept the peace. "I think it would’ve been appropriate for the FDA, in considering whether to fast-track the vaccine, to consider these science communication risks."

[ separation symbol ]

Kahan’s studies, depressing as they are, are also the source of his optimism: he thinks that if researchers can just develop a more evidence-based model of how people treat questions of science as questions of identity then scientists could craft a communications strategy that would avoid those pitfalls. "My hypothesis is we can use reason to identify the sources of the threats to our reason and then we can use our reason to devise methods to manage and control those processes," he says. That’s a lot of reasoning. As a concrete example, he offers the government’s approach to regulatory decisions. "There is a process in the Office of Management and Budget where every decision has to pass a cost-benefit test," he says. "Why isn’t there a process in the FDA evaluating every decision for science-communication impact?"

But when I ask him whether advocates for the HPV vaccine would really stay quiet if the FDA refused to fast-track a lifesaving treatment on grounds that a slower roll-out would be better PR, he doesn’t have much of an answer. Indeed, pressing him on these questions makes me wonder whether Kahan isn’t engaged in a bit of identity-protective cognition of his own. Having helped uncover a powerful mental process that undermines the institutions he’s most devoted to, he’s rationalized the problem away as a mere artifact of a poor communications strategy.

Kahan’s answers also take as a premise that scientists play a very powerful role in driving the public discourse. That is, to say the least, debatable. "If you taught everyone who cares about science to communicate properly you still couldn’t control Fox News," says Chris Mooney, a Mother Jones writer who focuses on the intersection between science and politics. "And that matters more than how individual scientists communicate."

But Kahan would never deny that identity-protective cognition afflicts him too. In fact, recognizing that is core to his strategy of avoiding it. "I’m positive that at any given moment some fraction of the things I believe, I believe for identity-protective purposes," he says. "That gives you a kind of humility."

Recognizing the problem is not the same as fixing it, though. I asked Kahan how he tries to guard against identity protection in his everyday life. The answer, he said, is to try to find disagreement that doesn’t threaten you and your social group — and one way to do that is to consciously seek it out in your group. "I try to find people who I actually think are like me — people I’d like to hang out with — but they don’t believe the things that everyone else like me believes," he says. "If I find some people I identify with, I don’t find them as threatening when they disagree with me." It’s good advice, but it requires, as a prerequisite, a desire to expose yourself to uncomfortable evidence — and a confidence that the knowledge won’t hurt you.

[ separation symbol ]


(Dan Kahan. Wikimedia Commons)

At one point in our interview Kahan does stare over the abyss, if only for a moment. He recalls a dissent written .. http://www.theatlantic.com/national/archive/2013/08/the-irony-of-justice-scalias-california-prison-rant/278349/ .. by Supreme Court Justice Antonin Scalia in a case about overcrowding in California prisons. Scalia dismissed the evidentiary findings of a lower court as motivated by policy preferences. "I find it really demoralizing, but I think some people just view empirical evidence as a kind of device," Kahan says.

But Scalia’s comments were perfectly predictable given everything Kahan had found. Scalia is a highly ideological, tremendously intelligent individual with a very strong attachment to conservative politics. He’s the kind of identity-protector who has publicly said he stopped subscribing to the Washington Post because he "just couldn’t handle it anymore," and so he now cocoons himself in the more congenial pages of the Washington Times and the Wall Street Journal. Isn’t it the case, I asked Kahan, that everything he’s found would predict that Scalia would convince himself of whatever he needed to think to get to the answers he wanted?

The question seemed to rattle Kahan a bit. "The conditions that make a person subject to that way of looking at the evidence," he said slowly, "are things that should be viewed as really terrifying, threatening influences in American life. That’s what threatens the possibility of having democratic politics enlightened by evidence."

[ not only in American life, of course ]

The threat is real. Washington is a bitter war between two well-funded, sharply-defined tribes that have their own machines for generating evidence and their own enforcers of orthodoxy. It’s a perfect storm for making smart people very stupid.

The silver lining is that politics doesn’t just take place in Washington. The point of politics is policy. And most people don’t experience policy as a political argument. They experience it as a tax bill, or a health insurance card, or a deployment. And, ultimately, there’s no spin effective enough to persuade Americans to ignore a cratering economy, or skyrocketing health-care costs, or a failing war. A political movement that fools itself into crafting national policy based on bad evidence is a political movement that will, sooner or later, face a reckoning at the polls.

At least, that’s the hope. But that’s not true on issues, like climate change, where action is needed quickly to prevent a disaster that will happen slowly. There, the reckoning will be for future generations to face. And it’s not true when American politics becomes so warped by gerrymandering, big money, and congressional dysfunction that voters can’t figure out who to blame for the state of the country. If American politics is going to improve, it will be better structures, not better arguments, that win the day.

Editor: Eleanor Barkhorn
Designer: Georgia Cowley
Illustrator: Warren Schultheis

http://www.vox.com/2014/4/6/5556462/brain-dead-how-politics-makes-us-stupid