orgtheory.net

when evidence isn’t convincing

A couple of weeks ago I linked to a blog post by Brad DeLong about the future of economics education. While most of the comments to my post were about the content of economics education, what really struck me in DeLong’s original post was how academic majors  reinforced students’ pre-existing political biases, rather than informing or changing them as we like to be a good liberal arts education will do. Right-leaning students leave economics feeling justified that the market will solve every social problem (or if it doesn’t, it’s not a social problem we ought to do anything about). Left-leaning students leave sociology feeling justified in their beliefs that the state ought to do more to resolve social problems. This is a problem of confirmation bias. Our brains are not very good at evaluating evidence that doesn’t conform to our pre-existing beliefs.

New research by legal scholar Dan Kahan shows that political ideology strongly shapes our willingness to believe scientific evidence. It turns out that it’s not just a problem among Republicans. Here’s a summary by Chris Mooney in Mother Jones:

In Kahan’s research (PDF), individuals are classified, based on their cultural values, as either “individualists” or “communitarians,” and as either “hierarchical” or “egalitarian” in outlook. (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In one study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: “The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.” A subject was then presented with the résumé of a fake expert “depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another.” The subject was then shown a book excerpt by that “expert,” in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist’s position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a “trustworthy and knowledgeable expert.” Yet 88 percent of egalitarian communitarians accepted the same scientist’s expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. (The alliances did not always hold. In another study (PDF), hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were opposed.)

Head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever. In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man’s freedom to possess a gun to defend his family) (PDF) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. The study subjects weren’t “anti-science”—not in their own minds, anyway. It’s just that “science” was whatever they wanted it to be.

I suppose another implication of this is that as society becomes more politically polarized, the less influential scientific evidence will be in persuading anyone to change their political positions.

Written by brayden king

April 18, 2011 at 7:01 pm

11 Responses

Subscribe to comments with RSS.

  1. I think it’s been an old result that more information tends to make folks adopt more extreme positions. I’m pretty sure Bryan Caplan wrote about it in the run-up to “The Myth of the Rational Voter”.

    Like

    teageegeepea

    April 19, 2011 at 2:00 am

  2. In economists’ frame, choices, opportunities and competitions are everywhere, so no one owes you.

    In sociologists’ frame, individual endowment and social rules are arbitrary, and are easily manipulated by those in power, so substantial efforts in social movement is necessary.

    Is it a good summary of the mindset of these two groups of people?

    Like

    passerby

    April 19, 2011 at 2:57 am

  3. Passerby – no, I don’t think those are good summaries. Economists recognize monopolies exist and that markets misprice stuff because of lack of competition. Sociologists don’t think rules are arbitrary. There are often good reasons they’re in place. People are certainly not easily manipulated.

    I think there’s a great deal of heterogeneity in both camps when you look at the evidence. The point is that most people/students start with a particular mindset and they don’t allow evidence to change those mindsets. Perhaps they start with the idea that they’ve figured everything out (e.g., I took ECON 101, what more is there to learn about markets?), or they had a race and ethnicity course in their sociology major that seemed to make sense of the world and so they believe they now have the answers to the world’s race problems.

    The problem is that none of us have figured out how things work yet. There are lots and lots of holes to be filled and that’s why we do research. We’re still trying to develop better explanations, find more precise mechanisms, etc., but you can’t do that without good empirical evidence to guide you. Evidence should be guiding your theory/beliefs, not the other way around.

    I think you missed the point of my post.

    Like

    brayden king

    April 19, 2011 at 4:03 am

  4. The post is certainly interesting, Brayden. I cannot tell from the descriptions of the experiments how much time elapsed between the steps, but I am curious if there is a priming effect that may contribute to the outcomes.

    Like

    Randy

    April 19, 2011 at 4:33 am

  5. I don’t want to rain on anyone’s parade. But, A LOT of economists are liberal democrats.

    Like

    David Hoopes

    April 19, 2011 at 6:56 pm

  6. David – very true. I know many of them.

    The study (and the ones linked by Omar above) explains why students and the public don’t believe or retain evidence that departs from their viewpoint; it’s not trying to predict the ideological viewpoints of the scientists themselves.

    Like

    brayden king

    April 19, 2011 at 7:09 pm

  7. Thank you for the link to the Kahan study. It runs 72-pages, a bit of reading. The experiment seems valid: “Book excerpts attributed to fictional authors. One of two opposing excerpts were randomly assigned to fictional authors (Figure 1) whose expertise was evaluated by subjects.” The same expert was assigned to different opinions but their expertise was differently evaluated based on the prior opinions of the Ss.

    I am presenting “Fraud in Science” to groups of middle schoolers next month at the U of M Flint campus. This could make an interesting highlight to Feynman’s “Cargo Cult Science.”

    Like

    Michael E. Marotta

    April 20, 2011 at 5:07 pm

  8. This is a very interesting post. One that applies to certain professions as well as to politics.
    For example, frequently I adduce social scientific and medical evidence to mental health professionals that most mental health treatments have 1) small effects, 2) last small periods of time, and 3) are not always positive. Aside from the “rent-seeking” aspects of mental helath, it is simply a truism that people are not interested in hearing that what they believe is not categorically correct.

    Like

    Brian A. Pitt

    April 21, 2011 at 2:42 am

  9. […] when evidence isn’t convincing (orgtheory.wordpress.com) […]

    Like

  10. […] when evidence isn’t convincing […]

    Like


Comments are closed.