when evidence isn’t convincing
A couple of weeks ago I linked to a blog post by Brad DeLong about the future of economics education. While most of the comments to my post were about the content of economics education, what really struck me in DeLong’s original post was how academic majors reinforced students’ pre-existing political biases, rather than informing or changing them as we like to be a good liberal arts education will do. Right-leaning students leave economics feeling justified that the market will solve every social problem (or if it doesn’t, it’s not a social problem we ought to do anything about). Left-leaning students leave sociology feeling justified in their beliefs that the state ought to do more to resolve social problems. This is a problem of confirmation bias. Our brains are not very good at evaluating evidence that doesn’t conform to our pre-existing beliefs.
New research by legal scholar Dan Kahan shows that political ideology strongly shapes our willingness to believe scientific evidence. It turns out that it’s not just a problem among Republicans. Here’s a summary by Chris Mooney in Mother Jones:
In Kahan’s research (PDF), individuals are classified, based on their cultural values, as either “individualists” or “communitarians,” and as either “hierarchical” or “egalitarian” in outlook. (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In one study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: “The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.” A subject was then presented with the résumé of a fake expert “depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another.” The subject was then shown a book excerpt by that “expert,” in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist’s position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a “trustworthy and knowledgeable expert.” Yet 88 percent of egalitarian communitarians accepted the same scientist’s expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. (The alliances did not always hold. In another study (PDF), hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were opposed.)
Head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever. In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man’s freedom to possess a gun to defend his family) (PDF) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. The study subjects weren’t “anti-science”—not in their own minds, anyway. It’s just that “science” was whatever they wanted it to be.
I suppose another implication of this is that as society becomes more politically polarized, the less influential scientific evidence will be in persuading anyone to change their political positions.