That’s the name of an article in the New Yorker that explores the work of my good friend political scientist Brendan Nyhan. The essence of pretty simple: people don’t change beliefs if it somehow challenges their identity:
Last month, Brendan Nyhan, a professor of political science at Dartmouth, published the results of a study that he and a team of pediatricians and political scientists had been working on for three years. They had followed a group of almost two thousand parents, all of whom had at least one child under the age of seventeen, to test a simple relationship: Could various pro-vaccination campaigns change parental attitudes toward vaccines? Each household received one of four messages: a leaflet from the Centers for Disease Control and Prevention stating that there had been no evidence linking the measles, mumps, and rubella (M.M.R.) vaccine and autism; a leaflet from the Vaccine Information Statement on the dangers of the diseases that the M.M.R. vaccine prevents; photographs of children who had suffered from the diseases; and a dramatic story from a Centers for Disease Control and Prevention about an infant who almost died of measles. A control group did not receive any information at all. The goal was to test whether facts, science, emotions, or stories could make people change their minds.
The result was dramatic: a whole lot of nothing. None of the interventions worked. The first leaflet—focussed on a lack of evidence connecting vaccines and autism—seemed to reduce misperceptions about the link, but it did nothing to affect intentions to vaccinate. It even decreased intent among parents who held the most negative attitudes toward vaccines, a phenomenon known as the backfire effect. The other two interventions fared even worse: the images of sick children increased the belief that vaccines cause autism, while the dramatic narrative somehow managed to increase beliefs about the dangers of vaccines. “It’s depressing,” Nyhan said. “We were definitely depressed,” he repeated, after a pause.
It’s the realization that persistently false beliefs stem from issues closely tied to our conception of self that prompted Nyhan and his colleagues to look at less traditional methods of rectifying misinformation. Rather than correcting or augmenting facts, they decided to target people’s beliefs about themselves. In a series of studies that they’ve just submitted for publication, the Dartmouth team approached false-belief correction from a self-affirmation angle, an approach that had previously been used for fighting prejudice and low self-esteem. The theory, pioneered by Claude Steele, suggests that, when people feel their sense of self threatened by the outside world, they are strongly motivated to correct the misperception, be it by reasoning away the inconsistency or by modifying their behavior. For example, when women are asked to state their gender before taking a math or science test, they end up performing worse than if no such statement appears, conforming their behavior to societal beliefs about female math-and-science ability. To address this so-called stereotype threat, Steele proposes an exercise in self-affirmation: either write down or say aloud positive moments from your past that reaffirm your sense of self and are related to the threat in question. Steele’s research suggests that affirmation makes people far more resilient and high performing, be it on an S.A.T., an I.Q. test, or at a book-club meeting.
Normally, self-affirmation is reserved for instances in which identity is threatened in direct ways: race, gender, age, weight, and the like. Here, Nyhan decided to apply it in an unrelated context: Could recalling a time when you felt good about yourself make you more broad-minded about highly politicized issues, like the Iraq surge or global warming? As it turns out, it would. On all issues, attitudes became more accurate with self-affirmation, and remained just as inaccurate without. That effect held even when no additional information was presented—that is, when people were simply asked the same questions twice, before and after the self-affirmation.
Read the whole thing.
As many of you know, Washington University decided to reestablish a sociology department after notoriously shutting theirs down some two decades ago. The Chronicle of Higher Ed has reported that the university has chosen the department’s first chair and associate chair — Steven Mazzari, a macroeconomist at Wash U., and Mark Rank, who started in Washington’s sociology department before moving to the School of Social Work in 1989.
This seems like a surprising decision. The Chronicle writes:
Administrators had considered appointing a senior figure in American sociology to be chair, but, “lacking an obvious candidate,” as Mr. Fazzari puts it, they turned to him. Along with several teaching awards, he has six years of experience as chair of the economics department, and has done stints on campus-planning and hiring committees. He was a member of the campus advisory panel formed last year to consider how to revive sociology.
“There is much overlap between the problems addressed by economics and sociology,” he says. “Economics also provides a firm grounding in technical modeling and data analysis that is part of much advanced work in many social sciences, including sociology.”
I can imagine various reasons they might have taken this approach. Luring a top senior person in to build a department from scratch has to be a challenge. Still, Washington has a lot of resources and is a highly respected university (outside of sociology, where it has no presence). And there are some definite downsides to launching the department without a highly visible sociologist at the helm. I’m curious what the back story is here but, having no inside information, will leave it to you to speculate.
Psych experiments show that we tend to overvalue objects that we possess – according to a coffee mug experiment, we would be willing to sell one that we have at a certain price, but others would not be willing to pay that same price. What happens when the object is a non-human family member?
When negotiating the sale of their home, one Australian family was willing to give up their cat Tiffany to the new homeowners for $140,000 (about $120K in US dollars). Some readers of the article announcing this exchange felt their pets were priceless, while others pointed out that cats are territorial and may not tolerate moves.
Don’t expect some cats to reciprocate your affectionate feelings – according to one medical examiner, cats will consume your lips and other edibles should you expire in your home. Sweet dreams, kitty owners.
A problem with a lot of introductory level courses is that they attract heterogeneous students. In sociology, this is very apparent in the introduction to sociology class. It is not uncommon to get, in the same class, a graduating senior who wants to put in the minimal amount of effort and the very aggressive freshman who wants that 4.0 GPA for that Harvard law application. The heterogeneous class presents problems on many levels – the presentation of materials, classroom management, and so forth. In this posts, a few comments on how to handle this class.
- Cut the class in half. A few people have told me that it is effective to treat the first half as a chance to make sure everyone is on the same page. Then, the second half you can move into material that will be new for almost everyone.
- Active learning: People have also suggested that you stop lecturing. Instead, really have students to in-class work. This helps reduced the boredom for more advanced students and, at the least, gives them something to do.
- A third strategy is to stratify assignments. Older students can get more involved and challenging assignments. This depends on the nature of the course and if you have the patience to grade multiple assignments at once.
Use the comments to discuss your own teaching strategies for heterogeneous classes.
So the stock market has been freaking out a bit the last couple of weeks. Secular stagnation, Ebola, a five-year bull market—who knows why. Anyway, over the weekend I was listening to someone on NPR explain what the average person should do under such circumstances (answer: hang tight, don’t try to time the market). This reminded me of one of my pet quibbles with financial advice, which I think applies to a lot of social science more generally.
For years, the conventional wisdom around what ordinary folks should do with their money has gone something like this. Save a lot. Put it in tax-favored retirement accounts. Invest it mostly in index funds—the S&P 500 is good. Don’t mess with it. In the long run this is likely to net you a reliable 7% return after inflation, about the best you’re likely to do.
Now, it’s not that I think this is bad advice. In fact, this is pretty much exactly what I do, with some small tweaks.
But it has always struck me how, in news stories and advice columns and talk shows, people talk about how this is a good strategy because it’s worked for SO LONG. For 30 years! Or since 1929! Or since 1900! (Adjust returns accordingly.)
And yes, 30 years, or 85, or 114, are all a long time relative to human life. And we have to make decisions based on the knowledge we’ve got.
But it’s always seemed to me that if what you’re interested in is what will happen over the 30+ years of someone’s earning life (more if you’re not in academia!), you’ve basically got an N of 1 to 4 here. I mean, sure, this may be a reasonable guess, but I don’t think there’s any strong reason to believe that the next 100 years are likely to look very similar to the last 100. Odds are better if you’re just interested in the next 30, but even then, I’m always surprised by just how confident the conventional wisdom is around the idea that the market always coming out ahead over a 25- or 30-year period—going ALL THE WAY BACK TO 1929—is rock solid evidence that it will do so in the future.
Of course, there are lots of people who don’t believe this, too, as evidenced by what happened to gold prices after the financial crisis. Or by, you know, survivalists.
Anyway, I think this overconfidence in the lessons of the recent past is something we as social scientists tend to be susceptible to. The study that comes most immediately to mind here is the Raj Chetty study on value-added estimates of teachers (paper 1, paper 2, NYT article).
The gist of the argument is that teachers’ effects on student test scores, net of student characteristics (their value added), predicts students’ eventual income at age 28. Now, there’s a lot that could be discussed about this study (latest round of critique, media coverage thereof).
But I just want to point to it—or rather, broader interpretations of it—as illustrating a similar overconfidence in the ability of the past to predict the future.
Here we have a study based on a massive (2.5 million students) dataset over a twenty-year period (1989-2009). Just thinking about the scale of the study and taking its results at face value, it’s hard to imagine how much more certain one could be in social science than at the end of such an endeavor.
And much of the media coverage takes that certainty and projects it into the future (see the NYT article again). If you replace a low value-added teacher with an average one, the classroom’s lifetime earnings will increase by more than $250,000.
And yet to make such a leap, you have to be willing to assume so many things about the future will be like the past: not only that incentivizing teachers differently and making tests more important won’t change their predictive effects (which the papers acknowledge), but, just as importantly, that the effects of education on earnings—or, more specifically, of teacher value-added on earnings—will be similar in future 20-year periods as it was from 1989-2009. And that nothing else meaningful about teachers, students, schools, or earnings will evolve over the next 20 years in ways that mess with that relationship in a significant way.
I think we do this a lot—project into the future based on our understanding of a past that is, really, quite recent. Of course knowledge about the (relatively) recent past still should inform the decisions we make about the future. But rather a lot of modesty is called for when making blanket claims that assume the future is going to look just like the past. Maybe it’s human nature. But I think that modesty is often missing.