inside higher education discusses replication in psychology and sociology

Science just published a piece showing that only a third of articles from major psychology journals can be replicated. That is, if you reran the experiments, only a third of experiments will have statistically significant results. The details of the studies matter as well. The higher the p-value, the less like you were to replicate and “flashy” results were less likely to replicate.

Insider Education spoke to me and other sociologists about the replication issue in our discipline. A major issue is that there is no incentive to actually assess research since it seems to be nearly impossible to publish replications and statistical criticisms in our major journals:

Recent research controversies in sociology also have brought replication concerns to the fore. Andrew Gelman, a professor of statistics and political science at Columbia University, for example, recently published a paper about the difficulty of pointing out possible statistical errors in a study published in the American Sociological Review. A field experiment at Stanford University suggested that only 15 of 53 authors contacted were able or willing to provide a replication package for their research. And the recent controversy over the star sociologist Alice Goffman, now an assistant professor at the University of Wisconsin at Madison, regarding the validity of her research studying youths in inner-city Philadelphia lingers — in part because she said she destroyed some of her research to protect her subjects.

Philip Cohen, a professor of sociology at the University of Maryland, recently wrote a personal blog post similar to Gelman’s, saying how hard it is to publish articles that question other research. (Cohen was trying to respond to Goffman’s work in the American Sociological Review.)

“Goffman included a survey with her ethnographic study, which in theory could have been replicable,” Cohen said via email. “If we could compare her research site to other populations by using her survey data, we could have learned something more about how common the problems and situations she discussed actually are. That would help evaluate the veracity of her research. But the survey was not reported in such a way as to permit a meaningful interpretation or replication. As a result, her research has much less reach or generalizability, because we don’t know how unique her experience was.”

Readers can judge whether Gelman’s or Cohen’s critiques are correct. But the broader issue is serious. Sociology journals simply aren’t publishing error correction or replication, with the honorable exception of Sociological Science which published a replication/critique of the Brooks/Manza (2006) ASR article. For now, debate on the technical merits of particular research seems to be the purview of blog posts and book reviews that are quickly forgotten. That’s not good.

50+ chapters of grad skool advice goodness: Grad Skool Rulz ($2!!!!)/From Black Power/Party in the Street


Written by fabiorojas

August 31, 2015 at 12:01 am

4 Responses

Subscribe to comments with RSS.

  1. There is a similar problem in economics. This is why I have been calling for a journal dedicated to replications


    Christian Zimmermann

    August 31, 2015 at 12:58 am

  2. Thanks for this. It’s hard as a layperson to know what “studies” are based in actual research. I keep looking at the footnotes. I guess I just assumed that if something has been published in a journal it meant it was the Real Deal, all peer-reviewed and replicable and reputable.



    August 31, 2015 at 11:59 am

  3. In grad school I used to joke that we needed a journal dedicated purely to replications in sociology – I call it a joke because I doubted anyone else in the discipline would have taken the idea seriously. Maybe publish small review articles as well, giving the name “Reviews and Replications”, so that someone could say they had an R&R at R and R.



    August 31, 2015 at 1:40 pm

  4. I’ve had several studies rejected at major journals because they were “just replications,” but fortunately, they were accepted elsewhere. One classic was a replication of a study I’d originally published in AJS which was actually done with substantially better methods, the 2nd time around, but rejected anyway b/c it was “only a replication.” (Done in a different country & with finer-grained geographical location data.) The eventual lucky recipient journal? Social Forces! Thank you, Dick Simpson!

    Recently, I’ve gone through something similar: telling colleagues about results replicating a very-well received original publication (by someone else) & being told by them “well, don’t we know that already?” It’s enough to make a guy want to take up fly fishing….
    oh, never mind…


    Howard Aldrich

    September 5, 2015 at 11:55 pm

Comments are closed.

%d bloggers like this: