orgtheory.net

Archive for the ‘research’ Category

design-focused review: a guest post by samuel r. lucas

Samuel R. Lucas is professor of sociology at the University of California, Berkeley. He works on education, social mobility, and research methods. This guest post proposes a reform of the journal review process.

On-going discussion about the journal publication process is laudable. I support many of the changes that have been suggested, such as the proposal to move to triple-blind review, and implemented, such as the rise of new journals that reject “dictatorial revi”–oops, I mean “developmental review.” I suggest, however, that part of the problem is that reviewers are encouraged to weigh in on anything–literally anything! I’ve reviewed papers and later received others’ reviews only to find a reviewer ignored almost all of the paper, weighing in on such issues as punctuation and choice of abbreviations for some technical terms. Although such off-point reviews are rare, they indicate that reviewers perceive it legitimate to weigh in on anything and everything. But a system allowing unlimited bases of review is part of the problem with peer review, for it shifts too much power to reviewers while at the same time providing insufficient guidance on what will be helpful in peer review. I contend that we need dispense with our kitchen-sink reviewing system by removing from reviewer consideration two aspects of papers: framing and findings.

Framing is a matter of taste and, as there is no accounting for taste, framing offers fertile ground for years of delay. Framing is an easy way to hold a paper hostage, because most solid papers could be framed in any one of several ways, and often multiple frames are equally valid. Authors should be allowed to frame their work as they see fit, not be forced to alter the frame because a reviewer reads the paper differently than the author. A reviewer who feels a paper should be framed differently should wait for its publication and then submit a paper that notes that the paper addressed Z but missed its connection to Q. Such an approach would make any worthwhile debate on framing public while freeing authors to place their ideas into the dialogue as well.

As for findings, peer review should be built on the following premise: if you accept the methods, then you accept the findings enough for the paper to enter the peer-reviewed literature. Thus, reviewers should assess whether the paper’s (statistical, experimental, qualitative) research design can answer the paper’s research question, but not the findings produced by the solid research design. Allowing reviewers to evaluate findings allows reviewers to (perhaps inadvertantly) scrutinize papers differently depending on the findings. To prevent such possibilities, journals should allow authors to request a findings-embargoed review, for which the journal would remove the findings section of the paper as well as the findings from: 1)the abstract, and, 2)the discussion/conclusion section of the paper before delivering the paper for review. As some reviewers may regard reading soon-to-be-published work early as a benefit of reviewing, reviewers could be sent full manuscripts if the paper is accepted for publication.

A review system in which reviewers do not review framing and findings is a design-focused review system. Once a paper passes a design-focused review, editors can conduct an in-house assessment to assure findings are accurately conveyed and the framing is coherent. The editors, unlike reviewers, see the population of submissions, and thus, unlike reviewers, are well-placed to fairly and consistently assess any other issues. Editors will be even more enabled to make such calls if they can make them only for the papers reviewers have determined satisfy the basic criterion of having a design solid enough to answer the question the paper poses.

The current kitchen-sink review system has become increasingly time-consuming and perhaps capricious, hardly positive features for effective peer review. If findings were embargoed and reviewers were discouraged from treating their preferred frame as essential to a quality paper, review times could be chopped dramatically and revise and resubmit processes would be focused on solidifying design. As a result, design-focused review could lower our collective workload by reducing the number of taste-driven rounds of review we experience as authors and reviewers, while simultaneously reducing authors’ potentially paralyzing concern that mere matters of taste will block their research from timely publication. Design-focused review may thus make peer review work better for everyone.

50+ chapters of grad skool advice goodness: Grad Skool Rulz ($2!!!!)/From Black Power/Party in the Street

Written by fabiorojas

April 28, 2016 at 12:02 am

how scientists can help us avoid the next flint

In a story full of neglect and willful ignorance, there are a few heroes. One is Mona Hanna-Attisha, the Flint pediatrician and Michigan State professor who raised the alarm with data on kids’ blood-lead levels from the local hospital. Another is Marc Edwards, the Virginia Tech environmental engineer who took on the Michigan Department of Environmental Quality after a Flint resident sent him a lead-rich water sample for testing.

Hanna-Attisha and Edwards provide shining examples of how academics can use science to hold the powers-that-be accountable and make meaningful change.

Taking on the status quo is hard. But as Edwards discusses in the Chronicle, it’s becoming ever-harder to do that from within universities:

I am very concerned about the culture of academia in this country and the perverse incentives that are given to young faculty. The pressures to get funding are just extraordinary. We’re all on this hedonistic treadmill — pursuing funding, pursuing fame, pursuing h-index — and the idea of science as a public good is being lost….What faculty person out there is going to take on their state, the Michigan Department of Environmental Quality, and the U.S. Environmental Protection Agency?…When was the last time you heard anyone in academia publicly criticize a funding agency, no matter how outrageous their behavior? We just don’t do these things….Everyone’s invested in just cranking out more crap papers.

When faculty defend academic freedom, tenure is often the focus. And certainly tenure provides one kind of protection for scientists like Hanna-Attisha (though she doesn’t yet have it) or Edwards who want to piss off the powerful.

But as this interview — and you should really read the whole thing — makes clear, tenure isn’t the only element of the academic ecosystem that allows people to speak out. Scientists can’t do their work without research funding, or access to data. When funders have interests — whether directly economic, as when oil and gas companies fund research on the environmental impacts of fracking, or more organizational, as when environmental agencies just don’t want to rock the boat — that affects what scientists can do.

So in addition to tenure, a funding ecosystem that includes multiple potential sources and that excludes the most egregiously self-interested will encourage independent science.

But beyond that, we need to defend strong professional cultures. Hanna-Attisha emphasizes how the values of medicine both motivated her (“[T]his is what matters. This is what we do … This is why we’re here”) and prompted her boss’s support (“Kids’ health comes first”), despite the “politically messy situation” that might have encouraged the hospital’s silence. Edwards lectures his colleagues about “their obligation as civil engineers to protect the public” and says, “I didn’t get in this field to stand by and let science be used to poison little kids.”

Intense economic pressures, though, make it hard to protect such this kind of idealism. As market and financial logics come to dominate institutions like hospitals and universities, professional values gradually erode. It takes a concerted effort to defend them when everything else encourages you to keep your head down and leave well enough alone.

Promoting academic independence isn’t without its downsides. Scientists can become solipsistic, valuing internal status over real-world impact and complacently expecting government support as their due. The balance between preserving a robust and independent academic sector and ensuring scientists remain accountable to the public is a delicate one.

But if I have to choose between two risks—that science might be a bit insular and too focused on internal incentives, or that the only supporters of science have a one-sided interest in how the results turn out—I’ll take the first one every time.

Written by epopp

February 12, 2016 at 4:54 pm

that chocolate milk study: can we blame the media?

A specific brand of high-protein chocolate milk improved the cognitive function of high school football players with concussions. At least that’s what a press release from the University of Maryland claimed a few weeks ago. It also quoted the superintendent of the Washington County Public Schools as saying, “Now that we understand the findings of this study, we are determined to provide Fifth Quarter Fresh [the milk brand] to all of our athletes.”

The problem is that the “study” was not only funded in part by the milk producer, but is unpublished, unavailable to the public and, based on the press release — all the info we’ve got — raises immediate methodological questions. Certainly there are no grounds for making claims about this milk in particular, since the control group was given no milk at all.

The summary also raises questions about the sample size. The total sample included 474 high school football players, but included both concussed and non-concussed players. How many of these got concussions during one season? I would hope not enough to provide statistical power — this NAS report suggests high schoolers get 11 concussions per 10,000 football games and practices.

And even if the sample size is sufficient, it’s not clear that the results are meaningful. The press release suggests concussed athletes who drank the milk did significantly better on four of thirty-six possible measures — anyone want to take bets on the p-value cutoff?

Maryland put out the press release nearly four weeks ago. Since then there’s been a slow build of attention, starting with a takedown by Health News Review on January 5, before the story was picked up by a handful of news outlets and, this weekend, by Vox. In the meanwhile, the university says in fairly vague terms that it’s launched a review of the study, but the press release is still on the university website, and similarly questionable releases (“The magic formula for the ultimate sports recovery drink starts with cows, runs through the University of Maryland and ends with capitalism” — you can’t make this stuff up!) are up as well.

Whoever at the university decided to put out this press release should face consequences, and I’m really glad there are journalists out there holding the university’s feet to the fire. But while the university certainly bears responsibility for the poor decision to go out there and shill for a sponsor in the name of science, it’s worth noting that this is only half of the story.

There’s a lot of talk in academia these days about the status of scientific knowledge — about replicability, bias, and bad incentives, and how much we know that “just ain’t so.” And there’s plenty of blame to go around.

But in our focus on universities’ challenges in producing scientific knowledge, sometimes we underplay the role of another set of institutions: the media. Yes, there’s a literature on science communication that looks as the media as intermediary between science and the public. But a lot of it takes a cognitive angle on audience reception, and it’s got a heavy bent toward controversial science, like climate change or fracking.

More attention to media as a field, though, with rapidly changing conditions of production, professional norms and pathways, and career incentives, could really shed some light on the dynamics of knowledge production more generally. It would be a mistake to look back to some idealized era in which unbiased but hard-hitting reporters left no stone unturned in their pursuit of the public interest. But the acceleration of the news cycle, the decline of journalism as a viable career, the impact of social media on news production, and the instant feedback on pageviews and clickthroughs all tend to reinforce a certain breathless attention to the latest overhyped university press release.

It’s not the best research that gets picked up, but the sexy, the counterintuitive, and the clickbait-ish. Female-named hurricanes kill more than male hurricanes. (No.) Talking to a gay canvasser makes people support gay marriage. (Really no.) Around the world, children in religious households are less altruistic than children of atheists. (No idea, but I have my doubts.)

This kind of coverage not only shapes what the public believes, but it shapes incentives in academia as well. After all, the University of Maryland is putting out these press releases because it perceives it will benefit, either from the perception it is having a public impact, or from the goodwill the attention generates with Fifth Quarter Fresh and other donors. Researchers, in turn, will be similarly incentivized to focus on the sexy topic, or at least the sexy framing of the ordinary topic. And none of this contributes to the cumulative production of knowledge that we are, in theory, still pursuing.

None of this is meant to shift the blame for the challenges faced by science from the academic ecosystem to the realm of media. But if you really want to understand why it’s so hard to make scientific institutions work, you can’t ignore the role of media in producing acceptance of knowledge, or the rapidity with which that role is changing.

After all, if academics themselves can’t resist the urge to favor the counterintuitive over the mundane, we can hardly blame journalists for doing the same.

Written by epopp

January 18, 2016 at 1:23 pm

why do universities salivate over money-losing grants?

Happy new year. Guess what my New Year’s resolution is. To that end, a few quick thoughts on universities and the grant economy to dip a toe back in the water.

We all know that American universities (well, not only American universities) are increasingly hungry for grants. When state funding stagnates, and tuition revenues are limited by politics or discounting, universities look to their faculty to bring in money through grants. Although this may be a zero-sum game across universities (assuming total funding is fixed), it is unsurprising that administrations would intensify grant-seeking when faced with tight budgets.

Of course, it’s only unsurprising if grants actually make money for the university. But a variety of observers, from the critical to the self-interested, have argued that the indirect costs that many grants bring in – the part that pays not for the direct cost of research, but for overhead expenses like keeping the network running, the library open, and the heat and electricity on – don’t actually cover the full expense of conducting research.

Instead, they suggest that every grant the university brings in costs it another 9% or so in unreimbursed overhead. In addition, about 12% of total research spending consists of universities spending their own money on research. While some of this goes to support work unlikely to receive external funding (e.g. research in the humanities), I think it’s safe to assume that most of it is related to the search for external grants – it’s seed funding for projects with the potential for external funding, or bridge funding for lab faculty between grants. (These numbers come from the Council on Government Relations, a lobbying organization of research universities.)

If that’s the case, it means that when faculty bring in grants, even federal grants that come with an extra 50% or so to pay for overhead costs, it costs the university money. Money that could be spent on instruction, or facility maintenance, or even on research itself. So how can we make sense of the fact that universities are intensifying their search for grants, even as the numbers suggest that grants cost universities more they gain them?

I can think of at least three reasons this might be the case:

1.  The numbers are wrong.

It is notoriously difficult to estimate the “real” indirect costs of research. How much of the library should your grant pay for? How much of the heat, if it’s basically supporting a grad student who would be sitting in the same shared office with or without the grant? There are conventions here, but they are just that – conventions. And maybe universities have a better sense of the “real” costs, which might be lower than standard accounting would suggest. COGR has an interest in making research look expensive, so government is generous about covering indirect costs. And critics of the university (with whom I sympathize) have a different interest in highlighting the costs of research, since they see a heavy grants focus as coming at the cost of education and of the humanities and social sciences. (See e.g. this recent piece by Chris Newfield, which inspired the line of thought behind this post.)

Certainly the numbers are squishy, and the evidence that grant-seeking costs universities more than it gains them isn’t airtight. But I haven’t seen anyone make a strong case that universities are actually making money from indirect costs. So I’m skeptical that these numbers are out-and-out wrong, although open to better evidence.

2. It’s basically political and/or symbolic, not financial.

A second possibility is that the additional dollars aren’t really the point. The point is that universities exist in a status economy in which having a large research enterprise is integral to many forms of success, from attracting desirable faculty and students, to appearing in a positive light to politicians (more relevant for public than private universities), to attracting donations from those who want to give to an institution that is among the “best”. Or, in a slight variation, maybe the perceived political benefits of having a large grant apparatus – of being on the cutting edge of science, of being seen as economically valuable – is seen as outweighing any extra costs. After all, what’s an extra 10% per grant if it makes the difference between the state increasing or cutting your appropriations over the next decade? (Again, most relevant for publics.)

These dynamics are real, but they don’t explain the intensification of the search for grants in response to tight budgets, except insofar as tight budgets also intensify the status competition. But it really seems to me that administrators see grants as a direct financial solution, not an indirect one. So I think that symbolic politics is a piece of the puzzle, but not the only one.

3.  Not all dollars are created equal.

Different dollars have different values to different people. Academic scientists often like industry grants because they tend to be more flexible than government money. Administrators, on the other hand, don’t, since such grants typically don’t cover overhead expenses.

Perhaps something related is going on with the broader search for grants. Maybe, even if grants really do cost more than they bring in for universities, administrators don’t perceive the revenues and the expenses in parallel ways. After all, those indirect costs provide identifiable extra dollars the university wouldn’t have seen otherwise. But the “excess” expenses are sort of invisible. The university is going to pay for the heat and the library either way; even if you know the research infrastructure has to be supported, you might assume that the marginal overhead cost of an additional grant doesn’t make that much difference. (Maybe you’d even be right.) And people might not see some costs – like university seed funding for potentially fundable research – as an expense of grant-seeking, even if that’s why they exist.

I think this is probably a big part of the explanation. The extra revenues of grants are visible and salient; the extra costs are hidden and easy to discount. So, rightly or wrongly, administrators turn to grant-seeking in tight times despite the fact that it actually costs universities money.

There are some other possibilities I’m not considering here. For example, maybe this is about the interests of different specific groups within the organization – e.g. about competitions among deans, or between upper administration and trustees. But I think #2 and #3 capture a lot of what’s going on.

So, if you think this dynamic (the intensification of grant-seeking) is kind of dysfunctional, what do you do? Well, pointing out how much research really costs the university – loudly and repeatedly – is probably a good idea. Make those “extra” costs as visible and salient as the revenues. (Though it would be SO NICE if the numbers were better.)

But don’t discount #2 – even if any extra costs of grants are made clear, universities aren’t going to give up the search for them. Because while the money grants bring in matters, they also have value as status capital, and that outweighs any unreimbursed costs they incur. Grants may not quite cover those pesky infrastructure costs. But the legitimacy they collectively confer is, quite literally, priceless.

Written by epopp

January 4, 2016 at 1:54 pm

new book Handbook of Qualitative Organizational Research Innovative Pathways and Methods (2015, Routledge) now available

At orgtheory, we’ve had on-going discussions about how to undertake research.  For example, I’ve shared my own take on dealing with the IRB, gaining access to organizations, undertaking ethnography , timing and pacing research, writing for wider audiences, and what is ethnography good for?  Guest blogger Ellen Berrey elaborated her thoughts on how to get access to organizations, and we’ve had at least three discussions about the challenges of anonymizing names and identities of persons and organizations, including guest blogger Victor Tan Chen’s post, guest blogger Ellen Berrey’s post, and Fabio’s most recent post here.

Looking for more viewpoints about how to undertake organizational research?  Preparing a research proposal?  Need a new guide for a methods or organizations class?  Rod Kramer and Kim Elsbach have co-edited the Handbook of Qualitative Organizational Research Innovative Pathways and Methods (2015, Routledge)

HandbookQualitativeOrgResearch

In the introduction, Kramer and Elsbach describe the impetus for the volume:

There were several sources of inspiration that motivated this volume. First and foremost was a thoughtful and provocative article by Jean Bartunek, Sara Rynes, and Duane Ireland that appeared in the Academy of Management Journal in 2006. This article published a list of the 17 most interesting organizational papers published in the last 100 years. These papers were identified by Academy of Management Journal board members—all of whom are leading organizational scholars cognizant of  the best work being done in their respective areas. A total of 67 board members nominated 160 articles as exceptionally interesting; those articles that received two or more nominations were deemed the most interesting. Of these exceptional articles, 12 (71%) involved qualitative methods.

This result strongly mirrors our own experience as organizational researchers. Although both of us have used a variety of methods in our organizational research (ranging from experimental lab studies and surveys to computer-based, agent simulations), our favorite studies by far have been our qualitative studies (including those we have done together). One of the qualities we have come to most appreciate, even cherish, about qualitative research is the sense of discovery and the opportunity for genuine intellectual surprise. Rather than merely seeking to confirm a preordained hypothesis or “nail down” an extrapolation drawn from the extant literature, our inductive studies, we found, invariably opened up exciting, unexpected intellectual doors and pointed us toward fruitful empirical paths for further investigation. In short, if life is largely all about the journey rather than destination, as the adage asserts, we’ve found qualitative research most often gave us a road we wanted to follow.

Read the rest of this entry »

Written by katherinechen

December 18, 2015 at 5:27 pm

stuff that doesn’t replicate

Here’s the list (so far):

Some people might want to hand wave the problem away or jump to the conclusion that science is broken. There’s a more intuitive explanation – science is “brittle.” That is, once you get past some basic and important findings, you get to findings that are small in size, require many technical assumptions, or rely on very specific laboratory/data collection conditions.

There should be two responses. First, editors should reject submissions which might depend on “local conditions” or very small results or send them to lower tier journals. Second, other researchers should feel free to try to replicate research. This is appropriate work for early career academics who need to learn how work is done. Of course, people who publish in top journals, or obtain famous results, should expect replication requests.

50+ chapters of grad skool advice goodness: Grad Skool Rulz ($2!!!!)/From Black Power/Party in the Street

Written by fabiorojas

October 13, 2015 at 12:01 am

movements and inhabited institutions: the case of latino student groups

A key insight from research on student activism is that the college environment has a strong influence on how that activism expresses itself. We saw that in Amy Binder and Kate Wood’s study of conservative groups. Daisy Reyes has an article in Sociology of Education that explores this issue with Latino groups and links it to institutional theory:

To comply with ideals of multiculturalism and diversity, postsecondary institutions incorporate Latino students into distinct campus cultures. These cultures influence how students interact with one another, the university community at large, and communities outside of campus, ultimately shaping how students inhabit Latino politics. Drawing on data from 20 months of ethnographic fieldwork with six student organizations and 60 in-depth interviews, I compare Latino student organizations in a liberal arts college, a research university, and a regional public university. Building on inhabited institutional theory, I identify dimensions of campus cultures that work in interaction with students to produce three divergent forms of ethnic political expression: deliberative, divisive, and contentious. Inhabited institutionalism helps explain why Latino politics takes distinct forms in specific academic contexts and suggests that strong collegiate incorporation may paradoxically serve to suppress Latino student engagement in political activism outside the campus gates.

Read the entire article here. Recommended.

50+ chapters of grad skool advice goodness: Grad Skool Rulz ($2!!!!)/From Black Power/Party in the Street

Written by fabiorojas

September 29, 2015 at 12:01 am

Follow

Get every new post delivered to your Inbox.

Join 3,649 other followers