please, no more homework
The Washington Post just ran an article about research showing that homework isn’t particularly effective. A clip from the article:
Let’s start by reviewing what we know from earlier investigations. First, no research has ever found a benefit to assigning homework (of any kind or in any amount) in elementary school. In fact, there isn’t even a positive correlation between, on the one hand, having younger children do some homework (vs. none), or more (vs. less), and, on the other hand, any measure of achievement. If we’re making 12-year-olds, much less five-year-olds, do homework, it’s either because we’re misinformed about what the evidence says or because we think kids ought to have to do homework despite what the evidence says.
Second, even at the high school level, the research supporting homework hasn’t been particularly persuasive. There does seem to be a correlation between homework and standardized test scores, but (a) it isn’t strong, meaning that homework doesn’t explain much of the variance in scores, (b) one prominent researcher, Timothy Keith, who did find a solid correlation, returned to the topic a decade later to enter more variables into the equation simultaneously, only to discover that the improved study showed that homework had no effect after all, and (c) at best we’re only talking about a correlation — things that go together — without having proved that doing more homework causes test scores to go up. (Take 10 seconds to see if you can come up with other variables that might be driving both of these things.)
Third, when homework is related to test scores, the connection tends to be strongest — or, actually, least tenuous — with math. If homework turns out to be unnecessary for students to succeed in that subject, it’s probably unnecessary everywhere.
Along comes a new study, then, that focuses on the neighborhood where you’d be most likely to find a positive effect if one was there to be found: math and science homework in high school. Like most recent studies, this one by Adam Maltese and his colleagues doesn’t provide rich descriptive analyses of what students and teachers are doing. Rather, it offers an aerial view, the kind preferred by economists, relying on two large datasets (from the National Education Longitudinal Study [NELS] and the Education Longitudinal Study [ELS]). Thousands of students are asked one question — How much time do you spend on homework? — and statistical tests are then performed to discover if there’s a relationship between that number and how they fared in their classes and on standardized tests.
Was there a correlation between the amount of homework that high school students reported doing and their scores on standardized math and science tests? Yes, and it was statistically significant but “very modest”: Even assuming the existence of a causal relationship, which is by no means clear, one or two hours’ worth of homework every day buys you two or three points on a test. Is that really worth the frustration, exhaustion, family conflict, loss of time for other activities, and potential diminution of interest in learning? And how meaningful a measure were those tests in the first place, since, as the authors concede, they’re timed measures of mostly mechanical skills? (Thus, a headline that reads “Study finds homework boosts achievement” can be translated as “A relentless regimen of after-school drill-and-skill can raise scores a wee bit on tests of rote learning.”)
Education researchers have long known that homework doesn’t lead to improved learning. Back in 2006, I blogged about The Battle over Homework, which lays out the case. Hey, teacher, leave us kids alone!