orgtheory.net

college rankings are a fraud, but I like princeton review

Fabio

Maybe that’s a little too harsh. College rankings can be thoughtful and useful but I am often reminded of the poor quality of many rankings. For example, the LA Times education blog (School Me! Adventures in Education) had a recent post on how the US News & World Report college rankings are derived primarily from exclusivity, campus wealth, and prior reputation. The LA Times blog rightfully notes that many college rankings don’t bother to ask the obvious question of *quality* of education or social experience at a college. Why should we care that an elite private college has a 10% acceptance rate? And should we be surprised at all that wealthy schools often have the biggest libraries?

Furthermore, these rankings seem designed to automatically put Harvard, Yale, or Princeton at the top, regardless of the advantages of other elite privates (Chicago, Cal Tech), liberal arts (Amherst, Swarthmore), and major public schools (Berkeley, Michigan). I urge readers to look at an old article in Slate about how Cal Tech – whose students are probably stronger than any Ivy League school – was unceremoniously bumped in 1999 from a top ranking to make way for Harvard/Yale/Princeton. As a Berkeley grad, I was surprised to see my college fluctuate from a high of #5 in the late 1980s, down to around #22 in the late 1990s, even though the college really hadn’t changed much. Bruce Gottlieb, the author of the Slate article, says the following about Cal Tech’s treatment and how the rankings work:

But the real reason Caltech jumped eight spaces this year is that the editors at U.S. News fiddled with the rules. The lead story of the “best colleges” package says that a change in “methodology … helped” make Caltech No. 1. Buried in a sidebar is the flat-out concession that “[t]he effect of the change … was to move [Caltech] into first place.” No “helped” about it. In other words, Caltech didn’t improve this year, and Harvard, Yale, and Princeton didn’t get any worse. If the rules hadn’t changed, HYP would still be ahead. If the rules had changed last year, Caltech would have been on top a year earlier.

The US News ranking seems designed to (a) reinforce the position of HYP as the the most desirable schools & (b) produce poorly designed lists for nervous high school kids and their folks. The US News ranking books tell you that there is more to college than ranking, but aside from showcasing a few non-Ivy League schools each year, does little to back up the rhetoric.

In that spirit, I recommend the Princeton Review’s recent 361 Best Colleges. If you are serious about college quality, this is far superior to the US News & World Report. Princeton Review provides you with basic statistics about competitiveness (average SAT, acceptance rate) and interview clips from students so you get a sense of what the school is like. Also, they actually survey students on how much they liked their education and other experiences. Sure, it’s not a perfect measure, but it’s a step in the right direction.

The book then creates lists of top & bottom 10 – schools that make you study the most/least, schools that have best/worst classroom exeperience, schools with the best/worst food, etc. And surprise – on many important measures, the HYP trio does not come out on top – or even in the top ten. For example, among the colleges with the best classroom teaching, you get a mix of liberal arts colleges and research schools with strong teaching missions (e.g., Chicago). Readers obsessed with rankings might also want to look at the Washington Monthly’s ranking, which emphasizes how much a school helps low income kids move up and whether graduates go into public service. I can only commend Princeton Review & Washington Monthly for thinking outside the box.

Written by fabiorojas

September 24, 2006 at 6:05 pm

Posted in education, fabio

7 Responses

Subscribe to comments with RSS.

  1. If US News & World Report didn’t fiddle with their rankings systems each year, they would have virtually the same rankings year after year, and then it would be much more difficult to sell the new rankings so much as news.

    Like

    Jeremy

    September 24, 2006 at 8:17 pm

  2. Well, it’s a little more subtle – the college’s position in the ranking is about the same – you won’t see HYP drop out of the top five – but you see a lot of fluctuations withing micro-categories. For example, no HYP school has ever been ranked lower than #5. Chicago (my grad institution) always fluctuates around #15-8, etc.

    The range of “acceptable” rankings remains the same for most schools (see original post for exceptions), but US news probably fiddles rules to create news grabbing “micro fluctuations.” (“Princeton grabs the #1 position !!!”)

    What they do to sell more copies is create new rankings – breaking up the research schools into private/public, creating categories for other universities, occasional grad school ratings, etc. The micro fluctuations among HYP are for wealthy suburbanites and their kids, the other stuff is for other readers.

    From a marketing standpoint, brilliant. From an education stand point, “room for improvement!”

    Like

    Fabio Rojas

    September 24, 2006 at 8:28 pm

  3. 1. Isn’t it better to divide the pool of institutions into two (or more categories)? Public and private universities, for example. Among the latter, perhaps a subdivision into universities and liberal arts colleges is also desirable. Without this subdivision, don’t these rankings end up comparing apples and oranges?

    2. Within these categories, are the composite scores of the top 10 (or, top 5) so different lead to any sensible ranking scheme?

    3. There are several ‘rankings’ now available for universities worldwide (The (London) Times, Shanghai University rankings are popular; even a recent issue of Newsweek had a list of top 100 ‘Global’ universities). With these, the apple-orange problem becomes compounded globally!

    Like

    Abi

    September 25, 2006 at 6:19 am

  4. The Shanhai Jiao Tong University rankings are pretty objective, and they only use outcomes, not inputs. They are biased in favor of science. I recalculated their rankings using the score per faculty full-time equivalent. On the original list Berkeley was number 4, if you take size into account, it is number 7. My list is here, http://conservationfinance.wordpress.com/2006/08/22/the-best-universities-in-the-world/

    Like

    Lars Smith

    September 25, 2006 at 4:16 pm

  5. […] important. They are based on some questionable criteria and data that is often manipulated (also here). But the colleges do consider their rankings […]

    Like

  6. […] long time readers know, I believe that most college rankings are garbage because they use dubious measures of performance and quality. Also, the leading magazines tend to […]

    Like

  7. […] to the failings of USNWR, it’s worth thinking about what would be included in an analysis that did a better job than […]

    Like


Comments are closed.