Students Find Glaring Discrepancy in US News Rankings
Study reveals the steep penalty that the rankings giant imposes on Reed.
For years, I’ve written about the hidden penalty that U.S. News & World Report imposes on Reed and other rebel colleges who refuse to cooperate with the rankings giant. Now a team of Reed students has come up with a way to estimate the magnitude of the hit.
Their conclusion? If USN faithfully followed its own formula in the 2019 rankings, Reed would be ranked at #38, rather than its assigned rank of #90. In other words, USN pushed the college down a whopping 52 rungs on the ladder because Reed wouldn’t fill out their form.
The Reed team—consisting of Bailee Cruger ’19, Huaying Qiu ’20, and Wenxin Du ’20—dug into the USN system for a in a statistics class with Prof. . They began by analyzing public data sources, specifically a database known as , run by the National Center for Education Statistics.
They quickly spotted a red flag. The USN rank depends on a complex scoring system based on dozens of factors such as SAT scores, class size, and so on. One of those factors is how much you spend per student on instruction, research, and student services. (The basic idea is that top colleges spend more money on their students.) Our researchers noticed that USN ranked Reed very low on this factor—it was ranked #169 out of 172 schools—despite the fact that Reed spends $54,566 per student, which is more than many schools in the Top 50.
This discrepancy aroused their suspicions and inspired them to perform an astonishing feat: they reverse-engineered USN’s ranking system.
Taming the Leviathan
The project required some that I won’t pretend to understand; part of it looks like this:
But the upshot is simple: starting with the IPEDS data and using formulas they extrapolated, the students were able to predict USN’s rankings with 94% accuracy. In other words, they could make a highly accurate prediction about where a given college would land in the USN rankings, based on its IPEDS data, with one striking exception—Reed.
To understand how this works, take a look at USN’s “financial resources” score, which is based on factors like faculty compensation and expenditure per student (which we saw before.) The students plugged IPEDS data for all colleges into the formula they worked out for financial resources, and compared their predictions with the actual scores. In most cases, their estimates were very close. Some were a little over, some a little under. Reed was off by more than 100 places.
The overall rankings for 2019 show the same pattern. Using IPEDS data, the students were able to predict a college’s overall score with a high degree of accuracy—until it came to Reed.
They reckoned that Reed should have been ranked at #38 in 2019. USN actually assigned it a rank of #90.
This isn’t a fluke. The students also checked the data for 2009. According to their calculations, Reed should have ranked at #37. But USN actually assigned it a rank of #54.
The students are careful to reiterate that their results are not perfect. “It is important to emphasize that all results are based on the data available to us and the models we used,” they told me. “The uncertainty—the prediction interval which can be found in the slides—should also be mentioned.”
Nevertheless, their conclusions are persuasive. “I was extremely impressed with how deeply the students dug into the USN methodology,” says Prof. McConville. “They worked hard to understand all the various pieces that go into the USN ranking model. They also followed good statistical practices with their work and tried various modeling approaches to determine how robust their findings were and each time made sure to quantify the uncertainty in their estimates. From a statistical perspective, I found their arguments that Reed is under-ranked to be very compelling.”
The discrepancy is glaring but should come as no surprise. In 1995, after a string of high-profile scandals where colleges misreported their data to USN, Reed abandoned the system, and other colleges followed suit. To prevent similar defections, USN established an array of statistical penalties designed to punish refuseniks and prevent others from leaving the fold. Most colleges fell back in line; Reed stuck to its guns and has held out against the system for a remarkable 24 years.
In 2014, Robert Morse, the director of data research for USN, made a presentation to the Association for Institutional Research in Orlando, Florida, where he claimed that his company treats non-responders like Reed in a straightforward manner, based on numbers they obtain from public databases. The implication was that refuseniks get a fair shake from the system, and that any griping should be written off as sour grapes. But the Reed paper makes clear that USN maintains a steep penalty for non-compliance, and that the statistical punishment is actually getting more severe.
What’s the Solution?
Of course, you can argue that the solution is obvious—Reed should simply agree to play the game. In 2014, St. John’s College in Annapolis, Maryland, did exactly that; its rank that year vaulted from #123 to #56.
Interestingly, however, none of the Reed students thought the college should change its policy. Says Huaying:
“I like Reed because I like the professors here—not because of their PhD degrees, but their personalities, styles of teaching, etc. I like the Sakura trees in Eliot Circle and the restaurants in Portland; I like my classmates. I can come up with dozens of other reasons why I like Reed, but you get the idea. How you like your school eventually comes down to these very personal things rather than numbers and these unmeasurable human feelings would be included in the error term. When the error term has the dominating effect, you know you won't have a good model. So, why should we take a relatively trivial variable, USN rankings, seriously?”
Experts over the years have highlighted the flaws in the USN system, but their warnings have done little to dent its popularity. The truth is that high-school seniors (and their families) suffer from information overload when it comes to choosing a college; they want a way to make easy comparisons. USN makes that possible—even if it sometimes means squeezing apples into oranges.
I'll close with an inspiring thought from Wenxin. “It’s easy to make up stories,” she says. “Except that between rows of data there lies truth. We may never unearth the absolute reality, but we can get close to it—like what we did here. So next time I’m asked, ‘What do you do in statistics?’ I’ll reply ‘Well… the first duty of us statisticians is seeking after the truth.’”
Tags: ÍõÖÓÑþ»éÀñÊÓƵÆعâ, Editor's Picks, Institutional, Research, Students