Remember when the Mafia fixed the World Series? That scandal so shocked America that it showed up in The Great Gatsby and gave us the heart-rending story of a little boy pleading with âShoelessâ Joe Jackson (one of the cheaters) to âsay it ainât so, Joe.â And of course, we Dodgeball fans have barely gotten over Lance Armstrongâs lies. Now we learn that another beloved American competition has been fixed: the U.S. News and World Report college rankings.
Yes, it turns out that some college administrators have been feeding U.S. News doctored numbers to nudge up their schools on that hallowed list.
Letâs get two things straight: We all love rankings, top ten lists, and contests. And we all know that most of them are kind of stupid. There are some contests where coming in second is a lot worse than coming in firstâlike presidential elections, and World Wars. (It was small consolation to the Kaiser in 1918 that he had almost defeated England.) But what exactly does it mean when Entertainment Weekly ranks Scarlett Johansson as âhotterâ than Megan Fox? Exactly as much as it means when U.S. News rejiggers its annual lineup of colleges to place Princeton ahead of Harvard: nothing, not one thing at all.
And that was the case well before we learned that colleges and universities were inflating their statistics to try to get a higher ranking in the supposedly authoritative U.S. News rankings. The truth is that U.S. Newsâa news magazine that went bankrupt and stopped reporting newsânow makes a huge annual business out of ranking American colleges based on criteria that are totally arbitrary.
Look at the âformulaâ U.S. News uses to rank colleges:
âUndergraduate academic reputation.â Some 22.5 percent of each schoolâs ranking is based on . . . surveys emailed to administrators at other schools, inquiring about âintangibles such as faculty dedication to teaching.â So to find out how dedicated the teachers are at Swarthmore, they ask the provost of Stanford.
“Freshman retention and graduation rate.” For U.S. News, âThe higher the proportion of freshmen who return to campus the following year and eventually graduate, the better a school is apt to be at offering the classes and services students need to succeed.â Or it might mean that a school is timid about what kind of students it admits, and that easy grading makes it very hard to fail. (I went to Yaleâjust try flunking out of that place. They will do everything short of sending tutors to your dorm room.)
âFaculty resources,â which smooshes together numbers like average class size, faculty qualifications, the use of adjunct and part-time teachersâand even faculty salariesâbut leaves out how many classes are taught by grad students with halting English.
âStudent selectivity.â This doesnât measure how smart the students are, but how prestigious the school already is. So a famous school thatâs the automatic first choice of valedictorians will far outscore another college whose students have equal SAT scoresâsimply because itâs more popular among guidance counselors.
âFinancial resources.â So schools that charge very high tuitions, then plow it back into indoor rock-climbing facilities, will outscore cheap schools that spend their money on books.
âGraduation rate performance.â This is a weird one, where U.S. News looks at how well it bet, in a previous edition, that the school would do at improving its graduation rateâand sees if the school beat the point spread. Huh?
âUndergraduate academic reputation ratings,â based onâŚsurveys of high school guidance counselors. So people who have been out of college for many years, who now work in high schools, are providing the ânewsâ about whatâs happening in universities? How would they know?
Percentage of alumni who give money to the school. This makes some sense, since it reflects how happy graduates are with what they paid for. A pity itâs the last, least important criterion.
What standards should we be using for judging colleges? I can suggest a few: How solid are the schoolâs âcoreâ or âgeneral educationâ requirements for every graduate? How rigorous are the course requirements for individual majorsâfor instance, must English majors study Shakespeare, and history majors the U.S. founding? How much intellectual freedom is there on the classroom? Is political speech free on campus? How sane is the dorm life? How safe are students on campus? These are just a few of the questions that college rankings pros seem never to ask. And yet they are the ones that really matter. Why not ask current students and faculty, confidentially, to rate their institutions, and report what they actually say? Their answers will be fudge-proof, free of administratorsâ tinkering andâas we I’ve often foundâsobering.
John Zmirak is Editor-in-Chief of Choosing the Right College and Collegeguide.org, and Senior Editor of The Intercollegiate Review.