A Better Way to Rank Colleges?
Amidst another scandal surrounding U.S. News and World Report’s college rankings, economists Christopher N. Avery, Mark E. Glickman, Caroline M. Hoxby, and Andrew Metrick have proposed another option: rankings based on students’ revealed preferences. Here’s the abstract:
We present a method of ranking U.S. undergraduate programs based on students’ revealed preferences. When a student chooses a college among those that have admitted him, that college “wins” his “tournament.” Our method efficiently integrates the information from thousands of such tournaments. We implement the method using data from a national sample of high-achieving students. We demonstrate that this ranking method has strong theoretical properties, eliminating incentives for colleges to adopt strategic, inefficient admissions policies to improve their rankings. We also show empirically that our ranking is (1) not vulnerable to strategic manipulation; (2) similar regardless of whether we control for variables, such as net cost, that vary among a college’s admits; (3) similar regardless of whether we account for students selecting where to apply, including Early Decision. We exemplify multiple rankings for different types of students who have preferences that vary systematically.
Eric Hoover of the The Chronicle of Higher Education comments on another interesting finding in the paper:
The authors also challenge the assumption that an admission rate is an indicator of desirability. Half of the top 20 colleges in the revealed-preferences list, they found, would fall outside the top 20 if one ranked them only according to their admission rates (the lower the rate, the better, conventional wisdom holds). Notre Dame, for instance, placed 13th on the desirability list, but its admission rate was only the 58th lowest. The University of Virginia placed 20th on the desirability list, but it had only the 76th lowest admission rate.
(HT: Seth Matthew Fishman)
Comments