Wednesday, October 04, 2006

US News Rankings and Learning: Statistical Evidence

By Richard Vedder

As readers know from previous epistles, I very much liked the Intercollegiate Studies Institute (ISI) study of civic learning in American universities, because it is one of the very few attempts to explicitly report, school by school, the "value added" by university training with respect to one area of knowledge. I would like the testing expanded both in terms of content and the number of schools examined, and hope to explore with other educational reformers whose thinking mirrors my own ways of making this happen.

I put the CCAP Whiz Kids (Jonathan Leirer, Matt Denhart, James Woodward) to work on doing some statistical analysis of the relationship between learning in college (as measured by the ISI study) and the ranking of the school in the US News & World Report (hereafter USN&WR) rankings. We looked specifically at the 29 schools for which we had ISI data that were on the national rankings list for national universities or liberal arts colleges. Did the top ranked schools (e.g., Princeton, Harvard, Yale, Williams, Amherest) show higher levels of "value added" learning by their students than the kids going to lower ranked (but generally pretty good) schools?

We found no statistically significant relationships between the USN&WR rankings and student learning in college. To be sure, there is a bias against the prestigious highly ranked schools in using the raw ISI test results. Those schools get students as freshman who do relatively well on the test initially, and thus have less knowledge left to learn in order to have a perfect comprehension. But even after correcting for that in various ways, there is no statistically significant relationship observed. Kids at the top USN&WR schools did somewhat better on the ISI test in their senior year than students at other schools, but that is solely because they knew more as freshman. Dollar for dollar, learning was far greater at the lesser ranked schools than at the prestige universities.

To be sure, the ISI test only examined one area of learning, albeit an important one. The sample of schools examined was pretty small. The testing might be criticized because the same students were not examined three (or four) years apart, first as freshman and then as seniors -- different students were being tested. Nonetheless, the results show that the metrics typically used to measure collegiate excellence, such as the USN&WR rankings, are very poor, and we need to take the ISI methodological approach and expand it to provide useful information to parents and policymakers about our universities --we need to create a good "bottom line."

The Spellings Commission liked that idea, as does, at least rhetorically, Secretary of Education Spellings herself (and presumably the Bush administration). I hope that the private sector can do an expanded version of the ISI test in the hopes of shaming the body politic and the universities into developing a truly national system measuring "value added" in college. This would be a giant step forward in making the universities more accountable, and making them suffer market consequences in cases where learning is meager in relation to the costs of college. This type of "value added" testing also would be a great way that private for-profit providers could demonstrate they offer a quality product and deliver learning results.

No comments: