By Richard Vedder
One of the best things --arguably the best thing --that the Spellings Commission recommended was the development of value added measures of learning by colleges, and the publishing of those results in an easy-to-understand format on the Internet and in other forms to aid parents, students, taxpayers, and policymakers. There are three possible responses from the academy to this recommendation: one, we will not do it because it is inappropriate or impossible to do; two, we agree with you and will work to implement the recommendation or three, that we are already doing it so this is much ado about nothing.
In today's INSIDE HIGHER ED Scott Jaschik reports on a meeting of the National Symposium on Postsecondary Student Success, where a number of administrators claimed that the third option above applies. "All these national calls would make you think nothing is happening," according to Jon Young of Fayetteville State University. Well, if it IS being done, why are we not hearing about it? Why are we not working to get a relatively small number of standardized instruments to measure "value added" and report the results IN AN EASY-TO-UNDERSTAND fashion to real people, not academic administrators? I think Margaret Spellings is absolutely right to champion this issue, and I am under whelmed by what I read about what is really going on.
To be sure, there are issues. I think we should measure the "value added" of what students learn rather than how they feel or how "engaged" they are. There is a case for having critical thinking being a second metric worth evaluating. Also, I was pleasantly surprised to read that the North Central Association, a major regional accreditor, is offering an alternative accreditation review that seems to emphasis regular evaluation of "value added" type measures. Although the devil is in the details, this is an excellent move in principle. I hope this idea spreads.