Now that the U.S. Department of Education has decided to ditch the ratings part of its college-ratings system in favor of a customizable, consumer-focused website, plenty of big questions remain.
What’s the legacy of the nearly two-year effort? What lessons were learned? What opportunities were lost?
We asked several ratings watchers for their views on the department’s change of course. Here’s some of what we heard.
It may have "poisoned" future efforts for college accountability.
When the ratings were first proposed, they were conceived as a comprehensive systemto evaluate colleges of all stripes on a host of measures of interest to both students and policy makers, making the ratings both a consumer and an accountability tool.
Andrew P. Kelly, director of the American Enterprise Institute’s Center on Education Reform, says he’s not surprised the department abandoned that approach. "Elaborate federal accountability systems" like the proposed ratings system and the No Child Left Behind proposals for elementary and secondary schools are often unworkable, he says. But when they fail, they can "poison the well for other ideas that are simpler," such as "skin-in-the-game" proposals that the institute, among others, has suggested.
It may have been a useful threat of action that had some positive outcomes.
Mr. Kelly contends that the role the Education Department will now be playing — using federal authority to marshal data from a variety of sources to create an information system "that doesn’t rely on judgment calls by bureaucrats" — is more appropriate for a government agency than what department officials had initially set out to do. "They could have started there and not wasted so much time and energy," he says.
But Robert Kelchen, an assistant professor of higher education at Seton Hall University who advised the department on ratings, says he’s pleased that the process could result in a new set of useful consumer metrics about colleges.
"If there’s good new data, I’d use it in heartbeat," says Mr. Kelchen, who also helped develop the rankings in the annual college guide produced byThe Washington Monthly.
Although the Education Department has not said what data from other federal and private sources it plans to include in its new site, Mr. Kelchen says he’d be especially interested in data on the employment rates of a college’s graduates, the proportion of them who are repaying their loans, and the graduation rates for recipients of Pell Grants.
Some higher-education groups contend that the prospect of a federal ratings system prompted many more colleges to become more publicly accountable about their student outcomes. Mr. Kelchen, for one, doesn’t buy that. "Regardless of what the government does," he says, for one segment of colleges, actions are driven by a desire to look good in U.S News & World Report's rankings, and for another, "the goal is to stay open."
Terry W. Hartle, a senior vice president at the American Council on Education, believes the prospect of ratings did change some colleges’ behavior, but he says some of that activity was as much a product of the times as a response to the proposed ratings.
As for the Education Department, Mr. Hartle says the fact that it continued listening to "anybody with a point of view" throughout the past 22 months and then made "a very rational decision" to change course was a positive sign. Had it pressed ahead with a poorly designed policy, "it would have hurt their credibility," he says.
It put a spotlight on existing problems with federal data on students and colleges, but it didn't resolve those problems.
From the moment the ratings were first proposed, in a speech by President Obama in August 2013, college officials, researchers, and leaders of higher-education associations questioned the accuracy of the data available to the government to create a ratings system, particularly if they were to be used as the basis for a rating that would eventually affect the awarding of federal student aid.
With so much potentially at stake, "it helped to clarify how incomplete and uninformative" some of the federal data are, says Mr. Kelly. He cites data on colleges’ "average net price" as an example of the latter, since most actual students will pay more or less than that amount, depending on their circumstances.
Even without those high stakes, the adequacy of data "will be an issue in the reauthorization of the Higher Education Act," says Mr. Hartle.
It’s not just the data held in the department’s Integrated Postsecondary Education Data System, or Ipeds, that will be at issue. With the department planning to integrate data from other agencies, the challenge of assuring the accuracy of that information will be an added challenge. As Mr. Kelly notes, "making new data available that are incomplete or inaccurate is not helpful."
Still, ratings insiders point to a little-noticed part of the process that could yield an important side benefit: In the course of their work, the teams working on ratings discovered that the data in Ipeds were not sufficient for the kind of system they initially hoped to build. That led to the breaking down of some bureaucratic barriers to allow more sharing of data from outside agencies and from within the department itself.
The experience helped demonstrate the value of using government data from several sources to evaluate the impact of potential policies, says Stephen L. DesJardins, a professor of education and public policy at the University of Michigan at Ann Arbor, who worked for the department on the ratings. "Maybe," says Mr. DesJardins, that "moves the needle a bit more" toward better government decision making. Ultimately, he hopes that will make it easier for researchers like him to more routinely get access to such data.
It could create more pressure for the establishment of a student "unit record" data system.
Ratings or no ratings, many higher-education reformers contend that colleges cannot be held accountable without some system that tracks what happens to their students both during and after college. Congress has so far rejected the creation of such a system. "It’s clear we still need greater accountability in higher education," says Ben Miller, senior director for postsecondary education at the Center for American Progress. States are already collecting better data in many cases, he says.
It may have emboldened the so-called higher-education lobby, which includes many who oppose such a system.
David L. Warren, president of the National Association of Independent Colleges and Universities and one of the most vocal critics of the ratings system and other Obama-administration proposals on colleges (he once swore to stand "shoulder to shoulder" with colleagues in opposition to proposals tying federal student aid to colleges’ performance), says the department’s policy shift shows the power of grass-roots opposition.
For months, he says, the department was insisting that it would be imposing a ratings system. "If you stand on principle and you are politically organized," says Mr. Warren, "you can alter the direction of public policy."
It highlighted the challenges of trying to reform something as complex and diverse as American higher education.
"Getting a sense of what you want to accomplish at the outset is the lesson," says Mr. Miller. "It’s very hard to reform the entire sector at once." What’s more, he notes, defining, much less measuring, the nature of "quality" in high education "is even more difficult than we thought."
Even with the change in course, the department still needs to show it can create a system that will actually be of use to, and be used by, students. Having better data will be really valuable, says Mamie Voight, director of policy research at the Institute for Higher Education Policy. "The next important step" is getting people to use it, particularly low-income and first-generation students "who most need this information."
Existing department sites, like College Navigator and the College Scorecard, don’t allow side-by-side comparisons, she says. She hopes that’s a feature of the new consumer tool, and that it doesn’t become just another underutilized government website.
Read more at The Chronicle of Higher Education: http://chronicle.com/article/Lessons-From-the-Education/231169