We applaud the desire of university presidents to improve student learning and accountability on their campuses, although the idea is hardly new. A number of institutions in the alliance, in fact, distinguished themselves by their work in assessment and accountability for student learning before the organization was formed.
Does progress in these areas require another coalition? The alliance offers a four-point plan for presidents to gather, use, report and publicize student learning outcomes. What’s been stopping all institutions from doing this long before now?
The availability of appropriate tools has not been an obstacle. Hundreds of institutions around the nation—indeed, many within the alliance—use effective and remarkably inexpensive instruments to evaluate student learning, such as the Collegiate Learning Assessment (CLA), the Measurement of Academic Proficiency and Progress (MAPP), and the Collegiate Assessment of Academic Proficiency (CAAP).
The CLA in particular has generated important findings about undergraduate cognitive growth in different academic programs, and is in particularly wide use among participants in the Association of Public and Land-Grant Universities’ Voluntary System of Accountability. The South Dakota Board of Regents for some years has used the CAAP to make general education assessment of students in its state colleges and universities a mandatory element in performance-based funding.
Rather than creating a new coalition and spending money on national conferences to celebrate their successes, wouldn’t it be better for campus leadership to use limited time and resources to further the hard but productive work of serious data collection and analysis? By serious data collection, we don’t mean portfolios—which, notably, is one of the options the alliance considers. Portfolios, although labor-intensive, offer at best supplementary material to understand an undergraduate’s development. At worst, they evade and distract from clear measures of academic progress. What higher-education leadership needs is objective assessment of academic value added, along with outcomes measures like licensure and professional examination results. It is well within reach.
Finally, it does not suffice to “[ensure] that at least once a year the governing board…receives and discusses a report on efforts,” as the outline states. Governing boards and the campus community need clear, reliable data that they can see in dashboard indicators as well as interpreted in a full report. Underlying any report on student learning must be accountability based on performance data. Press releases, national associations, and conferences don’t count.