The Education Research Program at the SSRC began organizing the Measuring College Learning (MCL) project in 2013 as an opportunity to engage faculty and the broader higher education community in an effort to develop tools to understand and improve discipline-specific student learning. Faculty are often absent in policy debates about student learning, and most current attempts to articulate and demonstrate learning have focused on generic competencies, not discipline-specific learning that occurs in students’ majors. In the MCL project, faculty and other experts came together to consider what students should learn in their majors and how that learning should be measured. Panels of experts from six disciplines participated in the project: biology, business, communication, economics, history, and sociology. A published volume on the work of the expert panels, Improving Quality in American Higher Education: Learning Outcomes and Assessments for the 21st Century, was published by Jossey Bass in late May of 2016.

“The learning outcomes frameworks represent a significant national consensus among faculty today and aptly showcase the kinds of high-level skills that society demands of today’s graduates.”

The MCL project was informed by the work of the CLA Longitudinal Study, an earlier SSRC project which measured generic competencies. The books Academically Adrift and Aspiring Adults Adrift charted the findings of the study, arguing that college students were not making the critical gains expected from a college education. Critiques of the work noted that the CLA, which measures generic skills, did not consider the role of discipline-specific learning. The MCL project emerged from that feedback, attempting to define “essential concepts and competencies” for undergraduate discipline-specific learning. The essential concepts and competencies frameworks, as defined and articulated by our faculty expert panels, outline ideas that are fundamental to the disciplines. The skills and practices articulated are by no means comprehensive, but take an important step in restructuring how assessment is thought of: placing faculty, rather than administrators, at the center. The learning outcomes frameworks represent a significant national consensus among faculty today and aptly showcase the kinds of high-level skills that society demands of today’s graduates. The Social Science Research Council has a rich history of disciplinary partnership, and the MCL project echoes that larger mission.

In addition to addressing discipline-specific learning, rather than generic skills, the MCL project brings attention back to student learning, rather than the “value of college” as measured by labor market outcomes. By changing the way the “value of college” is thought of and articulated, the project addresses questions of whether increased attention to student learning could better prepare students for life after college, rather than looking solely to employment and retention rates.

Given legitimate concerns and sensitivities to counter-productive accountability regimes that have been foisted upon educators, the MCL project was carefully organized around the following five principles: faculty should be at the center of defining and developing learning goals for undergraduates; students from all backgrounds and institutions should be given a fair opportunity to demonstrate their knowledge and skills when transferring from one institution to another and when transitioning into the workforce; measures of student learning should be rigorous and high quality and should yield data that allow for comparisons over time and across institutions; assessment tools should be used by institutions on a voluntary basis; and any single measure of student learning should be part of a larger holistic assessment plan.

Generic and discipline-specific competencies

In our work we were surprised to find that there was substantial agreement among faculty in defining learning outcomes through consideration of prior work and their own deliberations. Our faculty expert panels were quite adept at identifying and defining essential elements for student learning in their disciplines. Additionally, faculty were eager to move beyond rote content knowledge and memorization of facts.

“The first step in moving beyond content knowledge was to look, instead, at concepts and competencies.”

The faculty panels in our project recognized that the first step in moving beyond content knowledge was to look, instead, at concepts and competencies. The MCL project’s articulation of learning outcomes exists at the intersection of concepts (what students know and understand) and competencies (what students are able to do). Collectively, concepts and competencies articulate the habits of mind and practices characteristic of a specific discipline. While our MCL expert panels were not asked explicitly to address values, in many ways concepts and competencies nonetheless serve as a barometer for disciplinary values.

Given our focus on discipline-specific learning, we anticipated that a primary challenge would be to navigate the overlap between generic and discipline specific competencies. Since a number of existing initiatives are dedicated to defining and measuring generic competencies, we encouraged the MCL faculty panels to clearly define disciplinary habits of mind. The concepts that emerged are discipline-specific, but the competencies, although embedded in the disciplines, tend to overlap and resonate with more generic skills, such as critical thinking, analytical writing, quantitative reasoning, and problem solving.

“To what extent can certain higher-order skills be measured in the absence of specific domains?”

There is a long-standing debate in the teaching, learning, and assessment community about the role of the disciplines in the assessment of college learning: that is, to what extent can certain higher-order skills be measured in the absence of specific domains? Regardless of generic tools that attempt to measure critical thinking skills, college-level courses are, by and large, discipline-specific. As such, the need for measurement that accounts for disciplinary habits of mind is much needed. While students are developing broader skills, such as analysis, writing, and critical thinking, they are typically learning those skills in a discipline-specific context. Skills such as problem solving and analytical writing in their discipline-specific contexts enable students to be strong disciplinary actors, in addition to potentially providing a solid educational foundation for tasks that they will face outside disciplinary domains. The contribution of the MCL project to this debate is to illuminate the connections between generic and discipline-specific competencies and to highlight how disciplinary ways of thinking align with broader concerns about developing students’ ability to analyze, integrate, and problem solve.

The white papers

The published volume is comprised of a series of white papers, each articulating the essential concepts and competencies in the six MCL disciplines. These white papers promise to serve as valuable resources for department-level, discipline-level, and institution-level efforts to enhance the quality and intentionality of undergraduate education. In addition to the white papers themselves, the project demonstrated its openness to critique by providing experts on learning in higher education with an opportunity to publish essay commentaries on the project as a whole within the larger volume.

Essential Concepts & Competencies - MCL Framework

Each of the MCL white papers is unique, but all of them share a similar overarching structure. They begin with an overview of prior and ongoing efforts to define and measure learning outcomes in their disciplines. Next, building on these efforts and integrating new insights from the MCL panel discussions, each paper articulates the essential concepts and competencies for undergraduate student learning in the discipline. Each white paper then presents a persuasive and creative vision for the future of assessment in the discipline. While we found overlap in more generic competencies across disciplines (e.g., qualitative reasoning), there was also distinctive variation in the competencies that the groups articulated (e.g., “appreciate and apply the interdisciplinary nature of science” is unique to biology, while “evaluate historical accounts” is unique to history). Despite differences in the competencies emphasized in each learning outcome framework, all six disciplines advocated for a greater emphasis on competencies and disciplinary habits of mind rather than content knowledge. The essential concept and competencies frameworks highlight the extent to which education is best understood not as a series of facts to be memorized, but rather the development in students of deeper ways of understanding and acting in the world.

Overall, the papers advocate for the development of rigorous, twenty-first century assessment tools that are closely aligned with the concepts and competencies that are valued by faculty, students, and society. The papers encourage educators to address calls for accountability head-on, using multiple forms of evidence to drive continuous improvement at the classroom, department, and institutional level. In sum, the MCL white papers are a series of forward-thinking accounts of how undergraduate education should be organized and evaluated in contemporary US society.

The Measuring College Learning project is committed to the idea that the articulated frameworks are part of an iterative process that will evolve and change over time. As part of our plans for phase two of the project, we envision building out and piloting assessments based on the learning outcomes frameworks presented in the white papers. We look forward to the work to come, as part of a collaborative process with faculty, disciplinary associations, and other stakeholders concerned with improving student learning in higher education.

Under a special arrangement with the publisher, the SSRC is making all the materials from the volume available for free download via the Education Research Program’s website.

Posted on June 21, 2016