Collecting Data about Student Learning

Resource Title:
Data about Student Learning

"For assessment to be successful, it is necessary to put aside the question, 'What’s the best possible knowledge?' and instead to ask, 'Do we have good enough knowledge to try something different that might benefit our students?'"

-Blaich, C. F., & Wise, K. S. (2011). From gathering to using assessment results: Lessons from the Wabash National Study (NILOA Occasional Paper No.8). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

“[Assessments] are enablers of communication back and forth between students and instructors as the former demonstrates and the latter provides feedback upon learning milestones. The assessments are thus a means to an end, not an end in and of themselves.”

-Trogden, B.G. (2021). Using your sphere of influence to impact culturally responsive assessment. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.


Key Definitions & Frameworks

Data sources that are useful to consider in assessing student learning are:

  1. Evidence of learning outcomes

    Direct measures of learning

    • These allow students to demonstrate their learning for faculty to assess how well a program's students are meeting the expected level of proficiency for skills or knowledge. Examples include capstone projects, papers, standardized tests, observations of students in a clinical setting, and quiz questions that align with a key area of knowledge needed.
      • Embedded assessments are direct measures of student learning that serve two purposes: as a course requirement for students (i.e., a normal work product like a paper, quiz, or a design project) and for a program's assessment processes. They most frequently are collected from required courses, capstone courses, or key classes where a student must demonstrate mastery of a specific learning objective important to the department.

    Indirect measures of learning

    • These gather students' perceptions of and satisfaction with their learning. Common examples are focus groups and student and alumni surveys.

    Multiple methods of assessment

    • Methods that pair direct and indirect methods are most valuable because:
      • Research indicates that students are not always able to accurately self-assess their learning, so the use of indirect measures alone may be inaccurate.
      • Some outcomes (e.g., attitudes) can be assessed only through surveys, interviews or focus groups.
      • Indirect measures (e.g., student dissatisfaction with some aspect of their learning experience) can help explain results seen through collection of direct measures (e.g., low student performance on a key learning outcome).
  2. Background information about the students in the course or curriculum (i.e., inputs)
    • For example, what academic preparation do students bring into a course or academic program? What are the demographics of students in a class or program? What are their career or post-U-M educational aspirations? 
  3. Documentation of the learning experience
    • In other words, what is the nature of the learning experience for students?

U-M Data Sources on Student Learning in U-M Curriculum and Co-Curriculum

This provides measurement tools for faculty and administrators to assess student learning indirectly and directly, for both classroom and program-level assessment.

Indirect Evidence of Learning

Large Student Surveys Used at U-M

Data Source Coordination and Distribution of Findings

UMAY: This survey is open to all U-M undergraduates, and it is quite wide reaching. Each respondent answers a core set of questions about time use, academic/personal development, academic engagement, evaluation of the major, overall satisfaction, and climate.

UMAY is part of a nationwide study, Student Experience in the Research University (SERU), based at UC Berkeley. The local coordination of this project is through the Office of Budget and Planning. Annual UMAY reports are shared on the Office of Budget and Planning website.

U-M's 2015 UMAY response rate is ~20%.

Here is a sample recent UMAY instrument.

Destination Surveys: These studies examine LSA alumni’s, medical school applicants’ or law school applicants’ first experiences after college, looking at the "first destination" of job, graduate school, volunteer life, or family, and how the university served in preparing for that step.

 

The LSA study is conducted annually by the University Career Center. Select findings are presented on the University Career Center website.

Alumni and Exit Surveys

  • Sample exit surveys from U-M departments
  • CRLT also consults with many departments on customized exit/alumni survey design and analysis for assessment. To learn more, please contact Malinda Matney, Managing Director, Educational Development and Assessment Services, at mmatney@umich.edu.

Other validated surveys (some are fee-based)

Focus groups

  • Focus groups involve a discussion of 8-10 students to reflect on the curriculum. Focus groups can be useful for allowing students to collectively hear other students' experiences and reflect on achievement of key learning goals for a course, curriculum, or educational innovation. CRLT has conducted numerous focus groups for departments and for postsecondary educational grant evaluation. (For a full list, please see CRLT's recent assessment project list.) To learn more, please contact Malinda Matney, Managing Director, Educational Development and Assessment Services, at mmatney@umich.edu.

Sample assessment project at U-M using indirect evidence of learning

  • In Fall 2009, CRLT collaborated with LSA to assess its Quantitative Reasoning requirement. The evaluation was based on a survey of LSA first- and second-year students about quantitative reasoning gains they reported making from their QR1 or non-QR Fall Term courses. Most of the survey was derived from a University of Wisconsin assessment study of its QR requirement, which validated a survey about student self-reported learning gains with pre- and post-tests of authentic QR-related problems (Halaby, 2005). The instrument was developed by a study team that included UW’s Director of Testing & Evaluation and other quantitative researchers. In addition to the 14 UW gains items, the U-M survey asked students if they felt that the course met LSA’s goals for the QR requirement if they could give an example of an application of the course, and what instructional methods helped them learn. Key findings of the survey.

Direct Evidence of Learning

Rubrics are commonly used as an assessment tool for papers or projects.

Quiz questions, linked to specific key learning objectives

Concept inventories

  • Concept inventories are reliable and valid tests that are designed to test students' knowledge of key concepts in a field. Often, they can be used to make comparisons in student learning over time (e.g., a student's performance at the beginning and end of a course) or between students at different universities. They are most commonly used in science, math, and engineering.
  • Examples of concept inventories from engineering have been collected by the Foundation Coalition.
  • A list of concept inventories developed for scientific disciplines has been collected by Julie Libarkin, MSU.
  • An example of a department using this type of assessment data is the Mathematics Department. In Fall Term 2008, the department administered the Calculus Concept Inventory, a nationally validated test designed concepts of differential calculus. The survey was given to all sections of Math 115, with a pre-/post- design. On the post-test, students also were asked to rate the interactivity level of the classroom, and the percentage of time spent on interactively engaged activities. Summary findings here.

Background information about the students in a course or curriculum

LSA Academic Reporting Toolkit (ART) (Requires authentication)

  • An information service that enables LSA faculty to create course-specific data reports on a range of topics related to student performance, such as: enrollment and grade histories of students in a given course, enrollment and grade connections between courses, and relations of course grades to pre-college measures (ACT/SAT scores and AP exams). Each tool has customizable input and each is designed to return anonymous data (no student names or ID's) in both graphical and tabular form. Access to the site is restricted and requires authentication. To request access, please contact Rob Wilke, LSA Management and Information Stems, Dean's Office.

U-M Data Warehouse

  • The U-M Data Warehouse is a collection of data that supports reporting activity for University business. The M-Pathways Student Records Data Set contains academic data for students who have matriculated at the University of Michigan, Ann Arbor. The data includes students' personal information (demographic data), enrollment, courses, grades, degrees, and transfer credit. For more information on the data available, see the student records data dictionary. To request access to these data, instructors should contact their school/college's data steward or the Office of the Registrar.

Documentation of the learning experience 

Common measures to document learning activities include:

  • Syllabi (LSA Syllabus Archive)
  • Instructor reports of key instructional activities.
  • To obtain useful data, contact Steve Lonn at the USE Lab or Dan Kiskis at ITS

CRLT staff work with groups of faculty in departments or schools/colleges to collect assessment data that will be useful for educational grant evaluation or curricular decisions. For example, CRLT staff use interviews, focus groups, and surveys of students, faculty, and alumni to provide feedback about:

  • the effectiveness of a curriculum/program as a whole or a particular sequence of courses; 
  • the effectiveness of a unit's training program for graduate student instructors; or
  • a unit's climate for teaching and learning.
Click here for more information about the assessment and evaluation services CRLT can provide.

Source URL: https://dev.crlt.umich.edu/assessment-evaluation/collecting-assessment-data