Student Learning Outcomes Assessment

By Dr. James E. Mackin

Assessment

The assessment of student learning outcomes is probably the most fundamentally important type of assessment that an institution of higher education carries out.  By performing student learning outcomes assessments, the institution is asking:

    1. What have students learned?
    2. How can we tell what students have learned?
    3. How can we put our knowledge about what students have learned to use in providing better instruction/assessment of students?

In order to begin to address these questions, it is imperative that all of your faculty members understand how to express measurable student learning outcomes, and you will likely need to conduct numerous workshops on the topic before you can even begin to organize your student learning outcomes assessment protocols and processes.  A good starting point for the discussion can be Bloom’s Taxonomy, which basically classifies and conceptualizes the verbs that are used in student learning outcomes assessment.

Student learning outcomes assessment will need to occur in four different contexts at the institution: institutional, general education, academic program, and individual courses.  The assessments can be direct (e.g., some percentage of students performed at a satisfactory level on a multiple-choice question on a test) or indirect (e.g., information gathered from surveys), and they can be course-embedded (e.g., a presentation in a class) or they can be external to classes (e.g., results from implementation of a standardized test).  The question is, how do we put it all together to make some sense out of what we think students are actually learning at our institution?

It is very important that all learning outcomes are embedded in the planning and assessment process.  This means that a learning outcomes assessment plan should be included in each department’s strategic plan for each program that is offered by the department.  In addition, the department’s strategic plan should have departmental learning goals and objectives that are broadly connected to the learning outcomes in their program-based learning outcomes assessment plans.

A general template for the learning outcomes assessment plans is shown in the template provided below.

Normally, sections 1-3 of the template will be completed only at the beginning of the life of the relevant strategic plan or upon initial establishment of a new program, while sections 4-6 will be updated on every assessment cycle.  Also, graduate programs will not use section 1, where the program learning outcomes are linked to the institutional learning outcomes.  Graduate programs are more specialized than undergraduate programs and, although we need to ensure that they are delivering on their promises, these programs generally fall outside the realm of institutional learning outcomes.

There are a variety of tools that can be used to assess student learning outcomes in a program, depending on the outcome, and these tools will be expressed in section 2 of the template.  Examples include:

    • Course-embedded rubrics;
    • Applied learning projects;
    • Standardized tests;
    • Capstone presentations;
    • Course projects;
    • Senior surveys;
    • Employer surveys;
    • Alumni surveys;
    • Jury panel surveys;
    • External examiner surveys;
    • Pre- and post-test comparisons;
    • Selected examination questions;
    • Comprehensive exams;
    • Senior theses.

In addition, student learning outcomes will be embedded in each course in an academic program, although the nature of the outcomes and the level of student achievement expected will vary from course to course.  Section 3 of the template provides space for you to compile the level of learning outcome achievement expected for all courses in the program, including introductory, intermediate, and mastery.  Although not all courses will address all learning outcomes, the combination of all courses in the program should give students the opportunity to achieve mastery in all of the program student learning outcomes.  Ideally, lower level courses provide the opportunity for introductory achievement of learning outcomes, while intermediate and upper level courses provide opportunities for intermediate and mastery achievement of learning outcomes, respectively.

A space for the raw data that is used in assessments is not included in the template; however, that data should be stored in easily accessible locations.  Again, as in the case of strategic planning, learning outcomes assessment typically involves massive amounts of data, and it would behoove the institution to invest in software that is specifically designed for outcomes assessments.

For undergraduate programs, every context of learning outcomes assessment should be connected to every other context.  This is the reason that the template includes the section (section 1) where program learning outcomes are related to the institutional learning outcomes.  Not all institutional learning outcomes will be addressed at all levels (introductory, intermediate, mastery) by an individual major program, but the combination of the major program, the general education program and any additional graduation requirements must address the institutional learning outcomes at the mastery level for every student.  This requirement is one reason that many institutions have chosen to address their institutional learning outcomes completely through the general education program.

Because the program learning outcomes are embedded in the strategic planning process, all program learning outcomes will be carried through the review cycle shown in Figure 1.

Figure 1. An Idealized Planning and Assessment Review Sequence.
In this sequence, the academic department plans are reviewed by a group of department chairs/heads in a school/college and then the plans are reviewed by the relevant dean, who incorporates appropriate changes into the school/college plan. By the same token, the school/college plans are reviewed by Deans Council and then by the Provost for incorporation into the Academic Affairs Plan. The institutional oversight committee (in this case, the Budget and Planning Committee) reviews all plans on an annual basis.

A faculty subset of the institution-wide committee (the “Budget and Planning Committee” depicted in Figure1), or perhaps another committee, should then be tasked with reviewing the status of the institution with respect to the institutional student learning outcomes.  The data for this review would come from the information submitted for the program student learning outcomes assessments. 

If there are additional graduation requirements that lie outside the major and general education programs, the learning outcomes assessment data for those requirements will need to be collected and analyzed. It is possible to address the student learning outcomes for the general education program using the template.  However, it is important to recognize that the general education program is a unique concept in the sense that, at most institutions, there is no single entity, like an academic department, that oversees general education.  It is for this reason that most institutions have the equivalent of a “General Education Committee” and many have a “General Education Assessment Committee” that deals with general education for the institution as a whole.  These committees consist of faculty members from across all the areas encompassed by the general education program, and they usually also include accountable administrators.  As a general rule, the General Education Committee will be the originator of the program (i.e., the committee that approves all new requirements and any changes needed in the program), while the General Education Assessment Committee will be responsible for program review.  Because the responsibilities of the two types of committees are intimately related – one creates and the other reviews – it is often desirable to combine the two into a single committee that handles both sets of responsibilities.

Reviewing the general education program and making changes (i.e., “closing the loop”) is an area where most institutions struggle, simply because of the wide variety of interests that are involved.  The situation is simplified when courses are specifically designed for the general education program, but that scenario is rare in higher education.  More often, courses serve the dual purposes of meeting academic program requirements and general education requirements.  This is one reason that you will want to tie all of your programs, including the general education program, to the institutional learning outcomes.  That strategy gives you the flexibility that you need to essentially serve all masters, while at the same time simplifying and constraining approaches to learning outcomes assessment.

Your General Education (Assessment) Committee can come in at the end of the chain in Figure 1 and specifically review the general education learning outcomes while the program and institutional learning outcomes are also being evaluated, probably by other committees.  In principle, if all of the general education course requirements are also program requirements, it should be possible to compile the assessment information for the general education program from the information that is submitted for the academic programs.  If there are courses in the general education program that are not part of any academic program, then the assessment information will need to be solicited from the relevant course instructors, and separate forms should be developed for that purpose. 

A Simple, Robust Assurance of Learning (AoL) System for a Doctor of Business Administration (DBA) Program

By Vlad Krotov, PhD

Client

A young, large private university in the Middle East undergoing initial accreditation with WASC and AACSB.  

Client Requirements

Designing a simple and robust Assurance of Learning system for a Doctor of Business Administration (DBA) executive doctoral program offered by a large private university in the Middle East. The system had to meet the following requirements: 

a) Compliant with AACSB Standards. The College of Business where the program was offered was going through initial accreditation with AACSB; therefore, the system had to meet all the AACSB requirements in relation to AoL

b) Simple. The system had to be simple enough so that the DBA faculty could quickly understand and contribute to the continuous quality improvement program based on this AoL system. This was important to accommodate changes in the faculty roster teaching in the DBA program as well as general changes in the policies and the curriculum of this newly established DBA program

c) Reliable. The system had to produce reliable, useful results. It was important for the system to have a “pre-test” and then a “post-test” to produce meaningful results in relation to program learning goals. Also, the measurement tools had to incorporate both quantitative and qualitative results for further improvements. 

Solution

As the first step, the curriculum of the DBA program was aligned along the five Program Learning Outcomes (PLOs). The results of the curriculum alignment process are provided in the Course Alignment (CAM) matrix below (see Figure 1): 

Figure 1. Course Alignment Matrix (CAM) for the DBA Program

The extent to which Doctoral of Business Administration (DBA) students have mastered the learning outcomes of the program is assessed at 3 strategic points: METH 1 “Introduction to Business Research” course (Assessment Point 0), RSCH 1 “Research Proposal” (Assessment Point 1), and RSCH 2 “Dissertation” (Assessment Point 2) (see Figure 2 below).

Figure 2. AoL System for the DBA Program

At each of the assessment points, the Research Evaluation Rubric is used to assess student performance. The rubric relies on two other rubrics developed by the College: General Analytical Writing Assessment Rubric and Oral Presentation Rubric. In METH1 course, the basis for assessment is student “mock” dissertation proposal – an exercise where students, based on their limited knowledge of their research domain and high-level understanding of research methods, describe and present a rough plan for their possible future dissertation research. The assessment is done by the instructor teaching this course. This assessment point is used to establish a baseline for student knowledge and skills in relation to the program learning outcomes shortly after they join the DBA program. In RSCH the basis for assessment is the actual research proposal document and presentation. Finally, in RSCH 2 the basis for assessment is the final dissertation document and the final dissertation defense. In RSCH 1 and RSCH 2 assessment is done by the dissertation committee chair.  In both cases, assessment results are submitted to the Program Director for analysis. Subsequent changes in the curriculum are subjected to the standard curriculum revision process implemented by the College and presided over by the College Curriculum Committee.

Results

With this simple, robust AoL System, the college was able to “close the loop” in relation to the DBA program in just one year (see Figure 3) below: 

Figure 3. Closing the Loop

The results of the newly designed AoL system indicate a noticeable growth in the level at which doctoral students master the PLOs across semesters (see Figure 4 below). These improvements are largely a result of the recommendations for curriculum and policy changes submitted by the DBA faculty participating in the AoL process using the rubrics used in assessment. 

Figure 4. AoL Results Across Semesters

The College believes that this assessment method allows for a closer monitoring of individual students with respect to their achievement of the learning goals of the program. The AoL system was received positively by AACSB Review Team.