Risk Management for Business Schools

By Dr. Vlad Krotov and Dr. Jacob Chacko

The 2020 Business Accreditation Standards by AACSB require a business to “maintain an ongoing risk analysis, identifying potential risks that could significantly impair its ability to fulfill the school’s mission, as well as a contingency plan for mitigating these risks.” With the recent events surrounding the COVID-19 pandemic and its impact on educational institutions around the globe, there is a growing realization among business schools and their leaders of the importance and usefulness of Risk Management in their organizations.  In this article, we briefly discuss the Risk Management Process and offer simple, practical guidelines on how to identify, analyze, and mitigate risks with the help of a formal Risk Management Plan that is aligned with a broader Strategic Management Plan devised by a business school.

Simplifying Assumptions

In this article, we make a number of assumptions in relation to Risk Management (see Figure 1). We believe that these assumptions will simplify the Risk Management Process and make it more effective in mitigating the identified future risk events.

Figure 1. Risk Management Assumptions

Figure 1. Risk Management Assumptions

First, we believe that Risk Management is not a “bulletproof shield” for protecting a business school against all possible risks. It is rather a tool or a method that, if used effectively, can reduce the negative impact of risk on the organization. Risk Management can also be misused and turn into a vain exercise. This usually happens when the Risk Management process is (a) based on flawed analysis that does not properly identify and analyze important risks, (b) too complex and, thus, impractical, or (c) not backed by adequate resources required for risk mitigation. Second, we also believe that Risk Management is subjective. Risk Management is much closer to art rather than science; it is based on subjective reasoning and viewpoints, requires imagination for proper risk identification, and is heavily impacted by the “unknowns.” Because of that, we are strongly against a naïve, overly quantitative approach to Risk Management. We do support a formal, structured approach to risk analysis that makes use of appropriate quantitative and qualitative factors. Third, we believe that simplicity is the most effective response to the inherent complexity and serendipity of the environment that many business schools are operating it. We believe that overly complex, highly structured plans are inherently “fragile” in the face of the uncertain, highly complex, and turbulent environment that many business schools are increasingly finding themselves in. Simple, agile plans and structures are more robust and effective during the times of turbulence and uncertainty.

Risk Management Process

Risk Management can be defined as a continuous process comprised of the following steps or phases: analysis of strategic priorities and relevant internal and external factors, identification and definition of risk bearing events, analysis of risks based on likelihood and severity of their impact, mitigating risks by devising response strategies and actions and assigning people responsible for these actions, and monitoring of risks and periodic reporting in relation to these risks to key stakeholders (see Figure 2). Each of these steps is discussed in more detail in the sections below.

Figure 2. Risk Management Process

Analyzing Strategic Priorities and Relevant Internal and External Factors

Risk Management starts with the analysis of the current strategic priorities. As explained in Standard 1 of the 2020 Business Accreditation Standards published by AACSB, risk management is a part of a broader strategic management process and should be carried out in a way that supports a business school in attaining its strategic goals and objectives. Many of the internal and external risks can be identified by analyzing an organization’s internal strengths and weaknesses together with external opportunities and threats (the so-called SWOT analysis).

Identifying and Defining Risks

After this analysis, the organization should be able to identify and clearly describe important risk bearing events it is facing in relation to its internal and external environments. Examples of external risks include: 

    • Growing competition for students among existing educational institutions
    • Drops in enrollment due to demographics changes
    • Deficit of resources due to worsening economic conditions

Examples of internal risks include:

    • Decreases in funding due to budgeting changes at the university level
    • Inadequate staffing
    • Turnover in leadership

A table with clear descriptions of identified risks should be the main deliverable of the risk identification and definition phase.

Analyzing Risks Based on Likelihood and Impact

While all kinds of risks can and should be identified as a part of the Risk Management process, not all risks have the same estimated likelihood and potential impact. Thus, each risk should be carefully analyzed to determine (1) the likelihood of an event occurring and (2) severity of its impact (see Figure 3).

Figure 3. Risk Categories

This categorization of risks allows one to prioritize attention and resources in relation to possible future events. Events that are very likely to occur and which can possibly have a great impact on an organization should be treated as critical events. These events require special attention and resources to prevent their negative impact on the organization. Possible future events with moderate likelihood and moderate-to-high impact should be treated as important risk events. While being treated with adequate attention and resources, as a rule, these events should require less attention and resources than critical events with high likelihood and high impact.  Low likelihood events with moderate-to-high impact should require a moderate level of attention and resources. Events with moderate-to-high likelihood and low impact should be acknowledged and dealt with, but with a minimum level of resources. Finally, events with low likelihood and low impact should be discussed but probably excluded from a formal risk management plan to keep it simple.

Mitigating Risks by Devising Mitigation Actions and Assigning Responsibilities

After analyzing each possible risk event in terms of its likelihood and impact on the organization, possible actions for mitigating these risks should be devised. It is important to assign to each risk event a “risk owner”—a person responsible for taking a lead on these risk mitigation actions. More thought and extra planning should be put into critical and important events. Important organizational leaders should not be “overextended”; they should be assigned as “leads” only to critical and important risk events.

Monitoring Risks and Establishing Period Reporting to Key Stakeholders

People in charge of the specific risks should be given the formal task of monitoring the internal and external environment of a business school and carrying out mitigation actions designed to protect the organization from a possible negative impact of an event in a proactive fashion or carrying out emergency actions designed to minimize the impact of an event that has occurred already. Without a person responsible for monitoring and mitigating a potential risk event, the organization my find itself in a situation where the event is not identified or dealt with in a timely fashion. Periodic updates by people assigned to risk events should be sent to the dean. The dean can compile all of these reports in a formal Risk Management Plan update that is sent to all the key stakeholders quarterly, biannually, or annually—depending on the complexity and uncertainty of the environment that the school is operating in.

Risk Management Plan

The most important deliverable of the Risk Management process is in the form of a formal risk management plan that is updated periodically, depending on the Strategic Planning cycle length of a business school. The main elements of an effective Risk Management Plan are summarized in Table 1 below.

Strategic Goal 1 – Emphasize Faculty & Staff Development

Risk DescriptionImportanceRisk OwnerMitigation  ActionsReporting TimelineStatus Updates
Inferior instructional quality in online coursesCriticalDept. Chairs, FacultyComprehensive faculty training, audit of online classesOn-goingAll online courses have been reviewed using a standard quality rubric
Failure to attract and retain qualified facultyImportantDean, Dept. ChairsFaculty development opportunities, faculty satisfaction surveyAnnualA formal business faculty development program was established in collaboration with the Faculty Development Center
Failure to maintain appropriate portfolio of qualified facultyImportantDept. Chairs, Assoc. DeanDevelop and maintain a faculty resource planAnnualA faculty resource plan has been designed in accordance with AACSB definitions
Failure to maintain AACSB accreditationModerateDean, Assoc. DeanEnsure adherence to AACSB standards, focus on continuous improvementOn-goingFaculty sufficiency issue has been communicated to the university’s senior leaders
Table 1. Elements of a Risk Management Plan

Note that the plan contains all the outcomes or deliverables of the steps or phases of the Risk Management Process discussed above. Periodic status updates reported by the people in charge of the risk events are appended to each of the identified risks. Another important characteristic of this Risk Management Plan summary is that it is explicitly linked to Strategic Goal 1 found in the Strategic Plan of the business school.

Student Learning Outcomes Assessment

By Dr. James E. Mackin

Assessment

The assessment of student learning outcomes is probably the most fundamentally important type of assessment that an institution of higher education carries out.  By performing student learning outcomes assessments, the institution is asking:

    1. What have students learned?
    2. How can we tell what students have learned?
    3. How can we put our knowledge about what students have learned to use in providing better instruction/assessment of students?

In order to begin to address these questions, it is imperative that all of your faculty members understand how to express measurable student learning outcomes, and you will likely need to conduct numerous workshops on the topic before you can even begin to organize your student learning outcomes assessment protocols and processes.  A good starting point for the discussion can be Bloom’s Taxonomy, which basically classifies and conceptualizes the verbs that are used in student learning outcomes assessment.

Student learning outcomes assessment will need to occur in four different contexts at the institution: institutional, general education, academic program, and individual courses.  The assessments can be direct (e.g., some percentage of students performed at a satisfactory level on a multiple-choice question on a test) or indirect (e.g., information gathered from surveys), and they can be course-embedded (e.g., a presentation in a class) or they can be external to classes (e.g., results from implementation of a standardized test).  The question is, how do we put it all together to make some sense out of what we think students are actually learning at our institution?

It is very important that all learning outcomes are embedded in the planning and assessment process.  This means that a learning outcomes assessment plan should be included in each department’s strategic plan for each program that is offered by the department.  In addition, the department’s strategic plan should have departmental learning goals and objectives that are broadly connected to the learning outcomes in their program-based learning outcomes assessment plans.

A general template for the learning outcomes assessment plans is shown in the template provided below.

Normally, sections 1-3 of the template will be completed only at the beginning of the life of the relevant strategic plan or upon initial establishment of a new program, while sections 4-6 will be updated on every assessment cycle.  Also, graduate programs will not use section 1, where the program learning outcomes are linked to the institutional learning outcomes.  Graduate programs are more specialized than undergraduate programs and, although we need to ensure that they are delivering on their promises, these programs generally fall outside the realm of institutional learning outcomes.

There are a variety of tools that can be used to assess student learning outcomes in a program, depending on the outcome, and these tools will be expressed in section 2 of the template.  Examples include:

    • Course-embedded rubrics;
    • Applied learning projects;
    • Standardized tests;
    • Capstone presentations;
    • Course projects;
    • Senior surveys;
    • Employer surveys;
    • Alumni surveys;
    • Jury panel surveys;
    • External examiner surveys;
    • Pre- and post-test comparisons;
    • Selected examination questions;
    • Comprehensive exams;
    • Senior theses.

In addition, student learning outcomes will be embedded in each course in an academic program, although the nature of the outcomes and the level of student achievement expected will vary from course to course.  Section 3 of the template provides space for you to compile the level of learning outcome achievement expected for all courses in the program, including introductory, intermediate, and mastery.  Although not all courses will address all learning outcomes, the combination of all courses in the program should give students the opportunity to achieve mastery in all of the program student learning outcomes.  Ideally, lower level courses provide the opportunity for introductory achievement of learning outcomes, while intermediate and upper level courses provide opportunities for intermediate and mastery achievement of learning outcomes, respectively.

A space for the raw data that is used in assessments is not included in the template; however, that data should be stored in easily accessible locations.  Again, as in the case of strategic planning, learning outcomes assessment typically involves massive amounts of data, and it would behoove the institution to invest in software that is specifically designed for outcomes assessments.

For undergraduate programs, every context of learning outcomes assessment should be connected to every other context.  This is the reason that the template includes the section (section 1) where program learning outcomes are related to the institutional learning outcomes.  Not all institutional learning outcomes will be addressed at all levels (introductory, intermediate, mastery) by an individual major program, but the combination of the major program, the general education program and any additional graduation requirements must address the institutional learning outcomes at the mastery level for every student.  This requirement is one reason that many institutions have chosen to address their institutional learning outcomes completely through the general education program.

Because the program learning outcomes are embedded in the strategic planning process, all program learning outcomes will be carried through the review cycle shown in Figure 1.

Figure 1. An Idealized Planning and Assessment Review Sequence.
In this sequence, the academic department plans are reviewed by a group of department chairs/heads in a school/college and then the plans are reviewed by the relevant dean, who incorporates appropriate changes into the school/college plan. By the same token, the school/college plans are reviewed by Deans Council and then by the Provost for incorporation into the Academic Affairs Plan. The institutional oversight committee (in this case, the Budget and Planning Committee) reviews all plans on an annual basis.

A faculty subset of the institution-wide committee (the “Budget and Planning Committee” depicted in Figure1), or perhaps another committee, should then be tasked with reviewing the status of the institution with respect to the institutional student learning outcomes.  The data for this review would come from the information submitted for the program student learning outcomes assessments. 

If there are additional graduation requirements that lie outside the major and general education programs, the learning outcomes assessment data for those requirements will need to be collected and analyzed. It is possible to address the student learning outcomes for the general education program using the template.  However, it is important to recognize that the general education program is a unique concept in the sense that, at most institutions, there is no single entity, like an academic department, that oversees general education.  It is for this reason that most institutions have the equivalent of a “General Education Committee” and many have a “General Education Assessment Committee” that deals with general education for the institution as a whole.  These committees consist of faculty members from across all the areas encompassed by the general education program, and they usually also include accountable administrators.  As a general rule, the General Education Committee will be the originator of the program (i.e., the committee that approves all new requirements and any changes needed in the program), while the General Education Assessment Committee will be responsible for program review.  Because the responsibilities of the two types of committees are intimately related – one creates and the other reviews – it is often desirable to combine the two into a single committee that handles both sets of responsibilities.

Reviewing the general education program and making changes (i.e., “closing the loop”) is an area where most institutions struggle, simply because of the wide variety of interests that are involved.  The situation is simplified when courses are specifically designed for the general education program, but that scenario is rare in higher education.  More often, courses serve the dual purposes of meeting academic program requirements and general education requirements.  This is one reason that you will want to tie all of your programs, including the general education program, to the institutional learning outcomes.  That strategy gives you the flexibility that you need to essentially serve all masters, while at the same time simplifying and constraining approaches to learning outcomes assessment.

Your General Education (Assessment) Committee can come in at the end of the chain in Figure 1 and specifically review the general education learning outcomes while the program and institutional learning outcomes are also being evaluated, probably by other committees.  In principle, if all of the general education course requirements are also program requirements, it should be possible to compile the assessment information for the general education program from the information that is submitted for the academic programs.  If there are courses in the general education program that are not part of any academic program, then the assessment information will need to be solicited from the relevant course instructors, and separate forms should be developed for that purpose. 

A Simple, Robust Assurance of Learning (AoL) System for a Doctor of Business Administration (DBA) Program

By Vlad Krotov, PhD

Client

A young, large private university in the Middle East undergoing initial accreditation with WASC and AACSB.  

Client Requirements

Designing a simple and robust Assurance of Learning system for a Doctor of Business Administration (DBA) executive doctoral program offered by a large private university in the Middle East. The system had to meet the following requirements: 

a) Compliant with AACSB Standards. The College of Business where the program was offered was going through initial accreditation with AACSB; therefore, the system had to meet all the AACSB requirements in relation to AoL

b) Simple. The system had to be simple enough so that the DBA faculty could quickly understand and contribute to the continuous quality improvement program based on this AoL system. This was important to accommodate changes in the faculty roster teaching in the DBA program as well as general changes in the policies and the curriculum of this newly established DBA program

c) Reliable. The system had to produce reliable, useful results. It was important for the system to have a “pre-test” and then a “post-test” to produce meaningful results in relation to program learning goals. Also, the measurement tools had to incorporate both quantitative and qualitative results for further improvements. 

Solution

As the first step, the curriculum of the DBA program was aligned along the five Program Learning Outcomes (PLOs). The results of the curriculum alignment process are provided in the Course Alignment (CAM) matrix below (see Figure 1): 

Figure 1. Course Alignment Matrix (CAM) for the DBA Program

The extent to which Doctoral of Business Administration (DBA) students have mastered the learning outcomes of the program is assessed at 3 strategic points: METH 1 “Introduction to Business Research” course (Assessment Point 0), RSCH 1 “Research Proposal” (Assessment Point 1), and RSCH 2 “Dissertation” (Assessment Point 2) (see Figure 2 below).

Figure 2. AoL System for the DBA Program

At each of the assessment points, the Research Evaluation Rubric is used to assess student performance. The rubric relies on two other rubrics developed by the College: General Analytical Writing Assessment Rubric and Oral Presentation Rubric. In METH1 course, the basis for assessment is student “mock” dissertation proposal – an exercise where students, based on their limited knowledge of their research domain and high-level understanding of research methods, describe and present a rough plan for their possible future dissertation research. The assessment is done by the instructor teaching this course. This assessment point is used to establish a baseline for student knowledge and skills in relation to the program learning outcomes shortly after they join the DBA program. In RSCH the basis for assessment is the actual research proposal document and presentation. Finally, in RSCH 2 the basis for assessment is the final dissertation document and the final dissertation defense. In RSCH 1 and RSCH 2 assessment is done by the dissertation committee chair.  In both cases, assessment results are submitted to the Program Director for analysis. Subsequent changes in the curriculum are subjected to the standard curriculum revision process implemented by the College and presided over by the College Curriculum Committee.

Results

With this simple, robust AoL System, the college was able to “close the loop” in relation to the DBA program in just one year (see Figure 3) below: 

Figure 3. Closing the Loop

The results of the newly designed AoL system indicate a noticeable growth in the level at which doctoral students master the PLOs across semesters (see Figure 4 below). These improvements are largely a result of the recommendations for curriculum and policy changes submitted by the DBA faculty participating in the AoL process using the rubrics used in assessment. 

Figure 4. AoL Results Across Semesters

The College believes that this assessment method allows for a closer monitoring of individual students with respect to their achievement of the learning goals of the program. The AoL system was received positively by AACSB Review Team.