An overview of CU-Boulder assessment activities
For NCA self study LMcC, PBA, April 15, 1999
- Assessment in general. See listing of activities:
http://www.colorado.edu/pba/qis/current/cc8.htm written for the CU-Boulder submission to
the Colorado Commission on Higher Education (CCHE) quality indicator system. This document,
written in spring 1998, is organized in accord with topics CCHE lists under assessment but
includes other topics as well, for a comprehensive overview. Topics discussed include activities
in and use of information on or from
- Undergraduate and graduate student learning
- Student persistence and completion of educational objectives
- Placement rates and after-graduation performance
- Student satisfaction
- College, school, and department committees
- Academic program review
- Accreditation reviews
- The CCHE quality indicator system is itself another form of assessment. The commission
specifies 10 "indicators" that all institutions report on; assessment and accountability is one of
these, along with graduation rates, alumni satisfaction, and others. Institutions may also add
their own indicators, and CU-Boulder's submission provides extensive information on indicators
the campus has judged as important. See http://www.colorado.edu/pba/qis for the entire
- For more extensive information on CU-Boulder's undergraduate outcomes assessment process
and results, see http://www.colorado.edu/outcomes.
Analysis of strengths, weaknesses, opportunities, and threats
- Significant activity and scrutiny at multiple levels: Course, department/major, college, campus,
- A long-standing, well-established system of academic program review (PRP, program review
panel), which has survived several personnel changes
- Long time series of student survey responses and retention and graduation rates available for
analyses relative to student characteristics and to University actions
- Detailed goal statements for undergraduates in individual majors, published since the early '90's
in the catalog and in departmental materials
- Departmental assessments of achievements in the major using many different methods, some
exemplary, and consequent changes in undergraduate programs made by some departments.
Full departmental "ownership" of goal-setting, methods of assessment, and assessment use.
- A web site on undergraduate learning outcomes assessment reporting activities by individual
units; the site has been important in earning CU-Boulder a national reputation for undergraduate
- Compartmentalization: Many on and off campus see departmental activities reported as
"outcomes assessment," department and college curriculum or grad/undergraduate education
committees, retention studies, student and alumni surveys, and PRP as completely independent.
The "big picture" is difficult to see from faculty, student, and administrative levels. In addition,
activities that could fit under 'assessment' are sometimes not reported as such because they
are labeled differently. We suspect that accreditation activities in engineering (for ABET 2000)
and business are examples.
- Administrative structure: PRP and curriculum committees have well-established administrative
rules and changing personnel, and graduate-student assessments are coordinated by a
committee of the graduate school. In contrast, no formal campus-wide oversight committees
for assessment of courses, assessment in undergraduate majors or general education, or for
guiding survey use and design, have existed for years. In addition, no college-wide committee
in Arts and Sciences examines or even receives undergraduate assessment reports from all A&S
departments. The retention subcommittee of the now-defunct enrollment management team
did lead retention studies through late 1997.
- Note that PRP procedures direct department self-studies to examine their assessment
processes, results, and uses. If this part of PRP were taken more seriously by the review
panel departments would more likely link their various assessment activities and make
explicit use of results.
- An unfortunate reporting cycle: In 1997 the cycle for department reports on undergraduate
assessment efforts was changed from one to two years, with the expectation that each
department would engage in assessment (but not reporting) in both years of the cycle. This
may have saved departments some time spent in pro-forma reporting, but has caused wide-spread confusion and difficulties in continuity, tracking, funding, and reporting.
- Uneven results: Commitment to, use of, and creativity in implementing assessments of student
learning and satisfaction vary enormously across departments, schools, and colleges. These are
especially weak in general education areas not the responsibility of one department, such as
- A lack of campuswide stated educational goals for undergraduates. The Arts and Sciences core
curriculum requirement listing is the closest thing available to a published statement on campus
- Dependence on external requirements for motivation
- CU-Boulder's undergraduate assessment program started in 1988-89 in response to a state
requirement. The program plan states that the goal is assessments useful for faculty in
individual units as they consider changes in curriculum, teaching, and course design.
Nevertheless, many departments saw their primary purpose as reporting to CCHE (Colorado
Commission on Higher Education, the state coordinating board). With new legislation, CCHE
now requires only that the campus document the "existence and operation of a formal,
comprehensive, and effective institutional assessment and accountability program" -- not
report on individual departments. Conversations with departments now often begin "now
that we don't have that legislative requirement any more . . . ."
- The graduate school survey was initiated in 1997 in direct response to comments from North
Central Association (NCA, our accrediting agency). While graduate school officials are
interested in and may take action based on some survey results, they often say that the
results are "for NCA."
- This dependence leads to a focus on reporting instead of use. As an example, one
department surveyed its graduating seniors, produced frequencies of all items, and sent that
listing as its report, with no evidence that anyone in the department had attempted any
interpretation or use.
- A change in the campus-wide undergraduate assessment coordinator in early 1998 led to a
natural experiment in which departments were not prompted, nagged, or even reminded
about reporting requirements after May 1998. 16 of 24 departments with reports due by
9/15/98 had not reported by the end of February 1999.; some of these have done so now,
after a reminder.
Opportunities and threats (all items listed are both!)
- Increasing external pressures for assessment and accountability from accrediting agencies, the
public, and the state. Opportunity: External pressure may mean more activity. Threat:
Resistance or compliance that is simply pro forma.
- Frequent changes in state directions and directives. We have seen major changes over the last
ten years; with a new CCHE executive director and new governor these could well accelerate.
Opportunity: To move to more meaningful indicators of learning and success. Threat: Make-work activity to meet ever-changing rules.
- A move to use department, school, and college performance indicators in determining internal
budget allocations. These might include the fact of doing or using student-learning
assessments, and student satisfaction measures. Opportunity: To provide meaningful
incentives and rewards for departments to engage in true assessment, to make use of results,
and to focus on program-level activities (vs. course level). Could also help align University and
faculty/department goals for academic programs. Threat: Meaningless activities designed
simply to fill slots in tables of measures. Could also lead to the next item.
- Pressures to use assessment measures that are easy to collect and quantify, and are
comparable across departments or institutions. Opportunity: More "user-friendly" information.
Threat: Less meaningful measures.
- Increasing availability of information from peer institutions about activities and outcomes at a
department level. Opportunity: Ability to measure departments vs. similar departments
elsewhere. Threat: Overwhelming amounts of information, with "peer groups" tailored to make
every CU-Boulder department look good.