Map A to Z Index Search CU Home University of Colorado
 
  Strategic Planning Institutional Research and Analysis Campus Budget and Finances About PBA

PBA Home > Institutional Research & Analysis > Outcomes Assessment > Guide

Assessment for CU-Boulder Units: A Brief Guide
Prepared by the Assessment Oversight Committee, Spring, 2002

"It has been said that universities have three undergraduate curricula: The one that appears in the catalog, the one that professors teach, and the one that students actually learn. To what degree does the curriculum asserted on paper or imagined by deans and dons accurately portray what goes on in the minds of students? Making the three curricula visible so they can be brought into register is the business of assessment..." (Southern Illinois University assessment web page). Few documents have produced as much thoughtful change in undergraduate education as has Ernest Boyer's Carnegie Commission report 'Reinventing Undergraduate Education: A Blueprint for America's Research Universities', a document highly relevant to CU Boulder. Through authentic assessment, Boyer's scholarships of discovery, integration, and application may be effectively transformed into the scholarship of teaching.

One of the major trends in undergraduate education across the U.S. is broad recognition of the value of assessment activities which supplement traditional measures of student achievement such as course work and grades. Carefully constructed assessment tools, designed for specific student populations, frequently provide extremely valuable information to faculty with regard to their educational goals and objectives, information not obtainable by examining student transcripts. As a result of the 1999-2000 North Central Association (NCA) accreditation process for CU-Boulder, only one specific interim report was assigned to the campus: assessment. The Colorado Commission on Higher Education has also included assessment plans as part of the Quality Indicator System.

An Assessment Oversight Committee has been given the task of critiquing, supporting and re-vitalizing unit assessment efforts and of preparing a report to the NCA on progress and plans in this area of responsibility. Our premise is that outcomes assessment should, first and foremost, be useful to CU-Boulder and should help individual academic units

  • evaluate their curricula,
  • plan improvements where necessary, and
  • evaluate the effects of the changes.

Assessment leads to curriculum change

As examples of student achievements here at CU illustrate, outcomes assessments help units affirm things in their curricula and courses that are going well. They also help identify things that are not going so well, and often suggest the kinds of changes that might be needed. About half of the CU-Boulder units doing serious outcomes assessment (and about 40% of all units reporting through the late 1990s) have identified areas for improvement. Seniors' performance in some areas may not be up to the faculty's expectations, or other aspects of the data may suggest changes that would strengthen the unit's undergraduate program.

For example:

  • On exit surveys, Chemistry/Biochemistry students reported that they valued hands-on experiences in advanced labs and independent study courses. As a result, more experiments were added in lower-division courses to provide more and earlier hands-on experiences. In addition, the faculty now more vigorously encourage students to do independent study. As a result the number doing independent studies has increased by 60%.


  • Faculty reviews of Theatre and Dance senior seminar essays and results on faculty-developed diagnostic exams showed that students were not as well prepared as faculty expected them to be in theatre history or dramatic literature. The faculty revised the sequence of courses in these areas and now emphasize them in the senior seminar.


  • In response to early assessment results, Classics faculty added more sight-reading exercises in introductory Latin and Greek courses. As a result, translation grades in those courses rose noticeably.


  • The Department of English, disturbed at the number of minor grammatical errors in even the best of student writing, added a writing component to two introductory courses.

Differences between units must be acknowledged

The diversity of CU-Boulder units is sometimes a problem in an enterprise like this one, but it is more often a benefit. Based on significant experience at assessment work on this campus spanning several years, we have learned several things.

  • Units are too different, and too independent, to attempt to impose common methods. Letting units experiment with methods and discover what provides useful information for them takes time and requires unit-specific discussions, feedback, and consultation. The individual unit summaries show great variation in how units gather and use outcomes information.


  • Units make the best use of approaches that fit their own, and their discipline's, style of gathering and using information.


  • Units learn from each other. One unit's experiences with a method can help another unit decide to try it, or to avoid it.


  • Some central oversight and support is necessary

The process would not have grown as well as it has without a certain amount of central monitoring, prodding, cajoling, assistance, and constructive feedback. However, the benefits of acknowledging units' differences and giving them control of their own outcomes assessment processes would be lost if there were too much central control. Keeping a balance requires constant attention.

  • It is helpful if the coordinators have, or have access to people with, skills in data gathering and interpretation. Units may want or need help with planning assessment methods, gathering information, and making effective use of it. This can be particularly true for units using surveys or student records data.


  • It is important to adapt assistance and feedback to the unit's own research styles and skills. Faculty in some units are very experienced in designing surveys and using survey data, for example. Those in other units may not be, but may still find survey data useful. Some are quite skilled at holistic scoring of essays. Others are not familiar with these techniques. And so on. It is not appropriate, or useful, to treat units with different sets of skills the same way.

Example Programs for Certain Degrees:

The following examples illustrate findings from a few units' most recent annual assessment reports. They are only examples--more details, and results from every unit, are in the individual unit summaries.

  • In the writing skills component of the College of Arts and Sciences' general education program, internal and external raters agree that student writing at the end of the semester shows clear improvement over work early in the term.


  • The Departments of Computer Science, Mathematics, and Sociology report that their majors' performance on the nationally standardized Major Field Achievement Tests (MFAT) is at or above the national norms for their areas.


  • Upper-division Communication majors' average scores on a specially-constructed test of knowledge in the area are considerably higher than those of non-majors in the same upper-division courses and beginning majors in lower-division courses.


  • Approximately half of the papers from senior History courses reviewed each year by a faculty committee typically receive ratings in the highest category on scales for the department's knowledge goals.


  • Senior-level students in the College of Engineering and Applied Science taking the Engineer-in-Training exam of the National Council of Examiners for Engineering and Surveying consistently exceed the national passing rate.


  • Internship evaluations for Journalism seniors are consistently high, and the raters' narrative comments are quite positive.

Designing, Improving, Implementing Assessment:

The methods, styles, strategies and aims for undergraduate student assessment can and should encompass a full spectrum of options: standardized national tests, special assignments embedded within senior capstone courses, portfolios and videotapes, specially designed competency exams, evaluation by faculty outside the unit or the Boulder campus, etc. For units who need to re-vitalize their assessment processes, we strongly encourage careful examination of the wide range of successful examples posted on the web. Members of the campus Assessment Oversight Committee can also serve as valuable sources of information.

Key elements for a successful assessment effort include (1) unit faculty buy-in plus a competent, conscientious unit assessment leader, (2) regularity of the process, (3) responsiveness to the results, and (4) adequate resources to accomplish the tasks at hand. Assessment goes to the heart of the academic enterprise: what have students accomplished academically and how well does that match what the unit states, for example, in the catalog? Clearly, faculty must control nearly all aspects of assessment for it is they who are charged with accomplishing academic achievement. If the faculty in a particular unit treat assessment as a make-work, unimportant enterprise, the practice will be a waste of time and other resources. On the other hand, those units with strong faculty commitment, demonstrate decided advantages and improvements in their programs. The effort needs to have stable, competent faculty leadership within each unit. When assessment results point up weaknesses, the unit faculty must identify actions to remedy those weaknesses, else the effort on assessment goes to waste. Assessment cannot be a sometime or irregularly implemented process and be effective. When assessment produces changes, the results of those changes also need to be assessed in a spirit of continual monitoring and follow up. Lastly, the campus administration must provide adequate resources to units to allow them to implement assessment plans and consequent actions, all aimed toward improved undergraduate education.

Some General Considerations and Difficulties Commonly Encountered in Senior Assessment Processes (Draft)

The PBA outcomes assessment website contains many links and connections to materials and models which may be useful to individual units as they work to determine the most appropriate and effective strategy for themselves. In very general terms here are some of the most common challenges.

  1. Outcome assessments focus on the particular skills, knowledge and capabilities normally described in the catalog for each major. The focus is also on students who are nearing completion of their degree in that particular major. Outcome assessments do not focus on measures of student 'satisfaction'; we have other instruments to address these issues.


  2. Large departments typically do not have any single course (or even a small group of courses) which enroll all of their seniors. Thus the problem of getting seniors to demonstrate their mastery of the skills or knowledge desired by the department can be quite problematical. 'Volunteer' exams for example, standardized or custom constructed, have low participation and commitment even with incentives such as money prizes. Consequently, they may provide a very poor indicator of student's accomplishments. The generally most effective departmental tool in this area is to tie exam performance to a course grade in a very substantive manner. For example, a national, standardized exam might be given as the 'final exam' in one or more senior-level courses and if it counts, say, 20% or more of the course grade, students take it seriously. Alternatively, the department might write its own exam and administer in the same way.


  3. Departments which stress, as key educational goals, critical thinking, abstract reasoning, problem solving, etc., may not be satisfied with standardized exams and will want to create their own instruments. Small departments may require a common senior course which all students must take and such provide an excellent opportunity for outcomes assessments which focus on the general departmental goals for students. A key part of this process, however, should include a focus on skills or knowledge not restricted to a single course and the evaluation of student performance on these assessment instruments should include multiple faculty, some of whom should be from outside the unit. Large departments may want to distribute the assessment materials across several senior-level courses.


  4. Assessments cost. The costs can be direct in terms of money (e.g. purchase of national standardized exams); they can be indirect in terms of clerical, administrative and faculty time (especially if the process takes place mostly outside of normal course work). On the other hand, their seems to be a strong consensus among specialists in assessment that this is potentially the most powerful, most useful tool available to departments to determine the extent to which they are achieving their common objectives. The Assessment Oversight Committee has a budget to assist departments in meeting some of these costs.

PBA: PS-- L:\IR\OUTCOMES\aoc\handbk1.doc

Last revision 03/24/03


PBA Home  |   Strategic Planning  |   Institutional Research & Analysis |  Budget & Finances  | 
Questions? Comments? |  Legal & Trademarks |  Privacy
15 UCB, University of Colorado Boulder, Boulder, CO 80309-0015, (303)492-8631
  © Regents of the University of Colorado