Map A to Z Index Search CU Home University of Colorado
 
  Strategic Planning Institutional Research and Analysis Campus Budget and Finances About PBA

PBA Home > Institutional Research & Analysis > Outcomes Assessment > Assessment methods used by academic departments and programs

Assessment methods used by academic departments and programs


This section summarizes and gives brief examples of the assessment methods that CU-Boulder units have used.

Units use many different assessment methods. As with teaching and classroom assessment methods, they reflect differences in normal practice from discipline to discipline, differences in the particular faculty members involved, and practical constraints such as the number of majors a unit has and its typical class sizes.

In their reports, units provide evidence that their procedures have "face validity." Simply put, they must demonstrate that their efforts are not merely self-serving. This usually involves separating the role of evaluator from that of instructor. For example:
  • Faculty at institutions comparable to CU-Boulder review some units' goal statements and assessment procedures and assist in evaluating student papers, portfolios and test results.

  • Questions embedded in classroom exams and assignments are often developed by faculty committees, providing assessment beyond that devised for the particular course by its instructor(s).

  • Student papers and portfolios and performance on embedded questions are independently evaluated by more than one faculty member, at least one of whom is not an instructor in the course from which the material came. In some units, a separate outcomes assessment committee or undergraduate curriculum committee is responsible for these evaluations.

  • Some units use external tests with national norms, such as the Engineer-in-Training Exam or the ETS Major Field Achievement Test in their subject area.

  • Some units developed their assessment methods or instruments with the help of professional staff from Planning, Budget & Analysis (PBA).

The examples that follow illustrate the general types of methods that are used and various ways of establishing validity. These are only examples--they don't describe every unit's approach nor do they mention everything a unit does. The table lists all the methods reported by each unit since 1989-90 and indicates which ones are currently used. Details are in the individual unit summaries.

Embedded testing

Questions intended to assess outcomes goals are incorporated into exams or other assignments. Student performance on these items is evaluated as part of outcomes assessment. It is also graded by the instructor as part of the course, assuring good effort by the students without the burden of extra assessment-specific requirements. Some specific examples of this technique are:
  • Classics faculty review students' translations of selected passages in the final exams of classes at various levels, from beginning courses to advanced ones. The students have not seen these passages before. The translations are evaluated by at least two faculty members, one of whom is the course instructor.

  • Assessment questions for several of the Geography department's goals were developed and embedded in the final exams of advanced courses. For each course, at least two faculty members develop the embedded questions and score the students' answers. The responses of senior majors in these courses are given a separate score for each of the department's goals.

Student papers and projects

Samples of student work from a variety of courses are evaluated by faculty or by external reviewers to see how well the students are meeting program knowledge and skills goals. This is a form of embedded assessment--the papers and projects are also graded by the instructor as part of the course, assuring good effort by the students without the burden of extra assessment-specific requirements. For example:
  • A sample of senior level art history papers are collected from graduating Fine Arts majors and presented to outside evaluators. Each year's evaluators write comments about the overall quality of the sample, then select twelve papers and discuss them in-depth as a team. They rate the papers on a five-point scale for each relevant program goal, and submit individual evaluations for each student and a final written report summarizing their findings.

  • Each year, fifteen percent of the papers submitted by senior History majors in 3000 level courses are selected for evaluation by the department's undergraduate studies committee (UGS). The papers are evaluated on the basis of the department's goals by three-member subcommittees of the UGS. Each member independently rates all the papers in the subcommittee's set. The UGS has worked to refine the rating procedure over the years, increasing rater agreement by defining the scoring categories more and more rigorously and systematically modifying the rating scale.

Capstone projects

A capstone course is designed for senior majors in a department. It integrates the knowledge, concepts and skills associated with the entire sequence of study in the program.
  • Independent projects in capstone Chemistry laboratory courses are evaluated by multiple readers. Students report their work both in the form of a written journal article and orally. Written reports are evaluated on the basis of form and quality of writing as well as on content.

  • All French majors must pursue a senior research project, the senior essay. The project is the culmination of work in all the program's knowledge and skill areas. During the oral presentation, time is given for specific discussion in French of the topic. The senior essay and oral presentation are evaluated by a committee of the student's essay advisor and one other faculty member.

  • Graduating Women Studies majors taking a capstone critical thinking seminar are asked to collect a portfolio of all materials from their Women Studies courses, and to write a short summary for each course. Specific questions about learning styles and experiences are provided to help guide their reflections. They write a full paper about their experience as a Women Studies major, tracing the main themes and content of their learning experience. In addition, each student is asked to design and carry out a research project on a topic that extends their learning experience. Three faculty independently rate the reflective and research papers, using guidelines and specific questions about how the students' performance reflects each of the program's knowledge or skill goals.

Portfolio evaluation

A wide variety of material such as tests, papers, and research or artistic output is collected during a student's course of study. This portfolio is evaluated at the end of the student's career by an independent jury.
  • Each graduating Theatre and Dance major presents what he or she has done in connection with theatre and dance productions during his/her time at CU-Boulder. A panel of two or three faculty conduct each review, interviewing the student and reviewing any video tapes of their performances. Senior seminar essays are also reviewed.

  • The Women Studies assessment described above as using a capstone project is also a good example of portfolio assessment.

Nationally-normed tests

These tests are developed by a subject area's professional association or by professional testing agencies. They are used at many colleges and universities and CU-Boulder students' scores can be compared to national norms.
  • Educational Testing Services' Major Field Achievement Test (MFAT) for an area is a nationally-standardized multiple-choice exam based on the Graduate Record Exam (GRE). It is designed to assess graduating seniors' knowledge and skills in that area. The Computer Sciences, Mathematics, and Sociology departments regularly use the MFAT in their areas.

  • The departments of Chemistry, EPO-Biology, and Physics examine the GRE scores of their seniors who take the exam because they are applying to graduate school. All graduating seniors in the Applied Mathematics program are asked to take the GRE in their area.

Other broadly-normed tests used in CU-Boulder undergraduate outcomes assessment include

  • the National Greek Examination made available to graduating Classics majors, the internationally-normed Zertifikat Deutsch taken by graduating German majors, and the ETS Russian Proficiency exam taken by graduating Russian majors.

  • the Fundamentals of Engineering Examination of the National Council of Examiners for Engineering and Surveying. This is also called the "Engineer-in-Training" exam. Seniors in the College of Engineering take this exam as one step in qualifying for professional registration.

  • foreign language exams for Linguistics and Spanish majors using objective criteria defined by the American Council of Teachers of Foreign Languages (ACTFL).

  • a variety of measures required by the Colorado Department of Education and reported as part of outcomes assessment by CU-Boulder's School of Education.

Unit-specific tests

Some units develop their own diagnostic exams and use them as part of outcomes assessment. Each year's results are compared to those from previous years and to faculty expectations.
  • The Communication faculty have developed a multiple choice exam to test all of the program's knowledge goals. The test is administered in a required senior level course, and a faculty committee reviews and updates the exam as needed.

  • Diagnostic exams developed by the department for Theatre and Dance seniors are administered during the required senior seminar.

  • Germanic and Slavic Languages faculty have developed an essay exam on Russian literature and culture that graduating seniors take along with the nationally-normed Russian Proficiency exam.

Pre/post testing

A test or assignment is given at the beginning of a course or program. A similar test or assignment is given at the end to determine student improvement.
  • The University Writing Program collects representative samples of students' initial and final essays each year for evaluation by a panel of UWRP instructors and outside experts in student writing. The UWRP instructors do not evaluate papers from their own course sections. The evaluators rate each of several main characteristics such as the student's ability to state the issue that is to be discussed, to state his or her position on the issue, to give a preliminary summary of the argument to be made, and then to develop the argument.

  • Students entering the College of Music take tests in music history and music theory. These two subject areas represent a common set of courses required of all music majors. When the students complete their music theory and music history sequences, they are given exit exams to determine their growth in the two areas. Scores from the entering and exiting exams for a random sample of each year's graduating seniors are compared as part of each year's outcomes assessment report.

Surveys and interviews

Surveys and interviews are used to gather students' opinions about their educational experiences and experts' opinions about the students' competence.
  • The departments of Chemistry, Communication, Speech, Language, & Hearing Sciences, EPO-Biology, Fine Arts, Geological Sciences, International Affairs, Linguistics, Mathematics, Physics, Psychology, and Theatre and Dance, and the Schools of Education and Journalism and Mass Communication survey or interview their graduating seniors.

  • As part of the self-study required by CU-Boulder's formal program review process, units may ask PBA to survey alumni about their post-graduation experiences and satisfaction with their CU-Boulder education. Some units also use the survey information as part of that year's outcomes assessment.

The department of Communication and the Schools of Education and Journalism and Mass Communication survey student internship supervisors.
  • Education seniors' student-teaching terms are supervised by both university faculty and cooperating teachers (the schoolteachers to whom the student teacher is assigned). Each term, cooperating teachers complete a final evaluation at the end of the student teaching assignment and university supervisors complete a final evaluation of each student teacher they supervise. They rate each student on 18 factors including consistency, knowledge, maturity, planning, responsibility, and speaking and writing skills.

Changes in Assessment Methods

Units may change their assessment methods as they gain experience. Some units found that their original choice of assessment methods did not produce the most useful information about their particular curriculum. Others found that they needed to focus on a different or broader set of goals. Sometimes a new method simply fits a unit's academic style better than the old one. The examples that follow show some of the methodological changes that have occurred. The table and individual unit summaries identify additional changes.
  • Units such as the departments of Geography and Oriental Languages have regularly expanded their pools of embedded exam questions so as to evaluate more of their programs' goals.

  • The department of History and the school of Journalism are among those who have revised and refined rating scales used in their assessments.

Some units experimented with a method for a year or so before replacing it with one they felt was more appropriate for their program. Examples:
  • The Department of Communication began by having external reviewers evaluate samples of student work from upper-level exams, but switched to a "home-grown" exam.

  • Women's Studies at first used external review of final exams in a capstone course and then, partly at the external reviewer's suggestion, developed its current extensive portfolio review.

EPO-Biology (EPOB) and the Department of Speech, Language, & Hearing Sciences are examples of units that have systematically expanded their outcomes assessment process by adding new methods and modifying old ones. EPOB started with faculty-developed embedded exam questions and later added to and improved the pool of questions, but eventually dropped this approach. Meanwhile, they added external review of seniors' research papers, analysis of GRE and MCAT scores, and an exit survey. SLHS began with questions embedded in course exam, switched to portfolio reviews, and then added review of materials produced by students in a capstone senior course, as well as an interview of exiting seniors.

[Next]

Last revision 06/02/04


PBA Home  |   Strategic Planning  |   Institutional Research & Analysis |  Budget & Finances  | 
Questions? Comments? |  Legal & Trademarks |  Privacy
15 UCB, University of Colorado Boulder, Boulder, CO 80309-0015, (303)492-8631
  © Regents of the University of Colorado