Assessment Oversight Committee (AOC) meeting notes 3/22/01
Merrill Lessley - EPOB, Geography, Psychology, Computer Science.
Can see why NCA concluded not doing what we should be. No connection to previous reports. No evaluation of data. Nothing on how assessment has changed curricula. Lots of satisfaction surveys, little else.
Computer Science: Use nationally normed test, MFAT. Compare scores to other institutions' - students perform well. Nothing on what needs improving, nothing on how it's used. ABET more insistent, especially 2nd time - how have improved.
EPOB: Looks at GRE scores as indicators. Self-selected group. Little meaning re: curriculum performance. No tools for evaluating curriculum. No comparative data with other institutions or with own past performance. Mostly use satisfaction survey.
Geography: Fairly interesting. Embedded questions in senior exam. Faculty grades. Map back to goals. Fac. teams assess how responses relate to curriculum. No clear rationale linking assessment to curriculum changes. They do it, but don't say how. Could use triangulation with other methods. Do annually.
Psychology: Embarrassing. Worst I've ever seen. Complete waste of time. Raw results of satisfaction survey posted now.
Kumiko Takahara - Economics, Engineering, International Affairs, SLHS
Other comments (in email sent to Perry Sailor 1/31/01):
International Affairs Program: Rather unstructured. Graduating seniors' assessment reports draw heavily on the program review report. The department received (1993-4) Student Affairs Research Services assistance to conduct telephone interviews of senior majors and ex-majors. For 1996-7 the department committee members read some of the term papers, group projects from upper division courses and conducted exit interview of graduating seniors to monitor their satisfaction rate.
Economics: Uses variety of measures including a national economics exam. Portfolios of seniors containing term papers, written exams and other materials which are reviewed by a retired economics professor from outside. During 1998-9 they added a survey of graduating economics majors and of Economics departments across the country and individual exit survey by undergraduate advisor. The questions focus on graduating seniors' immediate plans and career goals, students' satisfaction rate, comments and evaluation of the faculty.
Speech, Language, and Hearing Sciences: Used portfolio review, exit interviews, capstone course for developing personal portfolios. Seems to change combinations and assessment methodology almost yearly.
College of Engineering and Applied Science: Uses Fundamentals of Engineering Exam (National Council
of Examiners for Engineering and Surveying) except for Applied Math and Computer Science senior majors. This
and B.S. from Enginering program receive accreditation from board for Engineering and Technology as the
first steps toward professional registration. The exam is nationally developed and nationally normed. [Note: Currently the college requires
students in only two majors - mechanical and civil engineering - to take the FE exam. PS]
Elease Robbins - Classics, History, Sociology, Business
Classics: Self-regulating. Assess assessment. Internal and external reviewers (only once, 10 years ago, for external) and tools, Greek and Latin separately, seem to be in constant revision. Thorough, lots of variety of methods. Not up to date. [Grant: Important thing is are they using assessment for change.]
History: Internal and external evaluators. Through 98. Divided into areas of emphasis. American, world, European. Look at which parts stronger. Look at variation between jr and sr faculty, TTT and honorarium faculty. [Kinney: focus evaluation on skills, not knowledge. Based on senior seminar papers.]
Sociology: 1997-98. Use MFT. Compare by area of emphasis within dept. Compare their grads w. others. [universities?] Offered MFT on voluntary basis - only 12, 13 kids when voluntary. Did in upper div. seminar classes earlier. No student evaluations.
Business: No results ever posted.
Gordon Brown - Ethnic Studies, UWRP, Linguistics, Women's Studies
Ethnic Studies: Small number of students, lots of divisions. Only thing done was alumni questionnaire in response to program review. Small response rate. No possible comparison to other universities. [Grant: connected to goals?] Thrust of questionnaires was whether program provided preparation for wide variety of fields.
UWRP: Very serious about evaluation. Change every year, trying to find better way. (Gave list of techniques used over years.)
Linguistics: Evaluate proficiency in 2nd language (one of goals). (Listed measures used over time for proficiency in language structures.) Use standardized tests for proficiency in 2nd language.
Women's Studies: Assessment originally focused on goals - evaluation of exams, projects by internal, external reviewers. Now - portfolio, project evaluated by 3 faculty members. Have faculty meeting to discuss assessment results. [McClelland: not only do faculty discuss what students have learned, but so do students.]
Stephen Jones - American Studies, Fine Arts, Physics, Applied Math
American Studies Program: Faculty evaluate research papers and projects from senior capstone courses required of their majors. The faculty evaluators are not the instructors of the courses. I did not find when the last evaluation was and if it is done every year.
Fine Arts: The two programs are Studio Arts and Art History. In Studio Arts outside experts are brought in to evaluate and critique the artwork of graduating seniors. In addition to the external review, exit survey forms are solicitated from students. This appears to have been in place and there is data for the AY 1999-2000. In Art history, there are exit surveys only. There is data for the last three years.
Department of Physics: Since 1989-90 the department has evaluated the knowledge and skill goals by analyzing term papers and project reports in senior level-capstone courses. Some graduating students have been interviewed to hear their assessments. The most recent date listed was '95-96. Physics also does outcome assessment activities relating to non-major undergraduate offerings. The first analysis is done through the Medical College Admissions Test (MCAT). Data here is as recent as 1998.
College of Engineering/Applied Science: With the exception of applied mathematics and computer science the College has used the Fundamentals of Engineering Examination of the National Council of Examiners for engineering and Surveying. This is also called the Engineering in Training exam. The exam is nationally developed and nationally normed. The last data point listed was '96. [Note: Currently the college requires students in only two majors - mechanical and civil engineering - to take the FE exam. PS] There is a new engineering accreditation process to be instituted in the next three years "which is based heavily on a variety of outcome assessment measures." The College is re-examining the outcomes assessment in light of that over the next year.
Applied Mathematics Program: The department has had graduating seniors take the math subject examination of the GRE, but is considering dropping that and participating with the College of Engineering in planning for new ways to do outcomes assessment. They have had students participate in national competition for mathematical modeling and used that as part of their outcomes assessment.
Department of Computer Science: The Department has been using the "ETS Major Field Achievement Test (MFAT) in computer science." It is a nationally standardized test based on the GRE. The last testing was done in 1999.
Mike Grant - EALL, Chem, Theater/Dance
EALL: describe what they do. Good data. Works in the spirit of what we want.
Chemistry: Pre, post knowledge test. Embedded in course exams. Also assess skills in lab. Authentic assessment. Outside reviewers. Only posted up to 1993.
Theater/Dance: Followed professional association's assessment guidelines. Claim improvement as a result of assessment. Results posted through 97-98. First class job.
English: Outside reviews of student essays. No mention of changes made as result of assessment.
Sam Fitch - Film Studies, Math, Critical Thinking (forgot notes; did from memory!)
Film Studies: Did use outside reviewers of student works (films). Reviewers need criteria/guidelines; don't have, so all reviewers use own criteria.
Math: Standardized test. Conclusion: compared to national avg., we're above. They did note problem with norm group comparability. Did not note tracking (or not) of year to year trends. No linkage to curricular changes.
Critical Thinking: General cross-department goals. Only assessments one-time shot, early on (while developing requirements). Cornell Critical Thinking Test. Compared performance of students in non-CT courses to those in CT courses. Does this test still exist? Done ca. 1991. [Kinney: was on assessment committee for CT - gathered syllabi, tried to see what CT courses did re: CT.]
Padraic Kinney - Anthropology, French/Italian, Political Science, Applied Math
Anthropology: Did do report in 2000. Look at knowledge. Described methods. Results seem to map grades in courses. Did this through 96. Switched to questionnaire in lower division course (2020). Pre-post.
French/Italian: French: "Ideal for type." Up to '96. Ideal because small program, small number of students, all known to faculty. Senior essay or honors thesis. Present before faculty. Part conducted in French. Italian: Assessment at its worst. In '93, said they'd do what French does. But no report since.
Political Science: Amazing document. Reads like scientists planning attack on cancer. Multiple stages, plans of attack. [Fitch: surveys most useful tools, not just satisfaction - have list of skills gained, needed. But students say (on survey) that they have learned critical thinking. But faculty doesn't think they do. So asking students may not be best way to assess. One year to get faculty to agree on scoring critical thinking.] [Someone: different for departments where students typically declare major as frosh (pol. sci.) vs. jr/sr (history).]
Applied Math: Find out what best students do. GRE, students who go to some competitions. When? '96? Don't look at entire population.
PBA: PS-- L:\IR\OUTCOMES\aoc\notes010322a.doc - 5/14/01.
Last revision 03/01/02
PBA Home | Strategic Planning |   Institutional Research & Analysis |  Budget & Finances |
Questions? Comments? | Legal & Trademarks | Privacy
15 UCB, University of Colorado Boulder, Boulder, CO 80309-0015, (303)492-8631
© Regents of the University of Colorado