Department of Speech, Language, and Hearing Sciences
In some summaries of assessment activity, goals are referred to by number (e.g., K-2 is knowledge goal 2).
The department has experimented with several assessment methods, finally settling on portfolio review. The first year's assessment, in 1989-90, focused on knowledge and skill goals K-5 and S-1. Second semester seniors took a 48-item "practice" test prepared for post-M.A. students who would be taking a national accreditation exam. The test was developed by a national group of professionals. In addition, seniors' videotaped oral presentations were evaluated by an outside expert and an SLHS faculty member on 1(lowest) to 7(best) scales for several criteria. The students averaged above the midpoint on the written test, but faculty found it hard to interpret differences among undergraduates' performance on a test devised for post-M.A. students. The ratings of the oral presentations averaged above the midpoints on the 7-point scales, at levels that were acceptable but not as high as the faculty wanted. That year's report indicated that faculty would emphasize oral presentations more in the appropriate courses.
The 1990-91 assessment used questions embedded in course exams to evaluate progress toward goals K-3 and S-3. Multiple evaluators scored answers to questions related to these goals. Again, while the students' performances were judged to be good, the faculty were not satisfied that the assessments gave them the most useful kinds of information. In 1991-92, they began a portfolio assessment, using the expertise of department personnel involved in a federally funded project that used the portfolio approach extensively. This seemed a viable approach for SLHS since the department typically has only 20-25 graduating seniors and faculty are able to meet with each of them individually during the assessment. Three faculty members, at least one outside reviewer, and the coordinator of undergraduate studies provided free-form remarks for each of the portfolios. These remarks were then categorized and summarized. (It became clear during this process that reviewers should be provided with definitive guidelines for evaluation. Developing a clearer rating scale became a goal for the future.) In addition, the department conducted exit interviews with the graduating seniors and obtained GRE scores for six of them.
According to the portfolio reviewers, the students ranged from "good" to "outstanding" in their understanding of clinical principles and the scientific basis of the profession, and in their clinical interactions. They were less satisfied with the students' written and oral communication skills and abilities in critical analysis and synthesis of materials. One departmental response to this in the following years was to urge faculty to require more significant written assignments.
In their exit interviews some students indicated that they would like to have had more direct clinical practica available to them in regular required coursework. Students who expressed this concern to the academic advisor at earlier stages of their program had been advised to seek out an independent study or elective undergraduate internship course, and several followed through on this option. The six students who provided GRE scores in 1991-92 did best on the test's verbal section, with average scores at the 60th percentile on the national scale. They did less well on the analytical (average: 45th percentile) and quantitative (average: 39th percentile) sections. The 15 students who provided GRE scores in 1992-93 performed similarly on the analytical (48th percentile) and quantitative (34th percentile) sections, but less well on the verbal section (41st percentile).
During 1991-92 and 1992-93 the department incorporated its outcomes assessment experience and information in a formal self-study and Program Review. Focus in 1992-93 was on Program Review rather than on the details of outcomes assessment. Portfolio review was continued, but emphasis was more on students' future plans and progress in the profession than on the detailed characteristics of the work in their portfolios. For example, nine (36%) of the 25 seniors had been accepted into graduate programs by the end of the spring semester. Two had elected to postpone attending graduate school and indicated intention to apply for entry fall 1994. An additional four students had completed their major course work in SLHS and were in the process of completing requirements for the School of Education and eventual teaching licenses.
The more thorough portfolio evaluation resumed in 1993-94. This was the first time that that fully representative portfolios were available for the graduating seniors. The portfolios included 2 to 3 years of collected documents reflecting the students' written work--samples of essay question responses from in-class exams, responses to take-home exams, research papers, and article critiques. The 1993-94 portfolios were evaluated by an SLHS faculty member and one outside evaluator. A composite score, on a scale of 1(worst) to 9 (best), was based on separate ratings of accuracy and depth of content information, evidence of integration across content areas, cohesion and clarity in writing, and evidence of critical analysis. The mean rating for the 26 portfolios was 6.75 (range 4.7 -8.8).
In 1994-95 the department significantly revised both the undergraduate curriculum and the outcomes assessment process. The faculty felt that the portfolio approach provided a global assessment rather than an evaluation of specific components of the major program. In the new curriculum, a required senior course, Introduction to Clinical Practice, serves as a capstone course stressing the preprofessional nature of the program. Assignments and experiences involving both faculty and professionals in the community address the program's various knowledge and skills goals. The material for evaluation includes written reports of clinical observations and critical evaluations of clinical practices. Informal exit interviews and exit questionnaires provide additional insight from the student perspective. Approaches to using the capstone and exiting-senior information were designed and tested during 1994-95.
The department continued to use a combination of exit interview and capstone course information as the major elements of the outcome assessment process in 1995-96. This year the capstone experience involved students' developing personal portfolios which served to supplement evaluation of the cumulative portfolio in the departmental advisor's office. Two faculty members independently rated each student's portfolios. The 27 graduating seniors achieved an average rating of 8.7 (range 8.2-9.7, on a scale of 1-10).
The information from the exit questionnaires/interviews provided some insights into the effects of the new undergraduate curriculum which became effective in the fall of 1994. Students' feedback indicated concerns about the reduction of two courses from three credits to two credits. The concerns were that the material seemed to warrant a full three hours, and students would be less well prepared for graduate work since other undergraduate programs require a three hour commitment to these areas.
The questionnaires and interviews also indicated that students were enthusiastic about the recently added undergraduate internship option.
Last revision 07/12/02
PBA Home | Strategic Planning |   Institutional Research & Analysis |  Budget & Finances |
Questions? Comments? | Legal & Trademarks | Privacy
15 UCB, University of Colorado Boulder, Boulder, CO 80309-0015, (303)492-8631
© Regents of the University of Colorado