Map A to Z Index Search CU Home University of Colorado
 
  Strategic Planning Institutional Research and Analysis Campus Budget and Finances About PBA

PBA Home > Institutional Research & Analysis > FCQ > FCQ Related Studies > Pilot study of online administration

Pilot Study of Online Administration of the Boulder Faculty Course Questionnaire in Spring, 2003

Perry Sailor, PBA
October, 2003

Summary

Selected University of Colorado Boulder full professors and senior instructors were invited to participate in a pilot study of online administration of Faculty Course Questionnaires (FCQs) in spring, 2003. Students in classes whose instructors participated were sent e-mails telling them how they could complete their FCQs on the Web. Ratings and response rates for online administration were compared with ratings and response rates for the usual paper-and-pencil, in-class method used in prior terms by the same instructors in the same courses.

Course and instructor rating averages and standard deviations were no different using online administration than using paper-and-pencil administration. Response rates were substantially lower online, 56% compared to 74% using paper administration. Open-ended comments indicate that students who responded online much preferred that method, with positive comments outnumbering negative 78% to 9%, with another 3% neutral and 5% mixed. Students liked the ease and convenience, the opportunity to assign ratings and write comments without time pressure, and the fact that class time was not used. The relatively few negative and mixed comments often expressed concern about anonymity using online administration, as well as having to complete FCQs on their own time.

Background and Method

Faculty course questionnaires (FCQs) give students the opportunity to rate their courses and instructors at the end of each term. The Office of Planning, Budget, and Analysis (PBA) administers FCQs for the Boulder, Denver, and Colorado Springs campuses. FCQs are traditionally administered on paper (on scannable forms), in class, during the last week of classes. However, PBA has administered FCQs online, via Web, for several years for courses that are themselves online or opt for this format. Most of these courses have been at CU-Denver or in Boulder continuing education.

See the paper form of the FCQ, and the online form. The questions are identical, with the addition of one open-comment question on the online form, "What did you like and not like about this method of collecting course evaluations?"

In October 2002, PBA adopted a goal of moving the Boulder campus entirely to online FCQs within a 10-year period to reduce costs and paper use, reduce processing time, and improve convenience. A number of issues surround online administration. These include system capacity; security or anonymity (and perceived anonymity) of student responses; methods of preventing multiple responses by a single student and responses by individuals not enrolled in a course; student and instructor reactions; response rates; and effects on course ratings. Many instructors believe that moving FCQ administration out of the classroom will lower response rates and will lower course ratings because disgruntled students will continue to respond but more-satisfied students will not.

PBA has worked on the security and technical issues with CU-Denver online and other volunteer sections. To begin exploring the other issues, we launched a Boulder campus pilot in spring 2003.

The pilot was by invitation only. We selected a set of course sections for the pilot, notified the instructors, and used online FCQ administration in the selected sections unless the instructor requested otherwise. A handful of other Boulder sections used online administration in spring 2003, at instructor request. Results for these "volunteer" sections are not discussed here.

We selected course sections for the pilot as follows:

  • Sections with at least 20 students enrolled. All but one were undergraduate sections.
  • Taught by full professors -- Because lowered ratings were a risk, and full professors are under consideration for neither tenure nor promotion. They also needed to have at least two years of previous FCQ data, for comparative purposes. It was subsequently decided at the analysis stage that the best comparisons were to previously taught sections of the same course, but initial selection did not require this.
  • Using WebCT, a campus-supported Web course-management system - Because students in courses using WebCT would have at least some familiarity and facility with using Web for course-related work. ITS provided lists of course sections (including instructor names) using WebCT in spring 2003.

These selection criteria yielded too few sections for the pilot, so we then added sections taught by senior instructors but meeting all other criteria. This gave us 27 sections total, with 23 different courses taught by 22 individuals, all full professors or senior instructors. These individuals are hereafter called "instructors."

We notified the 22 instructors around March 1 by e-mail (see Display A for notification letter text). Three instructors requested that we not use online administration in their courses, and a fourth requested that we remove one of his two courses. We honored these requests. One instructor did not offer a reason for opting out. Reasons given by the other three included:

  • Belief that in a large section in which many students did not attend class regularly, administering the FCQ online would lead to many such students completing the FCQ
  • Concern that response rate would be very low, because WebCT was not actually used in the course
  • Concern that because WebCT use was unpopular in the course, online FCQ administration would draw negative responses

Removing the instructors who asked out left 19 different courses, comprising 23 sections taught by 19 individuals. (see Display B for the list)

Analysis

The unit of analysis was the section. The analysis variables were the average (mean) course and instructor ratings for each section in the pilot. A "pretest" average was also calculated by averaging means for all sections of the same course taught by the same instructor between fall, 1998 and fall, 2002. The sections of the four instructors who had not taught the course in that period were dropped from the analysis. In addition, one instructor's section was dropped because he team-taught the course with another instructor (not in the sample) and the online form did not reflect this, confusing the students as to whom they were supposed to rate. Two more sections were dropped because they were team-taught but the online forms named only one instructor. The final set for analysis consisted of 16 paired averages, each spring 2003 (online administration) section rating paired with the average of all sections of the same course taught by the same instructor during the nine previous terms (excluding summers).

Paired t-tests were then done on the following variables: average overall course rating (FCQ item 11, "This course, compared to all your other university courses"), average overall instructor rating (item 12, "This instructor, compared to all your other university instructors"), the average standard deviations within course sections of each, and response rate (completed forms as a percentage of students enrolled). Class size was also checked, since it is known to be strongly related to FCQ ratings, and we wanted to make sure that possible differences in class size between the sections with online FCQs and prior sections wouldn't bias the ratings. Average class size between online-FCQ and previous sections did not differ significantly.

Average course ratings and instructor ratings in the online-FCQ sections were slightly higher than in the prior, paper-administered sections, by .05 and .04 points, respectively, but the differences were not statistically significant. Effect sizes were minuscule - less than a tenth of a standard deviation on each measure. The standard deviations themselves also did not differ, so online administration apparently did not affect variability of the ratings. However, the response rate was 19 percentage points lower in the online sections, 54% compared to a prior average of 73%, a statistically significant difference (p<.001). Treating the means as independent, a more conservative test, gave the same results, except the p value for the difference in response rates was .003.

Table 1. Averages of analysis variables, and p values associated with mean differences.

Variable (Online) Spring 03 Average (Paper) Prior Average Difference p
Course rating 2.87 2.82 .05 ns
Instructor rating 3.14 3.10 .04 ns
Course rating SD .95 .94 .01 ns
Instructor rating SD .87 .89 -.02 ns
Response rate 54% 73% -19% <.001
N of instructors 12 12    
N of sections 16 54    
N of completed forms 888 5,714    

With the relatively small number of instructors participating (16 with usable data), the possibility of a few extreme cases exerting undue influence on the averages and thus giving misleading results must be checked. There was no evidence of that here. Differences between prior- and online- FCQ course ratings for the same courses/instructors ranged from -.82 to +.88, with a balanced distribution between those two endpoints. Fifty-six percent of the online-administration sections were rated higher, 44% lower than their prior paper-administration counterparts. The same was true for instructor ratings, with 50% of online-administration sections rating higher, 40% lower. And the lower response rate for online-administration sections was found across the board, characterizing all 16 sections (albeit one by only 0.3 percentage points), with 11 of 16 having a gap of 10 percentage points or more.

We computed intercorrelations among class size, response rate, and ratings of courses and instructors, to see if they were different with online administration than with paper. Table 2 shows the results.

Table 2. Correlations among key variables.

Prior to Spring '03 (paper administration)
  Response Rate Instructor Rating Course Rating
Class Size -.84 -.77 -.73
Response Rate   .82 .81
Instructor Rating     .93

Spring '03 (online administration)
  Response Rate Instructor Rating Course Rating
Class Size -.44 -.82 -.75
Response Rate   .53 .55
Instructor Rating     .96

The correlation between class size and response rate was -.84 using paper administration (i.e., the larger the class, the lower the response rate), but only -.44 for online administration. This is obviously a sizable difference, although due to the small N it's not quite statistically significant at the conventional .05 level (p = .056). The lower correlation is because response rates tended to drop more in smaller classes than in larger ones - the drop averaged 25 percentage points in classes of fewer than 50, 12 percentage points in those larger than 50. It is probable that the usual high correlation between class size and response rate is due to larger classes having lower attendance, for a variety of reasons. Since completing the FCQ online doesn't require attendance, one would expect a lower correlation, as was in fact the case. The fact that the negative correlation is still somewhat sizable is perhaps due to larger classes being more populated by non-majors and other students with relatively lower interest, which might translate to a lower likelihood of completing the FCQ.

The correlation between response rate and course/instructor ratings also dropped with online administration, from the .8 range to around .5. The usual high correlation between response rate and instructor/course rating is probably due to high response rates being associated with smaller classes, more often largely populated by majors and/or upperclassmen, bringing a higher interest level to the course, and thus perhaps more predisposed to rate the course and the instructor highly. With the class size-response rate relationship attenuated a great deal by online administration, it is not surprising that the response rate-course/instructor rating relationship would be attenuated as well, as seems to be the case here.

Open-ended student comments

Content: In addition to the regular FCQ items, there was also an open-ended item that asked respondents "What did you like and not like about this method of collecting course evaluations?" We placed the 470 responses received into one of five categories: positive, negative, neutral, mixed, and unresponsive, all according to the judgment of the author of this report. Using these codes, 78% of responses were positive, 9% negative, 3% neutral, 5% mixed, and 4% nonresponsive (i.e., the response given didn't have anything to do with the method of FCQ administration, but rather with the course itself or with something else). By this measure, online administration was overwhelmingly popular. Of course, these responses came entirely from students who in fact completed the form online; it's possible that students who didn't complete the form would have been more negative about online administration.

Positive comments mostly centered around ease and convenience. Many students also expressed appreciation for being able to do the survey on their own time, rather than class time, for not feeling rushed and being able to take time to fully articulate their thoughts, and for having more privacy. Several made an interesting point with regard to being able to take more time: that while they appreciated it for the class they happened to be rating, they weren't sure their appreciation would extend to having several classes to rate online at their leisure. That, a few students said, would be too much of a good thing, and they would be less likely to complete several online forms than to complete just one.

Ironically, one of the main dislikes cited by the relatively few students who made negative comments was the same as one of the main likes cited in the positive comments: the fact that filling out the FCQ online doesn't take class time. Some students admit that they like getting out of class, or at least class work, early for FCQ administration. Others just resent having to do FCQs on their own time. A second negative mentioned by many students is the method of notification. Many students said that they almost missed the message with FCQ instructions because they often don't check e-mails coming to their university accounts, or they thought it was junk mail.

The other main issue stated in the negative (and mixed) comments was anonymity - a number of students were not convinced that responses given over the Web were truly anonymous and could not be traced to them by the instructor. Only two students said they felt that online FCQs were more anonymous than paper ones. In earlier testing some students said that handwriting on the paper forms could threaten anonymity.

Finally, it's not really a dislike per se, but a great many students correctly surmised that response rates would be lower with online administration. Many also (incorrectly, it turns out, as far as we can tell) guessed that responses would be pulled more to the extremes with online administration, because only students with very positive or very negative feelings would be motivated enough to respond.

Volume of comments: With online administration we can easily measure the volume of student responses to the four standard open-comment items:

  • The most effective aspects of this course were:
  • The least effective aspects of this course were:
  • The best ways to improve this course would be to:
  • Further comments:

In the 16 sections, an average of 52%, 49%, 49%, and 30% of respondents wrote something on these four items, respectively. Those who did write something averaged 14, 21, 19, and 32 words per response.

We do not have similar statistics for the paired on-paper sections. However, informal assessments by FCQ staff suggest that both the proportion of students writing anything, and the length of responses, in online administration exceed those for paper.

Next steps

Some results of the pilot are encouraging for expansion of online FCQ administration:

  • Most invited instructors agreed to participate (or did not disagree)
  • Most students liked doing FCQs online
  • There is no evidence that online administration yields lower (or higher) ratings, or that online FCQs are completed only by students with more extreme views about a course

Other results are discouraging:

  • Some instructors declined to participate - 4 of 22 is 18%, not insignificant
  • Some students found online administration less convenient, and worried about security
  • Response rates were definitely lower, and some students who responded mentioned that they might not do so if faced with online FCQs for all their classes.

We will work with Michael Grant, associate vice chancellor for academic affairs, on further tests, which may involve random assignment of sections, instructors, or departments to online FCQ administration. We will also more actively solicit and encourage individual instructors and sections to voluntarily opt for online administration.

PBA - L:\fcq\BDonline\Pilot031\anal.doc

Last revision 08/02/13


PBA Home  |   Strategic Planning  |   Institutional Research & Analysis |  Budget & Finances  | 
Questions? Comments? |  Legal & Trademarks |  Privacy
15 UCB, University of Colorado Boulder, Boulder, CO 80309-0015, (303)492-8631
  © Regents of the University of Colorado