Based on the faculty/staff survey administered in September 2016, the FCQ Redesign Committee compiled frequently asked questions (FAQs) in an effort to provide more information to faculty, staff and students.

Submitted by a Boulder faculty member from the College of Arts and Sciences (Roughly 20% of survey respondents mentioned this in comments)

The FCQ Redesign Project Committee is aware that systemic bias, e.g. the inherent tendency of a process to support predetermined conclusions, can potentially be a problem in instruments that gather student feedback. While no FCQ-type instrument has ever fully eradicated bias because student self-report data will always be susceptible to bias, one of the committee’s primary goals is to analyze the new instruments’ data to ensure that any new instrument adopted does not increase the bias of ratings of professors based on ethnicity, gender or other characteristics. If possible, we would like the new instrument to lessen the chance that professors of any background receive lower ratings for anything other than those characteristics relevant to the construct we are attempting to measure, teaching quality. The FCQ Redesign Project Committee is committed to choosing an instrument that is better than the present instrument when it comes to a few criteria, but one of our primary analyses (expected to be finished in late November) will be focused on the present FCQ to see if bias exists and to make sure the new instruments are an improvement compared to the present tool if evidence of bias exists in the traditional tool.

Submitted by a Denver faculty member from the College of Liberal Arts and Sciences (Roughly 15% of survey respondents mentioned this in comments)

One of the challenges with seeking student feedback is that students may not always be objective in the feedback they provide. The FCQ Redesign Project Committee hopes to balance the positive intention of Regent Policy 4-B to provide professors with meaningful and actionable feedback they can use to improve teaching with the potential for students to provide feedback based on things other than objectively observed critiques. One way to accomplish this is to focus students on specific behaviors they have observed in classrooms. Some research shows that student perceptions of teacher effectiveness negatively correlate with other measures of teacher effectiveness (see:Carroll and West (2008), Weinberg, Hashimoto, and Fleischer (2009), and Braga, Paccagnella, and Pellizzari (2014) as examples). It is our hope that focusing students away from perception questions will make the feedback provided more actionable and less likely to be clouded by popularity or grade inflation. Although the analysis of the first and second semesters’ pilots may validate or invalidate our assumptions.

Submitted by a Colorado Springs faculty member from School of Public Affairs (Roughly 9% of survey respondents mentioned this in comments)

Research shows that response rates often decrease when surveys are moved to an online format (see: Nulty (2008) for an overview) However, the technology vendor we are piloting (www.campuslabs.com)  assures us that they attain high response rates (roughly 80%) when professors follow the paper FCQ process that we have used across campuses for more than 20 years. By giving students time in class and reiterating to students how important the FCQ information is to faculty development and course improvement, the Campus Labs groups tend to have higher response rates than our online FCQs that are answered outside of class. Since Campus Labs has optimized the FCQs to be answered on mobile devices like cell phones and tablets, students are able to respond on any device they have handy. We will be developing a full process for getting students to respond to FCQ and will be testing that process in pilot classes.

When it comes to qualitative information, we will continue to provide an open comment box for students to submit additional feedback. Campus Labs also has the ability for professors to add questions to the FCQs, which could allow professors to seek more focused qualitative information. With that said, considerable research shows that an increase in text based survey questions has a negative effect on response rate. So, professors should be mindful when it comes to the addition of comment-box style questions.

Submitted by a Colorado Springs staff member (Roughly 7% of survey respondents mentioned this in comments)

By focusing on teaching behaviors that correlate strongly with student learning, The FCQ Redesign Project Committee hopes to improve the instrument’s ability to provide meaningful and actionable feedback. A body of research shows that the evaluation of teaching behaviors may correlate more strongly with student learning (see: Spooren, Brockx, and Mortelmans (2013)). Further, a body of research shows that students are more accurate in rating specific classroom behaviors than they are capable of accurately representing their own learning. The committee hopes that, by focusing student attention on observed behaviors in the classroom, the feedback provided will be more actionable.

Submitted by a Colorado Springs faculty member from the College of Letters, Arts and Sciences (Roughly 6% of survey respondents mentioned this in comments)

The FCQ Redesign Project Committee has examined research on student self-reported perceptions of learning and there is a consistent body of work across multiple disciplines that suggest that students are not very reliable at expressing how much they have learned (see: Carroll and West (2008), Weinberg, Hashimoto, and Fleischer (2009), Spooren, Brockx, and Mortelmans (2013),and Braga, Paccagnella, and Pellizzari (2014)). Because of this research, we are presently exploring instruments that focus on teaching behaviors exhibited by professors in class. Asking student whether they observed behaviors that correlate with good teaching practice should provide professors with more actionable feedback and should reduce the problem with reliability of student self-reported perceptions.

Submitted by a Boulder faculty member from the College of Arts and Science (Roughly 4% of survey respondents mentioned this in comments)

Until we do an analysis of the new FCQ instruments we will not have a solid answer to this question. The Boulder and Denver campus pilots are asking each student to respond to the old FCQ along with one of the pilot FCQ instruments. This will increase the ability of the committee to look at how the new tools’ data is both similar to and differs from the present tool. As we do our analyses we will make them available on this website so that anyone who has concerns can look them over.

Submitted by a Boulder staff member (Roughly 4% of survey respondents mentioned this in comments)

Multiple professors and staff members wrote variations of this statement. It is worth noting that some professors were concerned that the FCQ  is a general tool and they would prefer to see more specific tools because all classes are different. While we have to have a general tool that allows for comparisons across all campus classes because of the spirit of the Regent Policy 4-B that mandates FCQs across all campuses, we hope to achieve the requested specialization by giving professors an easier way to ask course specific questions. The Campus Labs tool we are piloting (www.campuslabs.com ) has a much easier interface for professors, departments, or schools to add questions to the FCQs. This functionality will be tested, hopefully in the spring pilot, and should be rolled out to all faculty as soon as is viable after it is tested assuming the campus labs tool is the instrument we choose to use for our final FCQ instrument.

The reduction in paper and subsequent need to scan up to 600,000 pieces of paper will reduce processing time and have a huge impact on the ability to turn around results. The FCQ Redesign Project Committee expects to build an automated analysis algorithm that will further improve turnaround-time of results. The creation of this algorithm cannot begin until an instrument is selected, so this work will happen later in the project.

Submitted by a Boulder faculty from the Leeds School of Business (Roughly 4% of survey respondents mentioned this in comments)

The FCQ Redesign Project is limited in its scope to both the process of administering the FCQ and the instrument itself. Regent Policy 4-B mandates the use of FCQs in system-wide professor evaluations. At present, how those FCQs are used in evaluations is school and department specific. There has been discussion of documenting the variations in how FCQs are used across the various CU campuses, but that is not presently within the span of control of The FCQ Redesign Project Committee.

The FCQ Redesign Project Committee is committed to making the instrument the best it can be. By focusing on teaching behaviors we hope to improve how useful the feedback is. Further, by giving professors the ability to use the tool formatively, i.e., the option to distribute FCQs multiple times each semester,  we hope to empower professors to change how they teach classes throughout the semester and engage students in more of a dialogue about how classes are progressing. By engaging student earlier and making them feel more engaged with professors, it is our belief that teaching will improve and students will learn more.

Submitted by a Colorado Springs faculty member from the College of Business and Administration. ​(Roughly 4% of survey respondents mentioned this in comments)

Some professors commented on the fact that the FCQ is a general instrument and that they would prefer to see an instrument that was different for different classes. By leveraging the Campus Labs (www.campuslabs.com) platform we hope to enable professors to eventually customize their FCQs by adding questions to instrument. Also, by focusing on teaching behaviors that are generally good practice across all classes, we hope to reduce the likelihood that questions will not pertain to all classes.

Submitted by a Boulder faculty member from the College of Arts and Sciences. (Roughly 3% of survey respondents mentioned this in comments)

The new technology we are testing does shift FCQs to online forms that are optimized for mobile devices, but the Campus Labs staff (www.campuslabs.com) recommends that professors allocate time in class for students to fill out the FCQs in order to ensure good response rates (Campus Labs reports 80%). Further, there is evidence that using the tool more than once per semester in a formative fashion engages students in the process of improving their classes. When students see that professors are making changes based on the feedback provided, they are increasingly likely to provide feedback and make that feedback more meaningful.  We expect to test this formative assessment functionality in later pilot semesters.

The FCQ Redesign Committee acknowledges that our campuses all have increasing numbers of online courses, pre-collegiate courses, and others may require special consideration when it comes to the questions we ask and issues related to response rates. To this end we do hope to find a solution that allows for some degree of customization based on course and program needs.  We will also be testing online course response rates, specifically at CU Denver, in the hopes of addressing the inherently lower response rates seen in these types of classes.

Submitted by a Boulder faculty member from the Program in Environmental Design (Roughly 3% of survey respondents mentioned this in comments)

At present the difficulty in a mandatory policy is twofold, 1) Can technology be modified to link the FCQ submission to the grade release on each campus? 2) Would the leadership of each campus and the system agree that forcing students to respond to the FCQ is appropriate? Alternatively, could simple incentives for students be developed to ensure higher response rates.

Submitted by a Boulder faculty member from the College of Arts and Sciences (Roughly 3% of survey respondents mentioned this in comments)

The FCQ Redesign Project Committee will be working to build a more effective process of communicating both the purpose of the FCQs and the importance of FCQs to both students and faculty. We will also be exploring way to communicate aggregated data to students so that they see the results of filling out the FCQ. There is research that suggests that students are more likely to fill out surveys and FCQs if they see the results of their time and contributions in the the aggregate report.