Mike Hannigan
Mechanical Engineering
Course Context:
Data Analysis, MCEN 3037, Jr. level required course that focuses on assessment of measurement uncertainty & assessment and analysis of data, including confidence intervals, hypothesis testing, regression/correlation, and analysis of variance. This was the third time that I taught this class. There were 105 students in the class.
What did I do?
Target: Improve engagement of the students with the course and topic. This has been a class that the students don't understand why they have to take the course as it seems to dangle by itself in our curriculum (at least to the students). Get more students to a higher level of knowing (cognition)  from memorized use of equations to application of concepts in open ended situations.
What did I do? (1) Using the idea of 'threshhold concept', I redesigned my syllabus to include information about the threshhold concept, learning objectives, and how the course was designed to meet those learning objectives. I used the syllabus throughout the term to reinforce those ideas. (2) I implemented several classroom assessment techniques (CATs) with the hope of having students more engaged with the course  in other words, have them read, think and apply the content more routinely; not just when they do homework assignments or take exams. The CATs employed were 5 minute papers, surveys, online discussion questions/responses, inclass think alouds, and inclass workshops. (3) To emphasize the role this course plays in their broad education, I had the students write reports about historical figures in the field, in first person, with the goal of describing their work (not a biography). I also had the students peer assess those reports. My goal for this exercise was to get them exposed to more work and not really to use their assessment. I also had them interview a practicing engineer to learn more about what life after college is like and to hear what practicing engineers think about the education.
What happened?
I am curious to get the FCQ results, but I did take a survey on the last day of class to gauge students’ perceptions of effectiveness of the new CATs and more typical interactions. A summary table appears below. For the first five columns, students ranked each course interaction from 1 to 5 with 5 being incredibly effective, while the final column had the students rank each interaction with 1 being the best. The interactions are in the rows with the first four rows shaded representing typical interactions used previously. The first column gauged the student’s perception on the interaction keeping them uptodate with the course throughout the term. Students thought that homework and exams kept them uptodate, with workshops the most effective new CAT applied at keeping the students uptodate. The second and third columns assessed students’ interest in the course and in an engineering career. Lectures were the most effective at keeping/getting the students interested in the course content – nice to see all the hard work of lecture prep with examples coming through! Workshops were the most effective of the new CATs. As for interest in an engineering career, the interview was easily the highest ranked with lectures a distant second. The fourth and fifth columns assess the ability of an interaction to help in understanding the course concepts and the engineering career. Students thought that homework was the most effective interaction at facilitating understanding of the course concepts, but the text, exams, lectures and workshops were all rated highly. As for understanding of the engineering career, the interview was highly ranked with lectures a distant second. Overall, the students found the four typical interactions (homework, text, lecture, and exams) to be the most effective with the workshop the best of the new CATs.
Interest 
Understanding 

Up to date 
Course 
Career 
Course 
Career 
Overall 

Homework  4.1 
3.0 
2.6 
4.2 
2.8 
2.6 
Text  3.6 
2.7 
2.4 
3.9 
2.5 
3.3 
Lectures  4.0 
3.4 
2.9 
3.7 
3.0 
3.5 
Exams  4.1 
2.9 
2.3 
3.9 
2.6 
3.5 
Workshops  3.8 
2.9 
2.5 
3.7 
2.6 
4.1 
Discussion Questions  3.2 
2.5 
2.3 
2.9 
2.3 
5.9 
Think Aloud  2.9 
2.3 
2.0 
2.8 
2.0 
6.3 
Interview  2.0 
2.8 
3.4 
2.1 
3.8 
7.2 
Alias Name Report  1.6 
2.6 
2.1 
1.7 
2.0 
8.1 
Backing up and looking at the big picture view, the class was more enjoyable than in past years. I had more fun during lecture and in general with the course, and the students seemed on the whole happier. It felt more like team work than us against them. I think the syllabus redesign set a great tone for the course as I found us having more in class discussions about 'why' and 'how' we are learning this stuff. The fact that I was employing more CATs than ever emphasized to the students that I was on their side. I am not sure if they got to a higher level of knowing, but ... it was more enjoyable.
I did do a midterm survey to ask the students what they thought about the effectivenss of the CATs, and most specifically the most involved one  the think aloud. I learned that the students in the audience found them ineffective as they got confused by their fellow students; although the participating students found them a useful tool at keeping them uptodate with the content. This caused me to adapt the think aloud to a more structured format. After two more, I found that it was still lacking so I stopped that tool. In addition, I found that the discussion questions/responses were effective at engaging the students for the first few weeks, but then the students realized that I was not keeping up with them and their level of effort (and engagement with the content) dropped off. As such I also dropped.
What did you learn?
Three of the CATs seemed to be effective at engaging the students: discussion questions/responses, workshops and 5minute papers. Although the discussion questions/responses lost effectiveness through the semester. After talking to my colleagues here I think that breaking up my online work into smaller cohorts might solve some of these as they may find these activities more helpful in and of themselves and not just something that the teacher uses to keep them involved. I really like the workshops as they get to apply the concepts we just covered, and I think I would like to make the 5minute papers more about application of the concept then just 'what did we talk about today'. For example, I might ask “Can you describe an experiment about … (strength of bone) where you would use a paired data design?” As such, they will merge with the workshops.
For my fellow FTEP participants I learned a great deal about implementation of different CATs. For example, one colleague created learning cohorts on CULearn and administered discussion questions by cohort. The smaller groups allowed for more effective engagement. In addition, one colleague had the TA in charge of the discussion questions and got the TA to really engage with the students in that setting. She felt like this lead to a more effective learning environment. During the May small group discussions, I also learned more about clickers and their effective use. If you can ask the students questions that aren’t simply “what is the correct answer” but more engaging questions like “if someone answered a question with … , what would you give them for a grade”. Also, use the clickers to survey students to generate automated data sets, this would be great to then have them apply a data analysis tool to the data that they just generated.
Based on the results from class and the discussion during the FTEP workshop, I am excited to make a few more changes to the course next year. I am going to drop the alias report and the think alouds, but add in clicker questions and make the workshops more prolific and engaging. I am still debating the discussion questions and their implementation. In general, I found that the improvements in the syllabus and a more open dialogue about the rationale for the class and the class assessments were very helpful and I will apply these ideas to all of my courses.