Link to Resource: Gender Bias in Test Item Formats: Evidence from PISA 2009, 2012, and 2015 Math and Reading Tests
Authors: Benjamin Shear, University of Colorado
This paper investigates differences between male and female student performance on multiple-choice (MC) versus constructed-response (CR) items used on the large-scale math and reading PISA tests. The results, based on data for high school-aged students in 35 countries including the US, provide consistent evidence that on average male students tend to earn relatively higher scores on MC test items whereas female students tend to earn relatively higher scores on CR test items. The pattern was consistent across countries, although the magnitude of differences varied by country and were larger on average in reading than in math. Policymakers, researchers, and other audiences using test scores to compare student achievement across gender groups should consider the types of item formats used on tests when interpreting results.