Issues of Question Equivalence in Online Exam Pools

Author/Creator ORCID

Date

2023-04

Department

Program

Citation of Original Publication

Goolsby-Cole, Cody et al. "Issues of Question Equivalence in Online Exam Pools." Journal of College Science Teaching 52, No. 4 (March/April 2023):24-30. https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-marchapril-2023/issues-question.

Rights

This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.

Subjects

Abstract

During the pandemic, the use of question pools for online testing was recommended to mitigate cheating, exposing multitudes of science, technology, engineering, and mathematics (STEM) students across the globe to this practice. Yet instructors may be unfamiliar with the ways that seemingly small changes between questions in a pool can expose differences in student understanding. In this study, we undertook an investigation of student performance on our questions in online exam pools across several STEM courses: upper-level physiology, general chemistry, and introductory physics. We found that the difficulty of creating analogous questions in a pool varied by question type, with quantitative problems being the easiest to vary without altering average student performance. However, when instructors created pools by rearranging aspects of a question, posing opposite counterparts of concepts, or formulating questions to assess the same learning objective, we sometimes discovered student learning differences between seemingly closely related ideas, illustrating the challenge of our own expert blind spot. We provide suggestions for how instructors can improve the equity of question pools, such as being cautious in how many variables one changes in a specific pool and “test driving” proposed questions in lower-stakes assessments.