Issues of Question Equivalence in Online Exam Pools

dc.contributor.authorGoolsby-Cole, Cody
dc.contributor.authorBass, Sarah M.
dc.contributor.authorStanwyck, Liz
dc.contributor.authorLeupen, Sarah
dc.contributor.authorCarpenter, Tara
dc.contributor.authorHodges, Linda C.
dc.date.accessioned2023-04-17T14:41:57Z
dc.date.available2023-04-17T14:41:57Z
dc.date.issued2023-04
dc.description.abstractDuring the pandemic, the use of question pools for online testing was recommended to mitigate cheating, exposing multitudes of science, technology, engineering, and mathematics (STEM) students across the globe to this practice. Yet instructors may be unfamiliar with the ways that seemingly small changes between questions in a pool can expose differences in student understanding. In this study, we undertook an investigation of student performance on our questions in online exam pools across several STEM courses: upper-level physiology, general chemistry, and introductory physics. We found that the difficulty of creating analogous questions in a pool varied by question type, with quantitative problems being the easiest to vary without altering average student performance. However, when instructors created pools by rearranging aspects of a question, posing opposite counterparts of concepts, or formulating questions to assess the same learning objective, we sometimes discovered student learning differences between seemingly closely related ideas, illustrating the challenge of our own expert blind spot. We provide suggestions for how instructors can improve the equity of question pools, such as being cautious in how many variables one changes in a specific pool and “test driving” proposed questions in lower-stakes assessments.en_US
dc.description.urihttps://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-marchapril-2023/issues-questionen_US
dc.format.extent7 pagesen_US
dc.genrejournal articlesen_US
dc.identifierdoi:10.13016/m2ddfh-bd7q
dc.identifier.citationGoolsby-Cole, Cody et al. "Issues of Question Equivalence in Online Exam Pools." Journal of College Science Teaching 52, No. 4 (March/April 2023):24-30. https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-marchapril-2023/issues-question.en_US
dc.identifier.urihttp://hdl.handle.net/11603/27608
dc.language.isoen_USen_US
dc.publisherNational Science Teacher's Association (NSTA)en_US
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Chemistry & Biochemistry Department Collection
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Physics Department
dc.relation.ispartofUMBC Mathematics and Statistics Department
dc.relation.ispartofUMBC Faculty Development Center (FDC)
dc.relation.ispartofUMBC Biological Sciences Department
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.en_US
dc.titleIssues of Question Equivalence in Online Exam Poolsen_US
dc.typeTexten_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
JCST_MarApr_2023_24_Hodges.pdf
Size:
435.19 KB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: