Examining The Validity Of A Developmental Mathematics Diagnostic Test Using The Rasch Model, Virtual Equating, And Ordinal Regression

No Thumbnail Available

Links to Files

Author/Creator

Author/Creator ORCID

Date

2016

Department

Psychology

Program

Doctor of Philosophy

Citation of Original Publication

Rights

This item is made available by Morgan State University for personal, educational, and research purposes in accordance with Title 17 of the U.S. Copyright Law. Other uses may require permission from the copyright owner.

Abstract

The purpose of this study was to gather evidence of the validity of a developmental mathematics diagnostic test used at Morgan State University. The regular and local validation of tests is a psychometrics best practice. The test validation process involves the definition of a construct by subject matter experts, the creation of items, the evaluation of items for construct relevance by subject matter experts, and the analysis of whether test results support the validity of the test. The diagnostic test was created by faculty experts in the field of developmental mathematics using an item bank provided by the course textbook publisher. The judgment of experts in the field provides evidence of the content validity of test items. The Rasch Measurement Model provides evidence of the construct validity of a test through fit statistics. Acceptable Rasch Measurement Model fit statistics contributed evidence to support the construct validity of the diagnostic test. This study also assessed the equivalence of the alternate forms of the diagnostic test using virtual equating, a procedure developed by Luppescu (1996, 2005, 2015). Results indicated that the alternate forms produced equivalent person ability measures which suggest all forms were measuring the same construct. The test was administered in a pretest-posttest design to students enrolled in developmental mathematics during the fall 2014 semester. The difference in Rasch ability measures between pretest and posttest administrations revealed that there were significant and positive changes which provided evidence that the test items were capturing class content. The predictive validity of the diagnostic test was assessed through multiple analyses. A correlational analysis between the posttest Rasch ability measures and the course final examination Rasch ability measures revealed a small, positive, and significant correlation. Ordinal regression revealed that the posttest Rasch ability measures were a significant and positive predictor of obtaining a higher final grade in the developmental mathematics class, but were not significant predictors of the course grades in the spring 2015 mathematics classes. The results of this study contribute to the literature on the Rasch Measurement Model as a practical application of locally collected evidence of the validity of a test.