Experiences and Lessons Learned Creating and Validating Concept Inventories for Cybersecurity
dc.contributor.author | Sherman, Alan T. | |
dc.contributor.author | Herman, Geoffrey L. | |
dc.contributor.author | Oliva, Linda | |
dc.contributor.author | Peterson, Peter A. H. | |
dc.contributor.author | Golaszewski, Enis | |
dc.contributor.author | Poulsen, Seth | |
dc.contributor.author | Scheponik, Travis | |
dc.contributor.author | Gorti, Akshita | |
dc.date.accessioned | 2020-11-03T16:20:55Z | |
dc.date.available | 2020-11-03T16:20:55Z | |
dc.date.issued | 2020-09-09 | |
dc.description | 2020 National Cyber Summit, Huntsville, Alabama, USA, June 2 to 4, 2020 | |
dc.description.abstract | We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better. The CATS Project is creating and validating two concept inventories—conceptual tests of understanding—that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background. Key steps include defining project scope, identifying the core concepts, uncovering student misconceptions, creating scenarios, drafting question stems, developing distractor answer choices, generating educational materials, performing expert reviews, recruiting student subjects, organizing workshops, building community acceptance, forming a team and nurturing collaboration, adopting tools, and obtaining and using funding. Creating effective MCQs is difficult and time-consuming, and cybersecurity presents special challenges. Because cybersecurity issues are often subtle, where the adversarial model and details matter greatly, it is challenging to construct MCQs for which there is exactly one best but non-obvious answer. We hope that our experiences and lessons learned may help others create more effective concept inventories and assessments in STEM. | en_US |
dc.description.sponsorship | We thank the many people who contributed to the CATS project as Delphi experts, interview subjects, Hackathon participants, expert reviewers, student subjects, and former team members, including Michael Neary, Spencer Offenberger, Geet Parekh, Konstantinos Patsourakos, Dhananjay Phatak, and Julia Thompson. Support for this research was provided in part by the U.S. Department of Defense under CAE-R grants H98230-15-1-0294, H98230-15-1-0273, H98230-17-1-0349, H98230-17-1-0347; and by the National Science Foundation under UMBC SFS grants DGE-1241576, 1753681, and SFS Capacity Grants DGE-1819521, 1820531. | en_US |
dc.description.uri | https://link.springer.com/chapter/10.1007/978-3-030-58703-1_1 | en_US |
dc.format.extent | 25 pages | en_US |
dc.genre | conference papers and proceedings | |
dc.genre | book chapters | |
dc.genre | preprints | |
dc.identifier | doi:10.13016/m2vkl6-okyy | |
dc.identifier.citation | Sherman A.T. et al. (2021) Experiences and Lessons Learned Creating and Validating Concept Inventories for Cybersecurity. In: Choo KK.R., Morris T., Peterson G.L., Imsand E. (eds) National Cyber Summit (NCS) Research Track 2020. NCS 2020. Advances in Intelligent Systems and Computing, vol 1271. Springer, Cham. https://doi.org/10.1007/978-3-030-58703-1_1 | en_US |
dc.identifier.uri | https://doi.org/10.1007/978-3-030-58703-1_1 | |
dc.identifier.uri | http://hdl.handle.net/11603/19999 | |
dc.language.iso | en_US | en_US |
dc.publisher | Springer Nature | en_US |
dc.relation.isAvailableAt | The University of Maryland, Baltimore County (UMBC) | |
dc.relation.ispartof | UMBC Computer Science and Electrical Engineering Department Collection | |
dc.relation.ispartof | UMBC Faculty Collection | |
dc.relation.ispartof | UMBC Student Collection | |
dc.relation.ispartof | UMBC Education Department | |
dc.rights | This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author. | |
dc.subject | UMBC Cyber Defense Lab | |
dc.title | Experiences and Lessons Learned Creating and Validating Concept Inventories for Cybersecurity | en_US |
dc.type | Text | en_US |
dcterms.creator | https://orcid.org/0000-0003-1130-4678 | |
dcterms.creator | https://orcid.org/0000-0003-0056-7819 | |
dcterms.creator | https://orcid.org/0000-0002-0814-9956 | |
dcterms.creator | https://orcid.org/0000-0002-0821-1123 | |
dcterms.creator | https://orcid.org/0000-0003-2208-165X |