Browsing by Subject "cybersecurity education"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item The CATS Hackathon: Creating and Refining Test Items for Cybersecurity Concept Inventories(2019-01-29) Sherman, Alan T.; Oliva, Linda; Golaszewski, Enis; Phatak, Dhananjay; Scheponik, Travis; Herman, Geoffrey L.; Choi, Dong San; Offenberger, Spencer E.; Peterson, Peter; Dykstra, Josiah; Bard, Gregory V.; Chattopadhyay, Ankur; Sharevski, Filipo; Verma, Rakesh; Vrecenar, RyanFor two days in February 2018, 17 cybersecurity educators and professionals from government and industry met in a "hackathon" to refine existing draft multiple-choice test items, and to create new ones, for a Cybersecurity Concept Inventory (CCI) and Cybersecurity Curriculum Assessment (CCA) being developed as part of the Cybersecurity Assessment Tools (CATS) Project. We report on the results of the CATS Hackathon, discussing the methods we used to develop test items, highlighting the evolution of a sample test item through this process, and offering suggestions to others who may wish to organize similar hackathons. Each test item embodies a scenario, question stem, and five answer choices. During the Hackathon, participants organized into teams to (1) Generate new scenarios and question stems, (2) Extend CCI items into CCA items, and generate new answer choices for new scenarios and stems, and (3) Review and refine draft CCA test items. The CATS Project provides rigorous evidence-based instruments for assessing and evaluating educational practices; these instruments can help identify pedagogies and content that are effective in teaching cybersecurity. The CCI measures how well students understand basic concepts in cybersecurity---especially adversarial thinking---after a first course in the field. The CCA measures how well students understand core concepts after completing a full cybersecurity curriculumItem Student Misconceptions about Cybersecurity Concepts: Analysis of Think-Aloud Interviews(DigitalCommons@Kennesaw State University, 2018) Thompson, Julia D.; Herman, Geoffrey L.; Scheponik, Travis; Oliva, Linda; Sherman, Alan; Golaszewski, Ennis; Phatak, DhananjayWe conducted an observational study to document student misconceptions about cybersecurity using thematic analysis of 25 think-aloud interviews. By understanding patterns in student misconceptions, we provide a basis for developing rigorous evidence-based recommendations for improving teaching and assessment methods in cybersecurity and inform future research. This study is the first to explore student cognition and reasoning about cybersecurity. We interviewed students from three diverse institutions. During these interviews, students grappled with security scenarios designed to probe their understanding of cybersecurity, especially adversarial thinking. We analyzed student statements using a structured qualitative method, novice-led paired thematic analysis, to document patterns in student misconceptions and problematic reasoning that transcend institutions, scenarios, or demographics. Themes generated from this analysis describe a taxonomy of misconceptions but not their causes or remedies. Four themes emerged: overgeneralizations, conflated concepts, biases, and incorrect assumptions. Together, these themes reveal that students generally failed to grasp the complexity and subtlety of possible vulnerabilities, threats, risks, and mitigations, suggesting a need for instructional methods that engage students in reasoning about complex scenarios with an adversarial mindset. These findings can guide teachers’ attention during instruction and inform the development of cybersecurity assessment tools that enable cross-institutional assessments that measure the effectiveness of pedagogies.