“It’s all about the start” classifying eyes-free mobile authentication techniques

Author/Creator ORCID

Date

2018-06-13

Department

Program

Citation of Original Publication

Wolf, Flynn; Aviv, Adam J.; Kuber, Ravi; “It’s all about the start” classifying eyes-free mobile authentication techniques; Journal of Information Security and Applications, Volume 41, August 2018, Pages 28-40; https://www.sciencedirect.com/science/article/abs/pii/S2214212618301315?via%3Dihub#!

Rights

This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
Public Domain Mark 1.0
This work was written as part of one of the author's official duties as an Employee of the United States Government and is therefore a work of the United States Government. In accordance with 17 U.S.C. 105, no copyright protection is available for such works under U.S. Law.

Subjects

Abstract

Mobile device users avoiding observational attacks and coping with situational impairments may employ techniques for eyes-free mobile unlock authentication, where a user enters his/her passcode without looking at the device. This study supplies an initial description of user accuracy in performing this authentication behavior with PIN and pattern passcodes, with varying lengths and visual characteristics. Additionally, we inquire if tactile-only feedback can provide assistive spatialization, finding that orientation cues prior to unlocking do not help. Measurements of edit distance and dynamic time warping accuracy were collected, using a within-group, randomized study of 26 participants. 1021 passcode entry gestures were collected and classified, identifying six user strategies for using the pre-entry tactile feedback, and ten codes for types of events and errors that occurred during entry. We found that users who focused on orienting themselves to position the first digit of the passcode using the tactile feedback performed better in the task. These results could be applied to better define eyes-free behavior in further research, and to design better and more secure methods for eyes-free authentication.