Leveraging Movement, Posture, and Anthropometric Contexts to Strengthen the Security of Mobile Biometrics
Active authentication is emerging as a promising way to continuously and unobtrusively authenticate smartphone users post-login. Although research in this area has shown that behavioral traits, such as touchscreen gestures and device movements, can be used to distinguish a legitimate user from an attacker, fundamental questions about these traits still remain unanswered. These include: how, and to what extent, do posture and movement impact behavioral traits; what is the impact of human variability (anthropometric properties, age,gender, and health conditions) on behavioral traits; to what extent can these traits be spoofed using posture and movement observations; and how can we strengthen these traits against spoofing attacks. In this project, an interdisciplinary team of investigators from the Computer Science, Biomedical Sciences, Physical Therapy, and Art and Media Technologies at NYIT will leverage capabilities in 3D motion capture, behavioral biometric authentication research, and motor control research to address these questions.
The investigators will systematically quantify the impact of fine-grained posture, movement, and anthropometric variables in adult, elderly, and Parkinson’s disease populations. They will leverage this knowledge to design new behavioral authentication techniques that achieve lower error rates under realistic conditions by adapting to drifts in context and behavior. Further, their research will quantify the susceptibility of behavioral biometric traits to forgery attacks, and introduce novel liveness detection techniques that rely on contextual information, as well as lightweight and unobtrusive user challenges, to mitigate these attacks. They will evaluate these techniques on up to 150 subjects under two settings: (1) in a 3D motion capture lab, and (2) over a period of up to 12 months in the user’s everyday environment.
This research was made possible by NSF Grant CNS-1814846