Improving Gesture Recognition using Markov Chains
Supervisor
Suitable for
Abstract
Co-supervised by Systems Security Lab
In recent work, we conducted a user study in which users made mobile payments using a smartwatch and we collected the inertial sensor (accelerometer, gyroscope, etc.) data from the smartwatch. Using this data, we showed that the tap gesture performed by the user while making such payments can be used to authenticate the user. We also found that the tap gesture is sufficiently distinct from other hand movements that, by recognising it, we can infer the user’s intention to pay (as opposed to it being an accidental payment or skimming attack). We achieved our results by using random forest classifiers on tap gestures represented by sliding windows consisting of between 0.5 and 4 seconds of inertial sensor data.
It may be possible to strengthen our intent recognition (gesture recognition) model by treating the tap gesture as a 3-tuple of three separate parts, namely: (i) reaching towards the payment terminal, (ii) aligning the watch face with the terminal to establish an NFC connection, and (iii) withdrawing afterwards. The plan would be to attempt to recognise each constituent part separately and use Markov chains to stitch together 3-tuples that form a complete gesture. Our dataset will be provided. The research question is whether this approach is more or less effective at recognising tap gestures than our initial RF approach.
Pre-requisites: Knowledge of data analysis.