Android Biometric System:
Gestures plus Device Motion/Orientation

Background

Biometric access to mobile devices for user access without entering passwords, and to combine the different sensory data available on these devices (e.g. sensory fusion), is gaining significant momentum in the wireless world. Users are increasingly requesting biometrics access (e.g. non-text data entry, gestures, facial recognition, etc.) to interact with their devices.

The highest profile implementations of gesture recognition have been in gaming. Nintendo's Wii changed the gaming market with its Wii Remote controllers. Microsoft subsequently rolled out its Kinect technology which enables users to play games and make selections using only hand and body gestures, without holding any type of controller.

There is also significant interest in implementing gesture recognition in cars, to enable drivers to control certain car features (make selections from the entertainment system, answer a call, etc.) with just a wave of the hand, as that is viewed as much safer than having the driver look down to press buttons on the dashboard. There are a number of methods to implement gesture recognition like a MEMS accelerometer or MEMS gyro which could detect user movements. There is growing interest in implementing gesture recognition vision systems in which optical sensors repeatedly capture images of the user and complex software algorithms are then used to determine what the user's gestures are, without the user having to hold any device.

Project

This will be a great project for those skilled in Android programming and for good programmers who would like to learn Android programming.

This is a continuation of earlier projects, see [1-4 ]. Most important is to read the technical paper from last semester's project, and they did not do actual Android programming [1]. This project will focus on developing robust device gesture and motion/orientation machine-learning biometric systems for mobile devices. For this project the work will be restricted to a particular Android device model and the gestures will be restricted to scrolling and other non-text input. It is anticipated that possibly three systems, or a combination of the three, will be developed covering the following input types:

  1. non-text-input gestures like scrolling
  2. device motion
  3. device orientation
The literature review should include a review of gesture recognition that include taps, swipes, and other touch screen input movements, as well as device motion and orientation.

Project Team Skills Required

The key skill desired is Android programming which is primarily Java, so familiarity with Java is a must. Experience with the Android SDK is also desirable since there would be a bit of a learning curve for someone knowing only Java. We imagine that most of the programming will be fairly low level, so the sensors API is most important. For someone without Android experience, this would also be a good place to start. Students strong in Java should be able to learn the API in a few weeks, see Android location/sensors.

Small Interim Deliverables

With code being developed, your customer Javid recommends using agile methodologies, such as pair programming, test driven development, etc. He also requires small interim deliverables in the form of project plans, design documents, test script documents, and communications documents, and at the end of the semester a lessons learned document. At the end of semester, prepare a short lessons learned document (about one page): on what you learned, what worked, and what could be improved.

Pace University biometric backend classifier

Although the features will be based on Android input, we can use the generic Pace University biometric backend classifier:

Project Steps

Although the following list has you developing all features simultaneously, it is recommended that you work one set of features through the whole process, then another, etc.

Project Deliverables

References

  1. Jonathan Lee, Aliza Levinger, Preston Rollins, Beqir Simnica, Javid Maghsoudi, Hui Zhao, Charles Tappert, Vinnie Monaco, and Leigh Anne Clevenger, Fall 2015 Technical Paper
  2. Noufal Kunnathu, Biometric User Authentication on Smartphone Accelerometer Sensor Data, Proc. Research Day Conference, CSIS, Pace University, May 2015.
    Also see associated "Manual For Biometric User Authentication on Smartphone Accelerometer Sensor Data", Fall 2014 Technical Manual
  3. Christopher Carlson, Toney Chen, Jeremy Cruz, Javid Maghsoudi, Hui Zhao, and John V. Monaco, User Authentication with Android Accelerometer and Gyroscope Sensors, Proc. Research Day Conference, CSIS, Pace University, May 2015.
  4. Naif Alotaibi, Richard Barilla, Francisco Betances, Aditya Chohan, Alexander Gazarov, Mantie Reid, Alexandra Scolaro, and Vinnie Monaco, Biometric System Design for Handheld Devices, Proc. Research Day, CSIS, Pace University, May 2014.
  5. Paper 2013BTAS-O12, IEEE 6th Int Conf Biometrics (BTAS 2013), 2013.
  6. Paper 2013BTAS-P27, IEEE 6th Int Conf Biometrics (BTAS 2013), 2013.
  7. Paper 2013BTAS-P30, IEEE 6th Int Conf Biometrics (BTAS 2013), 2013.
  8. Paper 2013BTAS-P35, IEEE 6th Int Conf Biometrics (BTAS 2013), 2013.
  9. Pagemill Partners, A Division Of Duff & Phelps Securities, Llc. Next-Generation User Interface Technologies for Mobile and Consumer Devices (n.d.): n. pag. Web.