A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 8 Issue 1
Jan.  2021

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 11.8, Top 4% (SCI Q1)
    CiteScore: 17.6, Top 3% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
Guang Chen, Fa Wang, Xiaoding Yuan, Zhijun Li, Zichen Liang and Alois Knoll, "NeuroBiometric: An Eye Blink Based Biometric Authentication System Using an Event-Based Neuromorphic Vision Sensor," IEEE/CAA J. Autom. Sinica, vol. 8, no. 1, pp. 206-218, Jan. 2021. doi: 10.1109/JAS.2020.1003483
Citation: Guang Chen, Fa Wang, Xiaoding Yuan, Zhijun Li, Zichen Liang and Alois Knoll, "NeuroBiometric: An Eye Blink Based Biometric Authentication System Using an Event-Based Neuromorphic Vision Sensor," IEEE/CAA J. Autom. Sinica, vol. 8, no. 1, pp. 206-218, Jan. 2021. doi: 10.1109/JAS.2020.1003483

NeuroBiometric: An Eye Blink Based Biometric Authentication System Using an Event-Based Neuromorphic Vision Sensor

doi: 10.1109/JAS.2020.1003483
Funds:  This work was supported by the National Natural Science Foundation of China (61906138), the National Science and Technology Major Project of the Ministry of Science and Technology of China (2018AAA0102900), the Shanghai Automotive Industry Sci-Tech Development Program (1838), the European Union’s Horizon 2020 Research and Innovation Program (785907), and the Shanghai AI Innovation Development Program 2018
More Information
  • The rise of the Internet and identity authentication systems has brought convenience to people’s lives but has also introduced the potential risk of privacy leaks. Existing biometric authentication systems based on explicit and static features bear the risk of being attacked by mimicked data. This work proposes a highly efficient biometric authentication system based on transient eye blink signals that are precisely captured by a neuromorphic vision sensor with microsecond-level temporal resolution. The neuromorphic vision sensor only transmits the local pixel-level changes induced by the eye blinks when they occur, which leads to advantageous characteristics such as an ultra-low latency response. We first propose a set of effective biometric features describing the motion, speed, energy and frequency signal of eye blinks based on the microsecond temporal resolution of event densities. We then train the ensemble model and non-ensemble model with our NeuroBiometric dataset for biometrics authentication. The experiments show that our system is able to identify and verify the subjects with the ensemble model at an accuracy of 0.948 and with the non-ensemble model at an accuracy of 0.925. The low false positive rates (about 0.002) and the highly dynamic features are not only hard to reproduce but also avoid recording visible characteristics of a user’s appearance. The proposed system sheds light on a new path towards safer authentication using neuromorphic vision sensors.

     

  • loading
  • 1 https://github.com/ispc-lab/NeuroBiometrics
  • [1]
    M. M. Ali, V. H. Mahale, P. Yannawar, and A. Gaikwad, “Overview of fingerprint recognition system,” in Proc. IEEE Int. Conf. Electrical, Electronics, and Optimization Techniques, pp. 1334–1338, 2016.
    [2]
    Y. Zhang and M. Juhola, “On biometrics with eye movements,” IEEE J. Biomedical and Health Informatics, vol. 21, no. 5, pp. 1360–1366, 2016.
    [3]
    Z. Li, Y. Yuan, L. Luo, W. Su, K. Zhao, C. Xu, J. Huang, and M. Pi, “Hybrid brain/muscle signals powered wearable walking exoskeleton enhancing motor ability in climbing stairs activity,” IEEE Trans. Medical Robotics and Bionics, vol. 1, no. 4, pp. 218–227, 2019. doi: 10.1109/TMRB.2019.2949865
    [4]
    Z. Li, J. Li, S. Zhao, Y. Yuan, Y. Kang, and C. P. Chen, “Adaptive neural control of a kinematically redundant exoskeleton robot using brain-machine interfaces,” IEEE Trans. Neural Networks and Learning Systems, vol. 30, no. 12, pp. 3558–3571, 2018.
    [5]
    Y. Liu, W. Su, Z. Li, G. Shi, X. Chu, Y. Kang, and W. Shang, “Motor-imagery-based teleoperation of a dual-arm robot performing manipulation tasks,” IEEE Trans. Cognitive and Developmental Systems, vol. 11, no. 3, pp. 414–424, 2018.
    [6]
    Q. Wu, Y. Zeng, C. Zhang, L. Tong, and B. Yan, “An EEG-based person authentication system with open-set capability combining eye blinking signals,” Sensors, vol. 18, no. 2, pp. 335, 2018. doi: 10.3390/s18020335
    [7]
    C. Gou, Y. Wu, K. Wang, K. Wang, F.-Y. Wang, and Q. Ji, “A joint cascaded framework for simultaneous eye detection and eye state estimation,” Pattern Recognition, vol. 67, pp. 23–31, 2017. doi: 10.1016/j.patcog.2017.01.023
    [8]
    M. Jeong and B. C. Ko, “Drivers facial expression recognition in real-time for safe driving,” Sensors, vol. 18, no. 2, pp. 4270, 2018.
    [9]
    V. K. Sinha, A. K. Gupta, and M. Mahajan, “Detecting fake iris in iris bio-metric system,” Digital Investigation, vol. 25, pp. 97–104, 2018. doi: 10.1016/j.diin.2018.03.002
    [10]
    M. Abo-Zahhad, S. M. Ahmed, and S. N. Abbas, “A novel biometric approach for human identification and verification using eye blinking signal,” IEEE Signal Processing Letters, vol. 22, no. 7, pp. 876–880, 2014.
    [11]
    J. A. Lenero-Bardallo, T. Serrano-Gotarredona, and B. Linares-Barranco, “A 3.6 μs latency asynchronous frame-free event-driven dynamic-vision sensor,” IEEE J. Solid-State Circuits, vol. 46, pp. 1443–1455, Jun. 2011. doi: 10.1109/JSSC.2011.2118490
    [12]
    Z. Sitova, J. Sedenka, Q. Yang, G. Peng, G. Zhou, P. Gasti, and K. S. Balagani, “HMOG: New behavioral biometric features for continuous authentication of smart phone users,” IEEE Trans. Information Forensics and Security, vol. 11, May 2016.
    [13]
    N. Merhav, “False-accept/false-reject trade-offs for ensembles of biometric authentication systems,” IEEE Trans. Information Theory, vol. 65, no. 8, Aug. 2019.
    [14]
    D. Q. Phung, D. Tran, W. Ma, P. Nguyen, and T. Pham, “Using shannon entropy as EEG signal feature for fast person identification,” in Proc. 22nd European Symp. Artificial Neural Networks, Computational Intelligence and Machine Learning, vol. 4, pp. 413–418, 2014.
    [15]
    H. U. Amin, A. S. Malik, R. F. Ahmad, N. Badruddin, N. Kamel, M. Hussain, and W.-T. Chooi, “Feature extraction and classification for EEG signals using wavelet transform and machine learning techniques,” Australasian Physical &Engineering Sciences in Medicine, vol. 38, no. 1, pp. 139–149, 2015.
    [16]
    Y. Ding, X. Hu, Z. Xia, Y.-J. Liu, and D. Zhang, “Inter-brain EEG feature extraction and analysis for continuous implicit emotion tagging during video watching,” IEEE Trans. Affective Computing, 2018. DOI: 10.1109/TAFFC.2018.2849758
    [17]
    S. N. Abbas and M. Abo-Zahhad, “Eye blinking EOG signals as biometrics,” in Biometric Security and Privacy, pp. 121–140, Springer, 2017.
    [18]
    S. Yang, S. Hoque, and F. Deravi, “Improved timefrequency features and electrode placement for eegbased biometric person recognition,” IEEE Access, vol. 7, pp. 49604–49613, 2019. doi: 10.1109/ACCESS.2019.2910752
    [19]
    A. George and A. Routray, “Fast and accurate algorithm for eye localisation for gaze tracking in lowresolution images,” IET Computer Vision, vol. 10, no. 7, pp. 660–669, 2016. doi: 10.1049/iet-cvi.2015.0316
    [20]
    I. Rigas and O. V. Komogortsev, “Biometric recognition via probabilistic spatial projection of eye movement trajectories in dynamic visual environments,” IEEE Trans. Information Forensics and Security, vol. 9, no. 10, pp. 1743–1754, 2014. doi: 10.1109/TIFS.2014.2350960
    [21]
    Z. Wang, C. Li, H. Shao, and J. Sun, “Eye recognition with mixed convolutional and residual network (micore-net),” IEEE Access, vol. 6, pp. 17905–17912, 2018. doi: 10.1109/ACCESS.2018.2812208
    [22]
    J. Espinosa, B. Domenech, C. Vázquez, J. Pérez, and D. Mas, “Blinking characterization from high speed video records. Application to biometric authentication,” PloS one, vol. 13, no. 5, pp. e0196125, 2018. doi: 10.1371/journal.pone.0196125
    [23]
    S. Seha, G. Papangelakis, D. Hatzinakos, A. S. Zandi, and F. J. Comeau, “Improving eye movement biometrics using remote registration of eye blinking patterns,” in Proc. IEEE Int. Conf. Acoustics, Speech and Signal Processing, pp. 2562–2566, 2019.
    [24]
    G. Chen, L. Hong, J. Dong, P. Liu, J. Conradt, and A. Knoll, “EDDD: Event-based drowsiness driving detection through facial motion analysis with neuromorphic vision sensor,” IEEE Sensors J., vol. 20, no. 11, pp. 6170–6181, 2020. doi: 10.1109/JSEN.2020.2973049
    [25]
    G. Chen, W. Chen, Q. Yang, Z. Xu, L. Yang, J. Conradt, and A. Knoll, “A novel visible light positioning system with event-based neuromorphic vision sensor,” IEEE Sensors J., vol. 20, no. 17, pp. 10211–10219, 2020. doi: 10.1109/JSEN.2020.2990752
    [26]
    A. Yousefzadeh, G. Orchard, T. Serrano-Gotarredona, and B. Linares-Barranco, “Active perception with dynamic vision sensors. Minimum saccades with optimum recognition,” IEEE Trans. Biomedical Circuits and Systems, vol. 12, no. 4, pp. 927–939, 2018. doi: 10.1109/TBCAS.2018.2834428
    [27]
    G. Lenz, S.-H. Ieng, and R. Benosman, “High speed event-based face detection and tracking in the blink of an eye,” arXiv preprint arXiv: 1803.10106, 2018.
    [28]
    G. Gallego, T. Delbrück, G. Orchard, C. Bartolozzi, B. Taba, A. Censi, S. Leutenegger, A. J. Davison, J. Conradt, K. Daniilidis, and D. Scaramuzza, “Event-based vision: A survey,” arXiv: 1904.08405, 2019.
    [29]
    G. Chen, H. Cao, J. Conradt, H. Tang, F. Röhrbein, and A. Knoll, “Event-based neuromorphic vision for autonomous driving: A paradigm shift for bio-inspired visual sensing and perception,” IEEE Signal Processing Magazine, vol. 37, no. 4, pp. 34–49, 2020.

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(11)  / Tables(3)

    Article Metrics

    Article views (1555) PDF downloads(76) Cited by()

    Highlights

    • Proposing the first-ever neuromorphic sensor-based biometric authentication system to capture subtle changes of human eye blinks in microsecond level latency. It outperforms the authentication system with the traditional CMOS cameras in terms of speed without sacrificing the accuracy.
    • Offering the first-ever neuromorphic eyeblink dataset which contains subtle motions of eyebrow and ocular during some normal blinks. The ratio of males to females among volunteers approaches 1 and they are all in a normal state. We uses DAVIS346 with a resolution of 346*260 pixels, a temporal resolution of 1 μs and an outstanding dynamic range (up to 140 dB) to collect dataset indoors under the natural light of the day. To encourage any comparison with this work, the dataset could be downloaded by https://github.com/ispc-lab/NeuroBiometrics.
    • Constructing a series of microsecond level event-driven eye-blinking features (204 in total) out of 4 types of smoothing curves generated from our filtered biometrics data in time and frequency domain. They are classified into 5 groups: duration features, speed features, energy features, ratio features and frequency features, serving as the candidate feature set of the subsequent processing.
    • Providing effective feature selection and identity recognition approaches for authentication. Recursive feature elimination is applied on the candidate feature set as the first step and the second step of feature selection could be conducted through Pearson correlation coefficient-based approach and coefficient of variation-based approach respectively. Identity recognition approaches include ensemble and non-ensemble models and their results show very low false-positive rates, which reveals that highly dynamic features are not only hard to fake but also protect visible characteristics of a user’s appearance.

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return