EyeTrackDL: A Robust Deep Learning Framework for Saccade Detection via Simulated Data Augmentation

Khoan Ngoc Ha, Tran Van Nghia, Le Ky Bien

Abstract


Saccade detection is a fundamental task in visual behavior analysis and vestibular diagnostics. However, video Head Impulse Test (vHIT) recordings are often noisy, heterogeneous, and affected by class imbalance, particularly for covert saccades. In this paper, we propose EyeTrackDL, a lightweight yet effective deep learning framework based on a multilayer perceptron (MLP) architecture for classifying three types of eye movements: non-saccades, overt saccades, and covert saccades. Input signals are preprocessed using a fourth-order Butterworth filter, and two high-level features (onset time and duration) are extracted per saccade. To address data scarcity and imbalance, we apply SMOTE resampling and incorporate synthetic data generated from a kinematic vestibulo-ocular reflex (VOR) model. The model is evaluated using K-fold cross-validation (K = 2 to 10) on both real and simulated datasets. Results show that EyeTrackDL achieves an average accuracy of up to 96.5% on simulated data and approximately 83% in the real data, with significant improvements in the sensitivity of the covert saccades. Our findings demonstrate the potential of integrating simulation-based augmentation and class balancing for robust saccade detection in clinical environments.

References


R. J. Leigh and D. S. Zee, The Neurology of Eye Movements. Oxford University Press, 2015.

K. Holmqvist, M. Nyström, R. Andersson, R. Dewhurst, H. Jarodzka, and J. van de Weijer, “Eye tracking: A comprehensive guide to methods and measures,” Oxford University Press, 2011.

S. Martinez-Conde, S. L. Macknik, and D. H. Hubel, “Toward a saccade-based biomarker for parkinson disease,” Frontiers in Neurology, vol. 4, p. 178, 2013.

H. G. MacDougall, K. P. Weber, L. A. McGarvie, G. M. Halmagyi, and I. S. Curthoys, “Covert saccades during head impulse testing: a marker of vestibular compensation?” Neurology, vol. 72, no. 7, pp. 588–593, 2009.

R. Engbert and K. Mergenthaler, “Microsaccades are triggered by low retinal image slip,” Proceedings of the National Academy of Sciences, vol. 103, no. 33, pp. 12349-12353, 2006.

M. Nyström and K. Holmqvist, “Practical robust fixation detection,” Behavior Research Methods, vol. 42, no. 1, pp. 372–384, 2010.

M. E. Bellet, J. Bellet, N. Guyader, M. Boucart, J. Grainger, and F. Vitu, “Human-level saccade detection performance using deep neural networks,” Scientific Reports, vol. 9, no. 1, pp. 1–12, 2019.

M. Startsev, I. Agtzidis, and M. Dorr, “Lstm-based saccade detection,” in Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 2018, pp. 1–9.

J. Otero-Millan, X. M. Troncoso, S. L. Macknik, I. Serrano Pedraza, and S. Martinez-Conde, “Oculomotor strategies for fixating and following moving targets,” Vision Research, vol. 49, no. 5, pp. 705–726, 2014.

R. Zemblys, D. C. Niehorster, and K. Holmqvist, “Gazenet: End-to-end eye-movement event detection with deep neural networks,” Behavior Research Methods, vol. 51, no. 2, pp. 840–864, 2018.

A. Mihali and R. C. Muresan, “A bayesian generative model for microsaccade detection,” Frontiers in Neuroscience, vol. 11, pp. 1–15, 2017.

P. M. Daye and L. M. Optican, “A bayesian framework for saccade generation: Eye movements in pursuit tracking and interception,” Journal of Vision, vol. 14, no. 12, pp. 1–25, 2014.

J. Pekkanen and O. Lappi, “A new threshold-free algorithm for detecting fixations and saccades in eye-tracking data,” Behavior Research Methods, vol. 49, no. 2, pp. 1234-1250, 2017.

I. Goodfellow, Y. Bengio, and A. Courville, Deep learning. MIT Press, 2016.

D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.

N. V. Chawla, K. W. Bowyer, L. O. Hall, and W. P. Kegelmeyer, “Smote: Synthetic minority over-sampling technique,” Journal of Artificial Intelligence Research, vol. 16, pp. 321–357, 2002.




DOI: http://dx.doi.org/10.21553/rev-jec.409

Copyright (c) 2025 REV Journal on Electronics and Communications


ISSN: 1859-378X

Copyright © 2011-2025
Radio and Electronics Association of Vietnam
All rights reserved