Gazepoint Citations

We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a short list of publications which we have found to date. If you have published your research which uses the Gazepoint system, please let us know and we will add a citation to your work here.

Brand, J., Masterson, T. D., Emond, J. A., Lansigan, R., & Gilbert-Diamond, D. (2020). Measuring attentional bias to food cues in young children using a visual search task: An eye-tracking study. Appetite, 148, 104610.
Lewis, G. A., & Bidelman, G. M. (2020). Autonomic Nervous System Correlates of Speech Categorization Revealed Through Pupillometry. Frontiers in Neuroscience, 13.
dos Santos, J. P. M., Ferreira, H., Reis, J., Prata, D., Simões, S. P., & Borges, I. D. (2020). The Use of Consumer Neuroscience Knowledge in Improving Real Promotional Media: The Case of Worten. In Á. Rocha, J. L. Reis, M. K. Peter, & Z. Bogdanović (Eds.), Marketing and Smart Technologies (pp. 202–218). Singapore: Springer.
Pavisian, B., Patel, V. P., & Feinstein, A. (2019). Cognitive mediated eye movements during the SDMT reveal the challenges with processing speed faced by people with MS. BMC Neurology, 19(1), 340.
Václavíková, Z. (2019). Eye-tracker technology in didactic research. AIP Conference Proceedings, 2186(1), 060019.
Larson, D. J., Wetherbee, J. C., & Branscum, P. (2019). CrossFit Athletic Identity’s Relationship to Sponsor Recall, Recognition, and Purchase Intent. International Journal of Kinesiology and Sports Science, 7(3), 6.
Seha, S., Papangelakis, G., Hatzinakos, D., Zandi, A. S., & Comeau, F. J. (2019). Improving Eye Movement Biometrics Using Remote Registration of Eye Blinking Patterns. In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (pp. 2562–2566).
Zhu, H., Salcudean, S. E., & Rohling, R. N. (2019). A novel gaze-supported multimodal human–computer interaction for ultrasound machines. International Journal of Computer Assisted Radiology and Surgery.
Tanner, S. A., McCarthy, M. B., & O’Reilly, S. J. (2019). Exploring the roles of motivation and cognition in label-usage using a combined eye-tracking and retrospective think aloud approach. Appetite, 135, 146–158.
Lim, Y., Gardi, A., Pongsakornsathien, N., Sabatini, R., Ezer, N., & Kistan, T. (2019). Experimental Characterisation of Eye-Tracking Sensors for Adaptive Human-Machine Systems. Measurement.
Hienert, D., Kern, D., Mitsui, M., Shah, C., & Belkin, N. J. (2019). Reading Protocol: Understanding what has been Read in Interactive Information Retrieval Tasks. ArXiv:1902.04262 [Cs]. Retrieved from
Bottos, S., & Balasingam, B. (2019). An Approach to Track Reading Progression Using Eye-Gaze Fixation Points. ArXiv:1902.03322 [Cs]. Retrieved from
Jeong, H., & Liu, Y. (2019). Effects of non-driving-related-task modality and road geometry on eye movements, lane-keeping performance, and workload while driving. Transportation Research Part F: Traffic Psychology and Behaviour, 60, 157–171.
Russell, C., & crusse. (2019). “I Consent”: An Eye-Tracking Study of IRB Informed Consent Forms.
Ćosić, K., Popović, S., Šarlija, M., Mijić, I., Kokot, M., Kesedžić, I., … Zhang, Q. (2019). New Tools and Methods in Selection of Air Traffic Controllers Based on Multimodal Psychophysiological Measurements. IEEE Access, 7, 174873–174888.
Constantinides, A., Fidas, C., Belk, M., & Pitsillides, A. (2019). “I Recall This Picture”: Understanding Picture Password Selections Based on Users’ Sociocultural Experiences. In IEEE/WIC/ACM International Conference on Web Intelligence (pp. 408–412). New York, NY, USA: ACM.
Gupta, V., Chakraborty, T., Agarwal, M., Singh, R., Arora, M., & Vatsa, M. (2019). Bag-of-Lies: A Multimodal Dataset for Deception Detection, 8.
Yaneva, V., & Eraslan, S. (2019). Adults with High-functioning Autism Process Web Pages With Similar Accuracy but Higher Cognitive Effort Compared to Controls, 4.
Coba, L., Rook, L., Zanker, M., & Symeonidis, P. (2019). Decision Making Strategies Differ in the Presence of Collaborative Explanations: Two Conjoint Studies. In Proceedings of the 24th International Conference on Intelligent User Interfaces (pp. 291–302). New York, NY, USA: ACM.
Coba, L., Zanker, M., & Rook, L. (2019). Decision Making Based on Bimodal Rating Summary Statistics - An Eye-Tracking Study of Hotels. In J. Pesonen & J. Neidhardt (Eds.), Information and Communication Technologies in Tourism 2019 (pp. 40–51). Cham: Springer International Publishing.
Volonte, M., Duchowski, A. T., & Babu, S. V. (2019). Effects of a Virtual Human Appearance Fidelity Continuum on Visual Attention in Virtual Reality. In Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents (pp. 141–147). New York, NY, USA: ACM.
Ohligs, M., Pereira, C., Voigt, V., Koeny, M., Janß, A., Rossaint, R., & Czaplik, M. (2019). Evaluation of an Anesthesia Dashboard Functional Model Based on a Manufacturer-Independent Communication Standard: Comparative Feasibility Study. JMIR Human Factors, 6(2), e12553.
Constantinides, A., Belk, M., Fidas, C., & Pitsillides, A. (2019). On the Accuracy of Eye Gaze-driven Classifiers for Predicting Image Content Familiarity in Graphical Passwords. In Proceedings of the 27th ACM Conference on User Modeling, Adaptation and Personalization (pp. 201–205). New York, NY, USA: ACM.
Matthews, O., Eraslan, S., Yaneva, V., Davies, A., Yesilada, Y., Vigo, M., & Harper, S. (2019). Combining Trending Scan Paths with Arousal to Model Visual Behaviour on the Web: A Case Study of Neurotypical People vs People with Autism. In Proceedings of the 27th ACM Conference on User Modeling, Adaptation and Personalization (pp. 86–94). New York, NY, USA: ACM.
Obaidellah, U., Raschke, M., & Blascheck, T. (2019). Classification of Strategies for Solving Programming Problems using AoI Sequence Analysis, 10.
Duchowski, A., Krejtz, K., Zurawska, J., & House, D. (2019). Using Microsaccades to Estimate Task Difficulty During Visual Search of Layered Surfaces. IEEE Transactions on Visualization and Computer Graphics, 1–1.
Koury, H. F., Leonard, C. J., Carry, P. M., & Lee, L. M. J. (2018). An Expert Derived Feedforward Histology Module Improves Pattern Recognition Efficiency in Novice Students. Anatomical Sciences Education, 0(0).
Eraslan, S., Yaneva, V., Yesilada, Y., & Harper, S. (2018). Web users with autism: eye tracking evidence for differences. Behaviour & Information Technology, 0(0), 1–23.
Notaro, G. M., & Diamond, S. G. (2018). Simultaneous EEG, eye-tracking, behavioral, and screen-capture data during online German language learning. Data in Brief, 21, 1937–1943.
Natraj, N., Alterman, B., Basunia, S., & Wheaton, L. A. (2018). The Role of Attention and Saccades on Parietofrontal Encoding of Contextual and Grasp-specific Affordances of Tools: An ERP Study. Neuroscience, 394, 243–266.
Beattie, K. L., & Morrison, B. W. (2018). Navigating the Online World: Gaze, Fixations, and Performance Differences between Younger and Older Users. International Journal of Human–Computer Interaction, 0(0), 1–14.
Meng, J., Streitz, T., Gulachek, N., Suma, D., & He, B. (2018). Three-Dimensional Brain–Computer Interface Control Through Simultaneous Overt Spatial Attentional and Motor Imagery Tasks. IEEE Transactions on Biomedical Engineering, 65(11), 2417–2427.
Raveh, E., Friedman, J., & Portnoy, S. (2018). Visuomotor behaviors and performance in a dual-task paradigm with and without vibrotactile feedback when using a myoelectric controlled hand. Assistive Technology, 30(5), 274–280.
Lim, Y., Gardi, A., Sabatini, R., Ramasamy, S., Kistan, T., Ezer, N., … Bolia, R. (2018). Avionics Human-Machine Interfaces and Interactions for Manned and Unmanned Aircraft. Progress in Aerospace Sciences, 102, 1–46.
Iskander, J., Jia, D., Hettiarachchi, I., Hossny, M., Saleh, K., Nahavandi, S., … Hanoun, S. (2018). Age-Related Effects of Multi-screen Setup on Task Performance and Eye Movement Characteristics. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 3480–3485).
Zhou, H., Wei, L., Cao, R., Hanoun, S., Bhatti, A., Tai, Y., & Nahavandi, S. (2018). The Study of Using Eye Movements to Control the Laparoscope Under a Haptically-Enabled Laparoscopic Surgery Simulation Environment. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 3022–3026).
Saleh, K., Iskander, J., Jia, D., Hossny, M., Nahavandi, S., Best, C., … Hanoun, S. (2018). Reliable Switching Mechanism for Low Cost Multi-screen Eye Tracking Devices via Deep Recurrent Neural Networks. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 3492–3497).
Saleh, K., Iskander, J., Jia, D., Hossny, M., Nahavandi, S., Best, C., … Hanoun, S. (2018). Reliable Switching Mechanism for Low Cost Multi-screen Eye Tracking Devices via Deep Recurrent Neural Networks. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 3492–3497).
Nishiyama, M., Matsumoto, R., Yoshimura, H., & Iwai, Y. (2018). Extracting discriminative features using task-oriented gaze maps measured from observers for personal attribute classification. Pattern Recognition Letters, 112, 241–248.
Sibley, C., Foroughi, C., Brown, N., & Coyne, J. T. (2018). Low Cost Eye Tracking: Ready for Individual Differences Research? Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 62(1), 741–745.
Kar, A., & Corcoran, P. (2018). Performance Evaluation Strategies for Eye Gaze Estimation Systems with Quantitative Metrics and Visualizations. Sensors, 18(9), 3151.
Morasco Junior, M. A. (2018). Parâmetros gráfico-inclusivos para o desenvolvimento de objetos de aprendizagem digitais voltados ao público infantil. Retrieved from
Ngan, H. F. B., & Yu, C.-E. (2018). To smile or not to smile – an eye-tracking study on service recovery. Current Issues in Tourism, 0(0), 1–6.
Jaikumar, S. (2018). How Do Consumers Choose Sellers In E-Marketplaces?: The Role of Display Price And Sellers’ Review Volume. Journal of Advertising Research, JAR-2018-028.
Aliyev, F., Ürkmez, T., & Wagner, R. (2018). Luxury brands do not glitter equally for everyone. Journal of Brand Management, 25(4), 337–350.
Chang, C., Chen, C., & Lin, Y. (2018). A Visual Interactive Reading System Based on Eye Tracking Technology to Improve Digital Reading Performance. In 2018 7th International Congress on Advanced Applied Informatics (IIAI-AAI) (pp. 182–187).
Tkach, B. (2018). Neuropsychological features personalities with deviant behavior. Fundamental and Applied Researches in Practice of Leading Scientific Schools, 27(3), 201–206.
Wijayanto, T., Marcilia, S. R., & Lufityanto, G. (2018). Visual Attention, Driving Behavior and Driving Performance among Young Drivers in Sleep-deprived Condition. KnE Life Sciences, 4(5), 424–434.
Antunes, J., & Santana, P. (2018). A Study on the Use of Eye Tracking to Adapt Gameplay and Procedural Content Generation in First-Person Shooter Games. Multimodal Technologies and Interaction, 2(2), 23.