Gazepoint Citations

We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a short list of publications which we have found to date. If you have published your research which uses the Gazepoint system, please let us know and we will add a citation to your work here.

Brand, J., Masterson, T. D., Emond, J. A., Lansigan, R., & Gilbert-Diamond, D. (2020). Measuring attentional bias to food cues in young children using a visual search task: An eye-tracking study. Appetite, 148, 104610.
Kim, S., Pollanen, M., Reynolds, M. G., & Burr, W. S. (2020). Problem Solving as a Path to Comprehension. Mathematics in Computer Science.
Eraslan, S., Yesilada, Y., Yaneva, V., & Ha, L. A. (2020). “Keep it simple!”: an eye-tracking study for exploring complexity and distinguishability of web pages for people with autism. Universal Access in the Information Society.
Lewis, G. A., & Bidelman, G. M. (2020). Autonomic Nervous System Correlates of Speech Categorization Revealed Through Pupillometry. Frontiers in Neuroscience, 13.
Acero-Mondragon, E. J., Chaustre-Nieto, L. C., Urdaneta-Paredes, D. A., Cortes-Cabrera, J. A., & Gallego-Correa, J. J. (2020). Left -Right Pupil Diameter Difference-During Radiographic Reading of Broncopulmonary Carcinoma: An Exploration with Cognitive Load Among Novices and Experts. The FASEB Journal, 34(S1), 1–1.
Bottos, S., & Balasingam, B. (2020). Tracking the Progression of Reading Using Eye-Gaze Point Measurements and Hidden Markov Models. IEEE Transactions on Instrumentation and Measurement, 1–1.
dos Santos, J. P. M., Ferreira, H., Reis, J., Prata, D., Simões, S. P., & Borges, I. D. (2020). The Use of Consumer Neuroscience Knowledge in Improving Real Promotional Media: The Case of Worten. In Á. Rocha, J. L. Reis, M. K. Peter, & Z. Bogdanović (Eds.), Marketing and Smart Technologies (pp. 202–218). Singapore: Springer.
Pavisian, B., Patel, V. P., & Feinstein, A. (2019). Cognitive mediated eye movements during the SDMT reveal the challenges with processing speed faced by people with MS. BMC Neurology, 19(1), 340.
Villamor, M. M., & Rodrigo, M. M. T. (2019). Gaze collaboration patterns of successful and unsuccessful programming pairs using cross-recurrence quantification analysis. Research and Practice in Technology Enhanced Learning, 14(1), 25.
Václavíková, Z. (2019). Eye-tracker technology in didactic research. AIP Conference Proceedings, 2186(1), 060019.
Larson, D. J., Wetherbee, J. C., & Branscum, P. (2019). CrossFit Athletic Identity’s Relationship to Sponsor Recall, Recognition, and Purchase Intent. International Journal of Kinesiology and Sports Science, 7(3), 6.
Bottos, S., & Balasingam, B. (2019). A Novel Slip-Kalman Filter to Track the Progression of Reading Through Eye-Gaze Measurements. ArXiv:1907.07232 [Cs, Eess]. Retrieved from
Wibirama, S., Santosa, P. I., Widyarani, P., Brilianto, N., & Hafidh, W. (2019). Physical discomfort and eye movements during arbitrary and optical flow-like motions in stereo 3D contents. Virtual Reality.
Seha, S., Papangelakis, G., Hatzinakos, D., Zandi, A. S., & Comeau, F. J. (2019). Improving Eye Movement Biometrics Using Remote Registration of Eye Blinking Patterns. In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (pp. 2562–2566).
Zhu, H., Salcudean, S. E., & Rohling, R. N. (2019). A novel gaze-supported multimodal human–computer interaction for ultrasound machines. International Journal of Computer Assisted Radiology and Surgery.
Tanner, S. A., McCarthy, M. B., & O’Reilly, S. J. (2019). Exploring the roles of motivation and cognition in label-usage using a combined eye-tracking and retrospective think aloud approach. Appetite, 135, 146–158.
Lim, Y., Gardi, A., Pongsakornsathien, N., Sabatini, R., Ezer, N., & Kistan, T. (2019). Experimental Characterisation of Eye-Tracking Sensors for Adaptive Human-Machine Systems. Measurement.
Hienert, D., Kern, D., Mitsui, M., Shah, C., & Belkin, N. J. (2019). Reading Protocol: Understanding what has been Read in Interactive Information Retrieval Tasks. ArXiv:1902.04262 [Cs]. Retrieved from
Bottos, S., & Balasingam, B. (2019). An Approach to Track Reading Progression Using Eye-Gaze Fixation Points. ArXiv:1902.03322 [Cs]. Retrieved from
Jeong, H., & Liu, Y. (2019). Effects of non-driving-related-task modality and road geometry on eye movements, lane-keeping performance, and workload while driving. Transportation Research Part F: Traffic Psychology and Behaviour, 60, 157–171.
Pfarr, J., Ganter, M. T., Spahn, D. R., Noethiger, C. B., & Tscholl, D. W. (2019). Avatar-Based Patient Monitoring With Peripheral Vision: A Multicenter Comparative Eye-Tracking Study. Journal of Medical Internet Research, 21(7), e13041.
Kannegieser, E., Atorf, D., & Meier, J. (2019). Conducting an Experiment for Validating the Combined Model of Immersion and Flow. In CSEDU.
Neomániová, K., Berčík, J., & Pavelka, A. (2019). The Use of Eye‑Tracker and Face Reader as Useful Consumer Neuroscience Tools Within Logo Creation. Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis, 67(4), 1061–1070.
Swift, D., & Schofield, D. (2019). THE IMPACT OF COLOR ON SECONDARY TASK TIME WHILE DRIVING. International Journal of Information Technology, 4(1), 19.
Russell, C., & crusse. (2019). “I Consent”: An Eye-Tracking Study of IRB Informed Consent Forms.
Ćosić, K., Popović, S., Šarlija, M., Mijić, I., Kokot, M., Kesedžić, I., … Zhang, Q. (2019). New Tools and Methods in Selection of Air Traffic Controllers Based on Multimodal Psychophysiological Measurements. IEEE Access, 7, 174873–174888.
Constantinides, A., Fidas, C., Belk, M., & Pitsillides, A. (2019). “I Recall This Picture”: Understanding Picture Password Selections Based on Users’ Sociocultural Experiences. In IEEE/WIC/ACM International Conference on Web Intelligence (pp. 408–412). New York, NY, USA: ACM.
Gupta, V., Chakraborty, T., Agarwal, M., Singh, R., Arora, M., & Vatsa, M. (2019). Bag-of-Lies: A Multimodal Dataset for Deception Detection, 8.
Yaneva, V., & Eraslan, S. (2019). Adults with High-functioning Autism Process Web Pages With Similar Accuracy but Higher Cognitive Effort Compared to Controls, 4.
Coba, L., Rook, L., Zanker, M., & Symeonidis, P. (2019). Decision Making Strategies Differ in the Presence of Collaborative Explanations: Two Conjoint Studies. In Proceedings of the 24th International Conference on Intelligent User Interfaces (pp. 291–302). New York, NY, USA: ACM.
Coba, L., Zanker, M., & Rook, L. (2019). Decision Making Based on Bimodal Rating Summary Statistics - An Eye-Tracking Study of Hotels. In J. Pesonen & J. Neidhardt (Eds.), Information and Communication Technologies in Tourism 2019 (pp. 40–51). Cham: Springer International Publishing.
Volonte, M., Duchowski, A. T., & Babu, S. V. (2019). Effects of a Virtual Human Appearance Fidelity Continuum on Visual Attention in Virtual Reality. In Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents (pp. 141–147). New York, NY, USA: ACM.
Ohligs, M., Pereira, C., Voigt, V., Koeny, M., Janß, A., Rossaint, R., & Czaplik, M. (2019). Evaluation of an Anesthesia Dashboard Functional Model Based on a Manufacturer-Independent Communication Standard: Comparative Feasibility Study. JMIR Human Factors, 6(2), e12553.
Constantinides, A., Belk, M., Fidas, C., & Pitsillides, A. (2019). On the Accuracy of Eye Gaze-driven Classifiers for Predicting Image Content Familiarity in Graphical Passwords. In Proceedings of the 27th ACM Conference on User Modeling, Adaptation and Personalization (pp. 201–205). New York, NY, USA: ACM.
Matthews, O., Eraslan, S., Yaneva, V., Davies, A., Yesilada, Y., Vigo, M., & Harper, S. (2019). Combining Trending Scan Paths with Arousal to Model Visual Behaviour on the Web: A Case Study of Neurotypical People vs People with Autism. In Proceedings of the 27th ACM Conference on User Modeling, Adaptation and Personalization (pp. 86–94). New York, NY, USA: ACM.
Obaidellah, U., Raschke, M., & Blascheck, T. (2019). Classification of Strategies for Solving Programming Problems using AoI Sequence Analysis, 10.
Duchowski, A., Krejtz, K., Zurawska, J., & House, D. (2019). Using Microsaccades to Estimate Task Difficulty During Visual Search of Layered Surfaces. IEEE Transactions on Visualization and Computer Graphics, 1–1.
Koury, H. F., Leonard, C. J., Carry, P. M., & Lee, L. M. J. (2018). An Expert Derived Feedforward Histology Module Improves Pattern Recognition Efficiency in Novice Students. Anatomical Sciences Education, 0(0).
Eraslan, S., Yaneva, V., Yesilada, Y., & Harper, S. (2018). Web users with autism: eye tracking evidence for differences. Behaviour & Information Technology, 0(0), 1–23.
Notaro, G. M., & Diamond, S. G. (2018). Simultaneous EEG, eye-tracking, behavioral, and screen-capture data during online German language learning. Data in Brief, 21, 1937–1943.
Natraj, N., Alterman, B., Basunia, S., & Wheaton, L. A. (2018). The Role of Attention and Saccades on Parietofrontal Encoding of Contextual and Grasp-specific Affordances of Tools: An ERP Study. Neuroscience, 394, 243–266.
Beattie, K. L., & Morrison, B. W. (2018). Navigating the Online World: Gaze, Fixations, and Performance Differences between Younger and Older Users. International Journal of Human–Computer Interaction, 0(0), 1–14.
Meng, J., Streitz, T., Gulachek, N., Suma, D., & He, B. (2018). Three-Dimensional Brain–Computer Interface Control Through Simultaneous Overt Spatial Attentional and Motor Imagery Tasks. IEEE Transactions on Biomedical Engineering, 65(11), 2417–2427.
Raveh, E., Friedman, J., & Portnoy, S. (2018). Visuomotor behaviors and performance in a dual-task paradigm with and without vibrotactile feedback when using a myoelectric controlled hand. Assistive Technology, 30(5), 274–280.
Lim, Y., Gardi, A., Sabatini, R., Ramasamy, S., Kistan, T., Ezer, N., … Bolia, R. (2018). Avionics Human-Machine Interfaces and Interactions for Manned and Unmanned Aircraft. Progress in Aerospace Sciences, 102, 1–46.
Iskander, J., Jia, D., Hettiarachchi, I., Hossny, M., Saleh, K., Nahavandi, S., … Hanoun, S. (2018). Age-Related Effects of Multi-screen Setup on Task Performance and Eye Movement Characteristics. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 3480–3485).
Zhou, H., Wei, L., Cao, R., Hanoun, S., Bhatti, A., Tai, Y., & Nahavandi, S. (2018). The Study of Using Eye Movements to Control the Laparoscope Under a Haptically-Enabled Laparoscopic Surgery Simulation Environment. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 3022–3026).
Saleh, K., Iskander, J., Jia, D., Hossny, M., Nahavandi, S., Best, C., … Hanoun, S. (2018). Reliable Switching Mechanism for Low Cost Multi-screen Eye Tracking Devices via Deep Recurrent Neural Networks. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 3492–3497).
Saleh, K., Iskander, J., Jia, D., Hossny, M., Nahavandi, S., Best, C., … Hanoun, S. (2018). Reliable Switching Mechanism for Low Cost Multi-screen Eye Tracking Devices via Deep Recurrent Neural Networks. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 3492–3497).