Gazepoint Citations

We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a short list of publications which we have found to date. If you have published your research which uses the Gazepoint system, please let us know and we will add a citation to your work here.

Sibley, C., Foroughi, C., Brown, N., Drollinger, S., Phillips, H., & Coyne, J. (2021). Augmenting Traditional Performance Analyses with Eye Tracking Metrics. In H. Ayaz & U. Asgher (Eds.), Advances in Neuroergonomics and Cognitive Engineering (pp. 118–125). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-030-51041-1_17
Ebaid, D., & Crewther, S. G. (2020). The Contribution of Oculomotor Functions to Rates of Visual Information Processing in Younger and Older Adults. Scientific Reports, 10(1), 10129. https://doi.org/10.1038/s41598-020-66773-5
Biondi, F. N., Balasingam, B., & Ayare, P. (2020). On the Cost of Detection Response Task Performance on Cognitive Load. Human Factors, 0018720820931628. https://doi.org/10.1177/0018720820931628
Pires, L. de F. (2020). Master’s students’ post-editing perception and strategies: Exploratory study. FORUM. Revue Internationale d’interprétation et de Traduction / International Journal of Interpretation and Translation, 18(1), 26–44. https://doi.org/10.1075/forum.19014.pir
Brand, J., Masterson, T. D., Emond, J. A., Lansigan, R., & Gilbert-Diamond, D. (2020). Measuring attentional bias to food cues in young children using a visual search task: An eye-tracking study. Appetite, 148, 104610. https://doi.org/10.1016/j.appet.2020.104610
Kim, S., Pollanen, M., Reynolds, M. G., & Burr, W. S. (2020). Problem Solving as a Path to Comprehension. Mathematics in Computer Science. https://doi.org/10.1007/s11786-020-00457-1
Eraslan, S., Yesilada, Y., Yaneva, V., & Ha, L. A. (2020). “Keep it simple!”: an eye-tracking study for exploring complexity and distinguishability of web pages for people with autism. Universal Access in the Information Society. https://doi.org/10.1007/s10209-020-00708-9
Lewis, G. A., & Bidelman, G. M. (2020). Autonomic Nervous System Correlates of Speech Categorization Revealed Through Pupillometry. Frontiers in Neuroscience, 13. https://doi.org/10.3389/fnins.2019.01418
Gwizdka, J., & Dillon, A. (2020). Eye-Tracking as a Method for Enhancing Research on Information Search. In W. T. Fu & H. van Oostendorp (Eds.), Understanding and Improving Information Search: A Cognitive Approach (pp. 161–181). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-030-38825-6_9
Acero-Mondragon, E. J., Chaustre-Nieto, L. C., Urdaneta-Paredes, D. A., Cortes-Cabrera, J. A., & Gallego-Correa, J. J. (2020). Left -Right Pupil Diameter Difference-During Radiographic Reading of Broncopulmonary Carcinoma: An Exploration with Cognitive Load Among Novices and Experts. The FASEB Journal, 34(S1), 1–1. https://doi.org/10.1096/fasebj.2020.34.s1.09819
Bottos, S., & Balasingam, B. (2020). Tracking the Progression of Reading Using Eye-Gaze Point Measurements and Hidden Markov Models. IEEE Transactions on Instrumentation and Measurement, 1–1. https://doi.org/10.1109/TIM.2020.2983525
dos Santos, J. P. M., Ferreira, H., Reis, J., Prata, D., Simões, S. P., & Borges, I. D. (2020). The Use of Consumer Neuroscience Knowledge in Improving Real Promotional Media: The Case of Worten. In Á. Rocha, J. L. Reis, M. K. Peter, & Z. Bogdanović (Eds.), Marketing and Smart Technologies (pp. 202–218). Singapore: Springer. https://doi.org/10.1007/978-981-15-1564-4_20
Pavisian, B., Patel, V. P., & Feinstein, A. (2019). Cognitive mediated eye movements during the SDMT reveal the challenges with processing speed faced by people with MS. BMC Neurology, 19(1), 340. https://doi.org/10.1186/s12883-019-1543-8
Villamor, M. M., & Rodrigo, M. M. T. (2019). Gaze collaboration patterns of successful and unsuccessful programming pairs using cross-recurrence quantification analysis. Research and Practice in Technology Enhanced Learning, 14(1), 25. https://doi.org/10.1186/s41039-019-0118-z
Václavíková, Z. (2019). Eye-tracker technology in didactic research. AIP Conference Proceedings, 2186(1), 060019. https://doi.org/10.1063/1.5137973
Larson, D. J., Wetherbee, J. C., & Branscum, P. (2019). CrossFit Athletic Identity’s Relationship to Sponsor Recall, Recognition, and Purchase Intent. International Journal of Kinesiology and Sports Science, 7(3), 6. https://doi.org/10.7575/aiac.ijkss.v.7n.3p.6
Bottos, S., & Balasingam, B. (2019). A Novel Slip-Kalman Filter to Track the Progression of Reading Through Eye-Gaze Measurements. ArXiv:1907.07232 [Cs, Eess]. Retrieved from http://arxiv.org/abs/1907.07232
Wibirama, S., Santosa, P. I., Widyarani, P., Brilianto, N., & Hafidh, W. (2019). Physical discomfort and eye movements during arbitrary and optical flow-like motions in stereo 3D contents. Virtual Reality. https://doi.org/10.1007/s10055-019-00386-w
Seha, S., Papangelakis, G., Hatzinakos, D., Zandi, A. S., & Comeau, F. J. (2019). Improving Eye Movement Biometrics Using Remote Registration of Eye Blinking Patterns. In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (pp. 2562–2566). https://doi.org/10.1109/ICASSP.2019.8683757
Zhu, H., Salcudean, S. E., & Rohling, R. N. (2019). A novel gaze-supported multimodal human–computer interaction for ultrasound machines. International Journal of Computer Assisted Radiology and Surgery. https://doi.org/10.1007/s11548-019-01964-8
Tanner, S. A., McCarthy, M. B., & O’Reilly, S. J. (2019). Exploring the roles of motivation and cognition in label-usage using a combined eye-tracking and retrospective think aloud approach. Appetite, 135, 146–158. https://doi.org/10.1016/j.appet.2018.11.015
Lim, Y., Gardi, A., Pongsakornsathien, N., Sabatini, R., Ezer, N., & Kistan, T. (2019). Experimental Characterisation of Eye-Tracking Sensors for Adaptive Human-Machine Systems. Measurement. https://doi.org/10.1016/j.measurement.2019.03.032
Hienert, D., Kern, D., Mitsui, M., Shah, C., & Belkin, N. J. (2019). Reading Protocol: Understanding what has been Read in Interactive Information Retrieval Tasks. ArXiv:1902.04262 [Cs]. Retrieved from http://arxiv.org/abs/1902.04262
Bottos, S., & Balasingam, B. (2019). An Approach to Track Reading Progression Using Eye-Gaze Fixation Points. ArXiv:1902.03322 [Cs]. Retrieved from http://arxiv.org/abs/1902.03322
Jeong, H., & Liu, Y. (2019). Effects of non-driving-related-task modality and road geometry on eye movements, lane-keeping performance, and workload while driving. Transportation Research Part F: Traffic Psychology and Behaviour, 60, 157–171. https://doi.org/10.1016/j.trf.2018.10.015
Calado, J., Marcelino-Jesus, E., Ferreira, F., & Sarraipa, J. (2019). EYE-TRACKING STUDENT’S BEHAVIOUR FOR E-LEARNING IMPROVEMENT (pp. 8978–8986). Presented at the 11th International Conference on Education and New Learning Technologies, Palma, Spain. https://doi.org/10.21125/edulearn.2019.2221
Pfarr, J., Ganter, M. T., Spahn, D. R., Noethiger, C. B., & Tscholl, D. W. (2019). Avatar-Based Patient Monitoring With Peripheral Vision: A Multicenter Comparative Eye-Tracking Study. Journal of Medical Internet Research, 21(7), e13041. https://doi.org/10.2196/13041
Kannegieser, E., Atorf, D., & Meier, J. (2019). Conducting an Experiment for Validating the Combined Model of Immersion and Flow. In CSEDU. https://doi.org/10.5220/0007688902520259
Neomániová, K., Berčík, J., & Pavelka, A. (2019). The Use of Eye‑Tracker and Face Reader as Useful Consumer Neuroscience Tools Within Logo Creation. Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis, 67(4), 1061–1070. https://doi.org/10.11118/actaun201967041061
Swift, D., & Schofield, D. (2019). THE IMPACT OF COLOR ON SECONDARY TASK TIME WHILE DRIVING. International Journal of Information Technology, 4(1), 19.
Russell, C., & crusse. (2019). “I Consent”: An Eye-Tracking Study of IRB Informed Consent Forms.
Ćosić, K., Popović, S., Šarlija, M., Mijić, I., Kokot, M., Kesedžić, I., … Zhang, Q. (2019). New Tools and Methods in Selection of Air Traffic Controllers Based on Multimodal Psychophysiological Measurements. IEEE Access, 7, 174873–174888. https://doi.org/10.1109/ACCESS.2019.2957357
Constantinides, A., Fidas, C., Belk, M., & Pitsillides, A. (2019). “I Recall This Picture”: Understanding Picture Password Selections Based on Users’ Sociocultural Experiences. In IEEE/WIC/ACM International Conference on Web Intelligence (pp. 408–412). New York, NY, USA: ACM. https://doi.org/10.1145/3350546.3352557
Gupta, V., Chakraborty, T., Agarwal, M., Singh, R., Arora, M., & Vatsa, M. (2019). Bag-of-Lies: A Multimodal Dataset for Deception Detection, 8.
Yaneva, V., & Eraslan, S. (2019). Adults with High-functioning Autism Process Web Pages With Similar Accuracy but Higher Cognitive Effort Compared to Controls, 4.
Coba, L., Rook, L., Zanker, M., & Symeonidis, P. (2019). Decision Making Strategies Differ in the Presence of Collaborative Explanations: Two Conjoint Studies. In Proceedings of the 24th International Conference on Intelligent User Interfaces (pp. 291–302). New York, NY, USA: ACM. https://doi.org/10.1145/3301275.3302304
Coba, L., Zanker, M., & Rook, L. (2019). Decision Making Based on Bimodal Rating Summary Statistics - An Eye-Tracking Study of Hotels. In J. Pesonen & J. Neidhardt (Eds.), Information and Communication Technologies in Tourism 2019 (pp. 40–51). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-030-05940-8_4
Volonte, M., Duchowski, A. T., & Babu, S. V. (2019). Effects of a Virtual Human Appearance Fidelity Continuum on Visual Attention in Virtual Reality. In Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents (pp. 141–147). New York, NY, USA: ACM. https://doi.org/10.1145/3308532.3329461
Ohligs, M., Pereira, C., Voigt, V., Koeny, M., Janß, A., Rossaint, R., & Czaplik, M. (2019). Evaluation of an Anesthesia Dashboard Functional Model Based on a Manufacturer-Independent Communication Standard: Comparative Feasibility Study. JMIR Human Factors, 6(2), e12553. https://doi.org/10.2196/12553
Constantinides, A., Belk, M., Fidas, C., & Pitsillides, A. (2019). On the Accuracy of Eye Gaze-driven Classifiers for Predicting Image Content Familiarity in Graphical Passwords. In Proceedings of the 27th ACM Conference on User Modeling, Adaptation and Personalization (pp. 201–205). New York, NY, USA: ACM. https://doi.org/10.1145/3320435.3320474
Matthews, O., Eraslan, S., Yaneva, V., Davies, A., Yesilada, Y., Vigo, M., & Harper, S. (2019). Combining Trending Scan Paths with Arousal to Model Visual Behaviour on the Web: A Case Study of Neurotypical People vs People with Autism. In Proceedings of the 27th ACM Conference on User Modeling, Adaptation and Personalization (pp. 86–94). New York, NY, USA: ACM. https://doi.org/10.1145/3320435.3320446
Obaidellah, U., Raschke, M., & Blascheck, T. (2019). Classification of Strategies for Solving Programming Problems using AoI Sequence Analysis, 10.
Duchowski, A., Krejtz, K., Zurawska, J., & House, D. (2019). Using Microsaccades to Estimate Task Difficulty During Visual Search of Layered Surfaces. IEEE Transactions on Visualization and Computer Graphics, 1–1. https://doi.org/10.1109/TVCG.2019.2901881
Koury, H. F., Leonard, C. J., Carry, P. M., & Lee, L. M. J. (2018). An Expert Derived Feedforward Histology Module Improves Pattern Recognition Efficiency in Novice Students. Anatomical Sciences Education, 0(0). https://doi.org/10.1002/ase.1854
Eraslan, S., Yaneva, V., Yesilada, Y., & Harper, S. (2018). Web users with autism: eye tracking evidence for differences. Behaviour & Information Technology, 0(0), 1–23. https://doi.org/10.1080/0144929X.2018.1551933
Notaro, G. M., & Diamond, S. G. (2018). Simultaneous EEG, eye-tracking, behavioral, and screen-capture data during online German language learning. Data in Brief, 21, 1937–1943. https://doi.org/10.1016/j.dib.2018.11.044
Natraj, N., Alterman, B., Basunia, S., & Wheaton, L. A. (2018). The Role of Attention and Saccades on Parietofrontal Encoding of Contextual and Grasp-specific Affordances of Tools: An ERP Study. Neuroscience, 394, 243–266. https://doi.org/10.1016/j.neuroscience.2018.10.019
Beattie, K. L., & Morrison, B. W. (2018). Navigating the Online World: Gaze, Fixations, and Performance Differences between Younger and Older Users. International Journal of Human–Computer Interaction, 0(0), 1–14. https://doi.org/10.1080/10447318.2018.1541545
Meng, J., Streitz, T., Gulachek, N., Suma, D., & He, B. (2018). Three-Dimensional Brain–Computer Interface Control Through Simultaneous Overt Spatial Attentional and Motor Imagery Tasks. IEEE Transactions on Biomedical Engineering, 65(11), 2417–2427. https://doi.org/10.1109/TBME.2018.2872855
Yaman, C., Küçün, N. T., Güngör, S., & Eroğlu, S. (2018). THE CONTEXTUAL EFFECT AND MEASUREMENT OF ATTENTION TO ADVERTISEMENTS VIA EYE TRACKING METHOD. JOURNAL OF LIFE ECONOMICS, 5(4), 221–232. https://doi.org/10.15637/jlecon.271