Gazepoint Citations

We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a short list of publications which we have found to date. If you have published your research which uses the Gazepoint system, please let us know and we will add a citation to your work here.

Sibley, C., Foroughi, C., Brown, N., Drollinger, S., Phillips, H., & Coyne, J. (2021). Augmenting Traditional Performance Analyses with Eye Tracking Metrics. In H. Ayaz & U. Asgher (Eds.), Advances in Neuroergonomics and Cognitive Engineering (pp. 118–125). Springer International Publishing. https://doi.org/10.1007/978-3-030-51041-1_17
Lee, T. L., Yeung, M. K., Sze, S. L., & Chan, A. S. (2020). Computerized Eye-Tracking Training Improves the Saccadic Eye Movements of Children with Attention-Deficit/Hyperactivity Disorder. Brain Sciences, 10(12). https://doi.org/10.3390/brainsci10121016
Brand, J., Diamond, S. G., Thomas, N., & Gilbert-Diamond, D. (2020). Evaluating the data quality of the Gazepoint GP3 low-cost eye tracker when used independently by study participants. Behavior Research Methods. https://doi.org/10.3758/s13428-020-01504-2
Hong, W. C. H., Ngan, H. F. B., Yu, J., & Zhao, Y. (2020). An eye-tracking study of exoticism in intra-national destinations in the Greater Bay area of China. Tourism Recreation Research, 0(0), 1–14. https://doi.org/10.1080/02508281.2020.1846431
Destyanto, T. Y. R., & Lin, R. F. (2020). Detecting computer activities using eye-movement features. Journal of Ambient Intelligence and Humanized Computing. https://doi.org/10.1007/s12652-020-02683-8
University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Jovančić, K., Milić Keresteš, N., University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Nedeljković, U., & University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia. (2020). Influence of white space on text scanning. Proceedings - The Tenth International Symposium GRID 2020, 699–706. https://doi.org/10.24867/GRID-2020-p79
University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Vladić, G., Mijatović, S., University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Bošnjaković, G., University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Jurič, I., University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Dimovski, V., & University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia. (2020). Analysis of the loading animation performance and viewer perception. Proceedings - The Tenth International Symposium GRID 2020, 667–675. https://doi.org/10.24867/GRID-2020-p76
Zuo, C., Ding, L., & Meng, L. (2020). A Feasibility Study of Map-Based Dashboard for Spatiotemporal Knowledge Acquisition and Analysis. ISPRS International Journal of Geo-Information, 9(11), 636. https://doi.org/10.3390/ijgi9110636
Chauhan, H., Prasad, A., & Shukla, J. (2020). Engagement Analysis of ADHD Students using Visual Cues from Eye Tracker. Companion Publication of the 2020 International Conference on Multimodal Interaction, 27–31. https://doi.org/10.1145/3395035.3425256
Karpova, V., Popenova, P., Glebko, N., Lyashenko, V., & Perepelkina, O. (2020). “Was It You Who Stole 500 Rubles?” - The Multimodal Deception Detection. Companion Publication of the 2020 International Conference on Multimodal Interaction, 112–119. https://doi.org/10.1145/3395035.3425638
Pillai, P., Ayare, P., Balasingam, B., Milne, K., & Biondi, F. (2020). Response Time and Eye Tracking Datasets for Activities Demanding Varying Cognitive Load. Data in Brief, 106389. https://doi.org/10.1016/j.dib.2020.106389
Karargyris, A., Kashyap, S., Lourentzou, I., Wu, J., Sharma, A., Tong, M., Abedin, S., Beymer, D., Mukherjee, V., Krupinski, E. A., & Moradi, M. (2020). Creation and Validation of a Chest X-Ray Dataset with Eye-tracking and Report Dictation for AI Development. ArXiv:2009.07386 [Cs]. http://arxiv.org/abs/2009.07386
Crameri, L., Hettiarachchi, I., & Hanoun, S. (2020). Feasibility Study of Skin Conductance Response for Quantifying Individual Dynamic Resilience. 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 1764–1771. https://doi.org/10.1109/SMC42975.2020.9283300
Pirruccio, M., Monaco, S., Della Libera, C., & Cattaneo, L. (2020). Gaze direction influences grasping actions towards unseen, haptically explored, objects. Scientific Reports, 10(1), 15774. https://doi.org/10.1038/s41598-020-72554-x
Clark, S., & Jasra, S. K. (2020). Detecting Differences Between Concealed and Unconcealed Emotions Using iMotions EMOTIENT. Journal of Emerging Forensic Sciences Research, 5(1), 1–24. https://jefsr.uwindsor.ca/index.php/jefsr/article/view/6376
Pritalia, G. L., Wibirama, S., Adji, T. B., & Kusrohmaniah, S. (2020). Classification of Learning Styles in Multimedia Learning Using Eye-Tracking and Machine Learning. 2020 FORTEI-International Conference on Electrical Engineering (FORTEI-ICEE), 145–150. https://doi.org/10.1109/FORTEI-ICEE50915.2020.9249875
Millán, Y. A., Chaves, M. L., & Barrero, J. C. (2020). A Review on Biometric Devices to be Applied in ASD Interventions. 2020 Congreso Internacional de Innovación y Tendencias En Ingeniería (CONIITI), 1–6. https://doi.org/10.1109/CONIITI51147.2020.9240291
Ngan, H. F. B., Bavik, A., Kuo, C.-F., & Yu, C.-E. (2020). WHERE YOU LOOK DEPENDS ON WHAT YOU ARE WILLING TO AFFORD: EYE TRACKING IN MENUS. Journal of Hospitality & Tourism Research, 1096348020951226. https://doi.org/10.1177/1096348020951226
Shi, M., Ming, H., Liu, Y., Mao, T., Zhu, D., Wang, Z., & Zhang, F. (2020). Saliency-dependent adaptive remeshing for cloth simulation. Textile Research Journal, 0040517520944248. https://doi.org/10.1177/0040517520944248
Volonte, M., Anaraky, R. G., Venkatakrishnan, R., Venkatakrishnan, R., Knijnenburg, B. P., Duchowski, A. T., & Babu, S. V. (2020). Empirical evaluation and pathway modeling of visual attention to virtual humans in an appearance fidelity continuum. Journal on Multimodal User Interfaces. https://doi.org/10.1007/s12193-020-00341-z
Bristol, S., Agostine, S., Dallman, A., Harrop, C., Crais, E., Baranek, G., & Watson, L. (2020). Visual Biases and Attentional Inflexibilities Differentiate Those at Elevated Likelihood of Autism: An Eye-Tracking Study. American Journal of Occupational Therapy, 74(4_Supplement_1), 7411505221p1-7411505221p1. https://doi.org/10.5014/ajot.2020.74S1-PO8133
Kuo, C.-F., Bavik, A., Ngan, H. F. B., & Yu, C.-E. (2020). The sweet spot in the eye of the beholder? Exploring the sweet sour spots of Asian restaurant menus. Journal of Hospitality Marketing & Management, 0(0), 1–16. https://doi.org/10.1080/19368623.2020.1790076
Doerflinger, J. T., & Gollwitzer, P. M. (2020). Emotion emphasis effects in moral judgment are moderated by mindsets. Motivation and Emotion. https://doi.org/10.1007/s11031-020-09847-1
Bhowmik, S., Motin, M. A., Sarossy, M., Radcliffe, P., & Kumar, D. (2020). Sample entropy analysis of pupillary signals in glaucoma patients and control via light-induced pupillometry. 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), 280–283. https://doi.org/10.1109/EMBC44109.2020.9176558
Ebaid, D., & Crewther, S. G. (2020). The Contribution of Oculomotor Functions to Rates of Visual Information Processing in Younger and Older Adults. Scientific Reports, 10(1), 10129. https://doi.org/10.1038/s41598-020-66773-5
Biondi, F. N., Balasingam, B., & Ayare, P. (2020). On the Cost of Detection Response Task Performance on Cognitive Load. Human Factors, 0018720820931628. https://doi.org/10.1177/0018720820931628
Pires, L. de F. (2020). Master’s students’ post-editing perception and strategies: Exploratory study. FORUM. Revue Internationale d’interprétation et de Traduction / International Journal of Interpretation and Translation, 18(1), 26–44. https://doi.org/10.1075/forum.19014.pir
Brand, J., Masterson, T. D., Emond, J. A., Lansigan, R., & Gilbert-Diamond, D. (2020). Measuring attentional bias to food cues in young children using a visual search task: An eye-tracking study. Appetite, 148, 104610. https://doi.org/10.1016/j.appet.2020.104610
Kim, S., Pollanen, M., Reynolds, M. G., & Burr, W. S. (2020). Problem Solving as a Path to Comprehension. Mathematics in Computer Science. https://doi.org/10.1007/s11786-020-00457-1
Eraslan, S., Yesilada, Y., Yaneva, V., & Ha, L. A. (2020). “Keep it simple!”: an eye-tracking study for exploring complexity and distinguishability of web pages for people with autism. Universal Access in the Information Society. https://doi.org/10.1007/s10209-020-00708-9
Sulikowski, P., & Zdziebko, T. (2020). Deep Learning-Enhanced Framework for Performance Evaluation of a Recommending Interface with Varied Recommendation Position and Intensity Based on Eye-Tracking Equipment Data Processing. Electronics, 9(2), 266. https://doi.org/10.3390/electronics9020266
Lewis, G. A., & Bidelman, G. M. (2020). Autonomic Nervous System Correlates of Speech Categorization Revealed Through Pupillometry. Frontiers in Neuroscience, 13. https://doi.org/10.3389/fnins.2019.01418
Kővári, A., Katona, J., & Pop, C. (2020). Evaluation of Eye-Movement Metrics in a Software Debugging Task using GP3 Eye Tracker. Acta Polytechnica Hungarica, 17, 57–76. https://doi.org/10.12700/APH.17.2.2020.2.4
Kővári, A., Katona, J., & Pop, C. (2020). Quantitative Analysis of Relationship Between Visual Attention and Eye-Hand Coordination. Acta Polytechnica Hungarica, 17, 77–95. https://doi.org/10.12700/APH.17.2.2020.2.5
Furukado, R., Hagiwara, G., Ito, T., & Isogai, H. (2020). Comparison of EEG biofeedback and visual search strategies during e-sports play according to skill level. 10.
Malhotra, A., Sankaran, A., Vatsa, M., Singh, R., Morris, K. B., & Noore, A. (2020). Understanding ACE-V Latent Fingerprint Examination Process via Eye-Gaze Analysis. IEEE Transactions on Biometrics, Behavior, and Identity Science, 1–1. https://doi.org/10.1109/TBIOM.2020.3027144
Knogler, V. (2020). Viewing Behaviour and Task Performance on Austrian Destination Websites: Comparing Generation Y and the Baby Boomers. In M. Rainoldi & M. Jooss (Eds.), Eye Tracking in Tourism (pp. 225–241). Springer International Publishing. https://doi.org/10.1007/978-3-030-49709-5_14
Gwizdka, J., & Dillon, A. (2020). Eye-Tracking as a Method for Enhancing Research on Information Search. In W. T. Fu & H. van Oostendorp (Eds.), Understanding and Improving Information Search: A Cognitive Approach (pp. 161–181). Springer International Publishing. https://doi.org/10.1007/978-3-030-38825-6_9
Acero-Mondragon, E. J., Chaustre-Nieto, L. C., Urdaneta-Paredes, D. A., Cortes-Cabrera, J. A., & Gallego-Correa, J. J. (2020). Left -Right Pupil Diameter Difference-During Radiographic Reading of Broncopulmonary Carcinoma: An Exploration with Cognitive Load Among Novices and Experts. The FASEB Journal, 34(S1), 1–1. https://doi.org/10.1096/fasebj.2020.34.s1.09819
Bottos, S., & Balasingam, B. (2020). Tracking the Progression of Reading Using Eye-Gaze Point Measurements and Hidden Markov Models. IEEE Transactions on Instrumentation and Measurement, 1–1. https://doi.org/10.1109/TIM.2020.2983525
dos Santos, J. P. M., Ferreira, H., Reis, J., Prata, D., Simões, S. P., & Borges, I. D. (2020). The Use of Consumer Neuroscience Knowledge in Improving Real Promotional Media: The Case of Worten. In Á. Rocha, J. L. Reis, M. K. Peter, & Z. Bogdanović (Eds.), Marketing and Smart Technologies (pp. 202–218). Springer. https://doi.org/10.1007/978-981-15-1564-4_20
Pavisian, B., Patel, V. P., & Feinstein, A. (2019). Cognitive mediated eye movements during the SDMT reveal the challenges with processing speed faced by people with MS. BMC Neurology, 19(1), 340. https://doi.org/10.1186/s12883-019-1543-8
Villamor, M. M., & Rodrigo, Ma. M. T. (2019). Gaze collaboration patterns of successful and unsuccessful programming pairs using cross-recurrence quantification analysis. Research and Practice in Technology Enhanced Learning, 14(1), 25. https://doi.org/10.1186/s41039-019-0118-z
Václavíková, Z. (2019). Eye-tracker technology in didactic research. AIP Conference Proceedings, 2186(1), 060019. https://doi.org/10.1063/1.5137973
Larson, D. J., Wetherbee, J. C., & Branscum, P. (2019). CrossFit Athletic Identity’s Relationship to Sponsor Recall, Recognition, and Purchase Intent. International Journal of Kinesiology and Sports Science, 7(3), 6. https://doi.org/10.7575/aiac.ijkss.v.7n.3p.6
Bottos, S., & Balasingam, B. (2019). A Novel Slip-Kalman Filter to Track the Progression of Reading Through Eye-Gaze Measurements. ArXiv:1907.07232 [Cs, Eess]. http://arxiv.org/abs/1907.07232
Wibirama, S., Santosa, P. I., Widyarani, P., Brilianto, N., & Hafidh, W. (2019). Physical discomfort and eye movements during arbitrary and optical flow-like motions in stereo 3D contents. Virtual Reality. https://doi.org/10.1007/s10055-019-00386-w
Seha, S., Papangelakis, G., Hatzinakos, D., Zandi, A. S., & Comeau, F. J. (2019). Improving Eye Movement Biometrics Using Remote Registration of Eye Blinking Patterns. ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2562–2566. https://doi.org/10.1109/ICASSP.2019.8683757
Zhu, H., Salcudean, S. E., & Rohling, R. N. (2019). A novel gaze-supported multimodal human–computer interaction for ultrasound machines. International Journal of Computer Assisted Radiology and Surgery. https://doi.org/10.1007/s11548-019-01964-8
Tanner, S. A., McCarthy, M. B., & O’Reilly, S. J. (2019). Exploring the roles of motivation and cognition in label-usage using a combined eye-tracking and retrospective think aloud approach. Appetite, 135, 146–158. https://doi.org/10.1016/j.appet.2018.11.015