Gazepoint Citations

We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a short list of publications which we have found to date. If you have published your research which uses the Gazepoint system, please let us know and we will add a citation to your work here.

Sibley, C., Foroughi, C., Brown, N., Drollinger, S., Phillips, H., & Coyne, J. (2021). Augmenting Traditional Performance Analyses with Eye Tracking Metrics. In H. Ayaz & U. Asgher (Eds.), Advances in Neuroergonomics and Cognitive Engineering (pp. 118–125). Springer International Publishing. https://doi.org/10.1007/978-3-030-51041-1_17
Zuo, C., Ding, L., & Meng, L. (2020). A Feasibility Study of Map-Based Dashboard for Spatiotemporal Knowledge Acquisition and Analysis. ISPRS International Journal of Geo-Information, 9(11), 636. https://doi.org/10.3390/ijgi9110636
Pillai, P., Ayare, P., Balasingam, B., Milne, K., & Biondi, F. (2020). Response Time and Eye Tracking Datasets for Activities Demanding Varying Cognitive Load. Data in Brief, 106389. https://doi.org/10.1016/j.dib.2020.106389
Karargyris, A., Kashyap, S., Lourentzou, I., Wu, J., Sharma, A., Tong, M., Abedin, S., Beymer, D., Mukherjee, V., Krupinski, E. A., & Moradi, M. (2020). Creation and Validation of a Chest X-Ray Dataset with Eye-tracking and Report Dictation for AI Development. ArXiv:2009.07386 [Cs]. http://arxiv.org/abs/2009.07386
Pirruccio, M., Monaco, S., Della Libera, C., & Cattaneo, L. (2020). Gaze direction influences grasping actions towards unseen, haptically explored, objects. Scientific Reports, 10(1), 15774. https://doi.org/10.1038/s41598-020-72554-x
Clark, S., & Jasra, S. K. (2020). Detecting Differences Between Concealed and Unconcealed Emotions Using iMotions EMOTIENT. Journal of Emerging Forensic Sciences Research, 5(1), 1–24. https://jefsr.uwindsor.ca/index.php/jefsr/article/view/6376
Pritalia, G. L., Wibirama, S., Adji, T. B., & Kusrohmaniah, S. (2020). Classification of Learning Styles in Multimedia Learning Using Eye-Tracking and Machine Learning. 2020 FORTEI-International Conference on Electrical Engineering (FORTEI-ICEE), 145–150. https://doi.org/10.1109/FORTEI-ICEE50915.2020.9249875
Millán, Y. A., Chaves, M. L., & Barrero, J. C. (2020). A Review on Biometric Devices to be Applied in ASD Interventions. 2020 Congreso Internacional de Innovación y Tendencias En Ingeniería (CONIITI), 1–6. https://doi.org/10.1109/CONIITI51147.2020.9240291
Ngan, H. F. B., Bavik, A., Kuo, C.-F., & Yu, C.-E. (2020). WHERE YOU LOOK DEPENDS ON WHAT YOU ARE WILLING TO AFFORD: EYE TRACKING IN MENUS. Journal of Hospitality & Tourism Research, 1096348020951226. https://doi.org/10.1177/1096348020951226
Shi, M., Ming, H., Liu, Y., Mao, T., Zhu, D., Wang, Z., & Zhang, F. (2020). Saliency-dependent adaptive remeshing for cloth simulation. Textile Research Journal, 0040517520944248. https://doi.org/10.1177/0040517520944248
Volonte, M., Anaraky, R. G., Venkatakrishnan, R., Venkatakrishnan, R., Knijnenburg, B. P., Duchowski, A. T., & Babu, S. V. (2020). Empirical evaluation and pathway modeling of visual attention to virtual humans in an appearance fidelity continuum. Journal on Multimodal User Interfaces. https://doi.org/10.1007/s12193-020-00341-z
Bristol, S., Agostine, S., Dallman, A., Harrop, C., Crais, E., Baranek, G., & Watson, L. (2020). Visual Biases and Attentional Inflexibilities Differentiate Those at Elevated Likelihood of Autism: An Eye-Tracking Study. American Journal of Occupational Therapy, 74(4_Supplement_1), 7411505221p1-7411505221p1. https://doi.org/10.5014/ajot.2020.74S1-PO8133
Kuo, C.-F., Bavik, A., Ngan, H. F. B., & Yu, C.-E. (2020). The sweet spot in the eye of the beholder? Exploring the sweet sour spots of Asian restaurant menus. Journal of Hospitality Marketing & Management, 0(0), 1–16. https://doi.org/10.1080/19368623.2020.1790076
Doerflinger, J. T., & Gollwitzer, P. M. (2020). Emotion emphasis effects in moral judgment are moderated by mindsets. Motivation and Emotion. https://doi.org/10.1007/s11031-020-09847-1
Bhowmik, S., Motin, M. A., Sarossy, M., Radcliffe, P., & Kumar, D. (2020). Sample entropy analysis of pupillary signals in glaucoma patients and control via light-induced pupillometry. 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), 280–283. https://doi.org/10.1109/EMBC44109.2020.9176558
Ebaid, D., & Crewther, S. G. (2020). The Contribution of Oculomotor Functions to Rates of Visual Information Processing in Younger and Older Adults. Scientific Reports, 10(1), 10129. https://doi.org/10.1038/s41598-020-66773-5
Biondi, F. N., Balasingam, B., & Ayare, P. (2020). On the Cost of Detection Response Task Performance on Cognitive Load. Human Factors, 0018720820931628. https://doi.org/10.1177/0018720820931628
Pires, L. de F. (2020). Master’s students’ post-editing perception and strategies: Exploratory study. FORUM. Revue Internationale d’interprétation et de Traduction / International Journal of Interpretation and Translation, 18(1), 26–44. https://doi.org/10.1075/forum.19014.pir
Brand, J., Masterson, T. D., Emond, J. A., Lansigan, R., & Gilbert-Diamond, D. (2020). Measuring attentional bias to food cues in young children using a visual search task: An eye-tracking study. Appetite, 148, 104610. https://doi.org/10.1016/j.appet.2020.104610
Kim, S., Pollanen, M., Reynolds, M. G., & Burr, W. S. (2020). Problem Solving as a Path to Comprehension. Mathematics in Computer Science. https://doi.org/10.1007/s11786-020-00457-1
Eraslan, S., Yesilada, Y., Yaneva, V., & Ha, L. A. (2020). “Keep it simple!”: an eye-tracking study for exploring complexity and distinguishability of web pages for people with autism. Universal Access in the Information Society. https://doi.org/10.1007/s10209-020-00708-9
Sulikowski, P., & Zdziebko, T. (2020). Deep Learning-Enhanced Framework for Performance Evaluation of a Recommending Interface with Varied Recommendation Position and Intensity Based on Eye-Tracking Equipment Data Processing. Electronics, 9(2), 266. https://doi.org/10.3390/electronics9020266
Lewis, G. A., & Bidelman, G. M. (2020). Autonomic Nervous System Correlates of Speech Categorization Revealed Through Pupillometry. Frontiers in Neuroscience, 13. https://doi.org/10.3389/fnins.2019.01418
Kővári, A., Katona, J., & Pop, C. (2020). Evaluation of Eye-Movement Metrics in a Software Debugging Task using GP3 Eye Tracker. Acta Polytechnica Hungarica, 17, 57–76. https://doi.org/10.12700/APH.17.2.2020.2.4
Kővári, A., Katona, J., & Pop, C. (2020). Quantitative Analysis of Relationship Between Visual Attention and Eye-Hand Coordination. Acta Polytechnica Hungarica, 17, 77–95. https://doi.org/10.12700/APH.17.2.2020.2.5
Malhotra, A., Sankaran, A., Vatsa, M., Singh, R., Morris, K. B., & Noore, A. (2020). Understanding ACE-V Latent Fingerprint Examination Process via Eye-Gaze Analysis. IEEE Transactions on Biometrics, Behavior, and Identity Science, 1–1. https://doi.org/10.1109/TBIOM.2020.3027144
Knogler, V. (2020). Viewing Behaviour and Task Performance on Austrian Destination Websites: Comparing Generation Y and the Baby Boomers. In M. Rainoldi & M. Jooss (Eds.), Eye Tracking in Tourism (pp. 225–241). Springer International Publishing. https://doi.org/10.1007/978-3-030-49709-5_14
Gwizdka, J., & Dillon, A. (2020). Eye-Tracking as a Method for Enhancing Research on Information Search. In W. T. Fu & H. van Oostendorp (Eds.), Understanding and Improving Information Search: A Cognitive Approach (pp. 161–181). Springer International Publishing. https://doi.org/10.1007/978-3-030-38825-6_9
Acero-Mondragon, E. J., Chaustre-Nieto, L. C., Urdaneta-Paredes, D. A., Cortes-Cabrera, J. A., & Gallego-Correa, J. J. (2020). Left -Right Pupil Diameter Difference-During Radiographic Reading of Broncopulmonary Carcinoma: An Exploration with Cognitive Load Among Novices and Experts. The FASEB Journal, 34(S1), 1–1. https://doi.org/10.1096/fasebj.2020.34.s1.09819
Bottos, S., & Balasingam, B. (2020). Tracking the Progression of Reading Using Eye-Gaze Point Measurements and Hidden Markov Models. IEEE Transactions on Instrumentation and Measurement, 1–1. https://doi.org/10.1109/TIM.2020.2983525
dos Santos, J. P. M., Ferreira, H., Reis, J., Prata, D., Simões, S. P., & Borges, I. D. (2020). The Use of Consumer Neuroscience Knowledge in Improving Real Promotional Media: The Case of Worten. In Á. Rocha, J. L. Reis, M. K. Peter, & Z. Bogdanović (Eds.), Marketing and Smart Technologies (pp. 202–218). Springer. https://doi.org/10.1007/978-981-15-1564-4_20
Pavisian, B., Patel, V. P., & Feinstein, A. (2019). Cognitive mediated eye movements during the SDMT reveal the challenges with processing speed faced by people with MS. BMC Neurology, 19(1), 340. https://doi.org/10.1186/s12883-019-1543-8
Villamor, M. M., & Rodrigo, Ma. M. T. (2019). Gaze collaboration patterns of successful and unsuccessful programming pairs using cross-recurrence quantification analysis. Research and Practice in Technology Enhanced Learning, 14(1), 25. https://doi.org/10.1186/s41039-019-0118-z
Václavíková, Z. (2019). Eye-tracker technology in didactic research. AIP Conference Proceedings, 2186(1), 060019. https://doi.org/10.1063/1.5137973
Larson, D. J., Wetherbee, J. C., & Branscum, P. (2019). CrossFit Athletic Identity’s Relationship to Sponsor Recall, Recognition, and Purchase Intent. International Journal of Kinesiology and Sports Science, 7(3), 6. https://doi.org/10.7575/aiac.ijkss.v.7n.3p.6
Bottos, S., & Balasingam, B. (2019). A Novel Slip-Kalman Filter to Track the Progression of Reading Through Eye-Gaze Measurements. ArXiv:1907.07232 [Cs, Eess]. http://arxiv.org/abs/1907.07232
Wibirama, S., Santosa, P. I., Widyarani, P., Brilianto, N., & Hafidh, W. (2019). Physical discomfort and eye movements during arbitrary and optical flow-like motions in stereo 3D contents. Virtual Reality. https://doi.org/10.1007/s10055-019-00386-w
Seha, S., Papangelakis, G., Hatzinakos, D., Zandi, A. S., & Comeau, F. J. (2019). Improving Eye Movement Biometrics Using Remote Registration of Eye Blinking Patterns. ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2562–2566. https://doi.org/10.1109/ICASSP.2019.8683757
Zhu, H., Salcudean, S. E., & Rohling, R. N. (2019). A novel gaze-supported multimodal human–computer interaction for ultrasound machines. International Journal of Computer Assisted Radiology and Surgery. https://doi.org/10.1007/s11548-019-01964-8
Tanner, S. A., McCarthy, M. B., & O’Reilly, S. J. (2019). Exploring the roles of motivation and cognition in label-usage using a combined eye-tracking and retrospective think aloud approach. Appetite, 135, 146–158. https://doi.org/10.1016/j.appet.2018.11.015
Lim, Y., Gardi, A., Pongsakornsathien, N., Sabatini, R., Ezer, N., & Kistan, T. (2019). Experimental Characterisation of Eye-Tracking Sensors for Adaptive Human-Machine Systems. Measurement. https://doi.org/10.1016/j.measurement.2019.03.032
Hienert, D., Kern, D., Mitsui, M., Shah, C., & Belkin, N. J. (2019). Reading Protocol: Understanding what has been Read in Interactive Information Retrieval Tasks. ArXiv:1902.04262 [Cs]. http://arxiv.org/abs/1902.04262
Bottos, S., & Balasingam, B. (2019). An Approach to Track Reading Progression Using Eye-Gaze Fixation Points. ArXiv:1902.03322 [Cs]. http://arxiv.org/abs/1902.03322
Jeong, H., & Liu, Y. (2019). Effects of non-driving-related-task modality and road geometry on eye movements, lane-keeping performance, and workload while driving. Transportation Research Part F: Traffic Psychology and Behaviour, 60, 157–171. https://doi.org/10.1016/j.trf.2018.10.015
Calado, J., Marcelino-Jesus, E., Ferreira, F., & Sarraipa, J. (2019). EYE-TRACKING STUDENT’S BEHAVIOUR FOR E-LEARNING IMPROVEMENT. 8978–8986. https://doi.org/10.21125/edulearn.2019.2221
Pfarr, J., Ganter, M. T., Spahn, D. R., Noethiger, C. B., & Tscholl, D. W. (2019). Avatar-Based Patient Monitoring With Peripheral Vision: A Multicenter Comparative Eye-Tracking Study. Journal of Medical Internet Research, 21(7), e13041. https://doi.org/10.2196/13041
Kannegieser, E., Atorf, D., & Meier, J. (2019). Conducting an Experiment for Validating the Combined Model of Immersion and Flow. CSEDU. https://doi.org/10.5220/0007688902520259
Neomániová, K., Berčík, J., & Pavelka, A. (2019). The Use of Eye‑Tracker and Face Reader as Useful Consumer Neuroscience Tools Within Logo Creation. Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis, 67(4), 1061–1070. https://doi.org/10.11118/actaun201967041061
Swift, D., & Schofield, D. (2019). THE IMPACT OF COLOR ON SECONDARY TASK TIME WHILE DRIVING. International Journal of Information Technology, 4(1), 19.
Russell, C., & crusse. (2019). “I Consent”: An Eye-Tracking Study of IRB Informed Consent Forms.