Gazepoint Citations

We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a short list of publications which we have found to date. If you have published your research which uses the Gazepoint system, please let us know and we will add a citation to your work here.

Sibley, C., Foroughi, C., Brown, N., Drollinger, S., Phillips, H., & Coyne, J. (2021). Augmenting Traditional Performance Analyses with Eye Tracking Metrics. In H. Ayaz & U. Asgher (Eds.), Advances in Neuroergonomics and Cognitive Engineering (pp. 118–125). Springer International Publishing.
Zuo, C., Ding, L., & Meng, L. (2020). A Feasibility Study of Map-Based Dashboard for Spatiotemporal Knowledge Acquisition and Analysis. ISPRS International Journal of Geo-Information, 9(11), 636.
Pillai, P., Ayare, P., Balasingam, B., Milne, K., & Biondi, F. (2020). Response Time and Eye Tracking Datasets for Activities Demanding Varying Cognitive Load. Data in Brief, 106389.
Karargyris, A., Kashyap, S., Lourentzou, I., Wu, J., Sharma, A., Tong, M., Abedin, S., Beymer, D., Mukherjee, V., Krupinski, E. A., & Moradi, M. (2020). Creation and Validation of a Chest X-Ray Dataset with Eye-tracking and Report Dictation for AI Development. ArXiv:2009.07386 [Cs].
Pirruccio, M., Monaco, S., Della Libera, C., & Cattaneo, L. (2020). Gaze direction influences grasping actions towards unseen, haptically explored, objects. Scientific Reports, 10(1), 15774.
Clark, S., & Jasra, S. K. (2020). Detecting Differences Between Concealed and Unconcealed Emotions Using iMotions EMOTIENT. Journal of Emerging Forensic Sciences Research, 5(1), 1–24.
Pritalia, G. L., Wibirama, S., Adji, T. B., & Kusrohmaniah, S. (2020). Classification of Learning Styles in Multimedia Learning Using Eye-Tracking and Machine Learning. 2020 FORTEI-International Conference on Electrical Engineering (FORTEI-ICEE), 145–150.
Millán, Y. A., Chaves, M. L., & Barrero, J. C. (2020). A Review on Biometric Devices to be Applied in ASD Interventions. 2020 Congreso Internacional de Innovación y Tendencias En Ingeniería (CONIITI), 1–6.
Ngan, H. F. B., Bavik, A., Kuo, C.-F., & Yu, C.-E. (2020). WHERE YOU LOOK DEPENDS ON WHAT YOU ARE WILLING TO AFFORD: EYE TRACKING IN MENUS. Journal of Hospitality & Tourism Research, 1096348020951226.
Shi, M., Ming, H., Liu, Y., Mao, T., Zhu, D., Wang, Z., & Zhang, F. (2020). Saliency-dependent adaptive remeshing for cloth simulation. Textile Research Journal, 0040517520944248.
Volonte, M., Anaraky, R. G., Venkatakrishnan, R., Venkatakrishnan, R., Knijnenburg, B. P., Duchowski, A. T., & Babu, S. V. (2020). Empirical evaluation and pathway modeling of visual attention to virtual humans in an appearance fidelity continuum. Journal on Multimodal User Interfaces.
Bristol, S., Agostine, S., Dallman, A., Harrop, C., Crais, E., Baranek, G., & Watson, L. (2020). Visual Biases and Attentional Inflexibilities Differentiate Those at Elevated Likelihood of Autism: An Eye-Tracking Study. American Journal of Occupational Therapy, 74(4_Supplement_1), 7411505221p1-7411505221p1.
Kuo, C.-F., Bavik, A., Ngan, H. F. B., & Yu, C.-E. (2020). The sweet spot in the eye of the beholder? Exploring the sweet sour spots of Asian restaurant menus. Journal of Hospitality Marketing & Management, 0(0), 1–16.
Doerflinger, J. T., & Gollwitzer, P. M. (2020). Emotion emphasis effects in moral judgment are moderated by mindsets. Motivation and Emotion.
Bhowmik, S., Motin, M. A., Sarossy, M., Radcliffe, P., & Kumar, D. (2020). Sample entropy analysis of pupillary signals in glaucoma patients and control via light-induced pupillometry. 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), 280–283.
Ebaid, D., & Crewther, S. G. (2020). The Contribution of Oculomotor Functions to Rates of Visual Information Processing in Younger and Older Adults. Scientific Reports, 10(1), 10129.
Biondi, F. N., Balasingam, B., & Ayare, P. (2020). On the Cost of Detection Response Task Performance on Cognitive Load. Human Factors, 0018720820931628.
Pires, L. de F. (2020). Master’s students’ post-editing perception and strategies: Exploratory study. FORUM. Revue Internationale d’interprétation et de Traduction / International Journal of Interpretation and Translation, 18(1), 26–44.
Brand, J., Masterson, T. D., Emond, J. A., Lansigan, R., & Gilbert-Diamond, D. (2020). Measuring attentional bias to food cues in young children using a visual search task: An eye-tracking study. Appetite, 148, 104610.
Kim, S., Pollanen, M., Reynolds, M. G., & Burr, W. S. (2020). Problem Solving as a Path to Comprehension. Mathematics in Computer Science.
Eraslan, S., Yesilada, Y., Yaneva, V., & Ha, L. A. (2020). “Keep it simple!”: an eye-tracking study for exploring complexity and distinguishability of web pages for people with autism. Universal Access in the Information Society.
Sulikowski, P., & Zdziebko, T. (2020). Deep Learning-Enhanced Framework for Performance Evaluation of a Recommending Interface with Varied Recommendation Position and Intensity Based on Eye-Tracking Equipment Data Processing. Electronics, 9(2), 266.
Lewis, G. A., & Bidelman, G. M. (2020). Autonomic Nervous System Correlates of Speech Categorization Revealed Through Pupillometry. Frontiers in Neuroscience, 13.
Kővári, A., Katona, J., & Pop, C. (2020). Evaluation of Eye-Movement Metrics in a Software Debugging Task using GP3 Eye Tracker. Acta Polytechnica Hungarica, 17, 57–76.
Kővári, A., Katona, J., & Pop, C. (2020). Quantitative Analysis of Relationship Between Visual Attention and Eye-Hand Coordination. Acta Polytechnica Hungarica, 17, 77–95.
Malhotra, A., Sankaran, A., Vatsa, M., Singh, R., Morris, K. B., & Noore, A. (2020). Understanding ACE-V Latent Fingerprint Examination Process via Eye-Gaze Analysis. IEEE Transactions on Biometrics, Behavior, and Identity Science, 1–1.
Knogler, V. (2020). Viewing Behaviour and Task Performance on Austrian Destination Websites: Comparing Generation Y and the Baby Boomers. In M. Rainoldi & M. Jooss (Eds.), Eye Tracking in Tourism (pp. 225–241). Springer International Publishing.
Gwizdka, J., & Dillon, A. (2020). Eye-Tracking as a Method for Enhancing Research on Information Search. In W. T. Fu & H. van Oostendorp (Eds.), Understanding and Improving Information Search: A Cognitive Approach (pp. 161–181). Springer International Publishing.
Acero-Mondragon, E. J., Chaustre-Nieto, L. C., Urdaneta-Paredes, D. A., Cortes-Cabrera, J. A., & Gallego-Correa, J. J. (2020). Left -Right Pupil Diameter Difference-During Radiographic Reading of Broncopulmonary Carcinoma: An Exploration with Cognitive Load Among Novices and Experts. The FASEB Journal, 34(S1), 1–1.
Bottos, S., & Balasingam, B. (2020). Tracking the Progression of Reading Using Eye-Gaze Point Measurements and Hidden Markov Models. IEEE Transactions on Instrumentation and Measurement, 1–1.
dos Santos, J. P. M., Ferreira, H., Reis, J., Prata, D., Simões, S. P., & Borges, I. D. (2020). The Use of Consumer Neuroscience Knowledge in Improving Real Promotional Media: The Case of Worten. In Á. Rocha, J. L. Reis, M. K. Peter, & Z. Bogdanović (Eds.), Marketing and Smart Technologies (pp. 202–218). Springer.
Pavisian, B., Patel, V. P., & Feinstein, A. (2019). Cognitive mediated eye movements during the SDMT reveal the challenges with processing speed faced by people with MS. BMC Neurology, 19(1), 340.
Villamor, M. M., & Rodrigo, Ma. M. T. (2019). Gaze collaboration patterns of successful and unsuccessful programming pairs using cross-recurrence quantification analysis. Research and Practice in Technology Enhanced Learning, 14(1), 25.
Václavíková, Z. (2019). Eye-tracker technology in didactic research. AIP Conference Proceedings, 2186(1), 060019.
Larson, D. J., Wetherbee, J. C., & Branscum, P. (2019). CrossFit Athletic Identity’s Relationship to Sponsor Recall, Recognition, and Purchase Intent. International Journal of Kinesiology and Sports Science, 7(3), 6.
Bottos, S., & Balasingam, B. (2019). A Novel Slip-Kalman Filter to Track the Progression of Reading Through Eye-Gaze Measurements. ArXiv:1907.07232 [Cs, Eess].
Wibirama, S., Santosa, P. I., Widyarani, P., Brilianto, N., & Hafidh, W. (2019). Physical discomfort and eye movements during arbitrary and optical flow-like motions in stereo 3D contents. Virtual Reality.
Seha, S., Papangelakis, G., Hatzinakos, D., Zandi, A. S., & Comeau, F. J. (2019). Improving Eye Movement Biometrics Using Remote Registration of Eye Blinking Patterns. ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2562–2566.
Zhu, H., Salcudean, S. E., & Rohling, R. N. (2019). A novel gaze-supported multimodal human–computer interaction for ultrasound machines. International Journal of Computer Assisted Radiology and Surgery.
Tanner, S. A., McCarthy, M. B., & O’Reilly, S. J. (2019). Exploring the roles of motivation and cognition in label-usage using a combined eye-tracking and retrospective think aloud approach. Appetite, 135, 146–158.
Lim, Y., Gardi, A., Pongsakornsathien, N., Sabatini, R., Ezer, N., & Kistan, T. (2019). Experimental Characterisation of Eye-Tracking Sensors for Adaptive Human-Machine Systems. Measurement.
Hienert, D., Kern, D., Mitsui, M., Shah, C., & Belkin, N. J. (2019). Reading Protocol: Understanding what has been Read in Interactive Information Retrieval Tasks. ArXiv:1902.04262 [Cs].
Bottos, S., & Balasingam, B. (2019). An Approach to Track Reading Progression Using Eye-Gaze Fixation Points. ArXiv:1902.03322 [Cs].
Jeong, H., & Liu, Y. (2019). Effects of non-driving-related-task modality and road geometry on eye movements, lane-keeping performance, and workload while driving. Transportation Research Part F: Traffic Psychology and Behaviour, 60, 157–171.
Calado, J., Marcelino-Jesus, E., Ferreira, F., & Sarraipa, J. (2019). EYE-TRACKING STUDENT’S BEHAVIOUR FOR E-LEARNING IMPROVEMENT. 8978–8986.
Pfarr, J., Ganter, M. T., Spahn, D. R., Noethiger, C. B., & Tscholl, D. W. (2019). Avatar-Based Patient Monitoring With Peripheral Vision: A Multicenter Comparative Eye-Tracking Study. Journal of Medical Internet Research, 21(7), e13041.
Kannegieser, E., Atorf, D., & Meier, J. (2019). Conducting an Experiment for Validating the Combined Model of Immersion and Flow. CSEDU.
Neomániová, K., Berčík, J., & Pavelka, A. (2019). The Use of Eye‑Tracker and Face Reader as Useful Consumer Neuroscience Tools Within Logo Creation. Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis, 67(4), 1061–1070.
Swift, D., & Schofield, D. (2019). THE IMPACT OF COLOR ON SECONDARY TASK TIME WHILE DRIVING. International Journal of Information Technology, 4(1), 19.
Russell, C., & crusse. (2019). “I Consent”: An Eye-Tracking Study of IRB Informed Consent Forms.