Gazepoint Citations

We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a shortlist of publications that we have found to date. If you are interested in using our best eye-tracking software for marketers in your research and don’t have the software yet, shop now or contact us to get started!

If you have published your research from your neuromarketing study that uses the Gazepoint system, please let us know and we will add a link to your work here! Our suggested reference to cite Gazepoint in your research is: Gazepoint (2021). GP3 Eye-Tracker. Retrieved from https://www.gazept.com

Menzel, T., Teubner, T., Adam, M. T. P., & Toreini, P. (2022). Home is where your Gaze is – Evaluating effects of embedding regional cues in user interfaces. Computers in Human Behavior, 136, 107369. https://doi.org/10.1016/j.chb.2022.107369
Veerabhadrappa, R., Hettiarachchi, I. T., Hanoun, S., Jia, D., Hosking, S. G., & Bhatti, A. (2022). Evaluating Operator Training Performance Using Recurrence Quantification Analysis of Autocorrelation Transformed Eye Gaze Data. Human Factors, 00187208221116953. https://doi.org/10.1177/00187208221116953
Spitzer, L., & Mueller, S. (2022). Using a test battery to compare three remote, video-based eye-trackers. 2022 Symposium on Eye Tracking Research and Applications, 1–7. https://doi.org/10.1145/3517031.3529644
Destyanto, T. Y. R., & Lin, R. F. (2022). Evaluating the Effectiveness of Complexity Features of Eye Movement on Computer Activities Detection. Healthcare, 10(6), 1016. https://doi.org/10.3390/healthcare10061016
Cybulski, P. (2022). An Empirical Study on the Effects of Temporal Trends in Spatial Patterns on Animated Choropleth Maps. ISPRS International Journal of Geo-Information, 11(5), 273. https://doi.org/10.3390/ijgi11050273
Stojmenović, M., Spero, E., Stojmenović, M., & Biddle, R. (2022). What is Beautiful is Secure. ACM Transactions on Privacy and Security. https://doi.org/10.1145/3533047
Maniglia, M., Contemori, G., Marini, E., & Battaglini, L. (2022). Contrast adaptation of flankers reduces collinear facilitation and inhibition. Vision Research, 193, 107979. https://doi.org/10.1016/j.visres.2021.107979
Kävrestad, J., Hagberg, A., Nohlberg, M., Rambusch, J., Roos, R., & Furnell, S. (2022). Evaluation of Contextual and Game-Based Training for Phishing Detection. Future Internet, 14(4), 104. https://doi.org/10.3390/fi14040104
Veerabhadrappa, R., Hettiarachchi, I. T., & Bhatti, A. (2022). Using Eye-tracking To Investigate The Effect of Gaze Co-occurrence and Distribution on Collaborative Performance. 2022 IEEE International Systems Conference (SysCon), 1–8. https://doi.org/10.1109/SysCon53536.2022.9773860
Veerabhadrappa, R., Hettiarachchi, I. T., & Bhatti, A. (2022). Gaze Convergence Based Collaborative Performance Prediction in a 3-Member Joint Activity Setting. 2022 IEEE International Systems Conference (SysCon), 1–7. https://doi.org/10.1109/SysCon53536.2022.9773865
Pietras, K., & Ganczarek, J. (2022). Aesthetic Reactions to Violations in Contemporary Art: The Role of Expertise and Individual Differences. Creativity Research Journal, 0(0), 1–15. https://doi.org/10.1080/10400419.2022.2046909
Hidalgo, C., Mohamed, I., Zielinski, C., & Schön, D. (2022). The effect of speech degradation on the ability to track and predict turn structure in conversation. Cortex. https://doi.org/10.1016/j.cortex.2022.01.020
Singh, G., Maurya, A., & Goel, R. (2022). Integrating New Technologies in International Business: Opportunities and Challenges. CRC Press.
D’Anselmo, A., Pisani, A., & Brancucci, A. (2022). A tentative I/O curve with consciousness: Effects of multiple simultaneous ambiguous figures presentation on perceptual reversals and time estimation. Consciousness and Cognition, 99, 103300. https://doi.org/10.1016/j.concog.2022.103300
Srinivasan, R., Turpin, A., & McKendrick, A. M. (2022). Developing a Screening Tool for Areas of Abnormal Central Vision Using Visual Stimuli With Natural Scene Statistics. Translational Vision Science & Technology, 11(2), 34. https://doi.org/10.1167/tvst.11.2.34
Dang, A., & Nichols, B. S. (2022). Consumer response to positive nutrients on the facts up front (FUF) label: A comparison between healthy and unhealthy foods and the role of nutrition motivation. Journal of Marketing Theory and Practice, 0(0), 1–20. https://doi.org/10.1080/10696679.2021.2020662
Zhu, H., Salcudean, S., & Rohling, R. (2022). Gaze-Guided Class Activation Mapping: Leveraging Human Attention for Network Attention in Chest X-rays Classification. ArXiv:2202.07107 [Cs, Eess]. http://arxiv.org/abs/2202.07107
Ivančić Valenko, S., Keček, D., Čačić, M., & Slanec, K. (2022). The Impact of a Web Banner Position on the Webpage User Experience. Tehnički Glasnik, 16(1), 93–97. https://doi.org/10.31803/tg-20211119110843
Koutsogiorgi, C. C., & Michaelides, M. P. (2022). Response tendencies due to item wording using eye-tracking methodology accounting for individual differences and item characteristics. Behavior Research Methods. https://doi.org/10.3758/s13428-021-01719-x
Katona, J. (2022). Measuring Cognition Load Using Eye-Tracking Parameters Based on Algorithm Description Tools. Sensors, 22(3), 912. https://doi.org/10.3390/s22030912
Porta, M., Dondi, P., Zangrandi, N., & Lombardi, L. (2022). Gaze-Based Biometrics From Free Observation of Moving Elements. IEEE Transactions on Biometrics, Behavior, and Identity Science, 4(1), 85–96. https://doi.org/10.1109/TBIOM.2021.3130798
Salaken, S. M., Hettiarachchi, I., Munia, A. A., Hasan, M. M., Khosravi, A., Mohamed, S., & Rahman, A. (2022). Predicting Cognitive Load of an Individual With Knowledge Gained From Others: Improvements in Performance Using Crowdsourcing. IEEE Systems, Man, and Cybernetics Magazine, 8(1), 4–15. https://doi.org/10.1109/MSMC.2021.3103498
Kim, M., Jeong, H., Kantharaju, P., Yoo, D., Jacobson, M., Shin, D., Han, C., & Patton, J. L. (2022). Visual guidance can help with the use of a robotic exoskeleton during human walking. Scientific Reports, 12(1), 3881. https://doi.org/10.1038/s41598-022-07736-w
Li, H. X., Mancuso, V., & McGuire, S. (2022). Integrated Sensors Platform. In D. Harris & W.-C. Li (Eds.), Engineering Psychology and Cognitive Ergonomics (pp. 64–73). Springer International Publishing. https://doi.org/10.1007/978-3-031-06086-1_5
Mariam, K., Afzal, O. M., Hussain, W., Javed, M. U., Kiyani, A., Rajpoot, N., Khurram, S. A., & Khan, H. A. (2022). On Smart Gaze based Annotation of Histopathology Images for Training of Deep Convolutional Neural Networks. IEEE Journal of Biomedical and Health Informatics, 1–1. https://doi.org/10.1109/JBHI.2022.3148944
Nakamura, G., Tatsukawa, S., Omori, K., Fukui, K., Sagara, J., & Chin, T. (2021). Evaluation and Training System of PC Operation for Elderly, Using Gazing Point and Mouse Operation. 2021 International Conference on Electrical, Computer and Energy Technologies (ICECET), 1–5. https://doi.org/10.1109/ICECET52533.2021.9698528
Patel, A. N., Chau, G., Chang, C., Sun, A., Huang, J., Jung, T.-P., & Gilja, V. (2021). Affective response to volitional input perturbations in obstacle avoidance and target tracking games. 2021 43rd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), 6679–6682. https://doi.org/10.1109/EMBC46164.2021.9630523
Oxford Business College, 65 George Street, Oxford, UK & Institute for Neuromarketing, Jurja Ves III spur no 4, Zagreb, Croatia, & Sola, Dr. H. M. (2021). How Neuroscience-Based Research Methodologies Can Deliver New Insights to Marketers. International Journal of Social Science and Human Research, 04(10). https://doi.org/10.47191/ijsshr/v4-i10-41
Zhou, Y. (2021). Eyes Move, Drones Move Explore the Feasibility of Various Eye Movement Control Intelligent Drones. 2021 IEEE International Conference on Data Science and Computer Application (ICDSCA), 508–513. https://doi.org/10.1109/ICDSCA53499.2021.9650336
Cuve, H. C., Stojanov, J., Roberts-Gaal, X., Catmur, C., & Bird, G. (2021). Validation of Gazepoint low-cost eye-tracking and psychophysiology bundle. Behavior Research Methods. https://doi.org/10.3758/s13428-021-01654-x
Ranalli, J. (2021). L2 student engagement with automated feedback on writing: Potential for learning and issues of trust. Journal of Second Language Writing, 52, 100816. https://doi.org/10.1016/j.jslw.2021.100816
Moriishi, C., Shunta, M., Ogishima, H., & Shimada, H. (2021). Effects of cortisol on retrieval of extinction memory in individuals with social anxiety. Comprehensive Psychoneuroendocrinology, 100060. https://doi.org/10.1016/j.cpnec.2021.100060
Hu, X., Nakatsuru, S., Ban, Y., Fukui, R., & Warisawa, S. (2021). A Physiology-Based Approach for Estimation of Mental Fatigue Levels With Both High Time Resolution and High Level of Granularity. Informatics in Medicine Unlocked, 100594. https://doi.org/10.1016/j.imu.2021.100594
Avoyan, A., Ribeiro, M., Schotter, A., Schotter, E. R., Vaziri, M., & Zou, M. (2021). PLANNED VS. ACTUAL ATTENTION (SSRN Scholarly Paper ID 3836157). Social Science Research Network. https://doi.org/10.2139/ssrn.3836157
Karargyris, A., Kashyap, S., Lourentzou, I., Wu, J. T., Sharma, A., Tong, M., Abedin, S., Beymer, D., Mukherjee, V., Krupinski, E. A., & Moradi, M. (2021). Creation and validation of a chest X-ray dataset with eye-tracking and report dictation for AI development. Scientific Data, 8(1), 92. https://doi.org/10.1038/s41597-021-00863-5
Ghiţă, A., Hernández Serrano, O., Fernández-Ruiz, J., Moreno, M., Monras, M., Ortega, L., Mondon, S., Teixidor, L., Gual, A., Gacto-Sanchez, M., Porras Garcia, B., Ferrer-García, M., & Gutiérrez-Maldonado, J. (2021). Attentional Bias, Alcohol Craving, and Anxiety Implications of the Virtual Reality Cue-Exposure Therapy in Severe Alcohol Use Disorder: A Case Report. Frontiers in Psychology, 12, 543586. https://doi.org/10.3389/fpsyg.2021.543586
Sulikowski, P., Zdziebko, T., Coussement, K., Dyczkowski, K., Kluza, K., & Sachpazidu-Wójcicka, K. (2021). Gaze and Event Tracking for Evaluation of Recommendation-Driven Purchase. Sensors, 21, 1381. https://doi.org/10.3390/s21041381
Seha, S. N. A., Hatzinakos, D., Zandi, A. S., & Comeau, F. J. E. (2021). Improving eye movement biometrics in low frame rate eye-tracking devices using periocular and eye blinking features. Image and Vision Computing, 104124. https://doi.org/10.1016/j.imavis.2021.104124
Bažantová, S., Štiková, E., Novák, M., & Gunina, D. (2021). Erotic appeals in advertising: visual attention and perceived appropriateness. Media Studies, 12(24), 21–39. https://hrcak.srce.hr/ojs/index.php/medijske-studije/article/view/14371
L.s, K., G.a, Y., V.i, Z., I.i, G., & B.Yu, P. (2021). Assessing the Aircraft Crew Activity Basing on Video Oculography Data. Experimental Psychology (Russia), 14(1), 204–222. https://doi.org/10.17759/exppsy.2021140110
Bhowmick, S., Arjunan, S. P., Sarossy, M., Radcliffe, P., & Kumar, D. K. (2021). Pupillometric recordings to detect glaucoma eyes. Physiological Measurement. https://doi.org/10.1088/1361-6579/abf05c
Sibley, C., Foroughi, C., Brown, N., Drollinger, S., Phillips, H., & Coyne, J. (2021). Augmenting Traditional Performance Analyses with Eye Tracking Metrics. In H. Ayaz & U. Asgher (Eds.), Advances in Neuroergonomics and Cognitive Engineering (pp. 118–125). Springer International Publishing. https://doi.org/10.1007/978-3-030-51041-1_17
Lee, T. L., Yeung, M. K., Sze, S. L., & Chan, A. S. (2020). Computerized Eye-Tracking Training Improves the Saccadic Eye Movements of Children with Attention-Deficit/Hyperactivity Disorder. Brain Sciences, 10(12). https://doi.org/10.3390/brainsci10121016
Brand, J., Diamond, S. G., Thomas, N., & Gilbert-Diamond, D. (2020). Evaluating the data quality of the Gazepoint GP3 low-cost eye tracker when used independently by study participants. Behavior Research Methods. https://doi.org/10.3758/s13428-020-01504-2
Hong, W. C. H., Ngan, H. F. B., Yu, J., & Zhao, Y. (2020). An eye-tracking study of exoticism in intra-national destinations in the Greater Bay area of China. Tourism Recreation Research, 0(0), 1–14. https://doi.org/10.1080/02508281.2020.1846431
Destyanto, T. Y. R., & Lin, R. F. (2020). Detecting computer activities using eye-movement features. Journal of Ambient Intelligence and Humanized Computing. https://doi.org/10.1007/s12652-020-02683-8
University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Jovančić, K., Milić Keresteš, N., University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Nedeljković, U., & University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia. (2020). Influence of white space on text scanning. Proceedings - The Tenth International Symposium GRID 2020, 699–706. https://doi.org/10.24867/GRID-2020-p79
University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Vladić, G., Mijatović, S., University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Bošnjaković, G., University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Jurič, I., University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia, Dimovski, V., & University of Novi Sad, Faculty of Technical Sciences, Department of Graphic Engineering and Design, Novi Sad, Serbia. (2020). Analysis of the loading animation performance and viewer perception. Proceedings - The Tenth International Symposium GRID 2020, 667–675. https://doi.org/10.24867/GRID-2020-p76
Zuo, C., Ding, L., & Meng, L. (2020). A Feasibility Study of Map-Based Dashboard for Spatiotemporal Knowledge Acquisition and Analysis. ISPRS International Journal of Geo-Information, 9(11), 636. https://doi.org/10.3390/ijgi9110636
Chauhan, H., Prasad, A., & Shukla, J. (2020). Engagement Analysis of ADHD Students using Visual Cues from Eye Tracker. Companion Publication of the 2020 International Conference on Multimodal Interaction, 27–31. https://doi.org/10.1145/3395035.3425256