Gazepoint Citations

We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a shortlist of publications that we have found to date. If you are interested in using our best eye-tracking software for marketers in your research and don’t have the software yet, shop now or contact us to get started!

If you have published your research from your neuromarketing study that uses the Gazepoint system, please let us know and we will add a link to your work here! Our suggested reference to cite Gazepoint in your research is: Gazepoint (2021). GP3 Eye-Tracker. Retrieved from https://www.gazept.com

Salminen, J., Jung, S., Nielsen, L., Şengün, S., & Jansen, B. J. (2022). How does varying the number of personas affect user perceptions and behavior? Challenging the ‘small personas’ hypothesis! International Journal of Human-Computer Studies, 168, 102915. https://doi.org/10.1016/j.ijhcs.2022.102915
Menzel, T., Teubner, T., Adam, M. T. P., & Toreini, P. (2022). Home is where your Gaze is – Evaluating effects of embedding regional cues in user interfaces. Computers in Human Behavior, 136, 107369. https://doi.org/10.1016/j.chb.2022.107369
Steffens, J., & Himmelein, H. (2022). Induced cognitive load influences unpleasantness judgments of modulated noise.
Pillai, P., Balasingam, B., & Biondi, F. (2022). USING SIGNAL-TO-NOISE RATIO TO EXPLORE THE COGNITIVE COST OF THE DETECTION RESPONSE TASK. https://doi.org/10.1177/1071181322661481
Souza, A., & Freitas, D. (2022). Towards the Improvement of the Cognitive Process of the Synthesized Speech of Mathematical Expression in MathML: An Eye-Tracking. 2022 International Conference on Interactive Media, Smart Systems and Emerging Technologies (IMET), 1–8. https://doi.org/10.1109/IMET54801.2022.9929541
Zyrianov, V., Peterson, C. S., Guarnera, D. T., Behler, J., Weston, P., Sharif, B., & Maletic, J. I. (2022). Deja Vu: semantics-aware recording and replay of high-speed eye tracking and interaction data to support cognitive studies of software engineering tasks—methodology and analyses. Empirical Software Engineering, 27(7), 168. https://doi.org/10.1007/s10664-022-10209-3
Lewandowska, A., Dziśko, M., & Jankowski, J. (2022). Investigation the role of contrast on habituation and sensitisation effects in peripheral areas of graphical user interfaces. Scientific Reports, 12(1), 15281. https://doi.org/10.1038/s41598-022-16284-2
Gawade, V., Bifulco, C., & (Grace) Guo, W. (2022). Lessons Learned to Effectively Teach and Evaluate Undergraduate Engineers in Work Design and Ergonomics Laboratory from a World Before, During, and After COVID-19. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 66(1), 756–760. https://doi.org/10.1177/1071181322661505
Gallant, S. N., Kennedy, B. L., Bachman, S. L., Huang, R., Cho, C., Lee, T.-H., & Mather, M. (2022). Behavioral and fMRI evidence that arousal enhances bottom-up selectivity in young but not older adults. Neurobiology of Aging. https://doi.org/10.1016/j.neurobiolaging.2022.08.006
Veerabhadrappa, R., Hettiarachchi, I. T., Hanoun, S., Jia, D., Hosking, S. G., & Bhatti, A. (2022). Evaluating Operator Training Performance Using Recurrence Quantification Analysis of Autocorrelation Transformed Eye Gaze Data. Human Factors, 00187208221116953. https://doi.org/10.1177/00187208221116953
Spitzer, L., & Mueller, S. (2022). Using a test battery to compare three remote, video-based eye-trackers. 2022 Symposium on Eye Tracking Research and Applications, 1–7. https://doi.org/10.1145/3517031.3529644
Destyanto, T. Y. R., & Lin, R. F. (2022). Evaluating the Effectiveness of Complexity Features of Eye Movement on Computer Activities Detection. Healthcare, 10(6), 1016. https://doi.org/10.3390/healthcare10061016
Cybulski, P. (2022). An Empirical Study on the Effects of Temporal Trends in Spatial Patterns on Animated Choropleth Maps. ISPRS International Journal of Geo-Information, 11(5), 273. https://doi.org/10.3390/ijgi11050273
Stojmenović, M., Spero, E., Stojmenović, M., & Biddle, R. (2022). What is Beautiful is Secure. ACM Transactions on Privacy and Security. https://doi.org/10.1145/3533047
Maniglia, M., Contemori, G., Marini, E., & Battaglini, L. (2022). Contrast adaptation of flankers reduces collinear facilitation and inhibition. Vision Research, 193, 107979. https://doi.org/10.1016/j.visres.2021.107979
Kävrestad, J., Hagberg, A., Nohlberg, M., Rambusch, J., Roos, R., & Furnell, S. (2022). Evaluation of Contextual and Game-Based Training for Phishing Detection. Future Internet, 14(4), 104. https://doi.org/10.3390/fi14040104
Veerabhadrappa, R., Hettiarachchi, I. T., & Bhatti, A. (2022). Using Eye-tracking To Investigate The Effect of Gaze Co-occurrence and Distribution on Collaborative Performance. 2022 IEEE International Systems Conference (SysCon), 1–8. https://doi.org/10.1109/SysCon53536.2022.9773860
Veerabhadrappa, R., Hettiarachchi, I. T., & Bhatti, A. (2022). Gaze Convergence Based Collaborative Performance Prediction in a 3-Member Joint Activity Setting. 2022 IEEE International Systems Conference (SysCon), 1–7. https://doi.org/10.1109/SysCon53536.2022.9773865
Pietras, K., & Ganczarek, J. (2022). Aesthetic Reactions to Violations in Contemporary Art: The Role of Expertise and Individual Differences. Creativity Research Journal, 0(0), 1–15. https://doi.org/10.1080/10400419.2022.2046909
Hidalgo, C., Mohamed, I., Zielinski, C., & Schön, D. (2022). The effect of speech degradation on the ability to track and predict turn structure in conversation. Cortex. https://doi.org/10.1016/j.cortex.2022.01.020
Singh, G., Maurya, A., & Goel, R. (2022). Integrating New Technologies in International Business: Opportunities and Challenges. CRC Press.
D’Anselmo, A., Pisani, A., & Brancucci, A. (2022). A tentative I/O curve with consciousness: Effects of multiple simultaneous ambiguous figures presentation on perceptual reversals and time estimation. Consciousness and Cognition, 99, 103300. https://doi.org/10.1016/j.concog.2022.103300
Srinivasan, R., Turpin, A., & McKendrick, A. M. (2022). Developing a Screening Tool for Areas of Abnormal Central Vision Using Visual Stimuli With Natural Scene Statistics. Translational Vision Science & Technology, 11(2), 34. https://doi.org/10.1167/tvst.11.2.34
Dang, A., & Nichols, B. S. (2022). Consumer response to positive nutrients on the facts up front (FUF) label: A comparison between healthy and unhealthy foods and the role of nutrition motivation. Journal of Marketing Theory and Practice, 0(0), 1–20. https://doi.org/10.1080/10696679.2021.2020662
Zhu, H., Salcudean, S., & Rohling, R. (2022). Gaze-Guided Class Activation Mapping: Leveraging Human Attention for Network Attention in Chest X-rays Classification. ArXiv:2202.07107 [Cs, Eess]. http://arxiv.org/abs/2202.07107
Ivančić Valenko, S., Keček, D., Čačić, M., & Slanec, K. (2022). The Impact of a Web Banner Position on the Webpage User Experience. Tehnički Glasnik, 16(1), 93–97. https://doi.org/10.31803/tg-20211119110843
Koutsogiorgi, C. C., & Michaelides, M. P. (2022). Response tendencies due to item wording using eye-tracking methodology accounting for individual differences and item characteristics. Behavior Research Methods. https://doi.org/10.3758/s13428-021-01719-x
Katona, J. (2022). Measuring Cognition Load Using Eye-Tracking Parameters Based on Algorithm Description Tools. Sensors, 22(3), 912. https://doi.org/10.3390/s22030912
Porta, M., Dondi, P., Zangrandi, N., & Lombardi, L. (2022). Gaze-Based Biometrics From Free Observation of Moving Elements. IEEE Transactions on Biometrics, Behavior, and Identity Science, 4(1), 85–96. https://doi.org/10.1109/TBIOM.2021.3130798
Salaken, S. M., Hettiarachchi, I., Munia, A. A., Hasan, M. M., Khosravi, A., Mohamed, S., & Rahman, A. (2022). Predicting Cognitive Load of an Individual With Knowledge Gained From Others: Improvements in Performance Using Crowdsourcing. IEEE Systems, Man, and Cybernetics Magazine, 8(1), 4–15. https://doi.org/10.1109/MSMC.2021.3103498
Kim, M., Jeong, H., Kantharaju, P., Yoo, D., Jacobson, M., Shin, D., Han, C., & Patton, J. L. (2022). Visual guidance can help with the use of a robotic exoskeleton during human walking. Scientific Reports, 12(1), 3881. https://doi.org/10.1038/s41598-022-07736-w
Yu-Wen, H., Yu-Ju, Y., & Wei, J. (2022). User Perception and Eye Movement on A Pandemic Data Visualization Dashboard. Proceedings of the Association for Information Science and Technology, 59(1), 121–131. https://doi.org/10.1002/pra2.610
Gao, H., Fan, W., Qiu, L., Yang, X., Li, Z., Zuo, X., Li, Y., Meng, M. Q.-H., & Ren, H. (2022). SAVAnet: Surgical Action-Driven Visual Attention Network for Autonomous Endoscope Control. IEEE Transactions on Automation Science and Engineering, 1–13. https://doi.org/10.1109/TASE.2022.3203631
Beşer, A., Sengewald, J., & Lackes, R. (2022). Drawing Attention on (Visually) Competitive Online Shopping Platforms – An Eye-Tracking Study Analysing the Effects of Visual Cues on the Amazon Marketplace. In Ē. Nazaruka, K. Sandkuhl, & U. Seigerroth (Eds.), Perspectives in Business Informatics Research (pp. 159–174). Springer International Publishing. https://doi.org/10.1007/978-3-031-16947-2_11
Xu, J., Guo, K., & Sun, P. Z. H. (2022). Driving Performance Under Violations of Traffic Rules: Novice Vs. Experienced Drivers. IEEE Transactions on Intelligent Vehicles, 1–10. https://doi.org/10.1109/TIV.2022.3200592
Li, H. X., Mancuso, V., & McGuire, S. (2022). Integrated Sensors Platform. In D. Harris & W.-C. Li (Eds.), Engineering Psychology and Cognitive Ergonomics (pp. 64–73). Springer International Publishing. https://doi.org/10.1007/978-3-031-06086-1_5
Mariam, K., Afzal, O. M., Hussain, W., Javed, M. U., Kiyani, A., Rajpoot, N., Khurram, S. A., & Khan, H. A. (2022). On Smart Gaze based Annotation of Histopathology Images for Training of Deep Convolutional Neural Networks. IEEE Journal of Biomedical and Health Informatics, 1–1. https://doi.org/10.1109/JBHI.2022.3148944
Nakamura, G., Tatsukawa, S., Omori, K., Fukui, K., Sagara, J., & Chin, T. (2021). Evaluation and Training System of PC Operation for Elderly, Using Gazing Point and Mouse Operation. 2021 International Conference on Electrical, Computer and Energy Technologies (ICECET), 1–5. https://doi.org/10.1109/ICECET52533.2021.9698528
Patel, A. N., Chau, G., Chang, C., Sun, A., Huang, J., Jung, T.-P., & Gilja, V. (2021). Affective response to volitional input perturbations in obstacle avoidance and target tracking games. 2021 43rd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC), 6679–6682. https://doi.org/10.1109/EMBC46164.2021.9630523
Oxford Business College, 65 George Street, Oxford, UK & Institute for Neuromarketing, Jurja Ves III spur no 4, Zagreb, Croatia, & Sola, Dr. H. M. (2021). How Neuroscience-Based Research Methodologies Can Deliver New Insights to Marketers. International Journal of Social Science and Human Research, 04(10). https://doi.org/10.47191/ijsshr/v4-i10-41
Zhou, Y. (2021). Eyes Move, Drones Move Explore the Feasibility of Various Eye Movement Control Intelligent Drones. 2021 IEEE International Conference on Data Science and Computer Application (ICDSCA), 508–513. https://doi.org/10.1109/ICDSCA53499.2021.9650336
Cuve, H. C., Stojanov, J., Roberts-Gaal, X., Catmur, C., & Bird, G. (2021). Validation of Gazepoint low-cost eye-tracking and psychophysiology bundle. Behavior Research Methods. https://doi.org/10.3758/s13428-021-01654-x
Ranalli, J. (2021). L2 student engagement with automated feedback on writing: Potential for learning and issues of trust. Journal of Second Language Writing, 52, 100816. https://doi.org/10.1016/j.jslw.2021.100816
Moriishi, C., Shunta, M., Ogishima, H., & Shimada, H. (2021). Effects of cortisol on retrieval of extinction memory in individuals with social anxiety. Comprehensive Psychoneuroendocrinology, 100060. https://doi.org/10.1016/j.cpnec.2021.100060
Hu, X., Nakatsuru, S., Ban, Y., Fukui, R., & Warisawa, S. (2021). A Physiology-Based Approach for Estimation of Mental Fatigue Levels With Both High Time Resolution and High Level of Granularity. Informatics in Medicine Unlocked, 100594. https://doi.org/10.1016/j.imu.2021.100594
Avoyan, A., Ribeiro, M., Schotter, A., Schotter, E. R., Vaziri, M., & Zou, M. (2021). PLANNED VS. ACTUAL ATTENTION (SSRN Scholarly Paper ID 3836157). Social Science Research Network. https://doi.org/10.2139/ssrn.3836157
Karargyris, A., Kashyap, S., Lourentzou, I., Wu, J. T., Sharma, A., Tong, M., Abedin, S., Beymer, D., Mukherjee, V., Krupinski, E. A., & Moradi, M. (2021). Creation and validation of a chest X-ray dataset with eye-tracking and report dictation for AI development. Scientific Data, 8(1), 92. https://doi.org/10.1038/s41597-021-00863-5
Ghiţă, A., Hernández Serrano, O., Fernández-Ruiz, J., Moreno, M., Monras, M., Ortega, L., Mondon, S., Teixidor, L., Gual, A., Gacto-Sanchez, M., Porras Garcia, B., Ferrer-García, M., & Gutiérrez-Maldonado, J. (2021). Attentional Bias, Alcohol Craving, and Anxiety Implications of the Virtual Reality Cue-Exposure Therapy in Severe Alcohol Use Disorder: A Case Report. Frontiers in Psychology, 12, 543586. https://doi.org/10.3389/fpsyg.2021.543586
Sulikowski, P., Zdziebko, T., Coussement, K., Dyczkowski, K., Kluza, K., & Sachpazidu-Wójcicka, K. (2021). Gaze and Event Tracking for Evaluation of Recommendation-Driven Purchase. Sensors, 21, 1381. https://doi.org/10.3390/s21041381
Seha, S. N. A., Hatzinakos, D., Zandi, A. S., & Comeau, F. J. E. (2021). Improving eye movement biometrics in low frame rate eye-tracking devices using periocular and eye blinking features. Image and Vision Computing, 104124. https://doi.org/10.1016/j.imavis.2021.104124