Gazepoint Citations

We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a short list of publications which we have found to date. If you have published your research which uses the Gazepoint system, please let us know and we will add a citation to your work here.

Brand, J., Masterson, T. D., Emond, J. A., Lansigan, R., & Gilbert-Diamond, D. (2020). Measuring attentional bias to food cues in young children using a visual search task: An eye-tracking study. Appetite, 148, 104610. https://doi.org/10.1016/j.appet.2020.104610
Lewis, G. A., & Bidelman, G. M. (2020). Autonomic Nervous System Correlates of Speech Categorization Revealed Through Pupillometry. Frontiers in Neuroscience, 13. https://doi.org/10.3389/fnins.2019.01418
dos Santos, J. P. M., Ferreira, H., Reis, J., Prata, D., Simões, S. P., & Borges, I. D. (2020). The Use of Consumer Neuroscience Knowledge in Improving Real Promotional Media: The Case of Worten. In Á. Rocha, J. L. Reis, M. K. Peter, & Z. Bogdanović (Eds.), Marketing and Smart Technologies (pp. 202–218). Singapore: Springer. https://doi.org/10.1007/978-981-15-1564-4_20
Pavisian, B., Patel, V. P., & Feinstein, A. (2019). Cognitive mediated eye movements during the SDMT reveal the challenges with processing speed faced by people with MS. BMC Neurology, 19(1), 340. https://doi.org/10.1186/s12883-019-1543-8
Václavíková, Z. (2019). Eye-tracker technology in didactic research. AIP Conference Proceedings, 2186(1), 060019. https://doi.org/10.1063/1.5137973
Larson, D. J., Wetherbee, J. C., & Branscum, P. (2019). CrossFit Athletic Identity’s Relationship to Sponsor Recall, Recognition, and Purchase Intent. International Journal of Kinesiology and Sports Science, 7(3), 6. https://doi.org/10.7575/aiac.ijkss.v.7n.3p.6
Seha, S., Papangelakis, G., Hatzinakos, D., Zandi, A. S., & Comeau, F. J. (2019). Improving Eye Movement Biometrics Using Remote Registration of Eye Blinking Patterns. In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (pp. 2562–2566). https://doi.org/10.1109/ICASSP.2019.8683757
Zhu, H., Salcudean, S. E., & Rohling, R. N. (2019). A novel gaze-supported multimodal human–computer interaction for ultrasound machines. International Journal of Computer Assisted Radiology and Surgery. https://doi.org/10.1007/s11548-019-01964-8
Tanner, S. A., McCarthy, M. B., & O’Reilly, S. J. (2019). Exploring the roles of motivation and cognition in label-usage using a combined eye-tracking and retrospective think aloud approach. Appetite, 135, 146–158. https://doi.org/10.1016/j.appet.2018.11.015
Lim, Y., Gardi, A., Pongsakornsathien, N., Sabatini, R., Ezer, N., & Kistan, T. (2019). Experimental Characterisation of Eye-Tracking Sensors for Adaptive Human-Machine Systems. Measurement. https://doi.org/10.1016/j.measurement.2019.03.032
Hienert, D., Kern, D., Mitsui, M., Shah, C., & Belkin, N. J. (2019). Reading Protocol: Understanding what has been Read in Interactive Information Retrieval Tasks. ArXiv:1902.04262 [Cs]. Retrieved from http://arxiv.org/abs/1902.04262
Bottos, S., & Balasingam, B. (2019). An Approach to Track Reading Progression Using Eye-Gaze Fixation Points. ArXiv:1902.03322 [Cs]. Retrieved from http://arxiv.org/abs/1902.03322
Jeong, H., & Liu, Y. (2019). Effects of non-driving-related-task modality and road geometry on eye movements, lane-keeping performance, and workload while driving. Transportation Research Part F: Traffic Psychology and Behaviour, 60, 157–171. https://doi.org/10.1016/j.trf.2018.10.015
Russell, C., & crusse. (2019). “I Consent”: An Eye-Tracking Study of IRB Informed Consent Forms.
Ćosić, K., Popović, S., Šarlija, M., Mijić, I., Kokot, M., Kesedžić, I., … Zhang, Q. (2019). New Tools and Methods in Selection of Air Traffic Controllers Based on Multimodal Psychophysiological Measurements. IEEE Access, 7, 174873–174888. https://doi.org/10.1109/ACCESS.2019.2957357
Constantinides, A., Fidas, C., Belk, M., & Pitsillides, A. (2019). “I Recall This Picture”: Understanding Picture Password Selections Based on Users’ Sociocultural Experiences. In IEEE/WIC/ACM International Conference on Web Intelligence (pp. 408–412). New York, NY, USA: ACM. https://doi.org/10.1145/3350546.3352557
Gupta, V., Chakraborty, T., Agarwal, M., Singh, R., Arora, M., & Vatsa, M. (2019). Bag-of-Lies: A Multimodal Dataset for Deception Detection, 8.
Yaneva, V., & Eraslan, S. (2019). Adults with High-functioning Autism Process Web Pages With Similar Accuracy but Higher Cognitive Effort Compared to Controls, 4.
Coba, L., Rook, L., Zanker, M., & Symeonidis, P. (2019). Decision Making Strategies Differ in the Presence of Collaborative Explanations: Two Conjoint Studies. In Proceedings of the 24th International Conference on Intelligent User Interfaces (pp. 291–302). New York, NY, USA: ACM. https://doi.org/10.1145/3301275.3302304
Coba, L., Zanker, M., & Rook, L. (2019). Decision Making Based on Bimodal Rating Summary Statistics - An Eye-Tracking Study of Hotels. In J. Pesonen & J. Neidhardt (Eds.), Information and Communication Technologies in Tourism 2019 (pp. 40–51). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-030-05940-8_4
Volonte, M., Duchowski, A. T., & Babu, S. V. (2019). Effects of a Virtual Human Appearance Fidelity Continuum on Visual Attention in Virtual Reality. In Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents (pp. 141–147). New York, NY, USA: ACM. https://doi.org/10.1145/3308532.3329461
Ohligs, M., Pereira, C., Voigt, V., Koeny, M., Janß, A., Rossaint, R., & Czaplik, M. (2019). Evaluation of an Anesthesia Dashboard Functional Model Based on a Manufacturer-Independent Communication Standard: Comparative Feasibility Study. JMIR Human Factors, 6(2), e12553. https://doi.org/10.2196/12553
Constantinides, A., Belk, M., Fidas, C., & Pitsillides, A. (2019). On the Accuracy of Eye Gaze-driven Classifiers for Predicting Image Content Familiarity in Graphical Passwords. In Proceedings of the 27th ACM Conference on User Modeling, Adaptation and Personalization (pp. 201–205). New York, NY, USA: ACM. https://doi.org/10.1145/3320435.3320474
Matthews, O., Eraslan, S., Yaneva, V., Davies, A., Yesilada, Y., Vigo, M., & Harper, S. (2019). Combining Trending Scan Paths with Arousal to Model Visual Behaviour on the Web: A Case Study of Neurotypical People vs People with Autism. In Proceedings of the 27th ACM Conference on User Modeling, Adaptation and Personalization (pp. 86–94). New York, NY, USA: ACM. https://doi.org/10.1145/3320435.3320446
Obaidellah, U., Raschke, M., & Blascheck, T. (2019). Classification of Strategies for Solving Programming Problems using AoI Sequence Analysis, 10.
Duchowski, A., Krejtz, K., Zurawska, J., & House, D. (2019). Using Microsaccades to Estimate Task Difficulty During Visual Search of Layered Surfaces. IEEE Transactions on Visualization and Computer Graphics, 1–1. https://doi.org/10.1109/TVCG.2019.2901881
Koury, H. F., Leonard, C. J., Carry, P. M., & Lee, L. M. J. (2018). An Expert Derived Feedforward Histology Module Improves Pattern Recognition Efficiency in Novice Students. Anatomical Sciences Education, 0(0). https://doi.org/10.1002/ase.1854
Eraslan, S., Yaneva, V., Yesilada, Y., & Harper, S. (2018). Web users with autism: eye tracking evidence for differences. Behaviour & Information Technology, 0(0), 1–23. https://doi.org/10.1080/0144929X.2018.1551933
Notaro, G. M., & Diamond, S. G. (2018). Simultaneous EEG, eye-tracking, behavioral, and screen-capture data during online German language learning. Data in Brief, 21, 1937–1943. https://doi.org/10.1016/j.dib.2018.11.044
Natraj, N., Alterman, B., Basunia, S., & Wheaton, L. A. (2018). The Role of Attention and Saccades on Parietofrontal Encoding of Contextual and Grasp-specific Affordances of Tools: An ERP Study. Neuroscience, 394, 243–266. https://doi.org/10.1016/j.neuroscience.2018.10.019
Beattie, K. L., & Morrison, B. W. (2018). Navigating the Online World: Gaze, Fixations, and Performance Differences between Younger and Older Users. International Journal of Human–Computer Interaction, 0(0), 1–14. https://doi.org/10.1080/10447318.2018.1541545
Meng, J., Streitz, T., Gulachek, N., Suma, D., & He, B. (2018). Three-Dimensional Brain–Computer Interface Control Through Simultaneous Overt Spatial Attentional and Motor Imagery Tasks. IEEE Transactions on Biomedical Engineering, 65(11), 2417–2427. https://doi.org/10.1109/TBME.2018.2872855
Yaman, C., Küçün, N. T., Güngör, S., & Eroğlu, S. (2018). THE CONTEXTUAL EFFECT AND MEASUREMENT OF ATTENTION TO ADVERTISEMENTS VIA EYE TRACKING METHOD. JOURNAL OF LIFE ECONOMICS, 5(4), 221–232. https://doi.org/10.15637/jlecon.271
Raveh, E., Friedman, J., & Portnoy, S. (2018). Visuomotor behaviors and performance in a dual-task paradigm with and without vibrotactile feedback when using a myoelectric controlled hand. Assistive Technology, 30(5), 274–280. https://doi.org/10.1080/10400435.2017.1323809
Lim, Y., Gardi, A., Sabatini, R., Ramasamy, S., Kistan, T., Ezer, N., … Bolia, R. (2018). Avionics Human-Machine Interfaces and Interactions for Manned and Unmanned Aircraft. Progress in Aerospace Sciences, 102, 1–46. https://doi.org/10.1016/j.paerosci.2018.05.002
Iskander, J., Jia, D., Hettiarachchi, I., Hossny, M., Saleh, K., Nahavandi, S., … Hanoun, S. (2018). Age-Related Effects of Multi-screen Setup on Task Performance and Eye Movement Characteristics. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 3480–3485). https://doi.org/10.1109/SMC.2018.00589
Zhou, H., Wei, L., Cao, R., Hanoun, S., Bhatti, A., Tai, Y., & Nahavandi, S. (2018). The Study of Using Eye Movements to Control the Laparoscope Under a Haptically-Enabled Laparoscopic Surgery Simulation Environment. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 3022–3026). https://doi.org/10.1109/SMC.2018.00513
Saleh, K., Iskander, J., Jia, D., Hossny, M., Nahavandi, S., Best, C., … Hanoun, S. (2018). Reliable Switching Mechanism for Low Cost Multi-screen Eye Tracking Devices via Deep Recurrent Neural Networks. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 3492–3497). https://doi.org/10.1109/SMC.2018.00591
Saleh, K., Iskander, J., Jia, D., Hossny, M., Nahavandi, S., Best, C., … Hanoun, S. (2018). Reliable Switching Mechanism for Low Cost Multi-screen Eye Tracking Devices via Deep Recurrent Neural Networks. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 3492–3497). https://doi.org/10.1109/SMC.2018.00591
Nishiyama, M., Matsumoto, R., Yoshimura, H., & Iwai, Y. (2018). Extracting discriminative features using task-oriented gaze maps measured from observers for personal attribute classification. Pattern Recognition Letters, 112, 241–248. https://doi.org/10.1016/j.patrec.2018.08.001
Sibley, C., Foroughi, C., Brown, N., & Coyne, J. T. (2018). Low Cost Eye Tracking: Ready for Individual Differences Research? Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 62(1), 741–745. https://doi.org/10.1177/1541931218621168
Kar, A., & Corcoran, P. (2018). Performance Evaluation Strategies for Eye Gaze Estimation Systems with Quantitative Metrics and Visualizations. Sensors, 18(9), 3151. https://doi.org/10.3390/s18093151
Morasco Junior, M. A. (2018). Parâmetros gráfico-inclusivos para o desenvolvimento de objetos de aprendizagem digitais voltados ao público infantil. Retrieved from https://repositorio.unesp.br/handle/11449/157459
Ngan, H. F. B., & Yu, C.-E. (2018). To smile or not to smile – an eye-tracking study on service recovery. Current Issues in Tourism, 0(0), 1–6. https://doi.org/10.1080/13683500.2018.1502260
Jaikumar, S. (2018). How Do Consumers Choose Sellers In E-Marketplaces?: The Role of Display Price And Sellers’ Review Volume. Journal of Advertising Research, JAR-2018-028. https://doi.org/10.2501/JAR-2018-028
Aliyev, F., Ürkmez, T., & Wagner, R. (2018). Luxury brands do not glitter equally for everyone. Journal of Brand Management, 25(4), 337–350. https://doi.org/10.1057/s41262-017-0085-x
Chang, C., Chen, C., & Lin, Y. (2018). A Visual Interactive Reading System Based on Eye Tracking Technology to Improve Digital Reading Performance. In 2018 7th International Congress on Advanced Applied Informatics (IIAI-AAI) (pp. 182–187). https://doi.org/10.1109/IIAI-AAI.2018.00043
Tkach, B. (2018). Neuropsychological features personalities with deviant behavior. Fundamental and Applied Researches in Practice of Leading Scientific Schools, 27(3), 201–206. https://doi.org/10.33531/farplss.2018.3.24
Wijayanto, T., Marcilia, S. R., & Lufityanto, G. (2018). Visual Attention, Driving Behavior and Driving Performance among Young Drivers in Sleep-deprived Condition. KnE Life Sciences, 4(5), 424–434. https://doi.org/10.18502/kls.v4i5.2573
Antunes, J., & Santana, P. (2018). A Study on the Use of Eye Tracking to Adapt Gameplay and Procedural Content Generation in First-Person Shooter Games. Multimodal Technologies and Interaction, 2(2), 23. https://doi.org/10.3390/mti2020023