Gazepoint Citations

We are frequently asked if we can provide examples of research papers in which the Gazepoint eye-tracking technologies are used. We are very happy to provide a short list of publications which we have found to date. If you have published your research which uses the Gazepoint system, please let us know and we will add a citation to your work here.

Bolte, T., Nolan, A., & Andujar, M. (n.d.). Eye Scan-Path Training with Voice Dialog on Chest X-Rays and Analyzing Results using Gridinger’s Classification, 4.
Gupta, V., Chakraborty, T., Agarwal, M., Singh, R., Arora, M., & Vatsa, M. (n.d.). Bag-of-Lies: A Multimodal Dataset for Deception Detection, 8.
Yaneva, V., & Eraslan, S. (n.d.). Adults with High-functioning Autism Process Web Pages With Similar Accuracy but Higher Cognitive Effort Compared to Controls, 4.
Hennessey, C., Noureddin, B., & Lawrence, P. (2006). A single camera eye-gaze tracking system with free head motion. In Proceedings of the 2006 symposium on Eye tracking research & applications (pp. 87–94). ACM. Retrieved from http://dl.acm.org/citation.cfm?id=1117349
Hennessey, C., Noureddin, B., & Lawrence, P. (2008). Fixation precision in high-speed noncontact eye-gaze tracking. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions On, 38(2), 289–298. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4410444
Hennessey, C., & Lawrence, P. (2008). 3D point-of-gaze estimation on a volumetric display. In Proceedings of the 2008 symposium on Eye tracking research & applications (pp. 59–59). ACM. Retrieved from http://dl.acm.org/citation.cfm?id=1344486
Hennessey, C. A., & Lawrence, P. D. (2009). Improving the accuracy and reliability of remote system-calibration-free eye-gaze tracking. Biomedical Engineering, IEEE Transactions On, 56(7), 1891–1900. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4797862
Hennessey, C., & Lawrence, P. (2009). Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions. Biomedical Engineering, IEEE Transactions On, 56(3), 790–799. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4633699
Hennessey, C., & Duchowski, A. T. (2010). An open source eye-gaze interface: Expanding the adoption of eye-gaze in everyday applications. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (pp. 81–84). ACM. Retrieved from http://dl.acm.org/citation.cfm?id=1743686
Zugal, S., & Pinggera, J. (2014). Low–Cost Eye–Trackers: Useful for Information Systems Research? In L. Iliadis, M. Papazoglou, & K. Pohl (Eds.), Advanced Information Systems Engineering Workshops (pp. 159–170). Springer International Publishing. https://doi.org/10.1007/978-3-319-07869-4_14
Pence, T. B., Dukes, L. C., Hodges, L. F., Meehan, N. K., & Johnson, A. (2014). An Eye Tracking Evaluation of a Virtual Pediatric Patient Training System for Nurses. In T. Bickmore, S. Marsella, & C. Sidner (Eds.), Intelligent Virtual Agents (pp. 329–338). Springer International Publishing. https://doi.org/10.1007/978-3-319-09767-1_43
Barber, T., Bertrand, J., Christ, C., Melloy, B. J., & Neyens, D. M. (2014). Comparing the Use of Active versus Passive Navigational Tools In a Virtual Desktop Environment via Eye Tracking. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 58(1), 1954–1958. https://doi.org/10.1177/1541931214581408
Taylor, P., Bilgrien, N., He, Z., & Siegelmann, H. T. (2015). EyeFrame: real-time memory aid improves human multitasking via domain-general eye tracking procedures. Human-Media Interaction, 17. https://doi.org/10.3389/fict.2015.00017
Burke, T., & Welter, J. (2015). Quantitative Analysis of Reading Comprehension and Reading Speed Based on Serif and Sans-serif Fonts. Retrieved from https://www.semanticscholar.org/paper/Quantitative-Analysis-of-Reading-Comprehension-and-Burke-Welter/b2a17cbaf8f1c74bb25ac6be3b03fd069a8aa96e
Folk, E. (2015). Quantitative Analysis of the Effects of Stage Lighting on Attention. Retrieved from https://www.semanticscholar.org/paper/Quantitative-Analysis-of-the-Effects-of-Stage-Folk/c9abf0946133098ae4e75df80f28aaa9c28cae05
Fong, S. S. M. (2015). Single-channel Electroencephalographic Recording in Children with Developmental Coordination Disorder: Validity and influence of Eye Blink Artifacts. Journal of Novel Physiotherapies, 05(04). https://doi.org/10.4172/2165-7025.1000270
Sarkar, A. R., Sanyal, G., & Majumder, S. (2015). Methodology for a Low-Cost Vision-Based Rehabilitation System for Stroke Patients. In S. Gupta, S. Bag, K. Ganguly, I. Sarkar, & P. Biswas (Eds.), Advancements of Medical Electronics (pp. 365–377). Springer India. https://doi.org/10.1007/978-81-322-2256-9_34
Yaneva, V., Temnikova, I., & Mitkov, R. (2015). Accessible Texts for Autism: An Eye-Tracking Study. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility (pp. 49–57). New York, NY, USA: ACM. https://doi.org/10.1145/2700648.2809852
Dünser, A., Lochner, M., Engelke, U., & Fernández, D. R. (2015). Visual and Manual Control for Human-Robot Teleoperation. IEEE Computer Graphics and Applications, 35(3), 22–32. https://doi.org/10.1109/MCG.2015.4
Radecký, M., Vykopal, J., & Smutný, P. (2015). Analysis of syntactic elements and structure of web pages using eye-tracking technology. In Carpathian Control Conference (ICCC), 2015 16th International (pp. 420–425). https://doi.org/10.1109/CarpathianCC.2015.7145116
Nelson, C. A., Zhang, X., Webb, J., & Li, S. (2015). Fuzzy Control for Gaze-Guided Personal Assistance Robots: Simulation and Experimental Application. International Journal On Advances in Intelligent Systems, 8(1 and 2), 77–84. Retrieved from https://www.thinkmind.org/index.php?view=article&articleid=intsys_v8_n12_2015_7
Wibirama, S., Wijayanto, T., Nugroho, H. A., Bahit, M., & Winadi, M. N. (2015). Quantifying visual attention and visually induced motion sickness during day-night driving and sleep deprivation. In 2015 International Conference on Data and Software Engineering (ICoDSE) (pp. 191–194). https://doi.org/10.1109/ICODSE.2015.7436996
El-Samahy, E., Mahfouf, M., Torres-Salomao, L. A., & Anzurez-Marin, J. (2015). A new computer control system for mental stress management using fuzzy logic. In 2015 IEEE International Conference on Evolving and Adaptive Intelligent Systems (EAIS) (pp. 1–7). https://doi.org/10.1109/EAIS.2015.7368785
Rodríguez, F., Gustavo, A., Perezchica, M., Guadalupe, M., & Alvarado Herrera, A. (2016). Desmitificando el valor de las imágenes de celebridades en sitios web de establecimientos de alojamiento turístico. Retrieved from https://digitum.um.es/xmlui/handle/10201/51419
Prichard, C., & Atkins, A. (2016). Evaluating L2 Readers’ Previewing Strategies Using Eye Tracking. The Reading Matrix: An International Online Journal, 16(2). Retrieved from http://www.readingmatrix.com/files/15-992935s1.pdf
Havran, V., Filip, J., & Myszkowski, K. (2016). Perceptually motivated BRDF comparison using single image. In Computer Graphics Forum (Vol. 35, pp. 1–12). Retrieved from http://onlinelibrary.wiley.com/doi/10.1111/cgf.12944/full
Best, D. S., & Duchowski, A. T. (2016). A Rotary Dial for Gaze-based PIN Entry. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (pp. 69–76). New York, NY, USA: ACM. https://doi.org/10.1145/2857491.2857527
Duchowski, A. T., Jörg, S., Allen, T. N., Giannopoulos, I., & Krejtz, K. (2016). Eye Movement Synthesis. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (pp. 147–154). New York, NY, USA: ACM. https://doi.org/10.1145/2857491.2857528
Ward, N. G., Jurado, C. N., Garcia, R. A., & Ramos, F. A. (2016). On the Possibility of Predicting Gaze Aversion to Improve Video-chat Efficiency. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (pp. 267–270). New York, NY, USA: ACM. https://doi.org/10.1145/2857491.2857497
Gomes, J., Marques, F., Lourenço, A., Mendonça, R., Santana, P., & Barata, J. (2016). Gaze-Directed Telemetry in High Latency Wireless Communications: The Case of Robot Teleoperation. Retrieved from https://www.researchgate.net/profile/Andre_Lourenco4/publication/307936647_Gaze-Directed_Telemetry_in_High_Latency_Wireless_Communications_The_Case_of_Robot_Teleoperation/links/57d2a73d08ae6399a38d7a95.pdf
Kawase, S., & Obata, S. (2016). Audience gaze while appreciating a multipart musical performance. Consciousness and Cognition, 46, 15–26. Retrieved from http://www.sciencedirect.com/science/article/pii/S1053810016302975
Filip, J., Havran, V., & Myszkowski, K. (2016). Gaze Analysis of BRDF Distortions. Retrieved from http://library.utia.cas.cz/separaty/2016/RO/filip-0462548.pdf
Jaikumar, S., Sahay, A., & others. (2016). Effect of Overlapping Price Ranges on Price Perception: Revisiting the Range Theory of Price Perception. Indian Institute of Management Ahmedabad, Research and Publication Department. Retrieved from http://64.207.185.160/assets/snippets/workingpaperpdf/16266030742016-02-02.pdf
Leifman, G., Rudoy, D., Swedish, T., Bayro-Corrochano, E., & Raskar, R. (2016). Learning Gaze Transitions from Depth to Improve Video Saliency Estimation. ArXiv:1603.03669 [Cs]. Retrieved from http://arxiv.org/abs/1603.03669
Nguyen, A. N., & Sheridan, D. (2016). Eye-Tracking: A Cost-Effective Workstation for Usability Studies. Retrieved from https://researchspace.auckland.ac.nz/handle/2292/30191
Jankowski, J., aw, Saganowski, S., aw, Br&#xf3, & Dka, P. (2016). Evaluation of TRANSFoRm Mobile eHealth Solution for Remote Patient Monitoring during Clinical Trials. Mobile Information Systems, 2016, e1029368. https://doi.org/10.1155/2016/1029368
Craig, T. L., Nelson, C. A., Li, S., & Zhang, X. (2016). Human gaze commands classification: A shape based approach to interfacing with robots. In 2016 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA) (pp. 1–6). https://doi.org/10.1109/MESA.2016.7587154
Torres-Salomao, L. A., Mahfouf, M., El-Samahy, E., & Ting, C. (2016). Psycho-Physiologically-Based Real Time Adaptive General Type 2 Fuzzy Modelling and Self-Organising Control of Operator’s Performance Undertaking a Cognitive Task. IEEE Transactions on Fuzzy Systems, (10.1109/TFUZZ.2016.2598363). Retrieved from http://dx.doi.org/10.1109/TFUZZ.2016.2598363
Coyne, J., & Sibley, C. (2016). Investigating the Use of Two Low Cost Eye Tracking Systems for Detecting Pupillary Response to Changes in Mental Workload. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 60(1), 37–41. https://doi.org/10.1177/1541931213601009
Naicker, P., Anoopkumar-Dukie, S., Grant, G. D., & Kavanagh, J. J. (2016). Medications influencing central cholinergic neurotransmission affect saccadic and smooth pursuit eye movements in healthy young adults. Psychopharmacology. https://doi.org/10.1007/s00213-016-4436-1
Naicker, P., Anoopkumar-Dukie, S., Grant, G. D., Neumann, D. L., & Kavanagh, J. J. (2016). Central cholinergic pathway involvement in the regulation of pupil diameter, blink rate and cognitive function. Neuroscience, 334, 180–190. https://doi.org/10.1016/j.neuroscience.2016.08.009
Andreu, L., & Sanz-Torrent, M. (2017). The Visual World Paradigm in Children with Spoken Language Disorders. In Eye-Tracking Technology Applications in Educational Research (pp. 262–282). IGI Global. Retrieved from http://books.google.com/books?hl=en&lr=&id=ca80DQAAQBAJ&oi=fnd&pg=PA262&dq=info:RrF-ChKR5H8J:scholar.google.com&ots=4HLRij621x&sig=Vcn6zwUHysDlfxXsRIcj-r2keoQ
Sibley, C., Coyne, J., & Sherwood, S. (2017). Research Considerations and Tools for Evaluating Human-Automation Interaction with Future Unmanned Systems. In Autonomy and Artificial Intelligence: A Threat or Savior? (pp. 157–178). Springer, Cham. https://doi.org/10.1007/978-3-319-59719-5_7
Barik, T., Smith, J., Lubick, K., Holmes, E., Feng, J., Murphy-Hill, E., & Parnin, C. (2017). Do Developers Read Compiler Error Messages? In Proceedings of the 39th International Conference on Software Engineering (pp. 575–585). Piscataway, NJ, USA: IEEE Press. https://doi.org/10.1109/ICSE.2017.59
Uribe-Quevedo, A., Valdivia, S., Prada, E., Navia, M., Rincon, C., Ramos, E., … Perez, B. (2017). Development of an Occupational Health Care Exergaming Prototype Suite. In Recent Advances in Technologies for Inclusive Well-Being (pp. 127–145). Springer, Cham. https://doi.org/10.1007/978-3-319-49879-9_7
Frydman, C., & Mormann, M. (2017). The Role of Salience and Attention in Choice Under Risk: An Experimental Investigation.
Knoll, M. A. (2017). Developing a bicyclist hazard perception test: Explorative research comparing adult and adolescent cyclists on a visual scanning and a key press measure.
Durkee, P. (2017). Examining the Speed and Automaticity of Formidability Assessment Mechanisms. California State University, Fullerton.
Ni, Y. (2017). A Study of Danmaku Video on Attention Allocation, Social Presence, Transportation to Narrative, Cognitive Workload and Enjoyment. Syracuse University.
Nugter, A. (2017). The effect of driving experience on hazard perception in relation to visual attention.