We have done a search for 2017 academic publications which used or cited the Gazepoint eye tracking system. You can find the list on the publications page. If you have used the Gazepoint system in your published work and wish to be added to the list, please let us know.
Gazepoint will be at UXLx: User Experience Lisbon! The conference will be at the FIL Meeting Centre in Lisbon Portugal May 23 -26, 2017. Come visit our booth and see the GP3 HD eye tracking in action. https://www.ux-lx.com/
We are very happy to announce the release of our newest eye tracking system, the GP3 HD. The GP3 HD system maintains all the great features of the GP3, it’s very easy to setup, quick to calibrate, interfaces with all existing hardware accessories (Laptop & VESA screen mounts) and software (Gazepoint Analysis & Remote Viewer). And it's still the most affordable high performance research system on the market. As the names suggests the GP3 HD has an upgraded imaging system which uses a larger HD image sensor and a USB3 interface to allow for a larger field of view. The GP3 HD will also be offered in either 60Hz or 150Hz variants. The GP3 HD is a significant upgrade with the following benefits: Increased allowable head movement: The larger sensor allows for a larger field of view which lets users sit and move more casually and naturally in front of the system. Even when a participant slouches or shifts around in their seat, the system will be able to continue to see and track their eyes. This means less data loss due to movement of the participants and a more comfortable experience for the users. Increased tracking performance (150Hz): The higher sampling rate of 150Hz reduces the chance of a loss of tracking due to fast head movements. The amount of movement that can occur between image frames is reduced from 16.6ms on a 60Hz system to 6.6ms on the 150Hz GP3 HD system. While loss of tracking is much reduced, blinks still result in a loss of tracking, however at the 150Hz update rate, the GP3 HD can re-acquire tracking up to 2.5 times faster than the GP3. The increased update rate further reduces potential data loss due [...]
Are you new to eye tracking and would like to strengthen your knowledge on the subject? Eye Tracking the User Experience, by Aga Bojko is an invaluable reference book that we highly recommend. The book is a practical guide that will benefit anyone wanting to learn how to conduct eye tracking studies in order to evaluate and improve the user experience of products and interfaces. This book is for you if: You are considering adding eye tracking to your research but are unsure if it is going to be of value to you. You recently purchased an eye tracker and are thinking, "Now what?" You've been conducting eye tracking studies for a while but would like to expand your repertoire of capabilities. Get it today from Amazon http://amzn.to/1qgklRr or soon to be available from our web store.
Version 3.2.0 of Gazepoint Analysis has been released. Calibration and tracking performance have been improved in this version along with the ability to use a default browser for web recording. Default browser feature has been requested for some time and we are pleased to be finally able to deliver it. It is a much simplified approach to web recording which does not support multiple user data aggregation but will start the computer default browser for a web recording session. This can be Chrome or Firefox or any browser of your choice. Short-cut keys are handy for stopping recordings (CTRL + ALT + S). See the new web recording in action. https://www.youtube.com/watch?v=20Kk_Vvk-_8
Interested in Eye Tracking for gaming? Watch our demo here of the GP3 in action while playing a game. The objective was to track where the user attention was during game play so that information can be placed optimally in the user interface. Gazepoint can be used in similar fashion for user interface optimization for other applications as well. https://www.youtube.com/watch?v=oTHxb9nPRhI
We've gone through a minor branding and logo updating exercise to enhance our brand and image. Here's the new Gazepoint logo. Hope you like it!
The latest update to the Gazepoint software is out. This version fixes a few minor bug and implements cursor click tracking in screen capture mode. To date we've been able to track clicks within Gazepoint Analysis (images, video, web pages) but were unable to track left click/right click in other programs (such as screen capture). We are happy to announce we've upgraded the tracking capability to capture left clicking and right clicking (both mouse down and up). In the Gazepoint API the cursor state value CS now corresponds as follows: 1 = left mouse button down 2 = right mouse button down 3 = left mouse button up 4 = right mouse button up. In Gazepoint Analysis, visualization of left mouse button data is highlighted with a green square centered on the click location and visualization of right mouse button data is highlighted with a red square centered on the click location.
Blink tracking has been added to the latest release of the Gazepoint software (V3.0.0), both to Control via the Open Gaze API (all versions) and to Analysis visualizations (UX). In Control you can access the blink tracking metrics by enabling the ENABLE_SEND_BLINK setting which adds a blink counter BKID, blink duration for the last blink BKDUR and the average blink rate over the last minute BKPMIN. In Analysis simply click the Enable display button for the Blink Rate to turn on the overlay.
The Gazepoint Remote Viewer provides the ability to transmit the images shown in the Gazepoint Analysis software, over a network, to a remote computer for unobtrusive observation of the experiment underway. The core features of Remote include: Transmission of Analysis images to remote computers over a TCP/IP network High quality images and low bandwidth requirements Control of experiment underway (Start / Stop / Next media item) Remote chat between multiple Remote clients Real-time event logging in the experiment by Remote clients Full screen viewing We quietly introduced the Gazepoint Remote Viewer late last year and over the last months collected feedback on the performance of the system. We originally provided the ability to tune the quality of the image (low to high) to vary the level of bandwidth required. The feedback was a resounding, "Give us the highest quality images at the maximum frame rate with minimum bandwidth!", so we completely redesigned the underlying video feed protocol. The latest release is able to transmit high quality images at 10 frames per second (fps), using approximately 20 Mbps, easily within a 100 Mbps local area network. The bandwidth/frame rate will depend somewhat on the content being shown to the user (how much image compression is possible) and the underlying network speed. While a LAN (100/1000 Mbps) connection is best, the system will still operate over a lower bandwidth connection such as WiFi or the Internet (WAN), the frame rate may just be lower. Remote provides the ability to remotely start the recording of the experiment, move the recording to the next media item to be tested, as well as to stop the recording. This can allow the experiment subject full control of their computer (keyboard and mouse), while the [...]