The entire Gazepoint Team is very happy to announce the release of the all new Gazepoint Biometrics platform. The Biometrics platform builds on our existing expertise in eye-tracking and adds additional biometric signals to the data capture stream. The Biometrics platform provides 3 new signals, heart rate and galvanic skin response (GSR) from the finger sensor, as well as an analog self-reporting engagement dial (0-100%). The biometric signals can be captured as standalone signals or in conjunction with eye-tracking data. In addition, the Gazepoint eye-tracking systems (GP3 and GP3 HD) have been upgraded to provide a more accurate pupil diameter measure. The upgraded pupillometry system is available to all existing and new eye-tracking customers with access to the latest software release (V5.0 will be rolled out soon). Some of the key features of the Gazepoint Biometrics system include: The Biometrics system is extremely easy to use. As with the design of our eye-tracking systems, our goal is to make it very easy to get up and running and collecting data. Simply plug the finger sensor into the dial block, the dial block into the PC USB and click the ON button in Control to activate the biometrics sensor system. Biometrics seamlessly integrates with the existing Gazepoint Analysis platform and eye-trackers. You can design projects as before with images, videos, screen capture, etc, and as long as the Biometrics system is enabled, all the data will be captured and synchronized along with the eye-tracking data. Biometrics can be run stand-alone. It is possible to use just the biometrics system to capture GSR, heart rate and the self-report dial if eye-tracking data is not required. Biometrics is integrated into the OpenGaze API which makes developing custom applications [...]
The latest release of the Gazepoint software suite is now available to download. A few notable updates in the release: Gazepoint Control The calibration system has been updated to easily allow for custom calibration graphics which was due to requests for holding the attention of very young participants and possibly primate research. Through the API you were always able to create your own calibration, however to do so required some programming. With the new system you can simply copy image and audio files into the Windows Documents\Gazepoint folder (for example: C:\Users\ch\Documents\Gazepoint) and Control will automatically use those during the calibration routine. Files must be named CalibPointX.png and CalibPointX.mp3 where X is the calibration number 1,2,3,4,5, etc, and we suggest the images have a background color of RGB=25,25,25 to blend with the background. The images will move, rotate, and scale while the music plays to try to hold the participants attention. You can use the + and - keys to increase and decrease the calibration speed. In addition, there was a bug in the USER_DATA API update in which a space character was missing which led to issues with some XML parsers. This error has been fixed in this release. Gazepoint Analysis The Analysis update in this version has primarily focused on the AOI system to make it easier to use as well as more functional for tracking moving content. You can now copy/paste all AOI for Text/Image/Video media by right clicking the media item in Analyze mode. AOI keyframes can now be edited to set the exact time and X, Y, Width, Height sizes. Finally, AOI keyframes now allow for gaps in the time sequences to track objects that disappear and reappear later in dynamic content. Keyframes may now [...]
As of Gazepoint release V4.2.0 the USER_DATA option in the OpenGaze API has been updated to include an optional duration tag. The DUR tag allows for embedding discrete events into the data stream. Previously if a USER_DATA value was set in the gaze data stream: <SET ID="USER_DATA" VALUE="EXP1" /> The USER_DATA value would then be set for all subsequent data samples, which is useful for blocking off portions of the data stream, such as Experiment 1, Experiment 2, etc. The duration option allows embedding single events (DUR=1) which means only a single data record will be tagged with the event. <SET ID="USER_DATA" VALUE="EVENT1" DUR="1" /> With this update it is possible to read other sensors such as serial port devices and then use the API to embed the results in the data stream. An illustrative example using MATLAB is shown below, where Control is running to generate the gaze data, and Analysis is running to collect the data. In the example the original use case is shown where USER_DATA is set to EXP1 for 1 second of data, EXP2 for 1.5 seconds and EXP3 for 2 seconds (a simple delay is used instead of the actual experiment). Following this the new USER_DATA capability is demonstrated using fictional events 88, 99, 101 (these could be A,B,C, or any other ID for the events). The source code is available in the demo folder of the Gazepoint installation C:\Program Files (x86)\Gazepoint\Gazepoint\demo\ </pre> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Gazepoint sample program for USER_DATA embedding via the Gazepoint API % Written in 2017 by Gazepoint www.gazept.com % % To the extent possible under law, the author(s) have dedicated all copyright % and related and neighboring rights to this software to the public [...]
The latest release of the Gazepoint software suite is now available to download. A few notable updates in the release: Gazepoint Control We have added a gain-sweep mode which when enabled, will sweep through the full range of the system gain settings until eyes are detected. In the image to the right you can see the bright pupils properly tracked. If the pupils were too bright the lower two eye images would be red (unidentified) and the upper face image would show bright white pupils. It has been reported that for some young subjects (under 6 years old) the pupils have such a strong bright pupil response the system cannot identify them properly to engage the automatic gain system, the pupils just look like glasses reflections. At this time the gain sweep mode is optional and can be enabled by clicking on the Control window and press the 'P' key on the keyboard. In the lower right of the taskbar it will show the Gain Sweep mode is active. If this system proves to work well for this use case we will enable it by default in future releases. Gazepoint Analysis A number of small bug fixes were added, as well as an update to the way the Gazepoint Analysis display is rendered. This update reduces the CPU usage when not actively rendering or processing data in Analyze mode.
We have done a search for 2017 academic publications which used or cited the Gazepoint eye tracking system. You can find the list on the publications page. If you have used the Gazepoint system in your published work and wish to be added to the list, please let us know.
Gazepoint will be at UXLx: User Experience Lisbon! The conference will be at the FIL Meeting Centre in Lisbon Portugal May 23 -26, 2017. Come visit our booth and see the GP3 HD eye tracking in action. https://www.ux-lx.com/
We are very happy to announce the release of our newest eye tracking system, the GP3 HD. The GP3 HD system maintains all the great features of the GP3, it’s very easy to setup, quick to calibrate, interfaces with all existing hardware accessories (Laptop & VESA screen mounts) and software (Gazepoint Analysis & Remote Viewer). And it's still the most affordable high performance research system on the market. As the names suggests the GP3 HD has an upgraded imaging system which uses a larger HD image sensor and a USB3 interface to allow for a larger field of view. The GP3 HD will also be offered in either 60Hz or 150Hz variants. The GP3 HD is a significant upgrade with the following benefits: Increased allowable head movement: The larger sensor allows for a larger field of view which lets users sit and move more casually and naturally in front of the system. Even when a participant slouches or shifts around in their seat, the system will be able to continue to see and track their eyes. This means less data loss due to movement of the participants and a more comfortable experience for the users. Increased tracking performance (150Hz): The higher sampling rate of 150Hz reduces the chance of a loss of tracking due to fast head movements. The amount of movement that can occur between image frames is reduced from 16.6ms on a 60Hz system to 6.6ms on the 150Hz GP3 HD system. While loss of tracking is much reduced, blinks still result in a loss of tracking, however at the 150Hz update rate, the GP3 HD can re-acquire tracking up to 2.5 times faster than the GP3. The increased update rate further reduces potential data loss due [...]
Are you new to eye tracking and would like to strengthen your knowledge on the subject? Eye Tracking the User Experience, by Aga Bojko is an invaluable reference book that we highly recommend. The book is a practical guide that will benefit anyone wanting to learn how to conduct eye tracking studies in order to evaluate and improve the user experience of products and interfaces. This book is for you if: You are considering adding eye tracking to your research but are unsure if it is going to be of value to you. You recently purchased an eye tracker and are thinking, "Now what?" You've been conducting eye tracking studies for a while but would like to expand your repertoire of capabilities. Get it today from Amazon http://amzn.to/1qgklRr or soon to be available from our web store.
Version 3.2.0 of Gazepoint Analysis has been released. Calibration and tracking performance have been improved in this version along with the ability to use a default browser for web recording. Default browser feature has been requested for some time and we are pleased to be finally able to deliver it. It is a much simplified approach to web recording which does not support multiple user data aggregation but will start the computer default browser for a web recording session. This can be Chrome or Firefox or any browser of your choice. Short-cut keys are handy for stopping recordings (CTRL + ALT + S). See the new web recording in action. https://www.youtube.com/watch?v=20Kk_Vvk-_8
Interested in Eye Tracking for gaming? Watch our demo here of the GP3 in action while playing a game. The objective was to track where the user attention was during game play so that information can be placed optimally in the user interface. Gazepoint can be used in similar fashion for user interface optimization for other applications as well. https://www.youtube.com/watch?v=oTHxb9nPRhI