The entire Gazepoint Team is very happy to announce the release of the all new Gazepoint Biometrics platform. The Biometrics platform builds on our existing expertise in eye-tracking and adds additional biometric signals to the data capture stream. The Biometrics module provides 3 new signals, heart rate and galvanic skin response (GSR) from the finger sensors block as well as an analog self-reporting engagement dial (0-100%). These biometric signals can be captured as standalone signals or in conjunction with eye-tracking data. In addition, the Gazepoint eye-tracking systems (GP3 and GP3 HD) have been upgraded to provide a more accurate pupil diameter measure. Even higher accuracy pupillometry can be achieved using a small square paper marker (about 7mm a side) affixed to a users glasses. The upgraded pupillometry system is available to all existing and new eye-tracking customers with access to the latest software release (V5.0 will be rolled out soon). Some of the key features of the Gazepoint Biometrics system include: The Biometrics system is extremely easy to use. As with the design of our eye-tracking systems, our goal is to make it very easy to get up and running and collecting data. Simply plug the finger sensor into the dial block, the dial block into the PC USB and click the ON button in Control to activate the biometrics sensor system. Biometrics seamlessly integrates with the existing Gazepoint Analysis platform and eye-trackers. You can design projects as before with images, videos, screen capture, etc, and as long as the Biometrics system is enabled, all the data will be captured and synchronized along with the eye-tracking data. Biometrics can be run stand-alone. It is possible to use just the biometrics system to capture GSR, heart [...]
The latest release of the Gazepoint software suite is now available to download. A few notable updates in the release: Gazepoint Control The calibration system has been updated to easily allow for custom calibration graphics which was due to requests for holding the attention of very young participants and possibly primate research. Through the API you were always able to create your own calibration, however to do so required some programming. With the new system you can simply copy image and audio files into the Windows Documents\Gazepoint folder (for example: C:\Users\ch\Documents\Gazepoint) and Control will automatically use those during the calibration routine. Files must be named CalibPointX.png and CalibPointX.mp3 where X is the calibration number 1,2,3,4,5, etc, and we suggest the images have a background color of RGB=25,25,25 to blend with the background. The images will move, rotate, and scale while the music plays to try to hold the participants attention. You can use the + and - keys to increase and decrease the calibration speed. In addition, there was a bug in the USER_DATA API update in which a space character was missing which led to issues with some XML parsers. This error has been fixed in this release. Gazepoint Analysis The Analysis update in this version has primarily focused on the AOI system to make it easier to use as well as more functional for tracking moving content. You can now copy/paste all AOI for Text/Image/Video media by right clicking the media item in Analyze mode. AOI keyframes can now be edited to set the exact time and X, Y, Width, Height sizes. Finally, AOI keyframes now allow for gaps in the time sequences to track objects that disappear and reappear later in dynamic content. Keyframes may now [...]
As of Gazepoint release V4.2.0 the USER_DATA option in the OpenGaze API has been updated to include an optional duration tag. The DUR tag allows for embedding discrete events into the data stream. Previously if a USER_DATA value was set in the gaze data stream: <SET ID="USER_DATA" VALUE="EXP1" /> The USER_DATA value would then be set for all subsequent data samples, which is useful for blocking off portions of the data stream, such as Experiment 1, Experiment 2, etc. The duration option allows embedding single events (DUR=1) which means only a single data record will be tagged with the event. <SET ID="USER_DATA" VALUE="EVENT1" DUR="1" /> With this update it is possible to read other sensors such as serial port devices and then use the API to embed the results in the data stream. An illustrative example using MATLAB is shown below, where Control is running to generate the gaze data, and Analysis is running to collect the data. In the example the original use case is shown where USER_DATA is set to EXP1 for 1 second of data, EXP2 for 1.5 seconds and EXP3 for 2 seconds (a simple delay is used instead of the actual experiment). Following this the new USER_DATA capability is demonstrated using fictional events 88, 99, 101 (these could be A,B,C, or any other ID for the events). The source code is available in the demo folder of the Gazepoint installation C:\Program Files (x86)\Gazepoint\Gazepoint\demo\ </pre> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Gazepoint sample program for USER_DATA embedding via the Gazepoint API % Written in 2017 by Gazepoint www.gazept.com % % To the extent possible under law, the author(s) have dedicated all copyright % and related and neighboring rights to this software to the public [...]
The latest release of the Gazepoint software suite is now available to download. A few notable updates in the release: Gazepoint Control We have added a gain-sweep mode which when enabled, will sweep through the full range of the system gain settings until eyes are detected. In the image to the right you can see the bright pupils properly tracked. If the pupils were too bright the lower two eye images would be red (unidentified) and the upper face image would show bright white pupils. It has been reported that for some young subjects (under 6 years old) the pupils have such a strong bright pupil response the system cannot identify them properly to engage the automatic gain system, the pupils just look like glasses reflections. At this time the gain sweep mode is optional and can be enabled by clicking on the Control window and press the 'P' key on the keyboard. In the lower right of the taskbar it will show the Gain Sweep mode is active. If this system proves to work well for this use case we will enable it by default in future releases. Gazepoint Analysis A number of small bug fixes were added, as well as an update to the way the Gazepoint Analysis display is rendered. This update reduces the CPU usage when not actively rendering or processing data in Analyze mode.
We have done a search for 2017 academic publications which used or cited the Gazepoint eye tracking system. You can find the list on the publications page. If you have used the Gazepoint system in your published work and wish to be added to the list, please let us know.
We are very happy to announce the release of our newest eye tracking system, the GP3 HD. The GP3 HD system maintains all the great features of the GP3, it’s very easy to setup, quick to calibrate, interfaces with all existing hardware accessories (Laptop & VESA screen mounts) and software (Gazepoint Analysis & Remote Viewer). And it's still the most affordable high performance research system on the market. As the names suggests the GP3 HD has an upgraded imaging system which uses a larger HD image sensor and a USB3 interface to allow for a larger field of view. The GP3 HD will also be offered in either 60Hz or 150Hz variants. The GP3 HD is a significant upgrade with the following benefits: Increased allowable head movement: The larger sensor allows for a larger field of view which lets users sit and move more casually and naturally in front of the system. Even when a participant slouches or shifts around in their seat, the system will be able to continue to see and track their eyes. This means less data loss due to movement of the participants and a more comfortable experience for the users. Increased tracking performance (150Hz): The higher sampling rate of 150Hz reduces the chance of a loss of tracking due to fast head movements. The amount of movement that can occur between image frames is reduced from 16.6ms on a 60Hz system to 6.6ms on the 150Hz GP3 HD system. While loss of tracking is much reduced, blinks still result in a loss of tracking, however at the 150Hz update rate, the GP3 HD can re-acquire tracking up to 2.5 times faster than the GP3. The increased update rate further reduces potential data loss due [...]
The latest update to the Gazepoint software is out. This version fixes a few minor bug and implements cursor click tracking in screen capture mode. To date we've been able to track clicks within Gazepoint Analysis (images, video, web pages) but were unable to track left click/right click in other programs (such as screen capture). We are happy to announce we've upgraded the tracking capability to capture left clicking and right clicking (both mouse down and up). In the Gazepoint API the cursor state value CS now corresponds as follows: 1 = left mouse button down 2 = right mouse button down 3 = left mouse button up 4 = right mouse button up. In Gazepoint Analysis, visualization of left mouse button data is highlighted with a green square centered on the click location and visualization of right mouse button data is highlighted with a red square centered on the click location.
Blink tracking has been added to the latest release of the Gazepoint software (V3.0.0), both to Control via the Open Gaze API (all versions) and to Analysis visualizations (UX). In Control you can access the blink tracking metrics by enabling the ENABLE_SEND_BLINK setting which adds a blink counter BKID, blink duration for the last blink BKDUR and the average blink rate over the last minute BKPMIN. In Analysis simply click the Enable display button for the Blink Rate to turn on the overlay.
The Gazepoint Remote Viewer provides the ability to transmit the images shown in the Gazepoint Analysis software, over a network, to a remote computer for unobtrusive observation of the experiment underway. The core features of Remote include: Transmission of Analysis images to remote computers over a TCP/IP network High quality images and low bandwidth requirements Control of experiment underway (Start / Stop / Next media item) Remote chat between multiple Remote clients Real-time event logging in the experiment by Remote clients Full screen viewing We quietly introduced the Gazepoint Remote Viewer late last year and over the last months collected feedback on the performance of the system. We originally provided the ability to tune the quality of the image (low to high) to vary the level of bandwidth required. The feedback was a resounding, "Give us the highest quality images at the maximum frame rate with minimum bandwidth!", so we completely redesigned the underlying video feed protocol. The latest release is able to transmit high quality images at 10 frames per second (fps), using approximately 20 Mbps, easily within a 100 Mbps local area network. The bandwidth/frame rate will depend somewhat on the content being shown to the user (how much image compression is possible) and the underlying network speed. While a LAN (100/1000 Mbps) connection is best, the system will still operate over a lower bandwidth connection such as WiFi or the Internet (WAN), the frame rate may just be lower. Remote provides the ability to remotely start the recording of the experiment, move the recording to the next media item to be tested, as well as to stop the recording. This can allow the experiment subject full control of their computer (keyboard and mouse), while the [...]
I recently looked back at our blog and the last post was in February! It's been quiet on the website but we have not been idle, we have a whole set of new features and products we'll be rolling out over the fall. We've just posted V3.0 of the Gazepoint Software suite, I'll take you through a few of the new features in the next set of blog posts.