As of Gazepoint release V4.2.0 the USER_DATA option in the OpenGaze API has been updated to include an optional duration tag. The DUR tag allows for embedding discrete events into the data stream. Previously if a USER_DATA value was set in the gaze data stream: <SET ID="USER_DATA" VALUE="EXP1" /> The USER_DATA value would then be set for all subsequent data samples, which is useful for blocking off portions of the data stream, such as Experiment 1, Experiment 2, etc. The duration option allows embedding single events (DUR=1) which means only a single data record will be tagged with the event. <SET ID="USER_DATA" VALUE="EVENT1" DUR="1" /> With this update it is possible to read other sensors such as serial port devices and then use the API to embed the results in the data stream. An illustrative example using MATLAB is shown below, where Control is running to generate the gaze data, and Analysis is running to collect the data. In the example the original use case is shown where USER_DATA is set to EXP1 for 1 second of data, EXP2 for 1.5 seconds and EXP3 for 2 seconds (a simple delay is used instead of the actual experiment). Following this the new USER_DATA capability is demonstrated using fictional events 88, 99, 101 (these could be A,B,C, or any other ID for the events). The source code is available in the demo folder of the Gazepoint installation C:\Program Files (x86)\Gazepoint\Gazepoint\demo\ </pre> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Gazepoint sample program for USER_DATA embedding via the Gazepoint API % Written in 2017 by Gazepoint www.gazept.com % % To the extent possible under law, the author(s) have dedicated all copyright % and related and neighboring rights to this software to the public [...]
The latest release of the Gazepoint software suite is now available to download. A few notable updates in the release: Gazepoint Control We have added a gain-sweep mode which when enabled, will sweep through the full range of the system gain settings until eyes are detected. In the image to the right you can see the bright pupils properly tracked. If the pupils were too bright the lower two eye images would be red (unidentified) and the upper face image would show bright white pupils. It has been reported that for some young subjects (under 6 years old) the pupils have such a strong bright pupil response the system cannot identify them properly to engage the automatic gain system, the pupils just look like glasses reflections. At this time the gain sweep mode is optional and can be enabled by clicking on the Control window and press the 'P' key on the keyboard. In the lower right of the taskbar it will show the Gain Sweep mode is active. If this system proves to work well for this use case we will enable it by default in future releases. Gazepoint Analysis A number of small bug fixes were added, as well as an update to the way the Gazepoint Analysis display is rendered. This update reduces the CPU usage when not actively rendering or processing data in Analyze mode.
We have done a search for 2017 academic publications which used or cited the Gazepoint eye tracking system. You can find the list on the publications page. If you have used the Gazepoint system in your published work and wish to be added to the list, please let us know.
We are very happy to announce the release of our newest eye tracking system, the GP3 HD. The GP3 HD system maintains all the great features of the GP3, it’s very easy to setup, quick to calibrate, interfaces with all existing hardware accessories (Laptop & VESA screen mounts) and software (Gazepoint Analysis & Remote Viewer). And it's still the most affordable high performance research system on the market. As the names suggests the GP3 HD has an upgraded imaging system which uses a larger HD image sensor and a USB3 interface to allow for a larger field of view. The GP3 HD will also be offered in either 60Hz or 150Hz variants. The GP3 HD is a significant upgrade with the following benefits: Increased allowable head movement: The larger sensor allows for a larger field of view which lets users sit and move more casually and naturally in front of the system. Even when a participant slouches or shifts around in their seat, the system will be able to continue to see and track their eyes. This means less data loss due to movement of the participants and a more comfortable experience for the users. Increased tracking performance (150Hz): The higher sampling rate of 150Hz reduces the chance of a loss of tracking due to fast head movements. The amount of movement that can occur between image frames is reduced from 16.6ms on a 60Hz system to 6.6ms on the 150Hz GP3 HD system. While loss of tracking is much reduced, blinks still result in a loss of tracking, however at the 150Hz update rate, the GP3 HD can re-acquire tracking up to 2.5 times faster than the GP3. The increased update rate further reduces potential data loss due [...]
The latest update to the Gazepoint software is out. This version fixes a few minor bug and implements cursor click tracking in screen capture mode. To date we've been able to track clicks within Gazepoint Analysis (images, video, web pages) but were unable to track left click/right click in other programs (such as screen capture). We are happy to announce we've upgraded the tracking capability to capture left clicking and right clicking (both mouse down and up). In the Gazepoint API the cursor state value CS now corresponds as follows: 1 = left mouse button down 2 = right mouse button down 3 = left mouse button up 4 = right mouse button up. In Gazepoint Analysis, visualization of left mouse button data is highlighted with a green square centered on the click location and visualization of right mouse button data is highlighted with a red square centered on the click location.
Blink tracking has been added to the latest release of the Gazepoint software (V3.0.0), both to Control via the Open Gaze API (all versions) and to Analysis visualizations (UX). In Control you can access the blink tracking metrics by enabling the ENABLE_SEND_BLINK setting which adds a blink counter BKID, blink duration for the last blink BKDUR and the average blink rate over the last minute BKPMIN. In Analysis simply click the Enable display button for the Blink Rate to turn on the overlay.
The Gazepoint Remote Viewer provides the ability to transmit the images shown in the Gazepoint Analysis software, over a network, to a remote computer for unobtrusive observation of the experiment underway. The core features of Remote include: Transmission of Analysis images to remote computers over a TCP/IP network High quality images and low bandwidth requirements Control of experiment underway (Start / Stop / Next media item) Remote chat between multiple Remote clients Real-time event logging in the experiment by Remote clients Full screen viewing We quietly introduced the Gazepoint Remote Viewer late last year and over the last months collected feedback on the performance of the system. We originally provided the ability to tune the quality of the image (low to high) to vary the level of bandwidth required. The feedback was a resounding, "Give us the highest quality images at the maximum frame rate with minimum bandwidth!", so we completely redesigned the underlying video feed protocol. The latest release is able to transmit high quality images at 10 frames per second (fps), using approximately 20 Mbps, easily within a 100 Mbps local area network. The bandwidth/frame rate will depend somewhat on the content being shown to the user (how much image compression is possible) and the underlying network speed. While a LAN (100/1000 Mbps) connection is best, the system will still operate over a lower bandwidth connection such as WiFi or the Internet (WAN), the frame rate may just be lower. Remote provides the ability to remotely start the recording of the experiment, move the recording to the next media item to be tested, as well as to stop the recording. This can allow the experiment subject full control of their computer (keyboard and mouse), while the [...]
I recently looked back at our blog and the last post was in February! It's been quiet on the website but we have not been idle, we have a whole set of new features and products we'll be rolling out over the fall. We've just posted V3.0 of the Gazepoint Software suite, I'll take you through a few of the new features in the next set of blog posts.
Gazepoint has been covered in the news a fair amount recently. After coverage in the local Vancouver Sun newspaper, Dr. Hennessey has given interviews on CKNW radio (SoundCloud link below) as well as CTV on eye-tracking applications and even given a sneak peak at our head mounted prototype system. We have also added another 3 citations of the Gazepoint GP3 in academic papers to the publications page.
Gazepoint's affordable and easy-to-use GP3 means you can outfit an entire classroom with eye-trackers! Leading eye tracking expert Dr. Andrew Duchowski of Clemson University, set-up a classroom with a GP3 eye tracker and Gazepoint Analysis software on each desk for the Fall 2014 school term. Dr. Duchowski teaches eye tracking from the low level systems to end-use applications, and found that sharing a limited number of devices between students really limited their exposure and first -understanding of how eye trackers works and are best used. With the help of Gazepoint, problem solved! With an entire laboratory facility instrumented with eye trackers and eye tracking software tools, it became much faster and easier to teach core concepts and have the students gain practical hands-on experience with running eye tracking studies. A total of 20 workstations were instrumented with GP3 desktop eye trackers attached to 22" displays with the VESA mount adaptor. The VESA mounts ensure the GP3 is solidly affixed to the workstation, while also properly positioning the eye tracker with respect to the viewer. Each workstation ran a full version of the Gazepoint Analysis UX software. "Teaching the class with 20 eye trackers, with one per student, was liberating. Contention issues were all but eliminated. A portable eye tracker on a tripod in front of the lectern with the instructor's display projected on the screen allowed a hands-on demo where each student ran their own calibration and their own first mock study with an image as stimulus. Prior to the instrumented classroom, students would often procrastinate with this kind of experimentation, jeopardizing their potential for success later in the semester. The in-class hands-on demo allowed each student to quickly grasp how to calibrate the device, record data, and then immediately see how scanpaths and heatmaps correspond to what they had [...]