Integrating Gazepoint Eye Tracking with Lab Streaming Layer (LSL)
At Gazepoint, we strive to make our eye tracking systems as flexible and interoperable as possible to support researchers across a wide range of disciplines. One of the most powerful tools for synchronizing data streams from multiple devices is the Lab Streaming Layer (LSL)—an open-source framework widely used in neuroscience, psychology, and human-computer interaction research. At Gazepoint we have developed a tool that allows you to stream Gazepoint API data with LSL. This tool has been updated in Gazepoint’s most recent software release. Now, every data field in the Gazepoint API can be streamed with LSL. In this post, we’ll introduce LSL, explain its benefits, and show you how to stream data from Gazepoint eye trackers using LSL.
What is Lab Streaming Layer (LSL)?
Lab Streaming Layer (LSL) is an open-source software system developed by the Swartz Center for Computational Neuroscience. It provides a standardized method for transmitting and synchronizing time-series data across devices and applications in real time.
Key features include:
- Sub-millisecond synchronization between multiple streams (e.g., eye tracking, EEG, GSR).
- Cross-platform support for Windows, macOS, and Linux.
- Language bindings for Python, C++, MATLAB, LabVIEW, and more.
- Automatic time correction and consistent timestamps across devices.
This makes LSL ideal for experiments that require precise coordination between multiple biosignal sources.
Why Use LSL with Gazepoint?
Gazepoint eye trackers are already used in diverse research domains—psychology, UX, education, and neuromarketing, to name a few. By integrating with LSL, you can:
- Synchronize eye tracking data with EEG, ECG, EMG, motion tracking, and other biosignals.
- Record all data streams into a single file.
- Minimize timing discrepancies that can affect the accuracy of multimodal experiments
Sample Applications
Here are a few ways researchers use Gazepoint with LSL:
- Cognitive load experiments combining EEG and gaze data.
- Pupillometry studies synchronized with audio/video stimuli.
- Reaction time analysis with synchronized button presses and fixations.
How to Stream Gazepoint Data via LSL
We provide a lightweight python application that reads data from the Gazepoint Control API and streams it into LSL in real time. To get started using a Gazepoint eye tracker with LSL, follow these steps:
Prerequisites:
- Python
- PyLSL (Installation instructions here: https://pypi.org/project/pylsl/ )
- Lab Recorder or other LSL recording tool (Lab Recorder can be found here: https://github.com/labstreaminglayer/App-LabRecorder/releases )
- Run Gazepoint Control
- Launch the Gazepoint Control software and ensure the Gazepoint GP3 eye gaze tracker is calibrated.
- Run Gazepoint LSL bridge application
- Using Python, run LSLGazepoint.py (found in the demo folder of your Gazepoint installation)
- Stream and Record
- Open Lab Recorder. Once running, you’ll see three streams- GazepointEyeTracker, GazepointBiometrics, and GazepointStringData appear in LabRecorder. You can now simultaneously record EEG, ECG, and other LSL-compatible devices into one synchronized XDF file for analysis.
Conclusion
Combining Gazepoint eye tracking with Lab Streaming Layer gives you the power and precision to conduct truly multimodal research. Whether you’re working in neuroscience, human factors, or UX testing, the LSL framework helps ensure that your data streams are synchronized and accurate,.
Interested in adding LSL support to your Gazepoint setup? Contact us to learn more!