saccades

Early visual signatures and benefits of intra-saccadic motion streaks
To efficiently explore our visual environment, we humans incessantly make brief and rapid eye movements. These so-called saccades inevitably shift the entire visual image across the retina, thereby inducing - like a moving camera with long exposure duration - a significant amount of motion blur, transforming single objects into elongated smeared motion streaks. While simultaneously recording electroencephalography and eye tracking, we asked human observers to make saccades to a target stimulus which then rapidly changed location while their eyes were in mid-flight. Critically, we compared smooth target motion to a simple jump, thus isolating neural responses and behavioral benefits specific to motion streaks. For continuous motion (i.e., when streaks were available), the post-saccadic target location could be decoded earlier from electrophysiological data and secondary saccades went more quickly to the new target location. Indeed, decoding of target location succeeded immediately after the end of the saccade and was most efficient on occipital sensors, suggesting that saccade-induced motion streaks are represented in visual cortex. Computational modeling of saccades as a consequence of early visual processes suggests that fast motion could be efficiently coded in orientation-selective channels, providing a parsimonious mechanism by which the brain exploits motion streaks for goal-directed behavior.
Saccadic omission revisited: What saccade-induced smear looks like
We rarely become aware of the immediate sensory consequences of our own saccades, that is, a massive amount of motion blur as the entire visual scene shifts across the retina. In this paper, we applied a novel tachistoscopic presentation technique to flash natural scenes in total darkness while observers made saccades. That way, motion smear induced by rapid image motion (otherwise omitted from perception) became readily observable. With this setup we could not only study the time course of motion smear generation and reduction, but also determine what visual features are encoded in smeared images. Low spatial frequencies and, most prominently, orientations parallel to the direction of the ongoing saccade. Using some cool computational modeling, we show that these results can be explained assuming no more than saccadic velocity and human contrast sensitivity profiles. To demonstrate that motion smear is directly linked to saccade dynamics, we show that the time course of perceived smear across observers can be predicted by a parsimonious motion-filter model that only takes the eyes’ trajectories as an input. And the best thing is that this works even if no saccades are made and the visual consequences of saccades are merely replayed to the fixating eye. In the name of open science, all modeling code, as well as data and data analysis code, is again publicly available.
Definition, Modeling, and Detection of Saccades in the Face of Post-saccadic Oscillations
When looking at data recorded by video-based eye tracking systems, one might have noticed brief periods of instability around saccade offset. These so-called post-saccadic oscillations are caused by inertial forces that act on the elastic components of the eye, such as the iris or the lens, and can greatly distort estimates of saccade duration and peak velocity. In this paper, we describe and evaluate biophysically plausible models (for a demonstration, see the shiny app) that can not only approximate saccade trajectories observed in video-based eye tracking, but also extract the underlying – and otherwise unobservable – rotation of the eyeball. We further present detection algorithms for post-saccadic oscillations, which are made publicly available, and finally demonstrate how accurate models of saccade trajectory can be used to generate data and mathematically tractable ground-truth labels for training ML-based algorithms that are capable of accurately detecting post-saccadic oscillations.