A subscription to JoVE is required to view this content. Sign in or start your free trial.

In This Article

  • Summary
  • Abstract
  • Introduction
  • Protocol
  • Results
  • Discussion
  • Disclosures
  • Acknowledgements
  • Materials
  • References
  • Reprints and Permissions

Summary

We describe here PyOKR, a semi-automated quantitative analysis method that directly measures eye movements resulting from visual responses to two-dimensional image motion. A Python-based user interface and analysis algorithm allows for higher throughput and more accurate quantitative measurements of eye-tracking parameters than previous methods.

Abstract

The study of behavioral responses to visual stimuli is a key component of understanding visual system function. One notable response is the optokinetic reflex (OKR), a highly conserved innate behavior necessary for image stabilization on the retina. The OKR provides a robust readout of image tracking ability and has been extensively studied to understand visual system circuitry and function in animals from different genetic backgrounds. The OKR consists of two phases: a slow tracking phase as the eye follows a stimulus to the edge of the visual plane and a compensatory fast phase saccade that resets the position of the eye in the orbit. Previous methods of tracking gain quantification, although reliable, are labor intensive and can be subjective or arbitrarily derived. To obtain more rapid and reproducible quantification of eye tracking ability, we have developed a novel semi-automated analysis program, PyOKR, that allows for quantification of two-dimensional eye tracking motion in response to any directional stimulus, in addition to being adaptable to any type of video-oculography equipment. This method provides automated filtering, selection of slow tracking phases, modeling of vertical and horizontal eye vectors, quantification of eye movement gains relative to stimulus speed, and organization of resultant data into a usable spreadsheet for statistical and graphical comparisons. This quantitative and streamlined analysis pipeline, readily accessible via PyPI import, provides a fast and direct measurement of OKR responses, thereby facilitating the study of visual behavioral responses.

Introduction

Image stabilization relies on precise oculomotor responses to compensate for global optic flow that occurs during self-motion. This stabilization is driven primarily by two motor responses: the optokinetic reflex (OKR) and the vestibulo-ocular reflex (VOR)1,2,3. Slow global motion across the retina induces the OKR, which elicits reflexive eye rotation in the corresponding direction to stabilize the image1,2. This movement, known as the slow phase, is interrupted by compensatory saccades, known as the fast phase, in which the eye rapidly resets in the opposite direction to allow for a new slow phase. Here, we define these fast-phase saccades as eye-tracking movements (ETMs). Whereas the VOR relies on the vestibular system to elicit eye movements to compensate for head movements3, the OKR is initiated in the retina by the firing of ON and subsequent signaling to the Accessory Optic System (AOS) in the midbrain4,5. Due to its direct reliance on retinal circuits, the OKR has been frequently used to determine visual tracking ability in both research and clinical settings6,7.

The OKR has been studied extensively as a tool for assessing basic visual ability2,6,8, DSGC development9,10,11,12, oculomotor responses13, and physiological differences among genetic backgrounds7. The OKR is evaluated in head-fixed animals presented with a moving stimulus14. Oculomotor responses are typically captured using a variety of video tools, and eye-tracking motions are captured as OKR waveforms in the horizontal and vertical directions9. To quantify tracking ability, two primary metrics have been described: tracking gain (the velocity of the eye relative to the velocity of the stimulus) and ETM frequency (the number of fast phase saccades over a given time frame). Calculation of gain has been used historically to directly measure angular velocity of the eye to estimate tracking ability; however, these calculations are labor intensive and can be arbitrarily derived based on video-oculography collection methods and subsequent quantification. For more rapid OKR assessment, counting of ETM frequency has been used as an alternate method for measuring tracking acuity7. Although this provides a fairly accurate estimation of tracking ability, this method relies on an indirect metric to quantify the slow phase response and introduces a number of biases. These include an observer bias in saccade determination, a reliance on temporally consistent saccadic responses across a set epoch, and an inability to assess the magnitude of the slow phase response.

In order to address these concerns with current OKR assessment approaches and to enable a high throughput in-depth quantification of OKR parameters, we have developed a new analysis method to quantify OKR waveforms. Our approach uses an accessible Python-based software platform named "PyOKR." Using this software, modeling and quantification of OKR slow phase responses can be studied in greater depth and with increased parameterization. The software provides accessible and reproducible quantitative assessments of responses to a myriad of visual stimuli and also two-dimensional visual tracking in response to horizontal and vertical motion.

Protocol

All animal experiments performed at The Johns Hopkins University School of Medicine (JHUSOM) were approved by the Institutional Animal Care and Use Committee (IACUC) at the JHUSOM. All experiments performed at the University of California, San Francisco (UCSF) were performed in accordance with protocols approved by the UCSF Institutional Animal Care and Use Program.

1. Behavioral data collection

  1. Record OKR eye movements using the video-oculography method of choice to generate wave data (i.e., a time series of the eye's gaze angle in spherical coordinates).
    NOTE: Representative data collected at JHUSOM were obtained using headpost implantation surgery and video-oculography, as previously described9,13 (Figure 1). Representative data collected from UCSF were obtained through the headpost implantation surgery and the video-oculography method, as described in previously10 (Figure 7).
    1. Make note of stimulus and recording parameters: recording frame rate, stimulus speed and direction, and lengths of time between and after stimulus epochs. For sinusoidal stimuli, note the amplitude and frequency of the stimulus wave as well.
  2. Export collected wave data as a .CSV file containing horizontal and vertical (azimuth and elevation) wave data.
    1. Organize wave data as a tab-delimited .CSV file with two columns containing horizontal data (epxWave) and vertical data (epyWave).

2. Installation of analysis software

  1. Download and install Python.
    1. For graph supervision, install Spyder via Anaconda.
    2. To ensure graphs function correctly in Spyder, go to Tools > Preferences > Ipython Console > Graphics > Graphics Backend. Set Inline to Automatic.
  2. Create a new Anaconda environment with Python.
  3. Install PyOKR via PyPi with pip install PyOKR to install the newest version along with package dependencies (Supplementary Coding File 1 and Supplementary Coding File 2)
  4. If a Windows computer is being used, run from PyOKR import OKR_win as o and then o.run().
  5. If a Mac computer is being used, run from PyOKR import OKR_osx as o and then o.run().

3. Analysis of wave data

  1. Initialization of analysis and file imports
    1. Run o.run() in a .py script to open the user interface.
    2. Under File, use the function Open or the command Ctrl+O [iOS command] to open a browser that will allow the user to select the desired wave file.
    3. Under File, use the button Export Folder or the command Ctrl+E to open a folder browser that will allow the selection of an output folder to which final analyses will be exported.
    4. Input the final analysis file name under the Output file in a recommended format such as AnimalGenotype_AnimalNumber_Analysis.
    5. Set the program for an individual animal using the command Set Subject under File or the command Ctrl+S to initialize the dataset for an individual animal.
  2. Definition of wave file parameters
    1. To begin setting stimulus parameters, define directionality under Select stimulus direction by selecting one of the four cardinal directions. For sinusoidal stimuli, select one that contains (Horizontal) or (Vertical) accordingly, with the cardinal direction defining the initial direction of the sinusoidal wave.
    2. Set stimulus type under Select stimulus type as either Unidirectional, Oscillatory, or Oblique.
    3. After setting the directionality, import either one's own stimulus position dataset (Import own stimulus vector data) or automatically generate a vector based on parameters (Generate stimulus vector from parameters). If importing a stimulus vector, proceed with 3.2.3.1 and then skip to step 3.3. If generating a stimulus vector, proceed with the next steps.
      1. If importing one's own vector data, import the distance values of the stimulus (i.e., a time series describing how far the stimulus moves between each adjacent acquisition frame) in the same format described in step 3.2.1. Additionally, analyze the entire dataset as one epoch rather than splitting it into individual epochs, as functionality for subsetting imported stimulus value has not been added as of PyOKR v1.1.2.
    4. Under Stimulus parameters, set the parameters of the stimulus used for data collection.
      1. Set the length of time of no stimulus at the beginning (head) and end (tail) of a given trial with Head and Tail.
      2. Set the amount of time a stimulus is shown, the amount of time of no-stimulus after, and the number of total epochs within a given trial with Length of epoch, Length of post-stimulus, and Number of Epochs, respectively.
      3. For unidirectional and oblique stimuli, set stimulus speed in degrees per second with Horizontal Speed and Vertical Speed.
      4. Set the capture rate of the collection camera with Capture frame rate.
      5. For sinusoidal stimuli, generate the sinusoidal wave for modeling oscillatory stimuli with Frequency and Amplitude.
    5. After parameterization, make the appropriate model from the inputted stimulus information above with Generate stimulus vector from parameters.
    6. Select a given epoch for the inputted stimulus using Select epoch to scan through the total wave file.
  3. Supervised selection of tracking phases
    1. To identify regions of slow tracking, automatically select fast phase saccades with Preliminary adjustment by clicking either Unfiltered Data or Filtered Data, which will label potential saccades based on maximal velocity changes.
    2. Under Unfiltered Data, confirm that saccades are accurately selected with a blue dot. If automatic selection is not accurate, manually remove points with the Left Mouse Button (LMB) or add points with the Right Mouse Button (RMB). When fast phase saccades are adequately selected, save the points with the Middle Mouse Button (MMB) and close the graph.
    3. If automatic filtering is desired, set a Z-Score Threshold and click Filtered Data to automatically filter saccades. If necessary, use the same manual supervision as described in step 3.3.2 to remove any noise.
    4. After proper saccade selection, press Point Adjustment to select the region to remove. Alter top and bottom points through a similar control scheme as described previously in step 3.3.2. Edit top (green) points with the LMB or the RMB and edit bottom (red) points with the Shift+LMB or Shift+RMB. When points are properly placed, use the MMB to save the points.
      NOTE: if using a Mac, bottom and top point adjustment are in two separate buttons and follow the same control scheme as described in step 3.3.2.
  4. Analysis of slow-tracking phases
    1. Set the order of the polynomial model using Set Polynomial Order to define the polynomial model that will be fitted to individual slow phases.
      NOTE: For unidirectional or oblique stimuli, the default value is 1 since linearity is necessary to calculate tracking gain. For sinusoidal stimuli, a higher order is needed to model the curve of the wave, with a default of 15.
    2. To analyze the trace, select Final Analysis to generate the slow phase models (Figure 2) for the selected slow phases (see Figure 2A-D) and calculate the distances, velocities, and tracking gains averaged across the epoch (Figure 2E).
    3. To view the two-dimensional (2D) or three-dimensional (3D) graph of the selected regions, select View 2D graph or View 3D graph, respectively.
    4. Select Add epoch to save the collected values generated in step 3.4.2. To view all added values for a given animal as well as averages for collected trials, select View current dataset.
    5. After an epoch is added, cycle through the rest of the file with Select epoch, following steps 3.3.1 to 3.4.4.
    6. Once a wave file is fully analyzed, repeat this process for all other files for a given animal by opening new files, setting appropriate parameters, and analyzing them accordingly. By repeating steps 3.2.1-3.4.5 for each file, generate a final dataset containing all wave data for a given animal.
  5. Final export of data
    1. After data analysis is complete for a given animal, with all directions or stimuli analyzed, export the dataset via Export data.
      NOTE: The raw dataset will be exported based on the Output file name and saved along the path set by Output Folder as a CSV containing individual epoch data with the total mean for each stimulus parameter.
    2. After exporting an individual animal, re-initialize the dataset with Ctrl+S and then repeat all previous steps to analyze a new animal.
    3. If needed, re-organize all the output data collected for multiple animals for easier analysis using the command Sort Data under the Analysis tab.
      NOTE: This function will compile and sort all average values for all the analyzed animal files stored within the output folder to allow for easier generation of graphs and statistical comparisons. Sorting is reliant on the naming of the files as of v1.1.2. Use the recommended naming scheme as described in step 3.1.4 for each file (e.g., WT_123_Analysis).

Results

To validate the analysis method described above, we quantified OKR tracking gain on wave traces collected from wild-type mice and a conditional knockout mutant with a known tracking deficit. In addition, to test the broader applicability of our analysis method, we analyzed traces derived from a separate cohort of wild-type mice acquired using a different video-oculography collection method. The automatic filtering of saccades facilitates OKR data processing and analysis (Figure 3). Using rec...

Discussion

PyOKR provides several advantages for studying visual responses reflected in eye movements. These include accuracy, accessibility, and data collection options, in addition to the ability to incorporate parameterization and variable stimulus speeds.

Direct eye tracking gain assessment provides an accurate characterization of eye movement that is a more direct quantitative metric than traditional manual counting of fast phase saccades (ETMs). Although useful, saccade counting provides an indirec...

Disclosures

The authors have no conflicts of interest.

Acknowledgements

This work was supported by R01 EY032095 (ALK), VSTP pre-doctoral fellowship 5T32 EY7143-27 (JK), F31 EY-033225 (SCH), R01 EY035028 (FAD and ALK) and R01 EY-029772 (FAD).

Materials

NameCompanyCatalog NumberComments
C57BL/6J  miceJackson Labs664
Igor ProWaveMetricsRRID: SCR_000325
MATLABMathWorksRRID: SCR_001622
Optokinetic reflex recording chamber - JHUSOMCustom-builtN/AAs described in Al-Khindi et al.(2022)9 and Kodama et al. (2016)13 
Optokinetic reflex recording chamber - UCSFCustom-builtN/AAs described in Harris and Dunn, 201510
PythonPython Software FoundationRRID: SCR_008394
Tbx5 flox/+ miceGift from B. BruneauN/AAs described in Al-Khindi et al.(2022)9 
Tg(Pcdh9-cre)NP276Gsat/MmucdMMRRCMMRRC Stock # 036084-UCD; RRID: MMRRC_036084-UCD

References

  1. Stahl, J. S. Using eye movements to assess brain function in mice. Vision Res. 44 (28), 3401-3410 (2004).
  2. Kretschmer, F., Tariq, M., Chatila, W., Wu, B., Badea, T. C. Comparison of optomotor and optokinetic reflexes in mice. J Neurophysiol. 118, 300-316 (2017).
  3. Bronstein, A. M., Patel, M., Arshad, Q. A brief review of the clinical anatomy of the vestibular-ocular connections - How much do we know. Eye. 29 (2), 163-170 (2015).
  4. Simpson, J. I. The accessory optic system. Ann Rev Neurosci. 7, 13-41 (1984).
  5. Hamilton, N. R., Scasny, A. J., Kolodkin, A. L. Development of the vertebrate retinal direction-selective circuit. Dev Biol. 477, 273-283 (2021).
  6. Dobson, V., Teller, D. Y. Visual acuity in human infants: a review and comparison of behavioral and electrophysiological studies. Vision Res. 18 (11), 1469-1483 (1978).
  7. Cahill, H., Nathans, J. The optokinetic reflex as a tool for quantitative analyses of nervous system function in mice: Application to genetic and drug-induced variation. PLoS One. 3 (4), e2055 (2008).
  8. Cameron, D. J., et al. The optokinetic response as a quantitative measure of visual acuity in zebrafish. J Vis Exp. (80), e50832 (2013).
  9. Al-Khindi, T., et al. The transcription factor Tbx5 regulates direction-selective retinal ganglion cell development and image stabilization. Curr Biol. 32 (19), 4286-4298 (2022).
  10. Harris, S. C., Dunn, F. A. Asymmetric retinal direction tuning predicts optokinetic eye movements across stimulus conditions. eLife. 12, 81780 (2015).
  11. Sun, L. O., et al. Functional assembly of accessory optic system circuitry critical for compensatory eye movements. Neuron. 86 (4), 971-984 (2015).
  12. Yonehara, K., et al. Congenital Nystagmus gene FRMD7 is necessary for establishing a neuronal circuit asymmetry for direction selectivity. Neuron. 89 (1), 177-193 (2016).
  13. Kodama, T., Du Lac, S. Adaptive acceleration of visually evoked smooth eye movements in mice. J Neurosci. 36 (25), 6836-6849 (2016).
  14. Stahl, J. S., Van Alphen, A. M., De Zeeuw, C. I. A comparison of video and magnetic search coil recordings of mouse eye movements. J Neurosci Methods. 99 (1-2), 101-110 (2000).

Reprints and Permissions

Request permission to reuse the text or figures of this JoVE article

Request Permission

Explore More Articles

Optokinetic ReflexOKREye TrackingVisual SystemImage StabilizationPyOKRSemi automated AnalysisVideo oculographySlow Tracking PhaseFast Phase SaccadeEye Movement Gain

This article has been published

Video Coming Soon

JoVE Logo

Privacy

Terms of Use

Policies

Research

Education

ABOUT JoVE

Copyright © 2025 MyJoVE Corporation. All rights reserved