Data Acquisition

Tungsten microelectrodes, often arranged in arrays of as many as 8, are advanced through the dura mater toward recording sites of interest with a hydraulic microdrive. Single units and local field potentials (LFPs) are amplified, filtered, and recorded using Plexon’s Multichannel Acquisition Processor (MAP) system. We classify extracellular spikes as belonging to one neuron or another through online and offline sorting based on spike shape (Plexon). While neural signals are being collected, eye position is simultaneously monitored either using a scleral search coil or an infrared eye tracker.

In some cases, electrodes imbedded in the surface of the skull are used to sample EEG signals, which are also filtered and sampled using the MAP system at the same time as spike and LFP signals. Because human brain impulses are typically recorded only from EEG electrodes, these surface electrodes enable us to begin bridging the literature between human and nonhuman primate neurophysiology research and also allow us to begin relating EEG signals to the underlying neural activity.

The time at which experimental events, such as the appearance of a visual target, occur is recorded by TEMPO software. This information allows us to analyze the timing of our neural and behavioral data in the context of a precise experimental timescale.

Data Analysis

While data are collected through a Windows or DOS platform, researchers are not limited to this computing platform. In the Schall Lab, researchers are encouraged to work in the computing environment where they are most comfortable. Lab members make use of Windows, Linux, and Mac, depending on their computing needs and preferences.


MATLAB is a high-level language and interactive environment that enables researchers to realize computationally intensive tasks faster than with traditional programming languages such as C, C++, and Fortran. MATLAB is exceptional in its ability to deal with vectors and matrices. MATLAB’s platform independence enables researchers to choose their own computing platform (e.g. Windows, Linux, or Mac) for post-recording data analysis. Sample analysis and mathematical modeling code is described below.

Interactive Race Model

The countermanding, or stop signal, task has been used to study normal cognitive control and clinical dysfunction (reviewed by Logan, 1994, Inhibitory processes in attention, memory, and language, p. 189-239, San Diego: Academic Press). Its utility is derived from an independent race model that accounts for performance and provides an estimate of the time it takes to stop a movement (Logan & Cowan, 1984, Psych Rev 91: 295-327). This model posits a race between GO and STOP processes with stochastically independent finish times. However, neurophysiological studies demonstrate that the neural correlates of the GO and STOP processes produce movements through a network of interacting neurons (Hanes, Patterson, & Schall, 1998, J Neurophysiol, 79, 817-834; Paré & Hanes, 2003, J Neurosci 23: 6480-6489; see also Schall, 2004, Ann Rev Psych 55: 23-50). The juxtaposition of the computational model with the neural data exposes a paradox ---- how can a network of interacting units produce behavior that appears to be the outcome of an independent race? We reported how a simple, competitive network can solve this paradox and provide an account of what is measured by stop signal reaction time (Boucher, Palmeri, Logan, & Schall, 2007, Psych Rev 114: 376-397). You find the code that plays out this interactive race model here.


To investigate spike timing relationships between simultaneously recorded neurons, we wrote and optimized functions to perform joint peri-stimulus time histogram (JPSTH) analysis . This technique was originally developed by Aertsen et al (1989, J Neurophysiol 51: 900-17) and refined by Brody (1999, Neural Computation 11: 1127-35; 1999, Neural Computation 1537-51). Our code can be found here.

Poisson spike train analysis

To determine when a neuron modulates its firing rate within a single trial, we have implemented a Poisson spike train analysis. To derive a continuous function from a discrete spike train, we calculate a spike density function by replacing the standard gaussian kernel with a kernel that looks like a post-synaptic potential (PSP, also referred to as an alpha function). A problem with the gaussian formulation is that selection of the standard deviation is somewhat arbitrary. We elected to use a kernel that has a rapid growth and slower decay, just like a post-synaptic potential. In other words, it functions like a leaky integrator. Of course, the time constants of growth and decay can be set to arbitrary values. We have used values measured for excitatory post-synaptic potentials ((growth) = 1 msec, (decay) = 20 msec). Here is the code.

Originally developed by Legendy and Salcman (1985, J Neurophysiol 53:926), we implemented this algorithm for use with data from awake behaving monkeys. Here is an example of its performance. Complete descriptions of its use can be found in Hanes, Thompson, and Schall (1995, Exp Brain Res 103:85-96) and Thompson, Hanes, Bichot and Schall (1996, J Neurophysiol 76:4040-4055). Here is a readme file and the code.

Burst Onset Detector

Presaccadic movement cells and preparatory set cells in the supplementary eye field fire a burst of action potentials prior to saccade initiation. To test whether this bursting activity is related to saccade preparation, it is necessary to detect the onset of each burst over many trials. Hanes, Thompson, and Schall (1995, Exp Brain Res 103:85-96) developed this code. Type help p_burst in the MATLAB command prompt for instructions on how to execute.


In the cooperative environment of the Schall Lab, members are encouraged to share, reuse, and improve code. Subversion is the tool used to maintain control over revisions and distribute these revisions to other users. That is, Subversion manages files and directories, and the changes made to them, over time. This allows the recovery of older versions of code, or examination of the history of how code changed. At some level, the ability for various people to modify and manage the same set of code from their respective locations fosters collaboration.


The Advanced Computing Center for Research and Education is a Vanderbilt organization that provides us both with incremental tape backup services and a high performance Linux computing cluster for remotely and rapidly executing code. All data is incrementally backed up on both central and departmental servers daily.

This page is valid XHTML 1.1