We focus on the visual and somatosensory pathways at the junction between the sensory periphery and sensory cortex. Our experimental approaches include multi-site, mutli-electrode recording, optical imaging, behavior, and patterned stimulation. Our computational approaches include linear and nonlinear model estimation, information theory, observer analysis, and signal detection and discrimination. Our long-term goal is to provide surrogate control for circuits involved in sensory signaling, for pathways injured through trauma or disease.

Reading the Neural Code

Decoding, Computational ModelingPerception and Behavior

readingOne clear litmus test as to whether we truly understand the neural code is whether we can tap into the activity of the neurons and make clear predictions about what is going on in the outside world or what is about to go on through the actions of the organism. We might refer broadly to this as our attempt to read the neural code. Upon observing a pattern of activity in the brain, we seek a dictionary of sorts, so that we may, for example, interpret a pattern in the visual pathway as representing a tree or a dog, or a pattern of activation in the motor pathway as representing a motor output such as an eye or hand movement.

Writing the Neural code

Artificial stimulationClosed-loop control

writingBeyond reading the neural code, an even better litmus test as to whether we understand the principles of neural coding is whether we can imprint the circuit with any code we desire and induce measurable effects on neural activity, perception or behavior: to write the neural code. Can we artificially introduce patterns of activity in the brain that shape the way that information is propagated along the various pathways? Can we be made to see, hear or feel something that is not experienced naturally through our peripheral sensory organs?