Associate Professor, Department of Neuroscience
Ph.D. Brown University
Neural basis of vision and visually guided behavior
Our laboratory studies the neural mechanisms of visual motion and form processing and their interactions in the cerebral cortex and how representations of visual information guide eye movements. We take an integrative approach using techniques of neurophysiology, psychophysics, and computational modeling. Visual information is represented and processed by a large number of neurons distributed across dozens of brain areas. Each of these neurons is sensitive to certain features of the visual image and has a spatially-constrained “view” of the world. Moreover, because many visual neurons are broadly-tuned to stimulus features, any given visual feature is represented by the discharge of a large population of neurons. How are spatially-localized representations synthesized to form perception? How are attributes of visual stimuli decoded from distributed population activity to make perceptual decisions and to guide action? The research in our laboratory is directed at addressing these questions. One line of our research focuses on studying integration and segmentation of multiple visual features in the visual system. Another line of our research investigates the impact of eye movements on visual processing and perceptual stability.
Please click here for publications.