|Keywords:||Biophysics, General; Biology, Neuroscience|
|Full text PDF:||http://nrs.harvard.edu/urn-3:HUL.InstRepos:14226061|
The retina sends many parallel channels of visual information to the brain through the axons of >20 retinal ganglion cell (RGC) populations. The purpose of these distinct circuits for vision remains an open question. Recent results suggest that each cell type responds selectively to a specific feature of the visual scene. These conclusions are derived primarily from experiments with artificial visual stimuli. It is unknown whether the insights gathered under such conditions extend to the natural environment in which the retina evolved. One can address this question by building a mathematical model of RGC responses to artificial stimuli and then testing how well that same model performs with natural visual input. For several RGC types this exercise has failed dramatically, indicating an imperfect understanding of their neural code. Here we focus on the mouse alpha RGCs, which possess large cell bodies, stout axons, and wide receptive fields. Three subtypes had been previously defined based on their responses to light steps: On-Sustained, Off-Sustained, and Off-Transient. We targeted these RGCs for recording using a transgenic mouse line in which GFP is expressed in all alpha subtypes. Quantitative analysis of the recorded light responses revealed four distinct physiological cell types: an On-Transient alpha RGC in addition to the other three type previously identified. Using both artificial stimuli and natural movies, we measured the visual responses of the mouse alpha cells. We then constructed a simple cascade-style model to link the stimulus to the firing rate. Based on electrophysiological recording and modeling, we found the visual messages the four alpha RGCs send to the brain to be similar in that they are minimally processed versions of the visual scene. Spatial averaging minimally influenced the responses of the alpha RGCs to the natural movies. Additionally, a simple linear- nonlinear model accounted very well for the visual responses of all four alpha RGC subtypes, correctly predicting at least 70% of the variance in firing. The same model worked for both artificial stimuli (e.g. random flicker) and natural stimuli (mouse-cam and simulated-mouse movies). This successful account of alpha cell function will be valuable as a retina model for understanding cortical vision in the behaving mouse.