Distinct Encoding of Spatial and Nonspatial Visual Information in Parietal Cortex

Abstract
It is well established that the primate parietal cortex plays an important role in visuospatial processing. Parietal cortex damage in both humans and monkeys can lead to behavioral deficits in spatial processing, and many parietal neurons, such as in the macaque lateral intraparietal area (LIP), are strongly influenced by visual–spatial factors. Several recent studies have shown that LIP neurons can also convey robust signals related to nonspatial factors, such as color, shape, and the behavioral context or rule that is relevant for solving the task at hand. But what is the relationship between the encoding of spatial factors and more abstract, nonspatial, influences in LIP? To examine this, we trained monkeys to group visual motion patterns into two arbitrary categories, and recorded the activity of LIP neurons while monkeys performed a categorization task in which stimuli were presented either inside each neuron9s receptive field (RF) or at a location in the opposite visual field. While the activity of nearly all LIP neurons showed strong spatial dependence (i.e., greater responses when stimuli were presented within neurons9 RFs), we also found that many LIP neurons also showed reliable encoding of the category membership of stimuli even when the stimuli were presented away from neurons9 RFs. This suggests that both spatial and nonspatial information can be encoded by individual LIP neurons, and that parietal cortex may be a nexus for the integration of visuospatial signals and more abstract task-dependent information during complex visually based behaviors.