Our experience of the world depends on integration of cues from multiple senses to form unified percepts. How the brain merges information across sensory modalities has been the object of debate. To measure how rats bring together information across sensory modalities, we devised an orientation categorization task that combines vision and touch. Rats encounter an object–comprised of alternating black and white raised bars–that looks and feels like a grating and can be explored by vision (V), touch (T), or both (VT). The grating is rotated to assume one orientation on each trial, spanning a range of 180 degrees. Rats learn to lick one spout for orientations of 0±45 degrees (“horizontal”) and the opposite spout for orientations of 90±45° (“vertical”). Though training was in VT condition, rats could recognize the object and apply the rules of the task on first exposure to V and to T conditions. This suggests that the multimodal percept corresponds to that of the single modalities. Quantifying their performance, we found that rats have good orientation acuity using their whiskers and snout (T condition); however under our default conditions, typically performance is superior by vision (V condition). Illumination could be adjusted to render V and T performance equivalent. Independently of whether V and T performance is made equivalent, performance is always highest in the VT condition, indicating multisensory enhancement. Is the enhancement optimal with respect to the best linear combination? To answer this, we computed the performance expected by optimal integration in the framework of Bayesian decision theory and found that most rats combine visual and tactile information better than predicted by the standard ideal–observer model. To confirm these results, we interpreted the data in two additional frameworks: Summation of mutual information for each sensory channel and probabilities of independent events. All three analyses agree that rats combine vision and touch better than could be accounted for by a linear interaction.
Electrophysiological recordings in the posterior parietal cortex (PPC) of behaving rats revealed that neuronal activity is modulated by decision of the rats as well as by categorical or graded modality-shared representations of the stimulus orientation. Because the population of PPC neurons expresses activity ranging from strongly stimulus-related (e.g. graded in relation to stimulus orientation) to strongly choice-related (e.g. modulated by stimulus category but not by orientation within a category) we suggest that this region is involved in the percept-to-choice transformation.