The brain combines information from multiple sensory modalities to build a consistent representation of the world. The principles by which multimodal stimuli are integrated in cortical hierarchies are well studied, but it is less clear whether and how unimodal inputs shape the processing of signals carried by a different modality. In rodents, for instance, direct connections from primary auditory cortex reach visual cortex, but studies disagree on the impact of these projections on visual cortical processing. Both enhancement and suppression of visually evoked responses by auditory inputs have been reported, as well as sharpening of orientation tuning and improvement in the coding of visual information. Little is known, however, about the functional impact of auditory signals on rodent visual perception. Here we trained a group of rats in a visual temporal frequency (TF) classification task, where the visual stimuli to categorize were paired with simultaneous but taskirrelevant auditory stimuli, to prevent high-level multisensory integration and investigate instead the spontaneous, direct impact of auditory signals on the perception of visual stimuli. Rat classification of visual TF was strongly and systematically altered by the presence of sounds, in a way that was determined by sound intensity but not by its temporal modulation. To investigate the mechanisms underlying this phenomenon, we developed a Bayesian ideal observer model, combined with a neural coding scheme where neurons linearly encode visual TF but are inhibited by concomitant sounds by a measure that depends on their intensity. This model captured very precisely the full spectrum of rat perceptual choices we observed, supporting the hypothesis that auditory inputs induce an effective compression of the visual perceptual space. This suggests an important role for inhibition as the key mediator of auditory-visual interactions and provides clear, mechanistic hypotheses to be tested by future work on visual cortical codes.