This proposal examines the role of the ventrolateral prefontal cortex (vPFC) in the processing of multimodal communication signals. In many cases, auditory and visual stimuli provide complementary information, and this especially true of communication signals. Rhesus macaques are sensitive to the correlations that exist between the production of vocalizations and the facial expressions that accompany them. Fundamental to the process of understanding multimodal stimuli is to combine both modalities into a unified representation. In Aim #1, we test the role of the vPFC in processing multimodal communication signals. We hypothesize that vPFC neurons respond preferentially to auditory and visual stimuli that convey complementary information. In Aim #2, we test the role of the vPFC in the categorization of multimodal communication signals. Rhesus monkeys discriminate between certain classes of species specific vocalizations based on relatively abstract categories as opposed to raw perceptual features, and this categorization can be seen in the firing rates of vPFC neurons. We hypothesize that vPFC neurons code multimodal stimuli into categories that are based on the information they convey as opposed to their morphological properties. [unreadable] [unreadable] [unreadable]