Humans are highly social animals, making auditory communication and perception vitally important. However, research of cortical involvement in auditory perception has lagged behind similar work in visual neuroscience, leaving significant gaps in our understanding of functional organization of human auditory cortex. The work proposed here will answer important questions regarding representation of auditory objects (i.e., complex sounds of behavioral significance) in nonprimary auditory cortex and will examine the effects of expertise on the neural specificity and organization of these object representations. Normal and expert (musician and birder) participants will listen to speech, musical instruments, and bird calls while blood oxygen level dependent (BOLD) changes in functional magnetic resonance imaging (fMRI) signal are recorded. We will employ the innovative technique of repetition adaptation (RA) to address the specificity of neural tuning within auditory cortex. While stimuli should engage similar cortical resources early within auditory cortical pathways, we expect higher areas to contain narrowly tuned representations of objects (i.e., exhibit RA) in category-specific subareas within anterior superior temporal cortex. Furthermore, we expect the specificity of neural representations in these higher auditory subareas to increase with level of experience with a particular category. Thus, neural representations of expert objects (i.e., speech sounds for all participants, musical instruments for musicians, and bird calls for birders) should be more narrowly tuned (i.e., exhibit greater RA) than those of nonexpert objects. Visualizing neural representations of multiple object categories within the same individuals, and exploring experience-dependent modulation of neural organization and specificity, will both be crucial and novel contributions to our limited understanding of functional organization of human auditory cortex. Moreover, knowledge gained from this research will deepen our understanding of auditory agnosias, aphasia, developmental perceptual language disorders like dyslexia, and social disorders like autism. Despite recent progress in our understanding of brain function, how the human brain discriminates between speech, music, and other complex sounds is still not well understood. We will use brain imaging (functional magnetic resonance imaging, fMRI) to monitor brain activity while normal people, musicians, and birders listen to speech, music, and bird calls. This research will not only help determine whether auditory perception of language is different from that of other sounds, it will also further our understanding of auditory deficits resulting from brain injury (aphasia, agnosia) and abnormal development (dyslexia, autism).