Integrating sensory information from a variety of sources to produce motor commands is fundamental to human behavior. Impairments in multisensory and sensorimotor integration impact numerous aspects of human health, ranging from social interactions and communication to movements in a complicated environment. For example, directing gaze to the location of a sound is a complex information-processing task requiring the conversion of auditory input signals into motor commands to move the eyes. This process is impaired in human neglect patients. Here, we propose a joint computational and experimental approach to illuminate this problem. Specifically, we will investigate how information about sound location is encoded in the spike trains in primate inferior colliculus, auditory cortex, lateral intraparietal cortex, and superior colliculus when monkeys perform saccades to sounds. We will explore the nature of the response patterns as a function of sound location (Aim 1) and eye position (Aim 2). Of particular interest will be the shape of spatial sensitivity and the effects of eye position on spatial sensitivity. In addition to considering conventional types of spatial sensitivity, we will use novel computational tools to explore whet information can be extracted from these response patterns and provide compact descriptions of the high dimensional data (Aim 3). We will also develop new computational techniques to improve the experimental methods, including adaptive sampling and interactive visualization (Aim 4). These aims will enhance our understanding of neural processing from sensory input to motor output. The issues of multisensory and sensorimotor integration investigated here bear on a variety of neurological disorders such as those arising from stroke and other types of brain lesions. A better understanding of the transformation from sensory input to motor response will aid in identifying the pathophysiological substrates in neurological disorders with impaired sensorimotor integration.