The basic "building blocks" of visual perception are starting to become reasonably well understood, and we can make a fairly good account of how simple discriminations are done. What we understand much less is how the visual system solves more realistic, everyday challenges. Visually guided navigation is a particularly good "model system" for studying real-world visual processes in the laboratory. The perception of self-motion from the pattern of motion on the retina has been studied extensively, though we still know very little about where in the brain the critical processing steps occur, and how the complex pattern of motion is converted into effective movement. The present proposal seeks to answer these open questions. First, we will seek direct evidence for the involvement of multiple cortical areas in the perception of self-motion, by using multiple, simultaneous recording techniques while our experimental animals are performing a discrimination of self-motion direction. Secondly, we will seek to ask if the parietal cortical area (the ventral intraparietal area or VIP) is both necessary for self-motion perception and is actually used. We will do this by perturbing the pattern of activity in VIP in the context of the self-motion task, both by reversible inactivation as well as by electrical activation. These complementary methods should greatly extend our understanding of how the parietal cortex participates in self-motion perception. However, to really extend our knowledge of self-motion perception, we need to extend the inquiry into a more active context. Human-factors studies have revealed that guidance of self-motion ("steering") is a very active process, with the direction of gaze being a critical component. However, next to nothing is known about the central nervous system mechanisms used in this active task. So, we propose to establish, characterize and exploit an animal model of active locomotion to study the involvement of brain structures in this task. We will train our subject to direct their "virtual" trajectories by joystick, and characterize how their normal behavior is influenced by cues including target direction, gaze direction, gaze velocity, and visual motion information. We will then record activity in multiple cortical areas while animals are engaged in this task, and explore the signals in visual and parietal cortex to better understand brain mechanisms of visually guided navigation. This information, in the long term, might be useful in helping the disabled to navigate, and in the development of visual prosthetics for the blind.