Most research on the smooth pursuit system has been conducted using a single spot target moving in isolation. This approach has been productive, and has resulted in elegant models that are consistent with the physiology of the motor limb of this system. However, in a natural scene many objects can move, each of which can have spatial structure and features that stimulate multiple motion detectors. Prevailing models do not describe how spatially distributed motion signals are processed to provide the driving signal for smooth pursuit. The long-term goal of this project is to determine how visual motion information is used to guide voluntary smooth eye movements in natural scenes, and to understand how eye movement systems cooperate when tracking complex moving targets with many articulated components. Our proposal addresses these issues using methodology and insight borrowed from the fields of visual psychophysics and oculomotor physiology to provide a complementary, collaborative approach to this problem. Specific aims are to: Determine the spatial organization of units that process visual motion for smooth pursuit. Determine if common motion processing limits smooth pursuit and perception of two objects. Determine signals affecting pursuit and saccadic performance while following a moving object and inspecting its features. The results should be applicable to the characterization of parameters of normal vision, and to the diagnosis and treatment of disorders of vision, strabismus and dynamic eye movement control.