DESCRIPTION:(adapted from applicant's abstract) The purpose of this grant is to develop a neural prosthesis to help paralyzed patients. The prosthesis will be developed in non-human primates as a precursor to applying a similar approach in humans. The rationale of the prosthesis is to record from an area of the cerebral cortex that plans reach movements. If these plans can be read-out in real-time, then patients who are paralyzed from spinal cord section, ALS, or other peripheral neuropathies could still think about making movements, and these thoughts could be used to operate external devices. In the experiments, the activity from the parietal reach region (PRR) will be recorded using arrays of electrodes, an area that is responsible for the initial planning of reach movements. Decode algorithms will be developed that will allow these plans to be read out in real time. The output device will be a robot limb whose controller is designed to be instructed by high level signals and to compute many of the lower level aspects of the movement trajectory that are normally computed at levels of the brain closer to the motor output. This hybrid control system represents a new area of robotics research. Additionally, local field potentials will be used to convey similar information to the robotic controller and this should provide a breakthrough for long-term recordings. The specific aims will proceed from the simplest experiment of demonstrating that a monkey can control an animated limb on a computer screen to more complex experiments where the experimenters will determine how PRR codes information about reaches in more natural situations. These later experiments will include neurophysiological studies of the coding of sequential movements, curved trajectories, combined hand-eye movements, and visua-motor plasticity. The concurrent engineering studies will develop alogorithms that can reconstruct the intended movements from neural record and develop supervisory control architectures that can move the robotic limb appropriately, all in real-time.