A great deal of evidence supports the notion that visual perception and visually guided behaviors are affected by our predictions about the non-random behavior and interdependencies of objects in the visual world. Previous studies have shown that posterior parietal cortex neurons in area LIP of the monkey can convey predictive signals about the upcoming direction of simple motion (in one dimension) when it was expected by the animal. However, critical issues remain unanswered concerning the mechanisms by which predictive signals develop during learning, the nature of predicted information (i.e. spatial vs. temporal) and how that information is used during visually guided behavior. We propose to address these issues by utilizing a novel behavioral paradigm in which monkeys learn to predict upcoming movements of a target that can follow complex two dimensional trajectories. By recording from neurons in LIP while monkeys learn to predict complex motion paths, we expect to gain substantial insights into the neuronal mechanisms that underlie visual prediction and learning and, further, a more detailed understanding of the nature of spatial information encoded in LIP.