Abstract Vision is an active sense that we use to explore the world around us. However, studies of visual coding are generally performed in animals that are head-fixed, which constrains the range of visual functions and behaviors that are amenable to study, thereby excluding many ethologically relevant natural behaviors as well as the interaction of visual processing and movement. A fundamental challenge in performing visual physiology in animals that are free to move relates to the fact that the experimenter no longer controls the visual input impinging on the retina, which depends on the animal?s position relative to the stimulus as well as head and eye position. Here we will address this challenge by developing a system to directly determine the visual input the animal receives by using two head-mounted miniature cameras: one to image the visual scene from the animal?s perspective, and one to measure pupil position in order to correct this visual scene for eye movements. In the first aim, we will implement the hardware and data analysis needed to acquire the visual input along with neural activity from an implanted silicon probe. In the second aim, we will apply this system to measure tuning curves and receptive fields in the same neurons across head-fixed and freely moving conditions. In addition to providing a proof-of-principle that the system is successful, the data from this aim will test the hypothesized role for a non-canonical cortical cell type, suppressed-by-contrast neurons, which have been proposed to signal the rapid change in visual input during self-motion. This project will remove a fundamental obstacle in visual physiology, and will provide the necessary foundation for future proposals to study natural behavior and contextual signals in visual processing.