Traditional models of perception have a foundational principle that sensory processing is hierarchical: sensory filters decompose the physical world into primitive features that are recombined through a series of stages into a coherent supramodal representation. These models are typically conceptualized as processing time-invariant (i.e., constant) signals or representations, and are evaluated in experiments whose structure (baseline->stimulus->response) parallels the discrete and staged nature of the presumed underlying processing dynamics. Real environments contain continuous time-varying signals and demand continuous perceptual and behavioral solutions, and while it is assumed that the operations of time-invariant models will trivially generalize to the continuous time domain, there are reasons to suspect that they will not. For example, in a continuous, dynamic environment, fluctuations in input traces may be interpreted as either representing noise contamination or changes in the source signals. How the brain achieves stable perceptual solutions in such circumstances is of critical interest. The present application describes a series of short-term experiments that seek to evaluate this issue in the context of multisensory integration. The merging of information across the senses has been shown to improve perceptual and behavioral judgments and speed the responses to external events, particularly those whose unisensory representations are significantly contaminated or obscured by noise. To realize these benefits, the brain must coordinate activity across senses that have different operational parameters, in particular, different reliabilities that are dependent on the immediate environmental circumstances. To achieve optimal integration, the brain must appropriately weight signals derived from different senses according to these reliabilities in their joint consideration. The framework proposed here seeks to understand this phenomenon by evaluating whether subjects use Kalman filter dynamics to achieve optimal performance on a continuous-time multisensory task. Subjects are tasked with tracking dynamic and noise-corrupted patterns of visual and auditory stimuli presented alone or in concert. The ability of subjects to adapt to changes in the signal and the relative reliabilities of the senses are interpreted in the context of the proposed model framework. Subject expectations are shaped by prior experience and forewarning of changes in the signals and signal statistics, and the timing and impact of the adjustments they make in their responses consequent to this information is evaluated. The experimental approach applied in the proposed studies will provide important insights into the principles upon which sensory channels are combined and help to establish the boundary conditions upon which real-time integration is achieved.