The longterm goal of the proposed research is to increase our understanding of the perception of spoken language. Previous studies have demonstrated that the perception of speech is not solely an auditory process. Visual information from a talker's mouth and face can also play an important role in spoken language processing, for both hearing-impaired and normal hearing adults. Research in this laboratory has shown that visual information can influence speech processing in a manner similar to the corresponding auditory information. The proposed research will extend these findings by examining the processes involved in the integration of auditory and visual information during phonetic perception. Four specific issues will be addressed: (1) whether the visual information is equivalent to the corresponding auditory information in its effect on phonetic processing; (2) whether the integration of phonetic information from an auditory-visual situation is similar to the integration of two sources of auditory information; (3) whether the information in the visual modality is processed independently or dependently of the information from the auditory modality; and (4) whether the original auditory information is accessible for further processing, after it has been integrated with the visual information. The results from these experiments should have important implications for models of speech perception and development. In addition, they should be relevent to clinical concerns for aural rehabilitation of the hearing-impaired.