ABSTRACT More than 1/3 of adults in the United States seek medical attention for vestibular disorders and hearing loss; disorders that can triple one?s fall risk and have a profound effect on one?s participation in activities of daily living. Hearing loss has been shown to reduce balance performance and could be one modifiable risk factor for falls. Patients with vestibular hypofunction tend to avoid busy, hectic, visually complex, and loud environments because these environments provoke dizziness and imbalance. While the visual impact on balance is well known, less is known about the importance of sounds for balance. In search for a possible mechanism to explain a relationship between what we hear and balance control, some studies suggested that sounds may serve as an auditory anchor, providing spatial cues for balance, similar to vision. However, the majority of these studies tested healthy adults? response to sounds with blocked visuals. It is also possible that a relationship between hearing loss and balance problems is navigated via an undetected vestibular deficit. By understanding the role of auditory input in balance control, we can be better equipped to help this large portion of the population who seek medical attention for vestibular disorders and hearing loss. Therefore, there is a critical need for a systematic investigation of balance performance in response to simultaneous visual and auditory perturbations, similar to real-life situations. To answer this need, our team used recent advances in virtual reality technology and developed a Head Mounted Display (HMD) protocol of immersive environments, combining specific manipulations of visuals and sounds, including generated sounds (i.e., white noise) and real-world recorded sounds (e.g., a train approaching a station). This research will answer the following questions: (1) Are sounds used for balance and if yes, via what mechanism? (2) Do individuals with single-sided hearing loss have a balance problem even without any vestibular issues? (3) Are those with vestibular loss destabilized by sounds? To address these questions, the following specific aims will be investigated in individuals with unilateral peripheral vestibular hypofunction (n=45), individuals with single-sided deafness (n=45), and age-matched controls (n=45): Aim 1: Establish the role of generated and natural sounds in postural control in different visual environments; Aim 2: Determine the extent to which a static white noise can improve balance within a dynamic visual environment. We expect to clarify the role of sounds in the control of balance. This contribution will be significant because the mechanism underlying possible link between hearing loss and falls needs to be better understood. Our work will inform the development of improved balance assessments, specifically whether balance tests need to include postural responses to sounds. Hearing status may need to be considered as a potential indicator of increased fall risk. Our work will also inform the development of new rehabilitation combining sounds and visuals in different contexts. HMDs and headphones affordability and portability will allow future translation to a clinical setting.