Description
Multisensory Processing brings a unique, audio-centric view to studies of multisensory behavior and perception and its neural correlates. Contributors to this volume are leading experts in multisensory processing with widely different backgrounds and perspectives-some would consider themselves as auditory neuroscientists, others are interested in the multisensory aspects of perception without special affinity to a single sensory system.The volume will be organized into several broad themes aligned with the major current areas of interest and emphasis in auditory-oriented multisensory research: psychophysics (basic audiovisual influences on perception, speech processing and multisensory function outside of canonical audiovisual pairings), anatomical substrates, neurophysiological bases, and clinical perspectives. The goal of the volume is to generate fruitful discussion that will illuminate readers on how auditory function is seen in the broader context of multisensory processing, and how understanding the interactions between senses can influence our view on auditory perception as well as its clinical and applied implications for those suffering from communication disorders. Multisensory Processing will appeal to graduate students, postdoctoral fellows, and established psychologists and neuroscientists who are interested in auditory function and multisensory processing. It will also be of interest to hearing scientists who have an interest in how other senses shape auditory function. Dr. Adrian K.C. Lee is Director of the Center for Auditory Neuroimaging and Assistant Professor of Speech & Hearing Sciences and Institute for Learning & Brain Sciences at University of Washington. The Lee lab is interested in mapping the cortical dynamics associated with auditory attention as well as how the oculomotor and the visual attentional network interact with auditory attention. Dr. Mark T. Wallace is Director of the Vanderbilt Brain Institute and Professor of Hearing and Speech Sciences, Psychology and Psychiatry at Vanderbilt University. The Wallace lab is interested in better understanding how the brain synthesizes information from multiple sensory systems and employs an array of approaches ranging from neurophysiology in animal models to neuroimaging in “typical” and clinical populations. Preface.- Visual Influence on Auditory Perception.- Cue Combination Within a Bayesian Framework.- Toward a Model of Auditory-Visual Speech Intelligibility.- An Object-Based Interpretation of Audiovisual Processing.- Hearing in a “Moving” Visual World: Coordinate Transformations Along the Auditory Pathway.- Multisensory Processing in the Auditory Cortex.- Audiovisual Integration in the Primate Prefrontal Cortex.- Using Multisensory Integration to Understand the Human Auditory Cortex.- Combining Voice and Face Content in the Primate Temporal Lobe.- Neural Network Dynamics and Audiovisual Integration.- Cross-Modal Learning in the Auditory System.- Multisensory Processing Differences in Individuals with Autism Spectrum Disorder.




