Jeffery A. Jones
Brain activation when acoustic information is delayed during an audiovisual speech task.
Proceedings of the 15th International Congress of Phonetic Sciences, 2209-2212.
Jones, J. A. & Callan, D. E.
published: 2003 | Research publication | Jones Lab
Although the impact that visual information can have on
speech perception is well known, we do not yet have an
adequate description of the neural mechanisms involved.
We asked subjects to identify consonants produced by a
speaker they both saw and heard while we acquired
functional magnetic resonance images (fMRI) of their
brain. During one experimental condition the acoustics
were synchronous with the visual image of the speakerís
face movements, in another they were delayed by 250 ms.
With respect to unimodal control conditions, we found
more extensive enhanced activity in the superior temporal
gyrus and sulcus (STG/STS), bilaterally, when the
audiovisual stimuli were synchronous than when the
sound was delayed. When we directly compared these two
experimental conditions, we found more activity in the
right premotor cortex and inferior parietal lobule (IPL)
when the acoustics were delayed. The results indicate that
polymodal regions of the STS and IPL play important but
different roles in audiovisual speech perception.
Download: PDF (856k) JonesCallanICPhS2003.pdf
revised Feb 12/08
View all Jeffery A. Jones documents