Abstract : Articulatory gestures and orthography are connected with speech through a natural and an artificial association, respectively. This EEG study investigated whether the integrations between speech and these two visual inputs rely on the same mechanism, despite their different characteristics. A comparison of skilled readers’ brain responses elicited by spoken words presented alone versus synchronously with visemes or graphemes showed that while neither visual input induced audiovisual integration on the N1 acoustic component, both led to a supra-additive integration on P2, with a stronger integration between speech and graphemes on left-anterior electrodes. This pattern persisted on the P350 component and generalized to all electrodes. The finding suggests a strong impact of reading acquisition on phonetic processing and lexical access. It also indirectly indicates that the dynamic and predictive cues present in natural lip movements but not in static visemes are critical to the contribution of visual articulatory gestures to speech processing.
Posted in Featured publication.