Journal of Experimental Psychology: Learning, Memory, and Cognition. Advance online publication.
Auditory speech appears to be linked to visual articulatory gestures and orthography through different mechanisms. Yet, both types of visual information have a strong influence on speech processing. The present study directly compared their contributions to speech processing using a novel word learning paradigm. Native speakers of French, who were familiar with English, learned minimal pairs of novel English words containing the English /θ/-/f/ phonemic contrast under one of three exposure conditions: (a) the auditory forms of novel words alone, (b) the auditory forms associated with articulatory gestures, or (c) the auditory forms associated with orthography. The benefits of the three methods were compared during training and at two posttraining time points where the visual cues were no longer available. We also assessed participants’ auditory-only discrimination of the /θ/-/f/ contrast pretraining and posttraining. During training, the visual cues facilitated novel word learning beyond the benefit of the auditory input alone. However, these additional benefits did not persist when participants’ discrimination and novel word learning performance were assessed immediately after training. Most interestingly, after a night’s sleep, participants who were exposed to orthography during training showed significant improvement in both discrimination and novel word learning compared to the previous day. The findings are discussed in terms of online versus residual impacts of articulatory gestures and orthography on speech processing. While both visual cues are beneficial when they are simultaneously presented with speech, only orthography shows residual impacts leading to a sleep-dependent enhancement of lexical knowledge through memory consolidation and retuning of the second language /θ/-/f/ contrast. (PsycInfo Database Record (c) 2021 APA, all rights reserved)