New PhD and Postdoctoral projects

We had the pleasure of welcoming two new doctoral and three post-doctoral students

Bridging communication in behavioural and neural dynamics

Isaïh Mohamed
Supervisors : Daniele Schön, Institut de Neurosciences des Systèmes & Leonardo Lancia Laboratoire de Phonétique et Phonologie

The aim of this project is to bridge interpersonal verbal coordination and neural dynamics. In practice, we will collect neurophysiological data on individuals (mostly patients with intracranial recordings) performing different interactive language tasks. We will use natural language processing methods to estimate objective features of verbal coordination on speech/language signals. Then we will use machine learning and information theory driven approaches to bridge the dynamics of the coordinative verbal behavior to spatio-temporal neural dynamics.
More precisely, we plan to use several tasks that have been proven to be efficient in the study of verbal interactions. Some tasks are rather constrained and controlled (allowing to manipulate the coordinative dynamics) while others assess conversation in more natural conditions. Speech recordings allow quantifying coordination at different linguistic levels in a time resolved manner. These metrics can then be used to interpret changes in neural dynamics as a function of verbal coordination. We plan to use different approaches, a machine learning approach (decoding the speech signal of the speaker based on the neural signal of the listener) as well as information-theoretic approach (to model to what extent the relation between neural signals and upcoming speech is influenced by the current level of coordination estimated by convergence, for instance).
Overall, this project will allow gathering a better understanding of the link between behavioural coordinative dynamics and neural dynamics. For instance, compared to simple coordinative dynamics, more difficult coordinative behaviour will probably require a change in the ratio between top-down and bottom-up connections between frontal regions and temporal regions in specific frequency bands (increase of top-down beta and decrease of bottom-up gamma).
The strength of this project is to merge sophisticated coordination designs, advanced analysis of verbal coordination dynamics and front edge neuroscience tools with unique neural data in humans.

 

 

Ouvrir une fenêtre sur l'esprit des lecteurs : Détermination par TMS et EEG du réseau cortical impliqué dans le comportement oculomoteur de lecture

Régis Mancini
Supervisors : Françoise Vitu, Laboratoire de Psychologie Cognitive & Boris Burle Laboratoire de Neurosciences cognitives

Eye movements during reading have been studied for more than a century, revealing a very stereotyped behaviour, despite a significant variability in the amplitude of saccades and the positions of fixations on the lines of text. Most of the models proposed to account for this behaviour are based on a cognitive guidance of the gaze, and therefore presuppose an essentially top-down control. These top-down models are nevertheless contradicted by the recently reported fact that an illiterate model of saccade programming in the superior colliculus, a multi-integrative subcortical structure, fairly accurately predicts the oculomotor behaviour of readers simply from early visual processing (luminance contrast). This result suggests on the contrary a secondary role of the neocortex in oculomotor control during reading.
The thesis aims on the one hand to characterize the cortical network involved in oculomotor control during reading, and on the other hand to determine the temporal dynamics of activation of these different cortical areas. This research is primarily based on the use of transcranial magnetic stimulation (TMS), which temporarily inactivates a given cortical area in healthy participants, in conjunction with the recording of eye movements during a sentence-reading task. The effect of the inactivation of a given cortical area on the oculomotor behaviours classically observed would therefore indicate its involvement in reading. In a second step, TMS studies will be complemented by an approach based on electroencephalographic (EEG) recordings.


The neural correlates of embodied L2 learning

Ana Zappa
Supervisor: Cheryl Frenck-Mestre, Laboratoire Parole et Langage

Within the framework of embodied semantics, the overlap between sensorimotor and language processes could have important implications for second language (L2) acquisition. Neurolinguistic studies have established motor-language interactions and behavioral studies have shown that coupling word encoding with corresponding, relevant gestures enhances language learning. The present study will explore these aspects of language processing and acquisition, using virtual reality (VR), to investigate how performing naturalistic actions during the learning of L2 action verbs may enhance mapping between word form and meaning. We will examine whether “embodied learning”, or learning that occurs using specific physical movements that are coherent with the meaning of new words, creates linguistic representations that produce greater motor resonance (activity in the motor cortex), due possibly to stronger and more specific motor traces, compared to a control condition. In addition, we will investigate whether embodied learning leads to better retention; indeed the implication of both linguistic and motor areas should lead to a more complete semantic representation and increased learning. Training will take place over two days using auditory verbs and a VR oculus headset and a hand controller. Native French speakers will learn the same set of L2 verbs in Serbian, in either the “embodied learning” or the control condition. The “embodied learning” group will learn the action verbs using different specific movements to manipulate objects for each verb; the control group will simply point to the virtual objects. Both pre and post training, time-frequency analysis will be carried out on the EEG data to measure mu and beta suppression, associated with motor activation, while participants passively listen to the new verbs. EEG and behavioral accuracy will also be used to assess learning in an audio-visual match-mismatch task. We expect that the “embodied learning” group will show greater motor activation during verb processing post-training, and that this will correlate with improved learning as indexed by a greater N400 effect and improved behavioral results compared to the control condition. If so, this would suggest that embodied learning adds a motor trace to lexical items, which would support theories of embodied semantics.

An investigation into the inhibitory mechanisms underlying inner verbal actions

Ladislas Nalborczyk
Supervisor: F.-Xavier Alarios, Laboratoire de Psychologie Cognitive & Marieke Longcamp, Laboratoire de Neurosciences Cognitives.

The main goal of this project is to tackle the problem of motor inhibition during covert speech and imagined typing, where covert speech is considered as the mental imagery of overt speech. Put simply, how can we imagine raising our arm without actually raising our arm? How can we imagine a conversation without actually producing it overtly? What are the cognitive and neural mechanisms that operate in order to prevent motor execution? How (where and when) are these mechanisms neurally implemented? Can we enhance or degrade these inhibitory mechanisms online? These questions and the problem of motor inhibition emerge from the use of concepts such as simulation or emulation to explain the phenomenon of motor imagery. These views suggest that motor imagery, defined as the mental representation of an action, without overt execution, would result from the simulation or emulation of actual execution. However, this raises the question of how it is possible for imagination of action to not lead to actual execution. We will tackle these questions using novel behavioural paradigms and transcranial magnetic stimulation in a series of five experiments.

Functional role of oscillatory dynamics in motor cortex during speech perception

Noémie te Rietmolen
Supervisor: Kristof Strijkers, Laboratoire Parole et Langage & Benjamin Morillon, Institut de Neurosciences des Systèmes.

While our knowledge on the brain structures underpinning speech perception has greatly advanced in the last decades, the neurophysiological mechanisms that can explain how humans process speech are still largely unknown.
In particular, influential theories about speech perception do not agree on the role of the motor system (Skipper et al., 2017): Dual-stream theories suggest that the motor system is not crucial (Hickok & Poeppel, 2007; Hickok, 2014), whereas opposing theories ascribe a fundamental role to the motor system in speech perception (Barnaud et al., 2018; Pulvermüller & Fadiga, 2010). A fruitful approach to understand the neural mechanisms underlying speech perception is to investigate cortical oscillations. Cortical oscillations refer to synchronized rhythmic brain activity, which is hypothesized to be important for structuring, binding and consolidating complex information in the cerebral cortex (e.g. Buzsáki & Draguhn, 2004). Given the intriguing possibility that cortical oscillations may offer a link between brain and behavior, in particular for higher-order cognitive processes such as language perception (e.g. Buzsáki, 2010), the current project sets out to investigate how brain oscillations in the motor cortex impact speech comprehension.
The objectives and hypotheses of this project are guided by current observations and proposals regarding the nature of cortical oscillations for (1) the extraction of speech sounds and (2) the extraction of meaning when perceiving language. With regard to speech sound processing (1), it has been suggested that cortical oscillations “provide the [temporal] infrastructure to parse and decode connected speech” (Giraud & Poeppel, 2012). At the level of the auditory cortex, low-frequency neural oscillations entrain to the (quasi-)rhythmic structure of the speech signal and causally contribute to speech comprehension (Peelle, 2018; Riecke et al., 2018; Zoefel et al., 2018). Moreover, neural entrainment to speech is also observed in regions beyond the auditory cortex, and in particular in the motor cortex, at the phrasal (0.6-1.3 Hz), lexical (1.3-3 Hz), and syllabic rates (3.5-4.5 Hz) (e.g. Keitel et al., 2018; Assaneo & Poeppel, 2018). One hypothesis is that such entrainment in these frequencies reflects temporal prediction derived from the temporal regularities presented in speech (e.g. Morillon & Baillet, 2017). In a similar vein, cortical oscillations related to the lexical meaning of spoken words (2) have been observed over auditory and motor cortices in the high-frequency range (beta and gamma; e.g. Pulvermüller et al., 1996; Canolty et al., 2007).
These large-scale synchronizations between fronto-central and superior temporal brain regions are hypothesized to reflect the binding of sensorimotor experiences into lexical categories (e.g. Strijkers, 2016; Garagnani et al., 2017). However, at present the functional role (if any) of such motor oscillatory activity in speech perception remains debated. In the current project, we set out to investigate the exact nature of oscillatory dynamics in the motor cortex for key components of speech perception (i.e. sound- and meaning-extraction) with two complementary studies
that each containing a behavioral and neurophysiological (magnetoencephalography; MEG) part. The behavioral experiments will assess whether activation of the motor cortex improves speech perception (and if so, under which conditions), and the MEG experiments will assess whether these potential behavioral improvements are indeed driven by frequency-specific cortical oscillations and enhanced functional coupling between motor and auditory cortical regions. In this manner, the results of this project may provide valuable insights for the theoretical development of sensorimotor integration during language processing and even highlight that specific oscillatory patterns (different frequency ranges) drive different processes involved in the perception of speech.

Posted in Uncategorized.