Corentin Bernard, Jocelyn Monnoyer, Michaël Wiertlewski, & Sølvi Ystad
A surface texture is perceived through both the sound and vibrations produced while being explored by our fingers. Because of their common origin, both modalities have a strong influence on each other, particularly at above 60 Hz for which vibrotactile perception and pitch perception share common neural processes. However, whether the sensation of rhythm is shared between audio and haptic perception is still an open question. In this study, we show striking similarities between the audio and haptic perception of rhythmic changes, and demonstrate the interaction of both modalities below 60 Hz. Using a new surface-haptic device to synthesize arbitrary audio-haptic textures, psychophysical experiments demonstrate that the perception threshold curves of audio and haptic rhythmic gradients are the same. Moreover, multimodal integration occurs when audio and haptic rhythmic gradients are congruent. We propose a multimodal model of rhythm perception to explain these observations. These findings suggest that audio and haptic signals are likely to be processed by common neural mechanisms also for the perception of rhythm. They provide a framework for audio-haptic stimulus generation that is beneficial for nonverbal communication or modern human-machine interfaces.