BBC - When Hands Speak As Loud as Words: Insights from Gesture and Sign Language Concerning the Cognitive and Neural Mechanisms of Learning
Gestures—meaningful hand movements—are often produced with speech, conveying complementary information visually. Similarly, signs can convey information about the shape and motion of referents via visual iconicity. In this talk, I will describe my investigations into the neurocognitive mechanisms of gesture-speech integration and sign processing. First, I will demonstrate that motor encoding via enactment facilitates the acquisition of signs by hearing adults. Second, I will show how gestures and comparable non-embodied movements affect the learning and perception of lexical tones in an unknown second language as well as non-speech tones. Third, I will discuss key neurophysiological mechanisms of gesture-speech integration. I will conclude by discussing how the findings of my research can inform interventions to enhance learning.