Sound symbolism scaffolds language development in preverbal infants
Allbwn ymchwil: Cyfraniad at gyfnodolyn › Erthygl › adolygiad gan gymheiriaid
StandardStandard
Yn: Cortex, Cyfrol 63, 16.09.2014, t. 196-205.
Allbwn ymchwil: Cyfraniad at gyfnodolyn › Erthygl › adolygiad gan gymheiriaid
HarvardHarvard
APA
CBE
MLA
VancouverVancouver
Author
RIS
TY - JOUR
T1 - Sound symbolism scaffolds language development in preverbal infants
AU - Thierry, G.L.
AU - Asano, M.
AU - Imai, M.
AU - Kita, S.
AU - Kitajo, K.
AU - Okada, H.
AU - Thierry, G.
PY - 2014/9/16
Y1 - 2014/9/16
N2 - A fundamental question in language development is how infants start to assign meaning to words. Here, using three Electroencephalogram (EEG)-based measures of brain activity, we establish that preverbal 11-month-old infants are sensitive to the non-arbitrary correspondences between language sounds and concepts, that is, to sound symbolism. In each trial, infant participants were presented with a visual stimulus (e.g., a round shape) followed by a novel spoken word that either sound-symbolically matched (“moma”) or mismatched (“kipi”) the shape. Amplitude increase in the gamma band showed perceptual integration of visual and auditory stimuli in the match condition within 300 msec of word onset. Furthermore, phase synchronization between electrodes at around 400 msec revealed intensified large-scale, left-hemispheric communication between brain regions in the mismatch condition as compared to the match condition, indicating heightened processing effort when integration was more demanding. Finally, event-related brain potentials showed an increased adult-like N400 response – an index of semantic integration difficulty – in the mismatch as compared to the match condition. Together, these findings suggest that 11-month-old infants spontaneously map auditory language onto visual experience by recruiting a cross-modal perceptual processing system and a nascent semantic network within the first year of life.
AB - A fundamental question in language development is how infants start to assign meaning to words. Here, using three Electroencephalogram (EEG)-based measures of brain activity, we establish that preverbal 11-month-old infants are sensitive to the non-arbitrary correspondences between language sounds and concepts, that is, to sound symbolism. In each trial, infant participants were presented with a visual stimulus (e.g., a round shape) followed by a novel spoken word that either sound-symbolically matched (“moma”) or mismatched (“kipi”) the shape. Amplitude increase in the gamma band showed perceptual integration of visual and auditory stimuli in the match condition within 300 msec of word onset. Furthermore, phase synchronization between electrodes at around 400 msec revealed intensified large-scale, left-hemispheric communication between brain regions in the mismatch condition as compared to the match condition, indicating heightened processing effort when integration was more demanding. Finally, event-related brain potentials showed an increased adult-like N400 response – an index of semantic integration difficulty – in the mismatch as compared to the match condition. Together, these findings suggest that 11-month-old infants spontaneously map auditory language onto visual experience by recruiting a cross-modal perceptual processing system and a nascent semantic network within the first year of life.
U2 - 10.1016/j.cortex.2014.08.025
DO - 10.1016/j.cortex.2014.08.025
M3 - Article
VL - 63
SP - 196
EP - 205
JO - Cortex
JF - Cortex
SN - 0010-9452
ER -