The infant brain may be predisposed to identify perceptually salient cues that are common to both signed and spoken languages. Recent theory based on spoken languages has advanced sonority as one of these potential language acquisition cues. Using a preferential looking paradigm with an infrared eye tracker, we explored visual attention of hearing 6- and 12-montholds with no sign language experience as they watched fingerspelling stimuli that either conformed to high sonority (well-formed) or low sonority (ill-formed) values, which are relevant to syllabic structure in signed language.
Younger babies showed highly significant looking preferences for well-formed, high sonority fingerspelling, while older babies showed no preference for either fingerspelling variant, despite showing a strong preference in a control condition. The present findings suggest babies possess a sensitivity to specific sonority-based contrastive cues at the core of human language structure that is subject to perceptual narrowing, irrespective of language modality (visual or auditory), shedding new light on universals of early language learning. Read more about this topic in Adam Stone, Laura-Ann Petitto, and Rain Bosworth’s research paper.