Individual Submission Summary
Share...

Direct link:

Visual Attention to Faces of Unfamiliar versus Familiar Language Speakers Varies by Infant’s Language Background

Thu, March 21, 4:00 to 5:15pm, Baltimore Convention Center, Floor: Level 1, Exhibit Hall B

Integrative Statement

Introduction: Early social-linguistic experiences impact infants’ face processing. For instance, relative to monolinguals, 4-12-month-old bilingual infants look longer at the mouths of talking faces speaking either a familiar or an unfamiliar language (Pons et al., 2015). This behavior is believed to facilitate the uptake of redundant audiovisual speech cues for language learning. However, it is unknown how attention to familiar versus unfamiliar linguistic stimuli may vary during the period of rapid language growth between 15 and 24 months. Here, we examine how linguistically-diverse infants’ visual attention to speakers may be modulated by familiarity with the language being spoken.

Participants: 15- to 24-month-old monolinguals (n=47; Mage=19.07, SDage=2.84) and bilinguals (n=42; Mage=18.53, SDage=3.08) participated. Bilingual infants were exposed to English 20-80% of the day (M=56%, SD=19.53%); monolingual infants were exposed to English 85-100% of the day (M=92.60%, SD=18.17%).

Method: We recorded infants’ eye movements as they viewed six 20-25 second videos of two female English-Armenian bilingual speakers conversing with each other and to the infant in infant-directed speech. The dialogue in three videos was in an unfamiliar language (Armenian); the other three videos were in a familiar language (English). The women did not mix or switch between languages during any of the videos. The woman on the left was coded as Speaker1; the woman on the right was coded as Speaker2. Video order was pseudorandomized for each infant; no more than two consecutive videos were presented in the same language. Latency to look at each speaker’s face on the onset of speech—measured in video frames—was our primary measure.

Results: Two 2 (language background: monolingual vs. bilingual) x 2 (video language: English vs. Armenian) ANOVAs with latency to first look at the speaker’s face being the dependent variable were conducted (Figure 1); separate ANOVAs were conducted for looking to each of the speaker’s faces at the onset of speech. Monolingual and bilingual infants showed different latencies to first look at speakers’ faces. When Speaker1 was talking, monolinguals were significantly faster to look to that speaker than were bilinguals (p=.008). A marginally significant language background by video language interaction (p=.07) revealed that monolinguals and bilinguals did not differ in their latency to look at Speaker1 when the video language was English but bilinguals were slower than monolinguals to look at the Speaker1 when the video language was Armenian. A similar marginally significant interaction (p=.08) revealed that monolinguals and bilinguals were both slower to look to Speaker2 when the video language was Armenian than English but the difference in latency to look at Speaker2 between English videos and Armenian videos was greater for bilinguals than monolinguals (~12 vs. 6 frames).

Discussion: Infants’ early language environments impact attention to bilingual speakers. Infants may modulate their looking patterns based on the speakers’ language backgrounds and identify when they can or cannot understand a speaker. Infants’ attention to social-linguistic contexts may be affected by the initial linguistic information provided by a new speaker.

Authors

©2020 All Academic, Inc.   |   Privacy Policy