Individual Submission Summary
Share...

Direct link:

Poster #17 - Top-down Sensory Prediction in Neonates: fNIRS Evidence

Thu, March 21, 12:30 to 1:45pm, Baltimore Convention Center, Floor: Level 1, Exhibit Hall B

Integrative Statement

Real-world experience is characterized by the presence of statistical information (e.g., thunder comes with lightning). Exposure to this statistical information plays a fundamental role in shaping the development of learning, perception, and cognitive abilities. However, little is known about the neural mechanisms that translate statistical information into adaptive changes in the developing brain. One possibility is that the developing brain uses statistical information to form predictions of upcoming sensory input and modulates neural activity in sensory systems through top-down connections.
Recent studies using functional near infrared spectroscopy (fNIRS) provide evidence that the developing brain uses statistical information to formation of predictions in perceptual cortex. After learning an associated audio-visual event, 6-month-old infants activated their occipital lobe during predictive auditory stimulation without visual input. That is, the brain can make predictions for upcoming visual information based on its associated audio signals (Emberson et al., 2015, 2017). In the current study, we extended this work to investigate the developmental origins of this ability and specifically whether it is available to infants in the first days of postnatal life (< 48 hrs).
Sixteen full-term healthy neonates (27.93 hrs) were first familiarized with an associated audio-visual event, which an auditory stimulation (vocal sound of /ba/, 800 ms) preceded a visual stimulation (LED light, 800 ms). After 108 such audio-visual events (6 blocks of 18 events), we examined whether neonates can use the predictive sound to modulate neural activity in visual perception systems with a predictive audio (pA+V-) condition. In each pA+V- block, 6 predictive sounds (/ba/) were presented without visual input. In addition to the pA+V- blocks, we presented audio-visual (pA+V+) blocks, which included 18 audio-visual events. The number of predictive sound-only events was 25% to the number of audio-visual events. GowerLabs NIRS system with neonate EASYCAP was used. Neonates were asleep during the experiment.
Neural activity in the visual systems was examined by the increase of oxy-hemoglobin from 0 in the occipital channels within a 30 s window after stimuli onset. We found a significant increase of oxy-hemoglobin when visual stimulation was presented in the pA+V+ condition, p = .046, indicating neural responses to the visual stimulation in neonates’ visual cortex. Moreover, a significant oxy-hemoglobin increase was also found in the pA+V- condition (p = .001, Fig.1), indicating neonates could use the predictive sound to activate visual cortex without any visual input. No deoxy-hemoglobin change was found in neither condition. This finding suggests that top-down sensory prediction already exists shortly after birth.
A control experiment (n = 6) corroborated this conclusion by showing no oxy-hemoglobin change in visual cortex when neonates heard a unpredictive sound (uA+V- condition). Our results show the ability to form predictions is present in the first days of postnatal life, suggesting this is a fundamental ability supporting early development.

Authors