Blog Archives

Crossmodal interaction of facial and vocal person identity information: An event-related potential study

Hearing a voice and seeing a face are essential parts of person identification and social interaction. It has been suggested that both types of information do not only interact at late processing stages but rather interact at the level of perceptual encoding (< 200 ms). The present study analysed when visual and auditory representations of person identity modulate the processing of voices. In unimodal trials, two successive voices (S1-S2) of the same or of two different speakers were presented. In the crossmodal condition, the S1 consisted of the face of the same or a different person with respect to the following voice stimulus. Participants had to decide whether the voice probe (S2) was from an elderly or a young person. Reaction times to the S2 were shorter when these stimuli were person-congruent, both in the uni- and crossmodal condition. ERPs recorded to the person-incongruent as compared to the person-congruent trials (S2) were enhanced at early (100–140 ms) and later processing stages (270–530 ms) in the crossmodal condition. A similar later negative ERP effect (270–530 ms) was found in the unimodal condition as well. These results suggest that identity information conveyed by a face is capable to modulate the sensory processing of voice stimuli.

from Brain Research

Advertisements

Cross-modal interactions between human faces and voices involved in person recognition

Faces and voices are key features of human recognition but the way the brain links them together is still unknown. In this study, we measured brain activity using functional magnetic resonance imaging (fMRI) while participants were recognizing previously learned static faces, voices and voice–static face associations. Using a subtraction method between bimodal and unimodal conditions, we observed that voice–face associations activated both unimodal visual and auditory areas, and specific multimodal regions located in the left angular gyrus and the right hippocampus. Moreover, a functional connectivity analysis confirmed the connectivity of the right hippocampus with the unimodal areas. These findings demonstrate that binding faces and voices rely on a cerebral network sustaining different aspects of integration such as sensory inputs processing, attention and memory.

from Cortex

Cross-modal interactions between human faces and voices involved in person recognition

Faces and voices are key features of human recognition but the way the brain links them together is still unknown. In this study, we measured brain activity using functional magnetic resonance imaging (fMRI) while participants were recognizing previously learned static faces, voices and voice–static face associations. Using a subtraction method between bimodal and unimodal conditions, we observed that voice–face associations activated both unimodal visual and auditory areas, and specific multimodal regions located in the left angular gyrus and the right hippocampus. Moreover, a functional connectivity analysis confirmed the connectivity of the right hippocampus with the unimodal areas. These findings demonstrate that binding faces and voices rely on a cerebral network sustaining different aspects of integration such as sensory inputs processing, attention and memory.

from Cortex

Cerebral Lateralization of Face-Selective and Body-Selective Visual Areas Depends on Handedness

The left-hemisphere dominance for language is a core example of the functional specialization of the cerebral hemispheres. The degree of left-hemisphere dominance for language depends on hand preference: Whereas the majority of right-handers show left-hemispheric language lateralization, this number is reduced in left-handers. Here, we assessed whether handedness analogously has an influence upon lateralization in the visual system. Using functional magnetic resonance imaging, we localized 4 more or less specialized extrastriate areas in left- and right-handers, namely fusiform face area (FFA), extrastriate body area (EBA), fusiform body area (FBA), and human motion area (human middle temporal [hMT]). We found that lateralization of FFA and EBA depends on handedness: These areas were right lateralized in right-handers but not in left-handers. A similar tendency was observed in FBA but not in hMT. We conclude that the relationship between handedness and hemispheric lateralization extends to functionally lateralized parts of visual cortex, indicating a general coupling between cerebral lateralization and handedness. Our findings indicate that hemispheric specialization is not fixed but can vary considerably across individuals even in areas engaged relatively early in the visual system.

from Cerebral Cortex

The Language and Literacy Development of Head Start Children: A Study Using the Family and Child Experiences Survey Database

Conclusion: The findings demonstrate the unique contributions that the home literacy environment and the presence of speech-language impairment during preschool make in children’s early reading outcomes.

from Language, Speech and Hearing Services in Schools

Older Adults’ Recognition of Bodily and Auditory Expressions of Emotion

This study compared young and older adults’ ability to recognize bodily and auditory expressions of emotion and to match bodily and facial expressions to vocal expressions. Using emotion discrimination and matching techniques, participants assessed emotion in voices (Experiment 1), point-light displays (Experiment 2), and still photos of bodies with faces digitally erased (Experiment 3). Older adults’ were worse at least some of the time in recognition of anger, sadness, fear, and happiness in bodily expressions and of anger in vocal expressions. Compared with young adults, older adults also found it more difficult to match auditory expressions to facial expressions (5 of 6 emotions) and bodily expressions (3 of 6 emotions).

from Psychology and Aging

Older Adults’ Recognition of Bodily and Auditory Expressions of Emotion

This study compared young and older adults’ ability to recognize bodily and auditory expressions of emotion and to match bodily and facial expressions to vocal expressions. Using emotion discrimination and matching techniques, participants assessed emotion in voices (Experiment 1), point-light displays (Experiment 2), and still photos of bodies with faces digitally erased (Experiment 3). Older adults’ were worse at least some of the time in recognition of anger, sadness, fear, and happiness in bodily expressions and of anger in vocal expressions. Compared with young adults, older adults also found it more difficult to match auditory expressions to facial expressions (5 of 6 emotions) and bodily expressions (3 of 6 emotions).

from Psychology and Aging

IQ, fetal testosterone and individual variability in children’s functional lateralization

Previous event-related potentials (ERP) studies have revealed that faces and words show a robust difference in the lateralization of their N170. The present study investigated the development of this differential lateralization in school-age boys. We assessed the potential role of fetal testosterone (FT) level as a factor biasing the prenatal development of lateralization, and the role of reading skill and Verbal IQ as factors predicting left lateralization for words in childhood. The adult pattern of differential N170 lateralization for faces and words was not present in a group of 26 school-age boys. This suggests that N170 lateralization only appears with years of experience with these stimulus categories or with late childhood maturation. FT level measured by amniocentesis did not account for a significant part of the individual variability in lateralization. Verbal IQ correlated with the degree of left lateralization of the N170 to words, but this effect was not specific to language abilities and language lateralization. A strong correlation was observed between the degree of left lateralization for words and the degree of left lateralization for faces, and both lateralization scores correlated with Verbal and Performance IQ. Possible explanations for these results are discussed along with ERP correlates of words and faces in school-age boys.

from Neuropsychologia