Coherent motion perception was tested in nine adolescents with dyslexia and 10 control participants matched for age and IQ using low contrast stimuli with three levels of coherence (10%, 25% and 40%). Event-related potentials (ERPs) and behavioural performance data were obtained. No significant between-group differences were found in performance accuracy and response latencies of correct responses, for early (P1, N1, and P2) or late (P3) ERP peaks. However, attenuated early ERPs in the 10% coherent condition correlated significantly with lower performance accuracy (r = −.66) and with the magnitude of literacy deficit (r = −.46). Copyright © 2011 John Wiley & Sons, Ltd.
Emotional meaning impacts word processing. However, it is unclear, at which functional locus this influence occurs and whether and how it depends on word class. These questions were addressed by recording event-related potentials (ERPs) in a lexical decision task with written adjectives, verbs, and nouns of positive, negative, and neutral emotional valence. In addition, word frequency (high vs. low) was manipulated. The early posterior negativity (EPN) in ERPs started earlier for emotional nouns and adjectives than for verbs. Depending on word class, EPN onsets coincided with or followed the lexicality effects. Main ERP effects of emotion overlapped with effects of word frequency between 300 and 550 ms but interacted with them only after 500 ms. These results indicate that in all three word class examined, emotional evaluation as represented by the EPN has a post-lexical locus, starting already after a minimum of lexical access.
Objective: To evaluate the impact of multi-talker babble on cortical event-related potentials (ERPs), specifically the N400, in a spoken semantic priming paradigm. Design: Participants listened in quiet and with background babble to word triplets, evaluating whether the third word was related to the preceding words. A temporo-spatial principal component analysis was conducted on ERPs to the first and second words (S1 and S2), processed without an overt behavioral response. One factor corresponded to the N400 and revealed greater processing negativity for unrelated as compared to related S2s in quiet and in babble. Study sample: Twelve young adults with normal hearing. Results: Background babble had no significant impact on the N400 in the posterior region but increased neural processing negativity at anterior and central regions during the same timeframe. This differential processing negativity in babble occurred in response to S2 but not S1. Furthermore, background babble impacted processing negativity for related S2s more than unrelated S2s. Conclusions: Results suggest that speech processing in a modestly degraded listening environment alters neural activity associated with auditory working memory, attention, and semantic processing in anterior and central scalp regions.
from the International Journal of Audiology
CONCLUSION: Except for the N2-P3 amplitude, it was observed stability of the parameters of ABR, MLR and P300 in normal adults after a period of three months.
We investigated whether listeners are sensitive to (mis)matching accentuation patterns with respect to contrasts in the linguistic and visual context, using Event-Related Potentials. We presented participants with displays of two pictures followed by a spoken reference to one of these pictures (e.g., “the red ball”). The referent was contrastive with respect to the linguistic context (utterance in the previous trial: e.g., “the blue ball”) or with respect to the visual context (other picture in the display; e.g., a display with a red ball and a blue ball). The spoken reference carried a pitch accent on the noun (“the red BALL”) or on the adjective (“the RED ball”), or an intermediate (‘neutral’) accentuation. For the linguistic context, we found evidence for the Missing Accent Hypothesis: Listeners showed processing difficulties, in the form of increased negativities in the ERPs, for missing accents, but not for superfluous accents. ‘Neutral’ or intermediate accents were interpreted as ‘missing’ accents when they occurred late in the referential utterance, but not when they occurred early. For the visual context, we found evidence for the Missing Accent Hypothesis for a missing accent on the adjective (an increase in negativity in the ERPs) and a superfluous accent on the noun (no effect). However, a redundant color adjective (e.g., in the case of a display with a red ball and a red hat) led to less processing problems when the adjective carried a pitch accent.
Crossmodal interaction of facial and vocal person identity information: An event-related potential study
Hearing a voice and seeing a face are essential parts of person identification and social interaction. It has been suggested that both types of information do not only interact at late processing stages but rather interact at the level of perceptual encoding (< 200 ms). The present study analysed when visual and auditory representations of person identity modulate the processing of voices. In unimodal trials, two successive voices (S1-S2) of the same or of two different speakers were presented. In the crossmodal condition, the S1 consisted of the face of the same or a different person with respect to the following voice stimulus. Participants had to decide whether the voice probe (S2) was from an elderly or a young person. Reaction times to the S2 were shorter when these stimuli were person-congruent, both in the uni- and crossmodal condition. ERPs recorded to the person-incongruent as compared to the person-congruent trials (S2) were enhanced at early (100–140 ms) and later processing stages (270–530 ms) in the crossmodal condition. A similar later negative ERP effect (270–530 ms) was found in the unimodal condition as well. These results suggest that identity information conveyed by a face is capable to modulate the sensory processing of voice stimuli.
from Brain Research
Age and amount of exposure to a foreign language during childhood: behavioral and ERP data on the semantic comprehension of spoken English by Japanese children
Children’s foreign-language (FL) learning is a matter of much social as well as scientific debate. Previous behavioral research indicates that starting language learning late in life can lead to problems in phonological processing. Inadequate phonological capacity may impede lexical learning and semantic processing (phonological bottleneck hypothesis). Using both behavioral and neuroimaging data, here we examine the effects of age of first exposure (AOFE) and total hours of exposure (HOE) to English, on 350 Japanese primary-school children’s semantic processing of spoken English. Children’s English proficiency scores and N400 event-related brain potentials (ERPs) were analyzed in multiple regression analyses. The results showed (1) that later, rather than earlier, AOFE led to higher English proficiency and larger N400 amplitudes, when HOE was controlled for; and (2) that longer HOE led to higher English proficiency and larger N400 amplitudes, whether AOFE was controlled for or not. These data highlight the important role of amount of exposure in FL learning, and cast doubt on the view that starting FL learning earlier always produces better results.
Using event-related potentials (ERPs), we investigated the N400 (an ERP component that occurs in response to meaningful stimuli) in children aged 8–10 years old and examined relationships between the N400 and individual differences in listening comprehension, word recognition and non-word decoding. Moreover, we tested the claim that the N400 effect provides a valuable indicator of behavioural vocabulary knowledge. Eighteen children were presented with picture-word pairs that were either ‘congruent’ (the picture depicted the spoken word) or ‘incongruent’ (they were unrelated). Three peaks were observed in the ERP waveform triggered to the onset of the picture-word stimuli: an N100 in fronto-central channels, an N200 in central–parietal channels and an N400 in frontal, central and parietal channels. In contrast to the N100 peak, the N200 and N400 peaks were sensitive to semantic incongruency with greater peak amplitudes for incongruent than congruent conditions. The incongruency effects for each peak correlated positively with listening comprehension but when the peak amplitudes were averaged across congruent/incongruent conditions they correlated positively with non-word decoding. These findings provide neurophysiological support for the position that sensitivity to semantic context (reflected in the N400 effect) is crucial for comprehension whereas phonological decoding skill relates to more general processing differences reflected in the ERP waveform. There were no correlations between ERP and behavioural measures of expressive or receptive vocabulary knowledge for the same items, suggesting that the N400 effect may not be a reliable estimate of vocabulary knowledge in children aged 8–10 years.
from Brain and Language
Attentional bias towards emotional linguistic material has been examined extensively with the emotion-word Stroop task. Although findings in clinical groups show an interference effect of emotional words that relate to the specific concern of the group, findings concerning healthy groups are less clear. In the present study, we investigated whether emotional Stroop interference in healthy individuals is affected by exposure of the words prior to the task. We used event-related potentials (ERPs) to examine the temporal aspects of Stroop interference. Participants took longer to indicate the colour of negative than of neutral words. Exposure of words prior to the Stroop task increased response latencies, but this effect was equal for neutral and negative words. At the neurophysiological level, we found more positive-going ERPs at later latencies (P290, N400 and LPP) in response to negative than in response to neutral Stroop words. The N400 was less negative for exposed than for new words, but this effect did not interact with the emotional valence of the words. For new (i.e., unexposed) words, the behavioural Stroop interference correlated with the P290, N400 and LPP emotion effects (negative minus neutral words). The successive ERP components suggest better prelexical, semantic, and sustained attentional processing of emotion words, even when the emotional content of the words is task-irrelevant.
Conflicts in language processing often correlate with late positive event-related brain potentials (ERPs), particularly when they are induced by inconsistencies between different information types (e.g. syntactic and thematic/plausibility information). However, under certain circumstances, similar sentence-level interpretation conflicts (inanimate subjects) engender negativity effects (N400 s) instead. The present ERP study was designed to shed light on this inconsistency. In previous studies showing monophasic positivities (P600 s), the conflict was irresolvable and induced via a verb, whereas N400 s were elicited by resolvable, argument-induced conflicts. Here, we therefore examined irresolvable argument-induced conflicts (pronoun case violations) in simple English sentences. Conflict strength was manipulated via the animacy of the first argument and the agreement status of the verb. Processing conflicts engendered a biphasic N400-late positivity pattern, with only the N400 sensitive to conflict strength (animacy). These results suggest that argument-induced conflicts engender N400 effects, (which we interpret in terms of increased competition for the Actor role) whereas irresolvable conflicts elicit late positivities (which we interpret as reflecting well-formedness categorisation).
The processing of phonological, orthographical and lexical information of Chinese characters in sentence contexts: An ERP study
In the current work, we aimed to study the processing of phonological, orthographical and lexical information of Chinese characters in sentence contexts, as well as to provide further evidence for psychological models. In the experiment, we designed sentences with expected, homophonic, orthographically similar, synonymous and control characters as endings respectively. The results indicated that P200 might be related to the early extraction of phonological information. Moreover, it might also represent immediate semantic and orthographic lexical access. This suggested that there might be a dual-route in cognitive processing, where the direct access route and the phonologically mediated access route both exist and interact with each other. The increased N400 under the control condition suggested that both phonological and orthographical information would influence semantic integration in Chinese sentence comprehension. The two positive peaks of the late positive shift might represent the semantic monitoring, and orthographical retrieval and reanalysis processing respectively. Under the orthographically similar condition, orthographical retrieval and reanalysis processing was more difficult in comparison with the other conditions, which suggested that there might be direct access from orthography to semantic representation in cognitive processing. In conclusion, it was shown that the direct access hypothesis or the dual-route hypothesis could better explain cognitive processing in the brain.
from Brain Research
In this study, we explored cerebral mechanisms during the computation of subject-verb agreement by measuring event-related potentials after French verb and pseudoverb targets preceded by various contexts. In auditory grammatical priming, the targets were either related to a congruent predictive pronoun prime (nous prêtons we lend) or an incongruent predictive pronoun prime (vous prêtons you lend) or a non predictive prime (zous prêtons zous lend). Whereas an early anterior negativity (LAN) and a parietal positivity were modulated by the preceding context for verb targets, only the early negativity was sensitive to the context for pseudoverb targets. Interestingly, for verbs, the LAN response was larger at left frontocentral sites around 100 msec after the onset of the recognition point of verbal inflection in the incongruent predictive condition relative to two other conditions. This finding was in line with the behavioral results, suggesting that top-down processes of the computation of the subject-verb agreement occur. Moreover, at 160-210 msec after the onset of the recognition point of verbal inflection, the parietal positivity was smaller in amplitude at left centroparietal sites for the incongruent predictive and non predictive conditions. This was interpreted as reflecting bottom-up processes of the computation of the subject-verb agreement. All findings thus suggest that top-down and bottom-up processes of the computation of the subject-verb agreement occur with distinctive temporal properties.
from Brain Research
Although hippocampus volume remains unaffected, IFS seems to induce functional changes in the MTL memory network, characterized by a compensation of recollection by familiarity-based remembering.
This study aimed at investigating the effects of acoustic distance and of speaker variability on the pre-attentive and attentive perception of French vowels by French adult speakers. The electroencephalogram (EEG) was recorded while participants watched a silent movie (Passive condition) and discriminated deviant vowels (Active condition). The auditory sequence included 4 French vowels, /u/ (standard) and /o/, /y/ and /ø/ as deviants, produced by 3 different speakers. As the vowel /o/ is closer to /u/ than the other deviants in acoustic distance, we predicted smaller mismatch negativity (MMN) and smaller N1 component, as well as higher error rate and longer reaction times. Results were in line with these predictions. Moreover, the MMN was elicited by all deviant vowels independently of speaker variability. By contrast, the Vowel by Speaker interaction was significant in the Active listening condition thereby showing that subtle within-category differences are processed at the attentive level. These results suggest that while vowels are categorized pre-attentively according to phonemic representations and independently of speaker variability, participants are sensitive to between-speaker differences when they focus attention on vowel processing.
from Brain Research
Reading emotional words within sentences: The impact of arousal and valence on event-related potentials
Effects of emotional word meaning have been studied exclusively for words in isolation but not in the context of sentences. We addressed this question within the framework of two-dimensional models of affect, conceiving emotion as a function of valence and arousal. Negative and neutral target verbs, embedded within sentences, were presented while event-related brain potentials (ERPs) and the activity of the Corrugator muscle were recorded. Twenty-one participants performed a semantic decision task on the target verbs. In contrast to single word studies no early posterior negativity was present. However, emotion effects in ERPs were evident in a late positive complex (LPC) for negative, high-arousal words in comparison to neutral words. Interestingly, the LPC was unaffected by pure arousal variation when valence was controlled for, indicating the importance of valence for this emotion-related ERP effect.<p><p>from the <a href=”http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6T3M-5120JXK-1&_user=108452&_coverDate=09%2F18%2F2010&_rdoc=1&_fmt=high&_orig=search&_origin=search&_sort=d&_docanchor=&view=c&_acct=C000059732&_version=1&_urlVersion=0&_userid=108452&md5=8e3ace6dd4298ba4fbf76f820bacc963&searchtype=a”><em>International Journal of Psychophysiolo