Blog Archives

The influence of emotional associations on the neural correlates of semantic priming

Emotions influence our everyday life in several ways. With the present study, we wanted to examine the impact of emotional information on neural correlates of semantic priming, a well-established technique to investigate semantic processing. Stimuli were presented with a short SOA of 200 ms as subjects performed a lexical decision task during fMRI measurement. Seven experimental conditions were compared: positive/negative/neutral related, positive/negative/neutral unrelated, nonwords (all words were nouns). Behavioral data revealed a valence specific semantic priming effect (i.e., unrelated > related) only for neutral and positive related word pairs. On a neural level, the comparison of emotional over neutral relations showed activation in left anterior medial frontal cortex, superior frontal gyrus, and posterior cingulate. Interactions for the different relations were located in left anterior part of the medial frontal cortex, cingulate regions, and right hippocampus (positive > neutral + negative) and left posterior part of medial frontal cortex (negative > neutral + positive). The results showed that emotional information have an influence on semantic association processes. While positive and neutral information seem to share a semantic network, negative relations might induce compensatory mechanisms that inhibit the spread of activation between related concepts. The neural correlates highlighted a distributed neural network, primarily involving attention, memory and emotion related processing areas in medial fronto-parietal cortices. The differentiation between anterior (positive) and posterior part (negative) of the medial frontal cortex was linked to the type of affective manipulation with more cognitive demands being involved in the automatic processing of negative information. Hum Brain Mapp, 2011. © 2010 Wiley-Liss, Inc.

from Human Brain Mapping

Is developmental dyslexia modality specific? A visual auditory comparison of Italian dyslexics

Although developmental dyslexia is often referred to as a cross-modal disturbance, tests of different modalities using the same stimuli are lacking. We compared the performance of 23 children with dyslexia and 42 chronologically matched control readers on reading versus repetition tasks and visual versus auditory lexical decision using the same stimuli. With respect to control readers, children with dyslexia were impaired only on stimuli in the visual modality; they had no deficit on the repetition and auditory lexical decision tasks. By applying the rate-amount model (Faust et al., 1999), we showed that performance of children with dyslexia on visual (but not auditory) tasks was associated with that of control readers by a linear relationship (with a 1.78 slope), suggesting that a global factor accounts for visual (but not auditory) task performance.

from Neuropsychologia

Error Detection Mechanism for Words and Sentences: A comparison between readers with dyslexia and skilled readers

The activity level of the error monitoring system for processing isolated versus contextual words in Hebrew was studied in adults with dyslexia and skilled readers while committing reading errors. Behavioural measures and event-related potentials were measured during a lexical decision task using words in a list and sentences. Error-related negativity (ERN/Ne) potentials following reading errors and correct-related negativity for correct responses were detected in all conditions and participants. However, ERN/Ne amplitudes were smaller for those with dyslexia than for the skilled readers, and for reading sentences than for words in a list. These results support previous findings of lower activation of the error detection mechanism among dyslexics, and point to different activity levels for words and sentences. A theory on the underlying factors of dyslexia is proposed.

from the International Journal of Disability, Development and Education

Impaired word recognition in Alzheimer’s disease: the role of age of acquisition

Studies of word production in patients with Alzheimer’s disease have identified the age of acquisition of words as an important predictor of retention or loss, with early acquired words remaining accessible for longer than later acquired words. If, as proposed by current theories, effects of age of acquisition reflect the involvement of semantic representations in task performance, then some aspects of word recognition in patients with Alzheimer’s disease should also be better for early than later acquired words. We employed a version of the lexical decision task which we term the lexical selection task. This required participants to indicate which of four items on a page was a real word (the three ‘foils’ being orthographically plausible nonwords). Twenty-two patients with probably Alzheimer’s disease were compared with an equal number of matched controls. The controls made few errors on the test, demonstrating that the controls were cognitively intact, and that the words were familiar to participants of their age and level of education. The Alzheimer patients were impaired overall, and recognized fewer late than early acquired words correctly. Performance of the Alzheimer patients on the lexical selection task correlated significantly with their scores on the Mini Mental State Examination. Word recognition becomes impaired as Alzheimer’s disease progresses, at which point effects of age of acquisition can be observed on the accuracy of performance.

from Neuropsychologia

Verb impairment in aphasia: A priming study of body-part overlap

This group of verb-impaired aphasic individuals was able to automatically (and rapidly) activate somatotopic features of verbs, showing little evidence of impaired lexical-semantic representations. Hence verb processing and verb naming were found to dissociate. In addition, this study extends our understanding of language processing by showing that actions are simulated by the human brain, even when verbs are encountered as de-contextualised single words. Further, somatotopic information is necessary, but not sufficient, for action simulation.

from Aphasiology

Derivational morphology and base morpheme frequency

Morpheme frequency effects for derived words (e.g. an influence of the frequency of the base “dark” on responses to “darkness”) have been interpreted as evidence of morphemic representation. However, it has been suggested that most derived words would not show these effects if family size (a type frequency count claimed to reflect semantic relationships between whole forms) were controlled. This study used visual lexical decision experiments with correlational designs to compare the influences of base morpheme frequency and family size on response times to derived words in English and to test for interactions of these variables with suffix productivity. Multiple regression showed that base morpheme frequency and family size were independent predictors of response times to derived words. Base morpheme frequency facilitated responses but only to productively suffixed derived words, whereas family size facilitated responses irrespective of productivity. This suggests that base morpheme frequency effects are independent of morpheme family size, depend on suffix productivity and indicate that productively suffixed words are represented as morphemes.

from the Journal of Memory and Language

Aging Influences the Neural Correlates of Lexical Decision but Not Automatic Semantic Priming

Human behavioral data indicate that older adults are slower to perform lexical decisions (LDs) than young adults but show similar reaction time gains when these decisions are primed semantically. The present study explored the functional neuroanatomic bases of these frequently observed behavioral findings. Young and older groups completed unprimed and primed LD tasks while functional magnetic resonance imaging (fMRI) was recorded, using a fully randomized trial design paralleling those used in behavioral research. Results from the unprimed task found that age-related slowing of LD was associated with decreased activation in perceptual extrastriate regions and increased activation in regions associated with higher level linguistic processes, including prefrontal cortex. In contrast to these age-related changes in brain activation, the older group showed a preserved pattern of fMRI decreases in inferior temporal cortex when LD was primed semantically. These findings provide evidence that older adults’ LD abilities benefit from contexts that reduce the need for frontally mediated strategic processes and capitalize on the continued sensitivity of inferior temporal cortex to automatic semantic processes in aging.

from Cerebral Cortex

“Pre-semantic” cognition revisited: Critical differences between semantic aphasia and semantic dementia

Patients with semantic dementia show a specific pattern of impairment on both verbal and non-verbal “pre-semantic” tasks: e.g., reading aloud, past tense generation, spelling to dictation, lexical decision, object decision, colour decision and delayed picture copying. All seven tasks are characterised by poorer performance for items that are atypical of the domain and “regularisation errors” (irregular/atypical items are produced as if they were domain-typical). The emergence of this pattern across diverse tasks in the same patients indicates that semantic memory plays a key role in all of these types of “pre-semantic” processing. However, this claim remains controversial because semantically-impaired patients sometimes fail to show an influence of regularity. This study demonstrates that (a) the location of brain damage and (b) the underlying nature of the semantic deficit affect the likelihood of observing the expected relationship between poor comprehension and regularity effects. We compared the effect of multimodal semantic impairment in the context of semantic dementia and stroke aphasia on the seven “pre-semantic” tasks listed above. In all of these tasks, the semantic aphasia patients were less sensitive to typicality than the semantic dementia patients, even though the two groups obtained comparable scores on semantic tests. The semantic aphasia group also made fewer regularisation errors and many more unrelated and perseverative responses. We propose that these group differences reflect the different locus for the semantic impairment in the two conditions: patients with semantic dementia have degraded semantic representations, whereas semantic aphasia patients show deregulated semantic cognition with concomitant executive deficits. These findings suggest a reinterpretation of single case studies of comprehension-impaired aphasic patients who fail to show the expected effect of regularity on “pre-semantic” tasks. Consequently, such cases do not demonstrate the independence of these tasks from semantic memory.

from Neuropsychologia

Individual differences in the joint effects of semantic priming and word frequency revealed by RT distributional analyses: The role of lexical integrity

Abstract
Word frequency and semantic priming effects are among the most robust effects in visual word recognition, and it has been generally assumed that these two variables produce interactive effects in lexical decision performance, with larger priming effects for low-frequency targets. The results from four lexical decision experiments indicate that the joint effects of semantic priming and word frequency are critically dependent upon differences in the vocabulary knowledge of the participants. Specifically, across two Universities, additive effects of the two variables were observed in means, and in RT distributional analyses, in participants with more vocabulary knowledge, while interactive effects were observed in participants with less vocabulary knowledge. These results are discussed with reference to [Borowsky, R., & Besner, D. (1993). Visual word recognition: A multistage activation model. Journal of Experimental Psychology: Learning, Memory, and Cognition, 19, 813–840] multistage account and [Plaut, D. C., & Booth, J. R. (2000). Individual and developmental differences in semantic priming: Empirical and computational support for a single-mechanism account of lexical processing. Psychological Review, 107, 786–823] single-mechanism model. In general, the findings are also consistent with a flexible lexical processing system that optimizes performance based on processing fluency and task demands.

from the Journal of Memory and Language

Impaired semantic inhibition during lexical ambiguity repetition in Parkinson’s disease

Abstract
Impairments of semantic processing and inhibition have been observed in Parkinson’s disease (PD), however, the consequences of faulty meaning selection and suppression have not been considered in terms of subsequent lexical processing. The present study employed a lexical ambiguity repetition paradigm where the first presentation of an ambiguity paired with a target biasing its dominant or subordinate meaning (e.g., bank – money or bank – river) was followed after several intervening trials by a presentation of the same ambiguity paired with a different target that biases the same (congruent) or a different (incongruent) meaning to that biased on the first presentation. Meaning dominance (dominant or subordinate weaker meanings) and interstimulus interval (ISI) were manipulated. Analyses conducted on the second presentation indicated priming of congruent meanings and no priming for the incongruent meanings at both short and long ISIs in the healthy controls, consistent with suppression of meanings competing with the representation biased in the first presentation. In contrast, the PD group failed to dampen activation for the incongruent meaning at the long ISI when the first presentation was subordinate. This pattern is consistent with an impairment of meaning suppression which is observed under controlled processing conditions and varies as a function of meaning dominance of the first presentation. These findings further refine our understanding of lexical-semantic impairments in PD and suggest a mechanism that may contribute to discourse comprehension impairments in this population.

from Cortex

Visual word recognition of multisyllabic words

The visual word recognition literature has been dominated by the study of monosyllabic words in factorial experiments, computational models, and megastudies. However, it is not yet clear whether the behavioral effects reported for monosyllabic words generalize reliably to multisyllabic words. Hierarchical regression techniques were used to examine the effects of standard variables (phonological onsets, stress pattern, length, orthographic N, phonological N, word frequency) and additional variables (number of syllables, feedforward and feedback phonological consistency, novel orthographic and phonological similarity measures, semantics) on the pronunciation and lexical decision latencies of 6115 monomorphemic multisyllabic words. These predictors accounted for 61.2% and 61.6% of the variance in pronunciation and lexical decision latencies, respectively, higher than the estimates reported by previous monosyllabic studies. The findings we report represent a well-specified set of benchmark phenomena for constraining nascent multisyllabic models of English word recognition.

from the Journal of Memory and Language

Regional and Foreign Accent Processing in English: Can Listeners Adapt?

Abstract Recent data suggest that the first presentation of a foreign accent triggers a delay in word identification, followed by a subsequent adaptation. This study examines under what conditions the delay resumes to baseline level. The delay will be experimentally induced by the presentation of sentences spoken to listeners in a foreign or a regional accent as part of a lexical decision task for words placed at the end of sentences. Using a blocked design of accents presentation, Experiment 1 shows that accent changes cause a temporary perturbation in reaction times, followed by a smaller but long-lasting delay. Experiment 2 shows that the initial perturbation is dependent on participants’ expectations about the task. Experiment 3 confirms that the subsequent long-lasting delay in word identification does not habituate after repeated exposure to the same accent. Results suggest that comprehensibility of accented speech, as measured by reaction times, does not benefit from accent exposure, contrary to intelligibility.

from the Journal of Psycholinguistic Research

Language Conflict in the Bilingual Brain

from Cerebral Cortex

The large majority of humankind is more or less fluent in 2 or even more languages. This raises the fundamental question how the language network in the brain is organized such that the correct target language is selected at a particular occasion. Here we present behavioral and functional magnetic resonance imaging data showing that bilingual processing leads to language conflict in the bilingual brain even when the bilinguals’ task only required target language knowledge. This finding demonstrates that the bilingual brain cannot avoid language conflict, because words from the target and nontarget languages become automatically activated during reading. Importantly, stimulus-based language conflict was found in brain regions in the LIPC associated with phonological and semantic processing, whereas response-based language conflict was only found in the pre-supplementary motor area/anterior cingulate cortex when language conflict leads to response conflicts.

Follow

Get every new post delivered to your Inbox.

Join 37 other followers