Blog Archives

Modulation of the motor system during visual and auditory language processing

Studies of embodied cognition have demonstrated the engagement of the motor system when people process action-related words and concepts. However, research using transcranial magnetic stimulation (TMS) to examine linguistic modulation in primary motor cortex has produced inconsistent results. Some studies report that action words produce an increase in corticospinal excitability; others, a decrease. Given the differences in methodology and modality, we re-examined this issue, comparing conditions in which participants either read or listened to the same set of action words. In separate blocks of trials, participants were presented with lists of words in the visual and auditory modality, and a TMS pulse was applied over left motor cortex, either 150 or 300 ms after the word onset. Motor evoked potentials (MEPs) elicited were larger following the presentation of action words compared with control words. However, this effect was only observed when the words were presented visually; no changes in MEPs were found when the words were presented auditorily. A review of the TMS literature on action word processing reveals a similar modality effect on corticospinal excitability. We discuss different hypotheses that might account for this differential modulation of action semantics by vision and audition.

from Experimental Brain Research

Advertisements

The Auditory Dorsal Pathway: Orienting Vision

Over the last decade, a particularly prominent model of auditory cortical function proposes that a dorsal brain pathway, emanating from the posterior auditory cortex, is primarily concerned with processing the spatial features of sounds. In the present paper, we outline some difficulties with a strict functional interpretation of this pathway, and highlight the recent trend to understand this pathway in terms of one that uses acoustic information to guide motor output towards objects of interest. In this spirit, we consider the possibility that some of the auditory spatial processing activity that has been observed in the dorsal pathway may actually be understood as a form of action processing in which the visual system may be guided to a particular location of interest. In this regard, attentional orientation may be considered a low-level form of action planning. Incorporating an auditory-guided motor aspect to the dorsal pathway not only offers a more holistic account of auditory processing, but also provides a more ecologically-valid perspective on auditory processing in dorsal brain regions.

from Neuroscience and Behavioral Reviews

Re-examining the gesture engram hypothesis. New perspectives on apraxia of tool use

In everyday life, we are led to reuse the same tools (e.g., fork, hammer, coffee-maker), raising the question as to whether we have to systematically recreate the idea of the manipulation which is associated with these tools. The gesture engram hypothesis offers a straightforward answer to this issue, by suggesting that activation of gesture engrams provides a processing advantage, avoiding portions of the process from being reconstructed de novo with each experience. At first glance, the gesture engram hypothesis appears very plausible. But, behind this beguiling simplicity lies a set of unresolved difficulties: (1) What is the evidence in favour of the idea that the mere observation of a tool is sufficient to activate the corresponding gesture engram? (2) If tool use can be supported by a direct route between a structural description system and gesture engrams, what is the role of knowledge about tool function? (3) And, more importantly, what does it mean to store knowledge about how to manipulate tools? We begin by outlining some of the main formulations of the gesture engram hypothesis. Then, we address each of these issues in more detail. To anticipate our discussion, the gesture engram hypothesis appears to be clearly unsatisfactory, notably because of its incapacity to offer convincing answers to these different issues. We conclude by arguing that neuropsychology may greatly benefit from adopting the hypothesis that the idea of how to manipulate a tool is recreated de novo with each experience, thus opening interesting perspectives for future research on apraxia.

from Neuropsychologia

Critical brain regions for action recognition: lesion symptom mapping in left hemisphere stroke

A number of conflicting claims have been advanced regarding the role of the left inferior frontal gyrus, inferior parietal lobe and posterior middle temporal gyrus in action recognition, driven in part by an ongoing debate about the capacities of putative mirror systems that match observed and planned actions. We report data from 43 left hemisphere stroke patients in two action recognition tasks in which they heard and saw an action word (‘hammering’) and selected from two videoclips the one corresponding to the word. In the spatial recognition task, foils contained errors of body posture or movement amplitude/timing. In the semantic recognition task, foils were semantically related (sawing). Participants also performed a comprehension control task requiring matching of the same verbs to objects (hammer). Using regression analyses controlling for both the comprehension control task and lesion volume, we demonstrated that performance in the semantic gesture recognition task was predicted by per cent damage to the posterior temporal lobe, whereas the spatial gesture recognition task was predicted by per cent damage to the inferior parietal lobule. A whole-brain voxel-based lesion symptom-mapping analysis suggested that the semantic and spatial gesture recognition tasks were associated with lesioned voxels in the posterior middle temporal gyrus and inferior parietal lobule, respectively. The posterior middle temporal gyrus appears to serve as a central node in the association of actions and meanings. The inferior parietal lobule, held to be a homologue of the monkey parietal mirror neuron system, is critical for encoding object-related postures and movements, a relatively circumscribed aspect of gesture recognition. The inferior frontal gyrus, on the other hand, was not predictive of performance in any task, suggesting that previous claims regarding its role in action recognition may require refinement.

from Brain

Unintended imitation in nonword repetition

Verbal repetition is conventionally considered to require motor-reproduction of only the phonologically relevant content of a perceived linguistic stimulus, while imitation of incidental acoustic properties of the stimulus is not an explicit part of this task. Exemplar-based theories of speech processing, however, would predict that imitation beyond linguistic reproduction may occur in word repetition. Five experiments were conducted in which verbal audio-motor translations had to be performed under different conditions. Nonwords varying in phonemic content, in vocal pitch (F0), and in speaking style (schwa-syllable expression) were presented. We experimentally varied the factors response delay (repetition vs. shadowing), intention-to-repeat (repetition vs. pseudo-naming), and phonological load (repetition vs. transformation). The responses of ten healthy participants were examined for phonemic accuracy and for traces of para-phonological imitation. Two aphasic patients with phonological impairments were also included, to find out if lesions to left anterior or posterior perisylvian cortex interfere with imitation.

In the healthy participants, significant imitation of both F0 and phonetic style was observed, with markedly stronger effects for the latter. Strong imitation was also found in an aphasic patient with a lesion to left anterior perisylvian cortex, whereas almost no imitation occurred in a patient with a lesion to the posterior language area. The degree of unintended imitation was modulated by each of the three independent factors introduced here. The results are discussed on the background of cognitive and neurolinguistic theories of imitation.

from the Journal of Voice

Body Schemantics: On the role of the body schema in embodied lexical-semantic representations

Words denoting manipulable objects activate sensorimotor brain areas, likely reflecting action experience with the denoted objects. In particular, these sensorimotor lexical representations have been found to reflect the way in which an object is used. In the current paper we present data from two experiments (one behavioral and one neuroimaging) in which we investigate whether body schema information, putatively necessary for interacting with functional objects, is also recruited during lexical processing. To this end, we presented participants with words denoting objects that are typically brought towards or away from the body (e.g., cup or key, respectively). We hypothesized that objects typically brought to a location on the body (e.g., cup) are relatively more reliant on body schema representations, since the final goal location of the cup (i.e., the mouth) is represented primarily through posture and body co-ordinates. In contrast, objects typically brought to a location away from the body (e.g., key) are relatively more dependent on visuo-spatial representations, since the final goal location of the key (i.e., a keyhole) is perceived visually. The behavioral study showed that prior planning of a movement along an axis towards and away from the body facilitates processing of words with a congruent action semantic feature (i.e., preparation of movement towards the body facilitates processing of cup.). In an fMRI study we showed that words denoting objects brought towards the body engage the resources of brain areas involved in the processing information about human bodies (i.e., the extra-striate body area, middle occipital gyrus and inferior parietal lobe) relatively more than words denoting objects typically brought away from the body. The results provide converging evidence that body schema are implicitly activated in processing lexical information.

from Neuropsychologia

Co-speech gestures in a naming task: Developmental data

Few studies have explored the development of the gesture-speech system after the two-word stage. Aim of the present study is to examine developmental changes in speech and gesture use, in the context of a simple naming task. Fifty-one children (age range: 2;3-7;6) were divided into five age groups and requested to name pictures representing objects, actions, or characteristics. In the context of a naming task that requires only the production of a single word, children produced pointing and representational gestures together with spoken responses. Pointing was the most frequent gesture produced by all groups of children. Among representational gestures, action gestures were more frequent than size and shape gestures. In addition, gesture production declined as a function of increasing age and spoken lexical competence. Results are discussed in terms of the links between action, gesture, and language, and the ways in which these may change developmentally.

from Language and Cognitive Processes

Actions, Words, and Numbers: A Motor Contribution to Semantic Processing?

from Current Directions in Psychological Science

ABSTRACT—Recent findings in neuroscience challenge the view that the motor system is exclusively dedicated to the control of actions, and it has been suggested that it may contribute critically to conceptual processes such as those involved in language and number representation. The aim of this review is to address this issue by illustrating some interactions between the motor system and the processing of words and numbers. First, we detail functional brain imaging studies suggesting that motor circuits may be recruited to represent the meaning of action-related words. Second, we summarize a series of experiments demonstrating some interference between the size of grip used to grasp objects and the magnitude processing of words or numbers. Third, we report data suggestive of a common representation of numbers and finger movements in the adult brain, a possible trace of the finger-counting strategies used in childhood. Altogether, these studies indicate that the motor system interacts with several aspects of word and number representations. Future research should determine whether these findings reflect a causal role of the motor system in the organization of semantic knowledge.

Hands in the air: Using ungrounded iconic gestures to teach children conservation of quantity.

from Developmental Psychology

Including gesture in instruction facilitates learning. Why? One possibility is that gesture points out objects in the immediate context and thus helps ground the words learners hear in the world they see. Previous work on gesture’s role in instruction has used gestures that either point to or trace paths on objects, thus providing support for this hypothesis. The experiments described here investigated the possibility that gesture helps children learn even when it is not produced in relation to an object but is instead produced “in the air.” Children were given instruction in Piagetian conservation problems with or without gesture and with or without concrete objects. The results indicate that children given instruction with speech and gesture learned more about conservation than children given instruction with speech alone, whether or not objects were present during instruction. Gesture in instruction can thus help learners learn even when those gestures do not direct attention to visible objects, suggesting that gesture can do more for learners than simply ground arbitrary, symbolic language in the physical, observable world. (PsycINFO Database Record (c) 2008 APA, all rights reserved)