Volkswirtschaftslehre

Superior temporal sulcus - It's my area: or is it?

Description: 

The superior temporal sulcus (STS) is the chameleon of the human brain. Several research areas claim the STS as the host brain region for their particular behavior of interest. Some see it as one of the core structures for theory of mind. For others, it is the main region for audiovisual integration. It plays an important role in biological motion perception, but is also claimed to be essential for speech processing and processing of faces. We review the foci of activations in the STS from multiple functional magnetic resonance imaging studies, focusing on theory of mind, audiovisual integration, motion processing, speech processing, and face processing. The results indicate a differentiation of the STS region in an anterior portion, mainly involved in speech processing, and a posterior portion recruited by cognitive demands of all these different research areas. The latter finding argues against a strict functional subdivision of the STS. In line with anatomical evidence from tracer studies, we propose that the function of the STS varies depending on the nature of network coactivations with different regions in the frontal cortex and medial-temporal lobe. This view is more in keeping with the notion that the same brain region can support different cognitive operations depending on task-dependent network connections, emphasizing the role of network connectivity analysis in neuroimaging.

Effects of unexpected chords and of performer's expression on brain responses and electrodermal activity

Description: 

BACKGROUND: There is lack of neuroscientific studies investigating music processing with naturalistic stimuli, and brain responses to real music are, thus, largely unknown.
METHODOLOGY/PRINCIPAL FINDINGS: This study investigates event-related brain potentials (ERPs), skin conductance responses (SCRs) and heart rate (HR) elicited by unexpected chords of piano sonatas as they were originally arranged by composers, and as they were played by professional pianists. From the musical excerpts played by the pianists (with emotional expression), we also created versions without variations in tempo and loudness (without musical expression) to investigate effects of musical expression on ERPs and SCRs. Compared to expected chords, unexpected chords elicited an early right anterior negativity (ERAN, reflecting music-syntactic processing) and an N5 (reflecting processing of meaning information) in the ERPs, as well as clear changes in the SCRs (reflecting that unexpected chords also elicited emotional responses). The ERAN was not influenced by emotional expression, whereas N5 potentials elicited by chords in general (regardless of their chord function) differed between the expressive and the non-expressive condition.
CONCLUSIONS/SIGNIFICANCE: These results show that the neural mechanisms of music-syntactic processing operate independently of the emotional qualities of a stimulus, justifying the use of stimuli without emotional expression to investigate the cognitive processing of musical structure. Moreover, the data indicate that musical expression affects the neural mechanisms underlying the processing of musical meaning. Our data are the first to reveal influences of musical performance on ERPs and SCRs, and to show physiological responses to unexpected chords in naturalistic music.

Comparing the processing of music and language meaning using EEG and fMRI provides evidence for similar and distinct neural representations

Description: 

Recent demonstrations that music is capable of conveying semantically meaningful information has raised several questions as to what the underlying mechanisms of establishing meaning in music are, and if the meaning of music is represented in comparable fashion to language meaning. This paper presents evidence showing that expressed affect is a primary pathway to music meaning and that meaning in music is represented in a very similar fashion to language meaning. In two experiments using EEG and fMRI, it was shown that single chords varying in harmonic roughness (consonance/dissonance) and thus perceived affect could prime the processing of subsequently presented affective target words, as indicated by an increased N400 and activation of the right middle temporal gyrus (MTG). Most importantly, however, when primed by affective words, single chords incongruous to the preceding affect also elicited an N400 and activated the right posterior STS, an area implicated in processing meaning of a variety of signals (e.g. prosody, voices, motion). This provides an important piece of evidence in support of music meaning being represented in a very similar but also distinct fashion to language meaning: Both elicit an N400, but activate different portions of the right temporal lobe.

Shared neural resources between music and language indicate semantic processing of musical tension-resolution patterns

Description: 

Harmonic tension-resolution patterns have long been hypothesized to be meaningful to listeners familiar with Western music. Even though it has been shown that specifically chosen musical pieces can prime meaningful concepts, the empirical evidence in favor of such a highly specific semantic pathway has been lacking. Here we show that 2 event-related potentials in response to harmonic expectancy violations, the early right anterior negativity (ERAN) and the N500, could be systematically modulated by simultaneously presented language material containing either a syntactic or a semantic violation. Whereas the ERAN was reduced only when presented concurrently with a syntactic language violation and not with a semantic language violation, this pattern was reversed for the N500. This is the first piece of evidence showing that tension- resolution patterns represent a route to meaning in music.

Is the extrastriate body area (EBA) sensitive to the perception of pain in others?

Description: 

Recent neuroimaging findings suggest a role of the extrastriate body area (EBA) in self/other distinction and in the perception of pain and emotions in others. The present functional magnetic resonance imaging study investigated whether EBA is modulated by the perception of pain in others. Participants were scanned during 2 consecutive sessions: 1) a localizer task precisely identifying EBA in each individual and 2) event-related trials in which participants watched pictures of pain (needle injections into human hands) inflicted in others or control stimuli showing hands in no pain. The perception of pain recruited large parts of the so-called pain matrix, documenting shared neural representations between the perception of pain in self and other. Both the needle injections and the control stimuli consistently activated bilateral EBA, replicating involvement of this area in the perception of body parts. However, activation during the perception of painful stimuli was not different from signal changes during perception of the control stimuli. This suggests that EBA is not specifically involved in empathy for pain.

Time-resolved analysis of fMRI signal changes using Brain Activation Movies

Description: 

Conventional fMRI analyses assess the summary of temporal information in terms of the coefficients of temporal basis functions. Based on established finite impulse response (FIR) analysis methodology we show how spatiotemporal statistical parametric maps may be concatenated to form Brain Activation Movies (BAMs), dynamic activation maps representing the temporal evolution of brain activation throughout task performance. These BAMs enable comprehensive assessment of the dynamics in functional topology without restriction to predefined regions and without detailed information on the stimulus paradigm. We apply BAM visualization to two fMRI studies demonstrating the additional spatiotemporal information available compared to standard fMRI result presentation. Here we show that BAMs allow for unbiased data visualization providing dynamic activation maps without assumptions on the neural activity except reproducibility across trials. It may thus be useful in proceeding from static to dynamic brain mapping, widening the range of fMRI in neuroscience. In addition, BAMs might be helpful tools in visualizing the temporal evolution of activation in "real-time" for better and intuitive understanding of temporal processes in the human brain.

Perspective taking is associated with specific facial responses during empathy for pain

Description: 

Witnessing the distress of others can result both in empathy and personal distress. Perspective-taking has been assigned a major role in the elicitation and modulation of these vicarious responses. However, little is known about how perspective-taking affects the psychophysiological correlates of empathy vs. personal distress. We recorded facial electromyographic and electrocardiographic activity while participants watched videos of patients undergoing painful sonar treatment. These videos were watched using two distinct perspectives: a) imagining the patient's feelings ('imagine other'), or b) imagining to be in the patient's place ('imagine self'). The results revealed an unspecific frowning response as well as activity over the M. orbicularis oculi region which was specific to the 'imagine self' perspective. This indicates that the pain-related tightening of the patients orbits was matched by participants when adopting this perspective. Our findings provide a physiological explanation for the more direct personal involvement and higher levels of personal distress associated with putting oneself explicitly into someone elses shoes. They provide further evidence that empathy does not only rely on automatic processes, but is also strongly influenced by top-down control and cognitive processes.

Anatomy of the episodic buffer: a voxel-based morphometry study in patients with dementia

Description: 

In 2000 Baddeley proposed the existence of a new component of working memory, the episodic buffer, which should contribute to the on-line maintenance of integrated memory traces. The author assumed that this component should be critical for immediate recall of a short story that exceeds the capacity of the phonological store. Accordingly, patients with Alzheimer's dementia (AD) should suffer of a deficit of the episodic buffer when immediate recall of a short story is impossible. On the other hand, the episodic buffer should be somewhat preserved in such patients when some IR can occur (Baddeley and Wilson, 2002).

We adopted this logic for a voxel-based morphometry study. We compared the distribution of grey-matter density of two such groups of AD patients with and of a group of age-matched controls. We found that both AD groups had a significant atrophy of the left mid-hippocampus; on the other hand, the anterior part of the hippocampus was significantly more atrophic in patients who were also impaired on the immediate prose recall task. Six out of ten patients with no immediate recall were spared at "central executive" tasks. Taken together our findings suggest that the left anterior hippocampus contributes to the episodic buffer of the revised working memory model. We also suggest that the episodic buffer is somewhat independent from the central executive component of working memory.

Recognition profile of emotions in natural and virtual faces

Description: 

BACKGROUND: Computer-generated virtual faces become increasingly realistic including the simulation of emotional expressions. These faces can be used as well-controlled, realistic and dynamic stimuli in emotion research. However, the validity of virtual facial expressions in comparison to natural emotion displays still needs to be shown for the different emotions and different age groups.
METHODOLOGY/PRINCIPAL FINDINGS: Thirty-two healthy volunteers between the age of 20 and 60 rated pictures of natural human faces and faces of virtual characters (avatars) with respect to the expressed emotions: happiness, sadness, anger, fear, disgust, and neutral. Results indicate that virtual emotions were recognized comparable to natural ones. Recognition differences in virtual and natural faces depended on specific emotions: whereas disgust was difficult to convey with the current avatar technology, virtual sadness and fear achieved better recognition results than natural faces. Furthermore, emotion recognition rates decreased for virtual but not natural faces in participants over the age of 40. This specific age effect suggests that media exposure has an influence on emotion recognition.
CONCLUSIONS/SIGNIFICANCE: Virtual and natural facial displays of emotion may be equally effective. Improved technology (e.g. better modelling of the naso-labial area) may lead to even better results as compared to trained actors. Due to the ease with which virtual human faces can be animated and manipulated, validated artificial emotional expressions will be of major relevance in future research and therapeutic applications.

Temporal characteristics of audiovisual information processing

Description: 

In complex natural environments, auditory and visual information often have to be processed simultaneously. Previous functional magnetic resonance imaging (fMRI) studies focused on the spatial localization of brain areas involved in audiovisual (AV) information processing, but the temporal characteristics of AV information flow in these regions remained unclear. In this study, we used fMRI and a novel information-theoretic approach to study the flow of AV sensory information. Subjects passively perceived sounds and images of objects presented either alone or simultaneously. Applying the measure of mutual information, we computed for each voxel the latency in which the blood oxygenation level-dependent signal had the highest information content about the preceding stimulus. The results indicate that, after AV stimulation, the earliest informative activity occurs in right Heschl's gyrus, left primary visual cortex, and the posterior portion of the superior temporal gyrus, which is known as a region involved in object-related AV integration. Informative activity in the anterior portion of superior temporal gyrus, middle temporal gyrus, right occipital cortex, and inferior frontal cortex was found at a later latency. Moreover, AV presentation resulted in shorter latencies in multiple cortical areas compared with isolated auditory or visual presentation. The results provide evidence for bottom-up processing from primary sensory areas into higher association areas during AV integration in humans and suggest that AV presentation shortens processing time in early sensory cortices.

Seiten

Le portail de l'information économique suisse

© 2016 Infonet Economy

RSS - Volkswirtschaftslehre abonnieren