Skip to main content Skip to search
Displaying 1 - 4 of 4
Two groups of subjects classified as high vs. low in the need for power (n power) were assessed for augmenting versus reducing in the event-related potential (ERP) elicited by neutral and power-related words. Words at four different intensity levels in each of these two classes were randomly presented and ERPs in response to each word class at each of the four intensity levels were computed from EEG recorded at Fz. The results indicated that the two groups responded differentially to the power-related vs. neutral words. HIgh n power subjects showed reduction in response to both power-related and neutral words while low n power subjects showed augmentation in response to the power-related words.
Zotero Collections:

Two groups of subjects classified as high vs. low in the need for power (n power) were assessed for augmenting versus reducing in the event-related potential (ERP) elicited by neutral and power-related words. Words at four different intensity levels in each of these two classes were randomly presented and ERPs in response to each word class at each of the four intensity levels were computed from EEG recorded at Fz. The results indicated that the two groups responded differentially to the power-related vs. neutral words. HIgh n power subjects showed reduction in response to both power-related and neutral words while low n power subjects showed augmentation in response to the power-related words.

Studies of emotion signaling inform claims about the taxonomic structure, evolutionary origins, and physiological correlates of emotions. Emotion vocalization research has tended to focus on a limited set of emotions: anger, disgust, fear, sadness, surprise, happiness, and for the voice, also tenderness. Here, we examine how well brief vocal bursts can communicate 22 different emotions: 9 negative (Study 1) and 13 positive (Study 2), and whether prototypical vocal bursts convey emotions more reliably than heterogeneous vocal bursts (Study 3). Results show that vocal bursts communicate emotions like anger, fear, and sadness, as well as seldom-studied states like awe, compassion, interest, and embarrassment. Ancillary analyses reveal family-wise patterns of vocal burst expression. Errors in classification were more common within emotion families (e.g., 'self-conscious,' 'pro-social') than between emotion families. The three studies reported highlight the voice as a rich modality for emotion display that can inform fundamental constructs about emotion.
Zotero Collections:

BACKGROUND: Recent studies have highlighted the role of right-sided anterior temporal and prefrontal activation during anxiety, yet no study has been performed with social phobics that assesses regional brain and autonomic function. This study compared electroencephalograms (EEGs) and autonomic activity in social phobics and controls while they anticipated making a public speech. METHODS: Electroencephalograms from 14 scalp locations, heart rate, and blood pressure were recorded while 18 DSM-IV social phobics and 10 controls anticipated making a public speech, as well as immediately after the speech was made. Self-reports of anxiety and affect were also obtained. RESULTS: Phobics showed a significantly greater increase in anxiety and negative affect during the anticipation condition compared with controls. Heart rate was elevated in the phobics relative to the controls in most conditions. Phobics showed a marked increase in right-sided activation in the anterior temporal and lateral prefrontal scalp regions. These heart rate and EEG changes together accounted for > 48% of the variance in the increase in negative affect during the anticipation phase. CONCLUSIONS: These findings support the hypothesis of right-sided anterior cortical activation during anxiety and indicate that the combination of EEG and heart rate changes during anticipation account for substantial variance in reported negative affect.
Zotero Collections: