Any emotion, if it is sincere, is involuntary. ~ Mark Twain When dealing with people, remember you are not dealing with creatures of logic, but creatures of emotion. ~ Dale Carnegie |
Chapter 10
Emotion
OUTLINE
What Is an Emotion?
Neural Systems Involved in Emotion Processing
Categorizing Emotions
Theories of Emotion Generation
The Amygdala
Interactions Between Emotion and Other Cognitive Processes
Get a Grip! Cognitive Control of Emotion
Other Areas, Other Emotions
Unique Systems, Common Components
AT AGE 42, the last time that S.M. remembered actually being scared was when she was 10. This was not because she had not been in frightening circumstances; in fact, she had been in plenty. She had been held at both knife- and gun-point, physically accosted by a woman twice her size, and nearly killed in a domestic violence attack, among other experiences (Feinstein et al., 2011).
Oddly enough, S.M. doesn’t really notice that things don’t frighten her. What she did notice, beginning at age 20, were seizures. A CT scan and an MRI revealed that both of S.M.’s amygdalae were severely atrophied (Figure 10.1). Further tests revealed that she had a rare autosomal recessive genetic disorder, Urbach–Wiethe disease, which leads to degeneration of the amygdalae (Adolphs et al., 1994, 1995; Tranel & Hyman, 1990), typically with an onset around 10 years of age. The deterioration of her amygdalae was highly specific; surrounding white matter showed minimal damage. On standard neuropsychological tests, her intelligence scores were in the normal range, and she had no perceptual or motor problems. Something curious popped up, however, when her emotional processing was tested. S.M. was shown a large set of photographs and asked to judge the emotion being expressed by the individuals in the pictures. She accurately identified expressions conveying sadness, anger, disgust, happiness, and surprise. But one facial expression stumped her: fear (see the bottom right photo of Figure 10.6, for a similar example expressing fear). S.M. seemed to know that some emotion was being expressed, and she was capable of recognizing facial identities (Adolphs et al., 1994), but she was not able to recognize fear in facial expressions. She also had another baffling deficit. When asked to draw pictures depicting different emotions, she was able to provide reasonable cartoons of a range of states, except when asked to depict fear. When prodded to try, she scribbled for a few minutes, only to reveal a picture of a baby crawling, but couldn’t say why she had produced this image (Figure 10.2).
FIGURE 10.1 Bilateral amygdala damage in patient S.M.
The white arrows indicate where the amygdala are located in the right and left hemispheres. Patient S.M. has severe atrophy of the amygdala, and the brain tissue is now replaced by cerebrospinal fluid (black).
One tantalizing possibility was that S.M. was unable to process the concept of fear. This idea was rejected, however, because she was able to describe situations that would elicit fear, used words describing fear properly (Adolphs et al., 1995), and she had no trouble labeling fearful tones in voices (Adolphs & Tranel, 1999). She even stated that she “hated” snakes and spiders and tried to avoid them (Feinstein et al., 2011). Although S.M. was able to describe fear and had indicated she was afraid of snakes, when her researchers objectively investigated whether she also had abnormal fear reactions, they found that she had a very much reduced experience of fear. She was taken to an exotic pet store that had a large collection of snakes and spiders. Contrary to her declarations, she spontaneously went to the snake terrariums and was very curious and captivated. She readily held one, rubbed its scales, touched its tongue, and commented, “This is so cool!” (Feinstein et al., 2011). What’s more, she repeatedly asked if she could touch the larger (some poisonous) snakes. While handling the snake, she reported her fear rating was never more than 2 on a 0–10 scale. Other attempts to elicit fear in S.M., such as going to a haunted house or watching a scary film, received a rating of zero, though she knew that other people would consider the experiences scary. Thus, her inability to experience fear was not the result of misunderstanding the concept of fear or not recognizing it. She did exhibit appropriate behavior when viewing film clips meant to induce all the other emotions, so it wasn’t that she had no emotional experience. Nor was it because she had never experienced fear. She described being cornered by a growling Doberman pincher when she was a child (before her disease manifested itself), screaming for her mother and crying, along with all the accompanying visceral fear reactions. Perhaps this is why she drew a toddler when asked to depict fear. It was not for lack of real-life fear-inducing episodes, either. S.M. had experienced those events we described earlier. In fact, her difficulty in detecting and avoiding threatening situations had probably resulted in her being in them more often than most people. These observations appeared to rule out a generalized conceptual deficit: She understood the notion, she just didn’t experience it.
Another interesting facet of S.M.’s behavior is that after being extensively studied for over 20 years, she continues to have no insight into her deficit and is unaware that she still becomes involved in precarious situations. It seems that because she cannot experience fear, she does not avoid them. (It sounds like the interpreter system is not getting any input about feeling fear; see Chapter 4.) What can be surmised about the amygdala and emotional processing from S.M.?
FIGURE 10.2 S.M.’s deficit in comprehending fear is also observed on a production task. |
It is difficult to understand who we are or how we interact with the world without considering our emotional lives. Under the umbrella of cognitive neuroscience, the study of emotion was slow to emerge because, for a number of reasons, emotion is difficult to study systematically. For a long time, emotion was considered to be subjective to the individual and thus, not amenable to empirical analysis. Researchers eventually realized that conscious emotions arise from unconscious processes that can be studied using the tools of psychology and cognitive neuroscience (see a review of the problem in LeDoux, 2000). It has become apparent that emotion is involved with much of cognitive processing. Its involvement ranges from influencing what we remember (Chapter 9), to where we direct our attention (Chapter 7), to the decisions that we make (Chapter 12). Our emotions modulate and bias our own behavior and actions. Underlying all emotion research is a question: Is there a neural system dedicated to emotions or are they just another form of cognition that is only phenomenologically different (S. Duncan & Barrett, 2007)? The study of emotion is emerging as a critical and exciting research topic.
We begin this chapter with some attempts to define emotion. Next, we review the areas of the brain that are thought to mediate emotion processing. We also survey the theories about emotions and how they are generated. Much of the research on emotion has concentrated on the workings of the amygdala, so we examine this part of the brain in some detail. We also look at the progress made in answering the questions that face emotion researchers:
We close the chapter with a look at several (especially) complex emotions, including happiness and love.
What Is an Emotion?
People have been struggling with this question for at least several thousand years. Even today, the answer remains unsettled. In the current Handbook of Emotions (3rd ed.), the late philosopher Robert Soloman (2008) devotes an entire chapter to discussing the lack of a good definition of emotion and looking at why it is so difficult to define. How would you define emotion?
Maybe your definition starts with “An emotion is a feeling you get when....” And we already have a problem, because many researchers claim that a feeling is the subjective experience of the emotion, but not the emotion itself. These two events are dissociable and, as we see later in this chapter, they use separate neural systems. Perhaps evolutionary principles can help us with a general definition. Emotions are neurological processes that have evolved, which guide behavior in such a manner as to increase survival and reproduction. How’s that for vague? Here is a definition from Kevin Ochsner and James Gross (2005), two researchers whose work we look at in this chapter:
Current models posit that emotions are valenced responses to external stimuli and/or internal mental representations that
- involve changes across multiple response systems (e.g., experiential, behavioral, peripheral, physiological),
- are distinct from moods, in that they often have identifiable objects or triggers,
- can be either unlearned responses to stimuli with intrinsic affective properties (e.g., pulling your hand away when you burn it) or learned responses to stimuli with acquired emotional value (e.g., fear when you see a dog that previously bit you),
- can involve multiple types of appraisal processes that assess the significance of stimuli to current goals, that
- depend upon different neural systems.
Most psychologists agree that emotion consists of three components:
Neural Systems Involved in Emotion Processing
Many parts of the nervous system are involved in our emotions. When emotions are triggered by an external event or stimulus (as they often are), our sensory systems play a major role. Sometimes emotions are triggered by an episodic memory, in which case our memory systems are involved (see Chapter 9). The physiologic components of emotion (that shiver up the spine, or the racing heart and dry mouth people experience with fear) involve the autonomic nervous system (ANS), a division of the peripheral nervous system. Recall from Chapter 2 that the ANS is made up of the sympathetic and the parasympathetic nervous systems (see Figure 2.17), and its motor and sensory neurons extend to the heart, lungs, gut, bladder, and sexual organs. The two systems work in combination to achieve homeostasis. As a rule of thumb, the sympathetic system promotes “fight or flight” arousal, and the parasympathetic promotes “rest and digest.” The ANS is regulated by the hypothalamus. The hypothalamus also controls the release of hormones from the pituitary gland. Of course, the fight-or-flight response uses the motor system. Arousal is a critical part of many theories on emotion. The arousal system is regulated by the reticular activating system, which is composed of sets of neurons running from the brainstem to the cortex via the rostral intralaminar and thalamic nuclei.
All of the neural systems mentioned so far are important in triggering an emotion or in generating physiological and behavioral responses. Yet where do emotions reside? We turn to that question next.
Early Concepts: The Limbic System as the Emotional Brain
The notion that emotion is separate from cognition and has its own network of brain structures underlying emotional behavior is not new. As we mentioned in Chapter 2, James Papez (pronounced “payps”) proposed a circuit theory of the brain and emotion in 1937, suggesting that emotional responses involve a network of brain regions made up of the hypothalamus, anterior thalamus, cingulate gyrus, and hippocampus. Paul MacLean (1949, 1952) later named these structures the Papez circuit. He then extended this emotional network to include what he called the visceral brain, adding Broca’s limbic lobe and some subcortical nuclei and portions of the basal ganglia. Later, MacLean included the amygdala and the orbitofrontal cortex. He called this extended neural circuit of emotion the limbic system, from the Latin limbus, meaning “rim.” The structures of the limbic system roughly form a rim around the corpus callosum (Anatomical Orientation figure; also see Figure 2.26).
ANATOMICAL ORIENTATION
The anatomy of emotion
The limbic system.
FIGURE 10.3 Specific brain regions are hypothesized to be associated with specific emotions.
The rust colored orbitofrontal cortex is associated with anger, the anterior cingulate gyrus in purple with sadness, the blue insula with disgust, and the green amygdala with fear.
MacLean’s early work identifying the limbic system as the “emotional” brain was influential. To this day, studies on the neural basis of emotion include references to the “limbic system” or “limbic” structures. The continued popularity of the term limbic system in more recent work is due primarily to the inclusion of the orbitofrontal cortex and amygdala in that system. As we shall see, these two areas have been the focus of investigation into the neural basis of emotion (Figure 10.3; Damasio, 1994; LeDoux, 1992). The limbic system concept as strictly outlined by MacLean, however, has not been supported over the years (Brodal, 1982; Kotter & Meyer, 1992; LeDoux, 1991; Swanson, 1983). We now know that many brainstem nuclei that are connected to the hypothalamus are not part of the limbic system. Similarly, many brainstem nuclei that are involved in autonomic reactions important to MacLean’s idea of a visceral brain are not part of the limbic system. Although several limbic structures are known to play a role in emotion, it has been impossible to establish criteria for defining which structures and pathways should be included in the limbic system. At the same time, classic limbic areas such as the hippocampus have been shown to be more important for other, nonemotional processes, such as memory (see Chapter 9). With no clear understanding as to why some brain regions and not others are part of the limbic system, MacLean’s concept has proven to be more descriptive and historical than functional in our current understanding of the neural basis of emotion.
Early attempts to identify neural circuits of emotion viewed emotion as a unitary concept that could be localized to one specific circuit, such as the limbic system. Viewing the “emotional brain” as separate from the rest of the brain spawned a locationist view of emotions. The locationist account hypothesizes that all mental states belonging to the same emotion category are produced by activity that is recurrently associated with a specific region in the brain (Figure 10.3). Also, this association is an inherited trait, and homologies are seen in other mammalian species (Panksepp, 1998; for a contrary view, see Lindquist et al., 2012).
Emerging Concepts of Emotional Networks
Over the last several decades, scientific investigations of emotion have become more detailed and complex. By measuring brain responses to emotionally salient stimuli, researchers have revealed a complex interconnected network involved in the analysis of emotional stimuli. This network includes the thalamus, the somatosensory cortex, higher order sensory cortices, the amygdala, the insular cortex (also called the insula), and the medial prefrontal cortex, including the orbitofrontal cortex, ventral striatum, and anterior cingulate cortex (ACC).
Those who study emotion now acknowledge that it is a multifaceted behavior that may vary along a spectrum from basic to more complex: It isn’t captured by one definition or contained within a single neural circuit. Indeed, S.M.’s isolated emotional deficit in fear recognition following bilateral amygdala damage supports the idea that there is no single emotional circuit. Emotion research now focuses on specific types of emotional tasks and on identifying the neural systems underlying specific emotional behaviors. Depending on the emotional task or situation, we can expect different neural systems to be involved. The question remains, however, whether discrete neural mechanisms and circuits underlie the different emotion categories, or if emotions emerge out of basic operations that are not specific to emotion (psychological constructionist approach), or if a combination exists whereby some brain systems are common to all emotions allied with separable regions dedicated to processing individual emotions such as fear, anger, and disgust. According to the constructionist approach, the brain does not necessarily function within emotion categories (L. F. Barrett, 2009; S. Duncan & Barrett, 2007; Lindquist et al., 2012; Pessoa, 2008). Instead, the psychological function mediated by an individual brain region is determined, in part, by the network of brain regions it is firing with (A. R. McIntosh, 2004). In this view, each brain network might involve some brain regions that are more or less specialized for emotional processing, along with others that serve many functions, depending on what role a particular emotion plays. For instance, the dorsomedial prefrontal areas that represent self and others are active across all emotions (Northoff et al., 2005), while brain regions that support attentional vigilance are recruited to detect threat signals; the brain regions that represent the consequence that a stimulus will have for the body are activated for disgust, but not only for disgust. So, just as a definition for emotion is in flux, so too are the anatomical correlates of emotional processing.
TAKE-HOME MESSAGES
Categorizing Emotions
At the core of emotion research is the issue of whether emotions are “psychic entities” that are specific, biologically fundamental, and hardwired with dedicated brain mechanisms (as Darwin supposed). Or, are emotions states of mind that are assembled from more basic, general causes, as William James suggested?
The trouble with the emotions in psychology is that they are regarded too much as absolutely individual things. So long as they are set down as so many eternal and sacred psychic entities, like the old immutable species in natural history, all that can be done with them is reverently to catalogue their separate characters, points, and effects. But if we regard them as products of more general causes (as “species” are now regarded as products of heredity and variation), the mere distinguishing and cataloguing becomes of subsidiary importance. Having the goose which lays the golden eggs, the description of each egg already laid is a minor matter. (James, 1890, p. 449)
James was of the opinion that emotions were not basic, nor were they found in dedicated neural structures, but were the melding of a mélange of psychological ingredients honed by evolution.
As we noted earlier in this chapter, most emotion researchers agree that the response to emotional stimuli is adaptive, comprised of three psychological states: a peripheral physiological response (e.g., heart racing), a behavioral response, and the subjective experience (i.e., feelings). What they don’t agree on are the underlying mechanisms. The crux of the disagreement among the different theories of emotion generation involves the timing of these three components and whether cognition plays a role. An emotional stimulus is a stimulus that is highly relevant for the well-being and survival of the observer. Some stimuli, such as predators or dangerous situations, may be threats; others may offer opportunities for betterment, such as food or potential mates. How the status of a stimulus is determined is another issue, as is whether the perception of the emotional stimulus leads to quick automatic processing and stereotyped emotional responses or if the response is modified by cognition. Next, we discuss the basic versus dimensional categorization of emotion and then look at representatives of the various theories of emotion generation.
Fearful, sad, anxious, elated, disappointed, angry, shameful, disgusted, happy, pleased, excited, and infatuated are some of the terms we use to describe our emotional lives. Unfortunately, our rich language of emotion is difficult to translate into discrete states and variables that can be studied in the laboratory. In an effort to apply some order and uniformity to our definition of emotion, researchers have focused on three primary categories of emotion:
Basic Emotions
We may use delighted, joyful, and gleeful to describe how we feel, but most people would agree that all of these words represent a variation of feeling happy. Central to the hypothesis that basic emotions exist is the idea that emotions reflect an inborn instinct. If a relevant stimulus is present, it will trigger an evolved brain mechanism in the same way, every time. Thus, we often describe basic emotions as being innate and similar in all humans and many animals. As such, basic emotions exist as entities independent of our perception of them. In this view, each emotion produces predictable changes in sensory, perceptual, motor, and physiological functions that can be measured and thus provide evidence that the emotion exists.
Facial Expressions and Basic Emotions For the past 150 years, many investigators have considered facial expressions to be one of those predictable changes. Accordingly, it is believed that research on facial expressions opens an extraordinary window into these basic emotions. This belief is based on the assumption that facial expressions are observable, automatic manifestations that correspond to a person’s inner feelings. Duchenne de Boulogne carried out some of the earliest research on facial expressions. One of his patients was an elderly man who suffered from near-total facial anesthesia. Duchenne developed a technique to electrically stimulate the man’s facial muscles and methodically trigger muscle contractions, and he recorded the results with the newly invented camera (Figure 10.4). He published his findings in The Mechanism of Human Facial Expression (1862). Duchenne believed that facial expressions revealed underlying emotions. Duchenne’s studies influenced Darwin’s work on the evolutionary basis of human emotional behavior, outlined in The Expression of the Emotions in Man and Animals (1873). Darwin had questioned people familiar with different cultures about the emotional lives of these varied cultures. From these discussions, Darwin determined that humans have evolved to have a finite set of basic emotional states, and each state is unique in its adaptive significance and physiological expression. The idea that humans have a finite set of universal, basic emotions was born, and this was the idea that William James protested.
FIGURE 10.4 Duchenne triggering muscle contractions in his patient, who had facial anesthesia.
The study of facial expressions was not taken up again until the 1960s, when Paul Ekman sought evidence for his hypothesis that (a) emotions varied only along a pleasant to unpleasant scale; (b) the relationship between a facial expression and what it signified was learned socially; and (c) the meaning of a particular facial expression varied among cultures. He studied cultures from around the world and discovered that, counter to his early hypothesis, the facial expressions humans use to convey emotion do not vary much from culture to culture. Whether people are from the Bronx, Beijing, or Papua New Guinea, the facial expressions we use to show that we are happy, sad, fearful, disgusted, angry, or surprised are pretty much the same (Ekman & Friesen, 1971; Figure 10.5). From this work, Ekman and others suggested that anger, fear, disgust, sadness, happiness, and surprise are the six basic human facial expressions and that each expression represents a basic emotional state (Table 10.1). Since then, other emotions have been added as potential candidate basic emotions.
Jessica Tracy and David Matsumoto (2008) have provided evidence that might change the rank of pride and shame to that of true basic emotions. They looked at the nonverbal expressions of pride or shame in reaction to winning or losing a judo match at the 2004 Olympic and Paralympic Games in contestants from 37 nations. Among the contestants, some were congenitally blind. Thus, the researchers assumed that in congenitally blind participants, the body language of their behavioral response was not learned culturally. All of the contestants displayed prototypical expressions of pride upon winning (Figure 10.6). Most cultures displayed behaviors associated with shame upon losing, though the response was less pronounced in athletes from highly individualistic cultures. This finding suggested to these researchers that behavior associated with pride and shame is innate and that these two emotions are basic.
FIGURE 10.5 The universal emotional expressions.
The meaning of these facial expressions is similar across all cultures. Can you match the faces to the emotional states of anger, disgust, fear, happiness, sadness, and surprise?
Although considerable debate continues as to whether any single list is adequate to capture the full range of emotional experiences, most scientists accept the idea that all basic emotions share three main characteristics. They are all innate, universal, and short-lasting human emotions. Table 10.2 is a set of criteria that some emotion researchers, such as Ekman, believe are common to all basic emotions.
Some basic emotions such as fear and anger have been confirmed in animals, which show dedicated subcortical circuitry for such emotions. Ekman also found that humans have specific physiological reactions for anger, fear, and disgust (see Ekman, 1992, for a review). Consequently, many researchers start with the assumption that everyone, including animals, has a set of basic emotions.
table 10.1 The Well-Established and Possible Basic Emotions According to Ekman (1999) | |
Well-established basic emotions | Candidate basic emotions |
Anger | Contempt |
Fear | Shame |
Sadness | Guilt |
Enjoyment | Embarrassment |
Disgust | Awe |
Surprise | Amusement |
Excitement | |
Pride in achievement | |
Relief | |
Satisfaction | |
Sensory pleasure | |
Enjoyment |
Complex Emotions
Even if we accept that basic emotions exist, we are still faced with identifying which emotions are basic and which are complex (Ekman, 1992; Ortigue et al., 2010a). Some commonly recognized emotions, such as jealousy and parental love, are absent from Ekman’s list (see Table 10.1; Ortigue et al., 2010a; Ortigue & Bianchi-Demicheli, 2011). Ekman did not exclude these intense feelings from his list of emotions, but called them “emotion complexes” (see Darwin et al., 1998). He differentiated them from basic emotions as follows: “Parental love, romantic love, envy, or jealousy last for much longer periods—months, years, a lifetime for love and at least hours or days for envy or jealousy” (Darwin et al., 1998, p. 83). Jealousy is one of the most interesting of the complex emotions (Ortigue & Bianchi-Demicheli, 2011). A review of the clinical literature of patients who experienced delusional jealousy following a brain infarct or a traumatic brain injury revealed that delusional jealousy is mediated by more than just the limbic system. A broad network of regions within the brain, including higher order cortical areas involved with social cognition (Chapter 13), theory of mind (Chapter 13), and interpretation of actions performed by others (Chapter 8) are involved (Ortigue & Bianchi-Demicheli, 2011). Clearly, jealousy is a complex emotion.
Similarly, romantic love is far more complicated than researchers initially thought (Ortigue et al., 2010a). (We do have to wonder who ever thought love was not complicated.) Ekman differentiates love from the basic emotions because no universal facial expressions exist for romantic love (see Table 10.1; Sabini & Silver, 2005). As Charles Darwin mentioned, “Although the emotion of love, for instance that of a mother for her infant, is one of the strongest of which the mind is capable, it can hardly be said to have any proper or peculiar means of expression” (Darwin, 1873, p. 215). Indeed, with love we can feel intense feelings and inner thoughts that facial expressions cannot reflect. Love may be described as invisible—though some signs of love, such as kissing and hand-holding, are explicit and obvious (Bianchi-Demicheli et al., 2006, 2010b). The visible manifestations of love, however, are not love per se (Ortigue et al., 2008, 2010b). The recent localization of love in the human brain—within subcortical reward, motivation, and emotion systems as well as higher order cortical brain networks involved in complex cognitive functions and social cognition—reinforces the assumption that love is a complex, goal-directed emotion rather than a basic one (Ortigue et al., 2010a; Bianchi-Demicheli et al., 2006). Complex emotions, such as love and jealousy, are considered to be refined, long-lasting cognitive versions of basic emotions that are culturally specific or individual.
FIGURE 10.6 Athletes from 37 countries exhibit spontaneous pride and shame behaviors.
The graphs compare the mean levels of nonverbal behaviors spontaneously displayed in response to wins and losses by sighted athletes on the top and congenitally blind athletes on the bottom.
table 10.2 Criteria of the Basic Emotions According to Ekman (1994) |
|
NOTE: In 1999, Ekman developed three additional criteria: (1) distinctive appearance developmentally; (2) distinctive thoughts, memories, images; and (3) distinctive subjective experience. |
Dimensions of Emotion
Another way of categorizing emotions is to describe them as reactions that vary along a continuum of events in the world, rather than as discrete states. That is, some people hypothesize that emotions are better understood by how arousing or pleasant they may be or by how motivated they make a person feel about approaching or withdrawing from an emotional stimulus.
Valence and Arousal Most researchers agree that emotional reactions to stimuli and events can be characterized by two factors: valence (pleasant–unpleasant or good–bad) and arousal (the intensity of the internal emotional response, high–low; Osgood et al., 1957; Russell, 1979). For instance, most of us would agree that being happy is a pleasant feeling (positive valence) and being angry is an unpleasant feeling (negative valence). If we find a quarter on the sidewalk, however, we would be happy but not really all that aroused. If we were to win $10 million in a lottery, we would be intensely happy (ecstatic) and intensely aroused. Although in both situations we experience something that is pleasant, the intensity of that feeling is certainly different. By using this dimensional approach—tracking valence and arousal—researchers can more concretely assess the emotional reactions elicited by stimuli. Instead of looking for neural correlates of specific emotions, these researchers look for the neural correlates of the dimensions—arousal and valence.
Approach or Withdraw A second dimensional approach characterizes emotions by the actions and goals that they motivate. Richard Davidson and colleagues (1990) at the University of Wisconsin–Madison suggested that different emotional reactions or states can motivate us to either approach or withdraw from a situation. For example, the positive emotion of happiness may excite a tendency to approach or engage in the eliciting situations, whereas the negative emotions of fear and disgust may motivate us to withdraw from the eliciting situations. Motivation, however, involves more than just valence. Anger, a negative emotion, can motivate approach. Sometimes the motivating stimuli can excite both approach and withdrawal: It is 110 degrees, and for hours you have been traveling across the Australian outback on a bus with no air conditioning. You are hot, sweaty, dirty, and your only desire is to jump into the river you’ve been slowly approaching all day. You are finally dropped off at your campground by the Katherine River, where you see a rope swing dangling invitingly next to the water. You drop your pack and trot to the river, which is stimulating you to approach. As you get closer, you catch a glimpse of a typically Australian sign next to the river’s edge: “Watch out for crocs.” Hmm... the river is no longer as approachable. You want to go in, and yet....
Categorizing emotions as basic, complex, and dimensional does not adequately capture all of our emotional experiences. Think of these categories instead as a framework that we can use in our scientific investigations of emotion. No single approach is correct all of the time, so we must not get drawn into an either-or debate. It is essential, though, to understand how emotion is defined, so that as we analyze specific examples of emotion research, meaningful consensus can emerge from a range of results. Next we examine some of the many theories of how emotions are generated.
TAKE-HOME MESSAGES
Theories of Emotion Generation
As we outlined near the beginning of this chapter, every emotion, following the perception of an emotion-provoking stimulus, has three components. There is a physiological response, a behavioral response, and a feeling. The crux of every theory of emotion generation involves the timing of the physiological reaction (for instance, the racing heart), the behavior reaction (such as the fight-or-flight response), and the experiential feeling (I’m scared!).
James–Lange Theory
William James proposed that the emotions were the perceptual results of somatovisceral feedback from bodily responses to an emotion-provoking stimulus. He used the example of fear associated with spotting a bear.
Our natural way of thinking about these standard emotions is that the mental perception of some fact excites the mental affection called the emotion, and that this latter state of mind gives rise to the bodily expression. My thesis on the contrary is that the bodily changes follow directly the PERCEPTION of the exciting fact, and that our feeling of the same changes as they occur IS the emotion. Common sense says,... we meet a bear, are frightened and run;... The hypothesis here to be defended says that this order of sequence is incorrect, that the one mental state is not immediately induced by the other, that the bodily manifestations must first be interposed between, and that the more rational statement is that we feel... afraid because we tremble, and not that we... tremble, because we are... fearful, as the case may be. Without the bodily states following on the perception, the latter would be purely cognitive in form, pale, colourless, destitute of emotional warmth. We might then see the bear, and judge it best to run... but we could not actually feel afraid. (James, 1884, p.189)
Thus, in James’s view, you don’t run because you are afraid, you are afraid because you become aware of your bodily change when you run. A similar proposition was suggested by a contemporary of James, Carl Lange, and the theory was dubbed the James–Lange theory.
So Lange and James theorize that
The bear (perception of stimulus) → physiologic reaction (adrenaline released causing increased heart and respiratory rates, sweating, and fight-or-flight response) → automatic, nonconscious interpretation of the physiological response (my heart is beating fast, I am running; I must be afraid) = subjective emotional feeling (scared!).
Thus James and Lange believed that with emotion there is a specific physiological reaction and that people could not feel an emotion without first having a bodily reaction.
Cannon–Bard Theory
James’s proposal caused quite an uproar. A counterproposal was offered several years later by a pair of physiologists from Harvard, Walter Cannon and Philip Bard. They thought that physiological responses were not distinct enough to distinguish among fear, anger, and sexual attraction, for example. Cannon and Bard also believed that the neuronal and hormonal feedback processes are too slow to precede and account for the emotions. Cannon (who was the first person to describe the fight-or-flight response) thought that the sympathetic nervous system coordinated the reaction while the cortex simultaneously generated the emotional feeling. Cannon found that when he severed the cortex from the brainstem above the hypothalamus and thalamus, cats still had an emotional reaction when provoked. They would growl, bare their teeth, and their hair would stand on end. They had the emotional reaction without cognition. These researchers proposed that an emotional stimulus was processed by the thalamus and sent simultaneously to the neocortex and to the hypothalamus that produced the peripheral response. Thus the neocortex generated the emotional feeling while the periphery carried out the slower emotional reaction. Returning to the bear-in-the-woods scenario, the Cannon–Bard theory is
fast | ||||
cortex (interpretation: → → → scared dangerous situation) | ||||
↑ | ||||
The bear | → | thalamus | ||
↓ | ||||
slower | ||||
hypothalamus (sympathetic nervous system) |
→ |
emotional reaction (fight or flight) |
Subsequent research, however, refuted some of Cannon’s and Bard’s ideas. For instance, Paul Ekman showed that at least some emotional responses (anger, fear, and disgust) can be differentiated by autonomic activity. The Cannon–Bard theory remains important, however, because it introduced into emotion research the model of parallel processing.
Appraisal Theory
Appraisal theory is a group of theories in which emotional processing is dependent on an interaction between the stimulus properties and their interpretation. The theories differ about what is appraised and the criteria used for this appraisal. Since appraisal is a subjective step, it can account for the differences in how people react. Richard Lazarus proposed a version of appraisal theory in which emotions are a response to the reckoning of the ratio of harm versus benefit in a person’s encounter with something. In this appraisal step, each of us considers personal and environmental variables when deciding the significance of the stimulus for our well-being. Thus, the cause of the emotion is both the stimulus and its significance. The cognitive appraisal comes before the emotional response or feeling. This appraisal step may be automatic and unconscious.
He sees the bear → cognition (A quick risk–benefit appraisal is made: A dangerous wild animal is lumbering toward me, and he is showing his teeth → risk/benefit = high risk/no foreseeable benefit → I am in danger!) → Feels the emotion (he’s scared!) → response (fight or flight).
Singer–Schachter Theory: Cognitive Interpretation of Arousal
You may have read about the experiment in which investigators gave two different groups of participants an injection of adrenaline (Schachter & Singer, 1962). The control group was told that they would experience the symptoms associated with adrenaline, such as a racing heart. The other group was told they had been injected with vitamins and should not experience any side effects. Each of the participants was then placed with a confederate, who was acting in either a euphoric or an angry manner. When later asked how they felt and why, the participants who knowingly received an adrenaline injection attributed their physiological responses to the drug, and those who did not know they had been given adrenaline attributed their symptoms to the environment (the happy or angry confederate) and interpreted their emotion accordingly. The Singer–Schachter theory of emotion generation is based on these findings. The theory is a blend of the James–Lange and appraisal theories. Singer and Schachter proposed that emotional arousal and then reasoning is required to appraise a stimulus before the emotion can be identified.
So they see the bear → physiological reaction (arousal: heart races, ready to run) → cognition (What’s going on? Yikes! We are between a mother and her cub!) = feel the emotion (they’re scared!).
Constructivist Theories
Constructivist theories suggest that emotion emerges from cognition as molded by our culture and language. A recent and influential constructivist theory is the conceptual act model, proposed by Lisa Barrett. In this theory, emotions are human-made concepts that emerge as we make meaning out of sensory input from the body and from the world. First we form a mental representation of the bodily changes that have been called core affect (Russell, 2003). This representation is then classified according to language-based emotion categories. Barrett suggests that these categories vary with a person’s experience and culture, so there are no empirical criteria for judging an emotion (Barrett, 2006b).
Sensory input (she sees the bear) → physiologic response (her heart races, she feels aroused in a negative way) → her brain calculates all previous bear encounters, episodes of racing heart, degree of arousal, valence, and you name it → categorizes the current reaction in reference to all the past ones and ones suggested by her culture and language → ah, this is an emotion, and I call it fear.
Evolutionary Psychology Approach
Evolutionary psychologists Leda Cosmides and John Tooby proposed that emotions are conductors of an orchestra of cognitive programs that need to be coordinated to produce successful behavior (Cosmides & Tooby, 2000). They suggest that the emotions are an overarching program that directs the cognitive subprograms and their interactions.
From this viewpoint, an emotion is not reducible to any one category of effects, such as effects on physiology, behavioral inclinations, cognitive appraisals, or feeling states, because it involves coordinated, evolved instructions for all of them together. An emotion also involves instructions for other mechanisms distributed throughout the human mental and physical architecture.
They see the bear → possible stalking and ambush situation is detected (a common scenario of evolutionary significance) and automatically activates a hardwired program (that has evolved thanks to being successful in these types of situations) that directs all of the subprograms.
Response: Perception and attention shift automatically; goal and motivations change from a picnic in the woods to stayin’ alive; information-gathering mechanisms are redirected and a change in concepts takes place: looking for the tree as shade for a picnic becomes looking for a tall tree for escape; memory comes on board; communication changes; interpretive systems are activated (did the bear see us? If the answer is no, the people automatically adopt freeze behavior; if it is yes, they scamper); learning systems go on (they may develop a conditioned response to this trail in the future); physiology changes; behavior decision rules are activated (which may be automatic or involuntary) → they run for the tree (whew).
LeDoux’s High Road and Low Road
Joseph LeDoux of New York University has proposed that humans have two emotion systems operating in parallel. One is a neural system for our emotional responses that is separate from a system that generates the conscious feeling of emotion. This emotion-response system is hardwired by evolution to produce fast responses that increase our chances of survival and reproduction. Conscious feelings are irrelevant to these responses and are not hardwired, but learned by experience.
fast hardwired fight-or-flight response |
||
LeDoux sees the bear: |
↑
—
↓
|
|
slow cognition (whoa, that looks suspiciously like an Ursus arctos horribilis, good thing I’ve been keeping in shape) → emotion (feels scared) |
LeDoux was one of the first cognitive neuroscientists to study emotions. His research on the role of the amygdala in fear has shown that the amygdala plays a major role in emotional processing in general, not just fear. Researchers know more about the role of the amygdala in emotion than they do about the role of other regions of the brain in emotion.
TAKE-HOME MESSAGES
The Amygdala
The amygdalae (singular: amygdala) are small, almond-shaped structures in the medial temporal lobe adjacent to the anterior portion of the hippocampus (Figure 10.7a). Each amygdala is an intriguing and complex structure that in primates is a collection of 13 nuclei. There has been some controversy about the concept of “the amygdala” as a single entity, and some neurobiologists consider the amygdala to be neither a structural nor a functional unit (Swanson & Petrovich, 1998). The nuclei can be grouped into three main amygdaloid complexes (Figure 10.7b).
FIGURE 10.7 Location and circuitry of the amygdala. |
|
Structures in the medial temporal lobe were first proposed to be important for emotion in the early 20th century, when Heinrich Klüver and Paul Bucy at the University of Chicago (1939) documented unusual emotional responses in monkeys following damage to this region. One of the prominent characteristics, of what later came to be known as Klüver–Bucy syndrome (Weiskrantz, 1956), was a lack of fear manifested by a tendency to approach objects that would normally elicit a fear response. The observed deficit was called psychic blindness because of an inability to recognize the emotional importance of events or objects. In the 1950s, the amygdala was identified as the primary structure underlying these fear-related deficits. When the amygdala of monkeys was lesioned more selectively, monkeys manifested a normal disproportionate impairment in cautiousness and distrust: They approached novel or frightening objects or potential predators, such as snakes or human strangers. Not just once, they did it again and again, even if they had a bad experience. Once bitten, they were not twice shy. Although humans with amygdala damage do not show all of the classic signs of Klüver–Bucy syndrome, they do exhibit deficits in fear processing, as S.M. demonstrated. She exhibited a lack of cautiousness and distrust (Feinstein et al., 2011), and she too did not learn to avoid what others would term fearful experiences.
While studying the amygdala’s role in fear processing, investigators came to realize that it was important for emotional processing in general, because of its vast connections to many other brain regions. In fact, the amygdala is the veritable Godfather of the forebrain and is its most connected structure. The extensive connections to and from the amygdala reflect its critical roles in learning, memory, and attention in response to emotionally significant stimuli. The amygdala contains receptors for the neurotransmitters glutamate, dopamine, norepinephrine, serotonin, and acetylcholine. It also contains hormone receptors for glucocorticoids and estrogen, and peptide receptors for opiods, oxytocin, vasopressin, corticotropin-releasing factor, and neuropeptide Y. There are many ideas concerning what role the amygdala plays. Luiz Pessoa (2011) boils down the amygdala’s job description by suggesting that it is involved in determining what a stimulus is and what is to be done about it; thus, it is involved in attention, perception, value representation, and decision making. In this vein, Karen Lindquist and colleagues (2012) have proposed that the amygdala is active when the rest of the brain cannot easily predict what sensations mean, what to do about them, or what value they hold in a given context. The amygdala signals other parts of the brain to keep working until these issues have been figured out (Whalen, 2007). Lindquist’s proposal has been questioned, however, by people who have extensively studied patient S.M. (Feinstein et al., 2011), the woman we met at the beginning of this chapter. S.M. appears to have no deficit in any emotion other than fear. Even without her amygdala, she correctly understands the salience of emotional stimuli, but she has a specific impairment in the induction and experience of fear across a wide range of situations. People who have studied S.M. suggest that the amygdala is a critical brain region for triggering a state of fear in response to encounters with threatening stimuli in the external environment. They hypothesize that the amygdala furnishes connections between sensory and association cortex that are required to represent external stimuli, as well as connections between the brainstem and hypothalamic circuitry, which are necessary for orchestrating the action program of fear. As we’ll see later in this chapter, damage to the lateral amygdala prevents fear conditioning. Without the amygdala, the evolutionary value of fear is lost. For much of the remainder of this chapter, we look at the interplay of emotions and cognitive processes, such as learning, attention, and perception. Although we cannot yet settle the debate on the amygdala’s precise role, we will get a feel for how emotion is involved in various cognitive domains as we learn about the amygdala’s role in emotion processing.
TAKE-HOME MESSAGES
Interactions Between Emotion and Other Cognitive Processes
In previous chapters, we have not addressed how emotion affects the various cognitive processes that have been discussed. We all know from personal experience, however, that this happens. For instance, if we are angry about something, we may find it hard to concentrate on reading a homework assignment. If we are really enjoying what we are doing, we may not notice we are tired or hungry. When we are sad, we may find it difficult to make decisions or carry out any physical activities. In this section, we look at how emotions modulate the information processing involved in cognitive functions such as learning, attention, and decision making.
The Influence of Emotion on Learning
One day, early in the 20th century, Swiss neurologist and psychologist Édouard Claparède greeted his patient and introduced himself. She introduced herself and shook his hand. Not such a great story, until you know that he had done the same thing every day for the previous five years and his patient never remembered him. She had Korsakoff’s syndrome (Chapter 9), characterized by an absence of any short-term memory. One day Claparède concealed a pin in his palm that pricked his patient when they shook hands. The next day, once again, she did not remember him; but when he extended his hand to greet her, she hesitated for the first time. Claparède was the first to provide evidence that two types of learning, implicit and explicit, apparently are associated with two different pathways (Kihlstrom, 1995).
Implicit Emotional Learning
As first noted by Claparède, implicit learning is a type of Pavlovian learning in which a neutral stimulus (the handshake) acquires aversive properties when paired with an aversive event (the pin prick). This process is a classic example of fear conditioning. It is a primary paradigm used to investigate the amygdala’s role in emotional learning. Fear conditioning is a form of classical conditioning in which the unconditioned stimulus is aversive. One advantage of using the fear-conditioning paradigm to investigate emotional learning is that it works essentially in the same way across a wide range of species, from fruit flies to humans. One laboratory version of fear conditioning is illustrated in Figure 10.8.
FIGURE 10.8 Fear conditioning.
(a) Before training, three different stimuli—light (CS), foot shock (US1), and loud noise (US2)—are presented alone, and both the foot shock and the noise elicit a normal startle response in rats. (b) During training, light (CS) and foot shock (US1) are paired to elicit a normal startle response (UR). (c) In tests following training, presentation of light alone now elicits a response (CR), and presentation of the light together with a loud noise but no foot shock elicits a potentiated startle (potentiated CR) because the rat is startled by the loud noise and has associated the light (CS) with the startling foot shock (US).
The light is the conditioned stimulus (CS). In this example, we are going to condition the rat to associate this neutral stimulus with an aversive stimulus. Before training (Figure 10.8a), however, the light is solely a neutral stimulus and does not evoke a response from the rat. In this pretraining stage, the rat will respond with a normal startle response to any innately aversive unconditioned stimulus (US)—for example, a foot shock or a loud noise—that invokes an innate fear response. During training (Figure 10.8b), the light is paired with a shock that is delivered immediately before the light is turned off. The rat has a natural fear response to the shock (usually startle or jump), called the unconditioned response (UR). This stage is referred to as acquisition. After a few pairings of the light (CS) and the shock (US), the rat learns that the light predicts the shock, and eventually the rat exhibits a fear response to the light alone (Figure 10.8c). This anticipatory fear response is the conditioned response (CR).
The CR can be enhanced in the presence of another fearful stimulus or an anxious state, as is illustrated by the potentiated startle reflex exhibited by a rat when it sees the light (the CS) at the same time that it experiences a loud noise (a different US). The CS and resulting CR can become unpaired again if the light (CS) is presented alone, without the shock, for many trials. This phenomenon is called extinction because at this point the CR is considered extinguished (and the rat will again display the same response to light as in Figure 10.8a).
Many responses can be assessed as the CR in this type of fear-learning paradigm, but regardless of the stimulus used or the response evoked, one consistent finding has emerged in rats (and we will soon see that this also holds true in humans): Damage to the amygdala impairs conditioned fear responses. Amygdala lesions block the ability to acquire and express a CR to the neutral CS that is paired with the aversive US.
Two Pathways: The High and Low Roads Using the fear-conditioning paradigm, researchers such as Joseph LeDoux (1996), Mike Davis (1992) of Emory University, and Bruce Kapp and his colleagues (1984) of the University of Vermont have mapped out the neural circuits of fear learning, from stimulus perception to emotional response. As Figure 10.9 shows, the lateral nucleus of the amygdala serves as a region of convergence for information from multiple brain regions, allowing for the formation of associations that underlie fear conditioning. Based on results from single-unit recording studies, it is widely accepted that cells in the superior dorsal lateral amygdala have the ability to rapidly undergo changes that pair the CS to the US. After several trials, however, these cells reset to their starting point; but by then, cells in the inferior dorsal lateral region have undergone a change that maintains the adverse association. This result may be why fear that has seemingly been eliminated can return under stress—because it is retained in the memory of these cells (LeDoux, 2007). The lateral nucleus is connected to the central nucleus of the amygdala. These projections to the central nucleus initiate an emotional response if a stimulus, after being analyzed and placed in the appropriate context, is determined to represent something threatening or potentially dangerous.
FIGURE 10.9 Amygdala pathways and fear conditioning. |
FIGURE 10.10 The amygdala receives sensory input along two pathways.
When a hiker chances upon a bear, the sensory input activates affective memories through the cortical “high road” and subcortical “low road” projections to the amygdala. Even before these memories reach consciousness, however, they produce autonomic changes, such as an increased heart rate, blood pressure, and a startled response such as jumping back. These memories also can influence subsequent actions through the projections to the frontal cortex. The hiker will use this emotion-laden information in choosing his next action: Turn and run, slowly back up, or shout at the bear?
An important aspect of this fear-conditioning circuitry is that information about the fear-inducing stimulus reaches the amygdala through two separate but simultaneous pathways (Figure 10.10; LeDoux, 1996). One goes directly from the thalamus to the amygdala without being filtered by conscious control. Signals sent by this pathway, sometimes called the low road, reach the amygdala rapidly (15 ms in a rat), although the information this pathway sends is crude. At the same time, sensory information about the stimulus is being projected to the amygdala via another cortical pathway, sometimes referred to as the high road. The high road is slower, taking 300 ms in a rat, but the analysis of the stimulus is more thorough and complete. In this pathway, the sensory information projects to the thalamus; then the thalamus sends this information to the sensory cortex for a finer analysis. The sensory cortex projects the results of this analysis to the amygdala. The low road allows for the amygdala to receive information quickly in order to prime, or ready, the amygdala for a rapid response if the information from the high road confirms that the sensory stimulus is the CS. Although it may seem redundant to have two pathways to send information to the amygdala, when it comes to responding to a threatening stimulus, it is adaptive to be both fast and sure. Now we see the basis of LeDoux’s theory of emotion generation (see p. 436). After seeing the bear, the person’s faster low road sets in motion the fight-or-flight response, while the slower high road through the cortex provides the learned account of the bear and his foibles.
Is the amygdala particularly sensitive to certain categories of stimuli such as animals? Two lines of evidence suggest that it is. The first has to do with what is called biological motion. The visual system extracts subtle movement information from a stimulus that it uses to categorize the stimulus as either animate (having motion characteristic of a biological entity) or inanimate. This ability to recognize biological motion is innate. It has been demonstrated in newborn babies, who will attend to biological motion within the first few days of life (Simion et al., 2008), and it has been identified in other mammals (Blake, 1993). This preferential attention to biological motion is adaptive, alerting us to the presence of other living things. Interestingly, PET studies have shown that the right amygdala is activated when an individual perceives a stimulus exhibiting biological motion (Bonda et al., 1996).
The second line of evidence comes from single-cell recordings from the right amygdala. Neurons in this region have been found to respond preferentially to images of animals. This effect was shown by a group of researchers who did single-cell recordings from the amygdala, hippocampus, and entorhinal cortex in patients who had had electrodes surgically implanted to monitor their epilepsy. The recordings were made as patients looked at images of persons, animals, landmarks, or objects. Neurons in the right amygdala, but not the left, responded preferentially to pictures of animals rather than to pictures of other stimulus categories. There was no difference in the amygdala’s response to threatening or cute animals. This categorical selectivity provides evidence of a domain-specific mechanism for processing this biologically important class of stimuli that includes predators or prey (Mormann et al., 2011).
FIGURE 10.11 Bilateral amygdala lesions in patient S.P.
During a surgical procedure to reduce epileptic seizures, the right amygdala and a large section of the right temporal lobe, including the hippocampus, were removed (circled regions). Pathology in the left amygdala is visible in the white band, indicating regions where cells were damaged by neural disease.
Amygdala’s Effect on Implicit Learning The role of the amygdala in learning to respond to stimuli that have come to represent aversive events through fear conditioning is said to be implicit. This term is used because the learning is expressed indirectly through a behavioral or physiological response, such as autonomic nervous system arousal or potentiated startle. When studying nonhuman animals, we can assess the CR only through indirect, or implicit, means of expression. The rat is startled when the light goes on. In humans, however, we can also assess the response directly, by asking the participants to report if they know that the CS represents a potential aversive consequence (the US). Patients with amygdala damage fail to demonstrate an indirect CR—for instance, they would not shirk Claparède’s handshake. When asked to report the parameters of fear conditioning explicitly or consciously, however, these patients demonstrate no deficit, and might respond with “Oh, the handshake, sure, it will hurt a bit.” Thus, we know that they learned that the stimulus is associated with an aversive event. Damage to the amygdala appears to leave this latter ability intact (A. K. Anderson & Phelps, 2001; Phelps et al., 1998; Bechara et al., 1995; LaBar et al., 1995).
This concept is illustrated by the study of a patient very much like S.M. Patient S.P. also has bilateral amygdala damage (Figure 10.11). To relieve epilepsy, at age 48 S.P. underwent a lobectomy that removed her right amygdala. MRI at that time revealed that her left amygdala was already damaged, most likely from mesial temporal sclerosis, a syndrome that causes neuronal loss in the medial temporal regions of the brain (A. K. Anderson & Phelps, 2001; Phelps et al., 1998). Like S.M., S.P. is unable to recognize fear in the faces of others (Adolphs et al., 1999).
In a study on the role of the amygdala in human fear conditioning, S.P. was shown a picture of a blue square (the CS), which the experimenters periodically presented for 10 s. During the acquisition phase, S.P. was given a mild electrical shock to the wrist (the US) at the end of the 10-s presentation of the blue square (the CS). In measures of skin conductance response (Figure 10.12), S.P.’s performance was as predicted: She showed a normal fear response to the shock (the UR), but no change in response when the blue square (the CS) was presented, even after several acquisition trials. This lack of change in the skin conductance response to the blue square demonstrates that she failed to acquire a CR.
FIGURE 10.12 S.P. showed no skin conductance response to conditioned stimuli. |
Following the experiment, S.P. was shown her data and that of a control participant, as illustrated in Figure 10.12, and she was asked what she thought. She was somewhat surprised that she showed no change in skin conductance response (the CR) to the blue square (the CS). She reported that she knew after the very first acquisition trial that she was going to get a shock to the wrist when the blue square was presented. She claimed to have figured this out early on and expected the shock whenever she saw the blue square. She was not sure what to make of the fact that her skin conductance response did not reflect what she consciously knew to be true. This dissociation between intact explicit knowledge of the events that occurred during fear conditioning and impaired conditioned responses has been observed in other patients with amygdala damage (Bechara et al., 1995; LaBar et al., 1995).
As discussed in Chapter 9, explicit or declarative memory for events depends on another medial temporal lobe structure: the hippocampus, which, when damaged, impairs the ability to explicitly report memory for an event. When the conditioning paradigm that we described for S.P. was conducted with patients who had bilateral damage to the hippocampus but an intact amygdala, the opposite pattern of performance emerged. These patients showed a normal skin conductance response to the blue square (the CS), indicating acquisition of the conditioned response. When asked what had occurred during conditioning, however, they were unable to report that the presentations of the blue square were paired with the shock, or even that a blue square was presented at all—just like Claparède’s patient.
This double dissociation between patients who have amygdala lesions and patients with hippocampal lesions is evidence that the amygdala is necessary for the implicit expression of emotional learning, but not for all forms of emotional learning and memory. The hippocampus is necessary for the acquisition of explicit or declarative knowledge of the emotional properties of a stimulus, whereas the amygdala is critical for the acquisition and expression of an implicitly conditioned fear response.
Explicit Emotional Learning
The double dissociation just described clearly indicates that the amygdala is necessary for implicit emotional learning, but not for explicit emotional learning. This does not mean that the amygdala is uninvolved with explicit learning and memory. How do we know? Let’s look at an example of explicit emotional learning.
Liz is walking down the street in her neighborhood and sees a neighbor’s dog, Fang, on the sidewalk. Even though she is a dog owner herself and likes dogs in general, Fang scares her. When she encounters him, she becomes nervous and fearful, so she decides to walk on the other side of the street. Why might Liz, who likes dogs, be afraid of this particular dog? There are a few possible reasons: For example, perhaps Fang bit her once. In this case, her fear response to Fang was acquired through fear conditioning. Fang (the CS) was paired with the dog bite (the US), resulting in pain and fear (the UR) and an acquired fear response to Fang in particular (the CR).
Liz may fear Fang for another reason, however. She has heard from her neighbor that this is a mean dog that might bite her. In this case she has no aversive experience linked to this particular dog. Instead, she learned about the aversive properties of the dog explicitly. Her ability to learn and remember this type of information depends on her hippocampal memory system. She likely did not experience a fear response when she learned this information during a conversation with her neighbor. She did not experience a fear response until she actually encountered Fang. Thus, her reaction is not based on actual experience with the dog, but rather is anticipatory and based on her explicit knowledge of the potential aversive properties of this dog. This type of learning, in which we learn to fear or avoid a stimulus because of what we are told (as opposed to actually having the experience), is a common example of emotional learning in humans.
The Amygdala Effect on Explicit Learning The question is this: Does the amygdala play a role in the indirect expression of the fear response in instructed fear? From what we know about patient S.M., what would you guess? Elizabeth Phelps of New York University and her colleagues (Funayama et al., 2001; Phelps et al., 2001) addressed this question using an instructed fear paradigm, in which the participant was told that a blue square may be paired with a shock. They found that, even though explicit learning of the emotional properties of the blue square depends on the hippocampal memory system, the amygdala is critical for the expression of some fear responses to the blue square (Figure 10.13a). During the instructed-fear paradigm, patients with amygdala damage were able to learn and explicitly report that some presentations of the blue square might be paired with a shock to the wrist. In truth, though, none of the participants ever received a shock. Unlike normal control participants, however, patients with amygdala damage did not show a potentiated startle response when the blue square was presented. They knew consciously that they would receive a shock, but had no emotional response. Normal control participants showed an increase in skin conductance response to the blue square that was correlated with amygdala activity (Figure 10.13b). These results suggest that, in humans, the amygdala is sometimes critical for the indirect expression of a fear response when the emotional learning occurs explicitly. Similar deficits have been observed when patients with amygdala lesions respond to emotional scenes (Angrilli et al., 1996; Funayama et al., 2001).
Although animal models of emotional learning highlight the role of the amygdala in fear conditioning and the indirect expression of the conditioned fear response, human emotional learning can be much more complex. We can learn that stimuli in the world are linked to potentially aversive consequences in a variety of ways, including instruction, observation, and experience. In whatever way we learn the aversive or threatening nature of stimuli—whether explicit and declarative, implicit, or both—the amygdala may play a role in the indirect expression of the fear response to those stimuli.
Amygdala, Arousal, and Modulation of Memory The instructed-fear studies indicate that when an individual is taught that a stimulus is dangerous, amygdala activity can be influenced by a hippocampal-dependent declarative representation about the emotional properties of stimuli (in short, the memory that someone told you the dog was mean). The amygdala activity subsequently modulates some indirect emotional responses. But is it possible for the reverse to occur? Can the amygdala modulate the activity of the hippocampus? Put another way, can the amygdala influence what you learn and remember about an emotional event?
FIGURE 10.13 Responses to instructed fear.
(a) While performing a task in the instructed fear protocol, participants showed an arousal response (measured by skin conductance response) consistent with fear to the blue square, which they were told might be linked to a shock. The presentation of the blue square also led to amygadal activation. (b) There is a correlation between the strength of the skin conductance response indicating arousal and the activation of the amygdala.
The types of things we recollect every day are things like where we left the keys, what we said to a friend the night before, or whether we turned the iron off before leaving the house. When we look back on our lives, however, we do not remember these mundane events. We remember a first kiss, being teased by a friend in school, opening our college acceptance letter, or hearing about a horrible accident. The memories that last over time are those of emotional (not just fearful) or important (i.e., arousing) events. These memories seem to have a persistent vividness that other memories lack.
James McGaugh and his colleagues (1992, 1996; Ferry & McGaugh, 2000) at the University of California, Irvine, investigated whether this persistence of emotional memories is related to the action of the amygdala during emotional arousal. An arousal response can influence people’s ability to store declarative or explicit memories. For example, investigators frequently use the Morris water maze task (see Chapter 9) to test a rat’s spatial abilities and memory. McGaugh found that a lesion to the amygdala does not impair the rats’ ability to learn this task under ordinary circumstances. If a rat with a normal amygdala is aroused immediately after training, by either a physical stressor or the administration of drugs that mimic an arousal response, then the rat will show improved retention of this task. The memory is enhanced by arousal. In rats with a lesion to the amygdala, however, this arousal-induced enhancement of memory, rather than memory acquisition itself, is blocked (McGaugh et al., 1996). Using pharmacological lesions to temporarily disable the amygdala immediately after learning also eliminates any arousal-enhanced memory effect (Teather et al., 1998).
Two important aspects of this work help us understand the mechanism underlying the role of the amygdala in enhancing declarative memory that has been observed with arousal. The first is that the amygdala’s role is modulatory. The tasks used in these studies depend on the hippocampus for acquisition. In other words, the amygdala is not necessary for learning this hippocampal-dependent task, but it is necessary for the arousal-dependent modulation of memory for this task.
The second important facet of this work is that this effect of modulation with arousal can occur after initial encoding of the task, during the retention interval. All of these studies point to the conclusion that the amygdala modulates hippocampal, declarative memory by enhancing retention, rather than by altering the initial encoding of the stimulus. Because this effect occurs during retention, the amygdala is thought to enhance hippocampal consolidation. As described in Chapter 9, consolidation occurs over time, after initial encoding, and leads to memories becoming more or less stable. Thus, when there is an arousal response, the amygdala alters hippocampal processing by strengthening the consolidation of memories. McGaugh and colleagues (1996) showed that the basolateral nucleus of the amygdala is important for this effect. Additional evidence, however, also suggests that the amygdala can interact directly with the hippocampus during the initial encoding phase (not just the consolidation phase) of an experience, which in turn also positively affects the long-term consolidation (Dolcos et al., 2004). Thus, the amygdala can modulate hippocampal-dependent declarative memory at multiple stages, leading to a net effect of enhanced retention.
This role for the amygdala in enhancing emotional, declarative memory has also been demonstrated in humans. Various studies over the years have indicated that a mild arousal response can enhance declarative memory for emotional events (e.g., see Christianson, 1992). This effect of arousal on declarative memory is blocked in patients with bilateral amygdala damage (Cahill et al., 1995). Interestingly, studies on patients with unilateral amygdala damage reveal that the right, and not the left, amygdala is most important for the retrieval of autobiographical emotional memories relating to negative valence and high arousal (Buchanan et al., 2006). In addition, functional neuroimaging studies have shown that activity observed in the human amygdala during the presentation of emotional stimuli is correlated with the arousal-enhanced recollection of these stimuli (Cahill et al., 1996; Hamann et al., 1999). The more active the amygdala, the stronger the memory. There is also increased effective connectivity bidirectionally between the amygdala and hippocampus during recall of emotional information that is relative to current behavior (A.P.R. Smith et al., 2006). These studies indicate that normal amygdala function plays a role in the enhanced declarative memory observed with arousal in humans.
The mechanism for this effect of arousal appears to be related to the amygdala’s role in modifying the rate of forgetting for arousing stimuli. In other words, arousal may alter how quickly we forget. This is consistent with the notion of a post-encoding effect on memory, such as enhancing hippocampal storage or consolidation. Although the ability to recollect arousing and nonarousing events may be similar immediately after they occur, arousing events are not forgotten as quickly as nonarousing events are (Kleinsmith & Kaplan, 1963). Unlike normal control participants, who show less forgetting over time for arousing compared to nonarousing stimuli, patients with amygdala lesions forget arousing and nonarousing stimuli at the same rate (LaBar & Phelps, 1998).
Studies on both animal models and human populations converge on the conclusion that the amygdala acts to modulate hippocampal consolidation for arousing events. This mechanism, however, does not underlie all the effects of emotion on human declarative memory. Emotional events are more distinctive and unusual than are everyday life events. They also form a specific class of events. These and other factors may enhance declarative or explicit memory for emotional events in ways that do not depend on the amygdala (Phelps et al., 1998).
Stress and Memory It appears that acute stress can facilitate memory. Kevin LaBar and his colleagues at Duke University (Zorawski et al., 2006) have found that the amount of endogenous stress hormone (cortisol) released during the acquisition of a conditioned fear accurately predicts how well fear memories are retained one day later in humans. Robert Sapolsky of Stanford University (1992) and his colleagues demonstrated, however, that extreme arousal or chronic stress may actually impair performance of the hippocampal memory system. This memory impairment is due to the effect of excessive stress hormones, such as glucocorticoids, on the hippocampus. The precise role of the amygdala in this impairment of hippocampal memory during chronic or excessive stress is not fully understood.
The amygdala’s interactions with the hippocampal memory system and explicit memory are specific and complex. The amygdala acts to modulate the storage of arousing events, thus ensuring that they will not be forgotten over time. And luckily, we can learn explicitly that stimuli in the environment are linked to potential aversive consequences, without having to experience these consequences ourselves (Listen to Mom!). This explicit, hippocampal-dependent representation of the emotional properties of events can affect amygdala activity and certain indirect fear responses. The interactions of the amygdala and hippocampus help ensure that we remember important and emotionally charged information and events for a long time. These memories ultimately ensure that our bodily response to threatening events is appropriate and adaptive.
The Influence of Emotion on Perception and Attention
No doubt you have had the experience of being in the midst of a conversation and hearing your name mentioned behind you—and you immediately turn to see who said it. We exhibit an increased awareness for and pay attention to emotionally salient stimuli. Attention researchers often use the attentional blink paradigm, in which stimuli are presented so quickly in succession that an individual stimulus is difficult to identify. When participants are told that they can ignore most of the stimuli—say, all the words printed in green and attend only to the few targets printed in blue—then participants are able to identify the targets. This ability, however, is limited by the amount of time between the target (blue) stimuli. If a second target stimulus is presented immediately after the first, in what is known as the early lag period, participants will often miss this second target. This impaired perceptual report reflects the temporal limitations of attention and is known as the attentional blink. If, however, that second word is emotionally significant, then people notice it (Anderson, 2005). An emotionally significant word is distinctive, arousing (energizing), and has either a positive or negative valence. In this experiment, arousal value (how reactive the participant is to a stimulus), not the valence of the word or its distinctiveness, overcame the attentional blink. Studies have shown that when the left amygdala is damaged, then patients don’t recognize the second target even if it is an arousing word (Anderson & Phelps, 2001). So it appears that when attentional resources are limited, it is the arousing emotional stimuli that reach awareness, and the amygdala again plays a critical role in enhancing our attention when emotional stimuli are present.
There are two theories about how this happens. One is that emotional learning involves an enduring change in sensory cortical tuning, and the other is that it produces a more transient change.
The first theory arose out of fear conditioning studies done on rats (Weinberger, 1995). It was found that the auditory cortex became especially sensitive to the stimuli used for the conditioned stimulus. Classical conditioning and fear conditioning (Bakin et al., 1996) shift the tuning frequency of the cortical neurons to the frequency of the conditioned stimulus. This cortical plasticity of the receptor field is associative and highly specific. It happens quickly and is retained indefinitely. The idea is that changes that occur in perceptual processing for stimuli with emotional properties (acquired through learning) are long lasting. Although this mechanism has not been explicitly demonstrated in humans, hints of it have been observed. In imaging studies, in which fear conditioning occurred using subliminally exposed face stimuli as the CS, with an aversive loud noise as the US, an increasing responsiveness to the CS was seen in both the amygdala and in the visual cortex over a series of trials (J. S. Morris et al., 2001). The presence of a learning response occurring in parallel in the amygdala and the visual cortex supports the idea that feedback efferents from the amygdala to the visual cortex act to modulate visual processing of emotionally salient stimuli.
The second theory proposes a mechanism that produces a more transient change in attentional thresholds. Recall that the amygdala has reciprocal connections with the sensory cortical processing regions and that it receives inputs of emotional significance before awareness takes place. Studies have indicated that attention and awareness don’t have much impact on the amygdala’s response to fearful stimuli (A. K. Anderson et al., 2003; Vuilleumier et al., 2001), which is consistent with the finding that the emotional qualities of stimuli are processed automatically (Zajonc, 1984). Thus, although you may be thinking about your lunch while hiking up the trail, you will still be startled at movement in the grass. You have just experienced a rapid and automatic transient change in attention spurred by emotional stimuli. The proposed mechanism for this attentional change is that early in the perceptual processing of the stimulus, the amygdala receives input about its emotional significance and, through projections to sensory cortical regions, modulates the attentional and perceptual processes (A.K. Anderson & Phelps, 2001; Vuilleumier et al., 2004). This idea is based first on the finding that there is enhanced activation of visual cortical regions to novel emotional stimuli (Kosslyn et al., 1996), combined with imaging studies that showed a correlation between visual cortex activation and amygdala activation in response to these same stimuli (Morris et al., 1998). Some evidence suggests that novelty is a characteristic of a stimulus that engages the amygdala independently of other affective properties such as valence and arousal. A recent fMRI study that examined valence, arousal, and novelty of emotional photo images found that the amygdala had higher peak responses and was activated longer for novel stimuli versus familiar stimuli, and the effect was independent of both valence and arousal (Weierich et al., 2010). The investigators also observed increased activity in early visual areas V1 and V2 when participants viewed novel emotional stimuli. This activation was different from the activation seen in later visual areas that occurred for valence and arousal.
What’s more, fMRI studies show that patients with damage to the amygdala do not show significant activation for fearful versus neutral faces in the visual cortex, whereas controls and patients with hippocampal damage do. Taken together, it seems that when emotional stimuli are present, the amygdala has a leading role in mediating the transient changes in visual cortical processing.
Clearly, the amygdala is critical in getting an unattended but emotional stimulus into the realm of conscious awareness by providing some feedback to the primary sensory cortices, thus affecting perceptual processing. This function was demonstrated by Phelps and her colleagues (2006). They examined the effect of fearful face cues on contrast sensitivity—an aspect of visual processing that occurs early in the primary visual cortex and is enhanced by covert attention. They found that when a face cue directed covert attention, contrast sensitivity was enhanced. This was an expected result. The interesting finding was that a fearful face enhanced contrast sensitivity, whether covert attention was directed to the face or not. So the emotion-laden stimulus enhanced perception without the aid of attention. The team also found that if the fearful face did cue attention, contrast sensitivity was enhanced even more than would have been predicted for the independent effects of a fearful face and covert attention. Thus emotion-laden stimuli receive greater attention and priority perceptual processing.
Emotion and Decision Making
Let’s say you have a big decision to make, and it has an uncertain outcome. You are considering elective knee surgery. You don’t need the surgery to survive; you get around OK, and you have no trouble boogie boarding. The problem is, you can’t do your favorite sport, snowboarding. You anticipate that you will be able to snowboard again if you have surgery. There is a drawback to this plan, however. What if you have the surgery and it doesn’t go so well? You could end up worse off than you are now (it happened to a friend of yours), and you would regret having had it done. What will you decide, and exactly what is going on in your brain as you go through this decision-making process?
Many decision models are based on mathematic and economic principles, and we will talk more about decision making in Chapters 12 and 13. Although these models are built on the logical principles of cost–benefit analysis, they fail to describe how people actually act. In constructing these models, it became obvious some factor in decision making was not being taken into account. In the early 1990s, Antonio Damasio and his colleagues at the University of Iowa made a surprising discovery while working with patient E.V.R., who had orbitofrontal cortex (OFC) damage. When faced with social reasoning tasks, E.V.R. could generate solutions to problems, but he could not prioritize his solutions based on their ability to solve the problem. In the real world, he made poor decisions about his professional and social life (Saver & Damasio, 1991). The researchers, studying a group of patients with similar lesions, found that the patients had difficulty anticipating the consequences of their actions and did not learn from their mistakes (Bechara et al., 1994). This discovery was surprising because at that time, researchers believed the orbitofrontal cortex handled emotional functions. Their belief was based on the many connections of the OFC to the insular cortex and the cingulate cortex, the amygdala, and the hypothalamus—all areas involved with emotion processing. Because emotion was considered a disruptive force in decision making, it was surprising that impairing a region involved in emotion would result in impaired decision making. Seemingly, an individual’s decision-making ability should have improved with such a lesion. Damasio wondered whether damage to the orbitofrontal cortex impaired decision making because emotion was actually needed to optimize it. At the time, this was a shocking suggestion. To test this idea, Damasio and his colleagues devised the Iowa Gambling Task. In the Iowa Gambling Task, skin conductance response (SCR) is measured while participants continually draw cards from their choice of four decks. The cards indicate monetary amounts resulting in either gain or loss. What participants don’t know is that two of the decks are associated with net winnings; although they have low payoffs, they have even lower losses. The other two decks are associated with net losses because, although they have high payoffs, they have even larger losses. Participants must figure out that they can earn the most money by choosing the decks associated with net winnings yet low payoffs.
Healthy adults and patients with damage outside the orbitofrontal cortex gamble in a manner that maximizes winnings. In contrast, patients with orbitofrontal damage fail to favor the decks that result in net winnings. Based on these results, Damasio proposed the somatic marker hypothesis, which states that emotional information, in the form of physiological arousal, is needed to guide decision making. When presented with a situation that requires us to make a decision, we may react emotionally to the situation around us. This emotional reaction is manifest in our bodies as somatic markers—changes in physiological arousal. It is theorized that orbitofrontal structures support learning the associations between a complex situation and the somatic changes (i.e., emotional state) usually associated with that particular situation. The orbitofrontal cortex and other brain regions together consider previous situations that elicited similar patterns of somatic change. Once these situations have been identified, the orbitofrontal cortex can use these experiences to rapidly evaluate possible behavioral responses and their likelihood for reward. Decision making can then selectively focus on option–outcome pairings that are potentially rewarding.
Based on our current understanding, three types of emotions influence decision making.
Although acquisition of fear conditioning requires the amygdala, normal extinction of a conditioned response (that is, learning that there has been a change and the stimulus is no longer associated with a punishment) involves interactions of the amygdala and the prefrontal cortex (Morgan & LeDoux, 1999). It has been suggested that the Iowa Gambling Task may be challenging for patients with orbitofrontal damage because it requires them to change their initial perceptions of the potential for rewards in the risky decks (Fellows & Farah, 2005). The decks with the net losses are very appealing at the beginning of the task because the rewards are so large. As participants continue to draw cards from those decks, however, the monumental losses begin to appear. These researchers found that if the task is modified so that the card order in the decks makes it clear earlier in the task that there are large wins but even larger losses, then patients with orbitofrontal damage perform this task as well as do healthy control participants. Thus, it appears that OFC damage results in the inability to respond to changing patterns of reward and punishment. Reversal learning does not take place, and these patients don’t learn from experience. This finding is consistent with research in monkeys, where investigators found that orbitofrontal damage makes it difficult to reverse an association once it has been learned (Jones & Mishkin, 1972). Single-cell recordings in monkeys have identified specific neurons in the OFC that respond only when reinforcement contingencies change (i.e., how closely an action or stimulus is linked to a reward or punishment; Rolls et al., 1996).
Edmund Rolls and his colleagues believe that emotion is the motivator for seeking reward and avoiding punishment. They investigated whether the OFC was activated by abstract rewards and punishment, such as winning or losing money. If so, they wondered if the neural representations were distinct or overlapping, and if there were any correlation to activation and amounts of reward or punishments. Using an event-related fMRI study, they determined that the OFC has distinct regions for reward and punishment. The lateral OFC is activated following a punishing outcome, and the medial OFC for a rewarding one. The amount of activation correlated positively with the magnitude of the reward or punishment (O’Doherty et al., 2001). The medial region that showed increased activation to reward also exhibited a decreased BOLD signal when punishment was meted out. Similarly, the lateral orbitofrontal cortex region that was activated when the outcome was punishment showed a decreased BOLD signal when the outcome was a reward. Thus an inability to represent the magnitude of rewards and punishments (i.e., the cost–benefit ratio) would obviously lead to poor decision making. We see that the OFC is selectively active for the magnitude of reward and punishment and for their changing patterns.
Regret is the feeling you get when you compare the voluntary choice you made with rejected alternatives that might have turned out better. You feel regret because you are able to think counterfactually. You can say, “If I had done this instead of that, then things would have been better.” We dislike feeling regret, and so we learn from our experience and take steps to minimize feeling regret by making choices to avoid it. In contrast, disappointment is an emotion related to an unexpected negative outcome without the sense of personal responsibility. “I won teacher of the year, but because I was the last one hired, I was the first one fired.” People with OFC lesions have normal emotional reactions to their wins and losses, but they do not feel regret. They also do not learn from regret-inducing decisions or anticipate negative consequences of their choices (Camille et al., 2004).
To study the brain activity associated with regret, Georgio Coricelli and his colleagues (2005) induced regret in healthy participants by having them make a gambling choice and then telling them the better outcome of the unchosen gamble. Using fMRI, the researchers found that enhanced activity in the medial OFC, the anterior cingulate cortex, and the anterior hippocampus correlated with increasing regret. The more the choice was regretted, the greater the activity of the medial OFC. They also found that after multiple trials, their participants became risk averse, a behavior reflected in enhanced activity within the medial OFC and the amygdala. This same pattern of activation was also exhibited just before making a choice. This intriguing result suggests that the same circuit mediates both the experience and the anticipation of the emotion of regret. The team also observed different patterns of neural activation when participants were experiencing regret (medial OFC), when they were simply evaluating results (processed in the ventral striatum), and when they were experiencing disappointment for an outcome that was less than expected (middle temporal gyrus and brainstem). The researchers found that the feeling of regret strongly influences decision choice, leading to more risk-aversive choices over time.
The emotion that you are feeling can influence your decision. For example, say you are leafing through the paper that your professor just returned. You spent your entire three-day weekend working very hard on that paper. As you flip it over to see your grade, a friend comes up and asks you to head up the fund drive for your soccer club. What you see on that paper will produce an emotion in you—elation, frustration, satisfaction—that may affect the response you give to your friend.
What function is served by having emotions play such a role in decision making? Ellen Peters (2006) and her colleagues suggest that experienced feelings about a stimulus and feelings that are independent of the stimulus, such as mood states, have four roles in decision making.
Hans-Rüdiger Pfister and Gisela Böhm (2008) suggest four different categories in which emotions have a role in decision making: to provide information about pleasure and pain to build preferences, to enable rapid choices under time pressure, to focus attention on relevant aspects of a decision problem, and to generate commitment concerning morally and socially significant decisions.
Emotion and Social Stimuli
Chapter 13 covers the topic of social cognition, which involves how we recognize emotions in others. Here in Chapter 10, we introduce some aspects of social cognition as they relate to emotional processing.
Facial Expressions Studies have shown that there is a dissociation between identifying an individual’s face and identifying the emotional expression on that face. Our patient S.M. had no trouble identifying faces; she just couldn’t recognize the expression of fear on a face. People with amygdalar damage do not have a problem recognizing nonemotional facial features. In addition, they are able to recognize the similarity between facial expressions whose emotional content they label incorrectly. What’s more, their deficit appears to be restricted to the recognition of facial expressions. Some of them are able to generate and communicate a full range of facial expressions themselves (A. K. Anderson & Phelps, 2000). Depending on the specific facial expression, it appears that different neural mechanisms and regions of the brain are at work, not for processing specific facial expressions per se, but more generally for processing different emotions.
Evidence for this idea comes from studies in which investigators presented different facial expressions to participants while they were undergoing PET scans. The scans were then analyzed to identify areas of the brain that were uniquely activated for the emotions they saw (Figure 10.14). James Blair and colleagues (1999) applied this strategy in a landmark study of the neural basis of anger. They used a computer program to manipulate a neutral facial expression into one that looked increasingly angry (Figure 10.14a) and searched for brain activation associated with the gradient of expression intensity. They found that the right orbitofrontal cortex (OFC; see Figure 10.3b) was increasingly active when participants viewed increasingly expressive angry faces (Figure 10.15). This region was not active when participants viewed sad faces. These results suggest a role for the OFC in explicit emotional labeling of angry faces.
FIGURE 10.14 Examples of morphed facial expressions. |
Neuroimaging experiments in normal participants and patients with anxiety disorders have reported increased amygdala activation in response to brief presentations of faces with fearful expressions compared to faces with neutral expressions (Breiter et al., 1996; Cahill et al., 1996; Irwin et al., 1996; Morris et al., 1998). Although the amygdala is activated in response to other emotional expressions, such as happy or angry, the activation response to fear is significantly greater. One interesting aspect of the amygdala’s response to fearful facial expressions is that the participant does not have to be aware of seeing the fearful face for the amygdala to respond. When fearful facial expressions are presented subliminally and then masked with neutral expressions, the amygdala is activated as strongly as when the participant is aware of seeing the faces (Whalen et al., 1998).
This critical role for the amygdala in explicitly evaluating fearful faces also extends to other social judgments about faces, such as indicating from a picture of a face whether the person appears trustworthy or approachable (Adolphs et al., 2000; Said et al., 2010). Once again, this observation is consistent with the behavior of patients with amygdala damage, who rated pictures of individuals whose faces were deemed untrustworthy by normal controls as both more trustworthy and more approachable (Adolphs et al., 1998).
After nearly a decade of testing S.M., Ralph Adolphs and his colleagues (Adolphs et al., 2005; Kennedy and Adolphs, 2010) discovered an explanation for her inability to recognize fearful faces. Using computer software that exposed only parts of either a fearful or happy facial expression, the researchers were able to figure out what regions of the face participants relied on to discriminate between expressions. They found that control participants consistently relied on eyes to make decisions about expression. S.M., on the other hand, did not derive information from the eyes. Indeed, a subsequent experiment using eye-tracking technology confirmed that she did not even look at the eyes of any face, regardless of the emotion it conveyed (Figure 10.16a). So if, unlike controls, S.M. did not automatically use eyes to derive information from faces, why did she only have trouble with identifying fear? Most expressions contain other cues that can be used for identification. For instance, an expression of happiness reliably contains a smile, and disgust a snarl of sorts. The identifying feature of a fearful expression, however, is the increase in size of the white region (sclera) of the eyes (Figure 10.17). This prominent characteristic is captured by the frequently used phrase, “I could see the fear in his eyes.” More empirically, one study found that viewing sclera from a fearful face without any other accompanying facial information is sufficient to increase amygdala activity in normal participants (relative to sclera from facial expressions of happiness; Whalen et al., 2004).
FIGURE 10.15 Neural correlates of the perception of anger. |
FIGURE 10.16 Abnormal eye movement patterns during face perception following amygdala lesions.
(a) Unlike control participants, S.M.’s eye movements do not target the eyes of other faces. (b) When instructed to focus on the eyes, however, S.M. is able to identify fearful expressions as well as controls can. The top panel shows that, when instructed, S.M. is able to look at the eyes. Red lines indicate eye movements and white circles indicate points of fixation.
In another study, investigators masked expressions of happiness or sadness in order to find brain areas associated with automatic, implicit analysis of emotion (Killgore & Yurgelun-Todd, 2004). These investigators found amygdala activity associated with analysis of happy but not sad faces. Although a smile is part of a happy expression, a smile can be faked. First observed by Duchenne and known as the Duchenne smile, the telling part of a truly happy facial expression is the contraction of the orbicularis oculi muscle, which cannot be done voluntarily by most people (Ekman, 2003). This causes the lateral eye margins to crinkle, the cheeks to be pulled up, and the lateral portion of the brow to drop. Perhaps amygdala activation when looking at happy faces is due to our attention being drawn to the eyes and identifying this aspect of the happy facial expression.
FIGURE 10.17 Size of eye whites alone is sufficient to induce differential amygdala response to fearful expressions. |
Stunningly, the investigators could induce S.M. to overcome her deficit by providing her with a simple instruction: “Focus on the eyes.” If told to do so, she no longer had any difficulty identifying fearful faces (see Figure 10.16b). She would focus on the eyes only when reminded, however. Consequently, the amygdala appears to be an integral part of a system that automatically directs visual attention to the eyes when encountering any facial expressions. Impaired eye gaze is also a main characteristic of several psychiatric illnesses and social disorders in which the amygdala may be dysfunctional (e.g., autism spectrum disorder). Adolphs and colleague’s findings that looking at the eyes are important to recognizing facial expressions and experimental manipulations that promote eye gaze may hold promise for interventions in such populations (D. P. Kennedy & Adolphs, 2010; Gamer & Buchel, 2009). This novel function of the amygdala is still not fully understood and is just one example of the diverse topics on the frontier of emotion research. Recent studies performed at the California Institute for Telecommunication and Information Technology extend these findings by identifying all the physical characteristics (e.g., eyebrow angle, pupil dilation, etc.) that make facial expressions of fear and the other basic emotions unique. They have developed a robot, Einstein, that can identify and then imitate facial expressions of others. You can watch Einstein at http://www.youtube.com/watch?v=pkpWCu1k0ZI.
Beyond the Face You may be familiar with the study in which participants were shown a film of various geometric shapes moving around a box. The movement was such that the participants described the shapes as if they were animate, with personalities and motives, moving about in a complex social situation—that is, participants anthropomorphized the shapes (Heider & Simmel, 1944). Patients with either amygdala damage or autism do not do this. They describe the shapes as geometric figures, and their description of the movement is devoid of social or emotional aspirations (Heberlein & Adolphs, 2004). Thus, the amygdala seems to have a role in perceiving and interpreting emotion and sociability in a wide range of stimuli, even inanimate objects. It may play a role in our ability to anthropomorphize.
The amygdala, however, does not appear to be critical for all types of social communication. Unlike patients with damage to the orbitofrontal cortex, patients with amygdala lesions, as we saw with S.M., do not show gross impairment in their ability to respond to social stimuli. They can interpret descriptions of emotional situations correctly, and they can give normal ratings to emotional prosody (the speech sounds that indicate emotion), even when a person is speaking in a fearful tone of voice (Adolphs et al., 1999; A. K. Anderson & Phelps, 1998; S. K. Scott et al., 1997).
Social Group Evaluation The amygdala also appears to be activated during the categorization of people into groups. Although such implicit behavior might sometimes be helpful (separating people within a social group from people outside of the group or identifying the trustworthiness of a person), it can also lead to behaviors such as racial stereotyping. A variety of research has looked at racial stereotyping from both a behavioral and a functional imaging perspective.
Behavioral research has gone beyond simple, explicit measures of racial bias, as obtained through self-reporting, to implicit measures that examine indirect behavioral responses demonstrating a preference for one group over another. One common indirect measure for examining bias is the Implicit Association Test (IAT). Devised by Greenwald and colleagues (1998), the IAT measures the degree to which social groups (black versus white, old versus young, etc.) are automatically associated with positive and negative evaluations (see https://implicit.harvard.edu/implicit to take the test yourself). Participants are asked to categorize faces from each group while simultaneously categorizing words as either good or bad. For example, for one set of trials the participant responds to “good” words and black faces with one hand, and to “bad” words and white faces with the other hand. In another set of trials, the pairings are switched. The measure of bias is computed by the difference in the response latency between the black-and-good/white-and-bad trials versus the black-and-bad/white-and-good trials.
To study the neural basis of this racial bias, Elizabeth Phelps and her colleagues (2000) used functional MRI to examine amygdala activation in white participants viewing black and white faces. They found that the amygdala was activated when white Americans viewed unfamiliar black faces (but not faces of familiar, positively regarded blacks like Michael Jordan, Will Smith, and Martin Luther King Jr.). More important, the magnitude of the amygdala activation was significantly correlated with indirect measures of racial bias as determined by the IAT. Participants who showed more racial bias as measured by the IAT showed greater amygdala activity during the presentation of black faces. The researchers concluded that the amygdala responses and behavioral responses to black versus white faces in white participants reflected cultural evaluations of social groups as modified by experience. But is this really what was happening?
Although the amygdala does appear to be activated during these tasks, is it necessary for such evaluation? Phelps and colleagues (2003) compared the performance of the patient S.P., who had bilateral amygdala damage, to the performance of control participants on explicit and implicit measures of racial bias. They found no significant differences between the patient and controls on either measure and were forced to conclude that the amygdala is not a critical structure for the indirect evaluation of race, suggesting instead that it might be important for differences in the perceptual processing of “same” versus “other” race faces.
More recent studies have expanded our understanding of the role that the amygdala plays in social group evaluations. William Cunningham and colleagues (2004) compared areas of brain activation in white participants using fMRI for brief and more prolonged presentation of faces of black males and white males. Their findings led them to propose two separate systems for social evaluation processing (Figure 10.18). For brief presentations, where the evaluation must be made quickly and automatically, the amygdala is activated, and the activation is greater for black faces than for white faces. With longer presentations, when controlled processing can take place, amygdala activation is not significantly different between races. Instead, significantly more activity occurred in the right ventrolateral prefrontal cortex during viewing of black faces than of white faces. Cunningham’s team proposed that there are distinct neural differences between automatic and more controlled processing of social groups and that the controlled processing may modulate the automatic evaluation.
We must be careful in drawing sweeping conclusions about racial stereotypes from this data. It may appear that certain processes in the brain make it likely that people will categorize others on the basis of race, but is that what they actually do? This suggestion does not make sense to evolutionary psychologists. They point out that our human ancestors did not travel over very great distances. It would have been highly unusual for them to come across humans of other races, so it makes no sense that humans should have evolved a neural process to categorize race. It would make sense, however, to be able to recognize whether other humans belonged to one’s own social or family group or not, and hence whether they could be trusted or not. Guided by this evolutionary perspective, Kurzban and colleagues (2001) found that when categorization cues stronger than race are present (e.g., one’s group is a team wearing green shirts and the opposing group wears red shirts), the categorization based on race nearly disappears.
FIGURE 10.18 Differential neural response in white participants to masked and unmasked black and white faces.
Black and white faces were presented for either 30 ms (masked) or for 525 ms (unmasked). (a) The right amygdala is more active for black versus white faces when faces are presented for 30 ms. (b) The pattern of amygdala activity is similar at 525 ms, though the effect is attenuated. Also during the longer stimulus presentation, activity in the (c) dorsolateral prefrontal cortex, (d) anterior cingulate cortex, and (e) ventrolateral prefrontal cortex was greater for black faces relative to white faces. Activity in one or more of these areas may be responsible for the attenuation of amygdala activity at 525 ms.
A recent study may help explain what is going on here. Researchers compared the amygdala response to a set of faces that varied along two dimensions centered on an average face (Figure 10.19). The faces differed in social content along one dimension (trustworthiness) and were socially neutral along the other dimension. In both the amygdala and much of the posterior face network, a similar response to both dimensions was seen, and responses were stronger the farther the face was along the dimension from an average face. These findings suggest that what may be activating these regions is the degree of difference from a categorically average face (Said et al., 2010). If you are from an Asian culture, your average face would be Asian, and thus, your amygdala would be activated for any non-Asian face. This response is not the same thing as determining whether someone is a racist. The ability to use such a categorization strategy may lead to racism, but it does not do so necessarily.
FIGURE 10.19 Faces used in the fMRI experiment.
The faces in the top row varied along the valence dimension ranging from −3, −1, 1, and 3 standard deviations away from the average face. Trustworthy judgments are highly correlated with valence. The socially neutral faces used in the control condition are on the bottom row. Their shape varies from values of −5, −1.67, 1.67, and 5 standard deviations away from the average face.
TAKE-HOME MESSAGES
Get a Grip! Cognitive Control of Emotion
The offensive lineman who yells at the referee for a holding penalty may be considered a “bad sport.” But what is really happening? The player is not controlling his negative emotional response to having his goal—blocking the tackle—thwarted. In contrast, the wife who smiles at her husband as he goes off on a dangerous endeavor “so that he will remember me with a smile on my face and not crying” is consciously controlling her sad emotional response. Emotion regulation refers to the processes that influence the type of emotions we have, when we have them, and how we express and experience them. Recall that emotions arise from brain systems that appraise the significance of a stimulus with respect to our goals and needs. That appraisal involves attention processes, evaluation processes, and response processes. Strategies to regulate emotion can affect any of these in different ways (Figure 10.20). Thus, emotion regulation processes can intervene at multiple points in the emotion generation process, some early on and some after the fact. Some are conscious and controlled, like our wife forcing a smile, and some are unconscious and automatic (Gross, 1998a).
Typically, research in how we regulate emotion is carried out by changing the input (the emotional stimulus) or the output (the emotional response). The former can be done by avoiding the stimulus altogether, changing the attention paid to it (for instance, by being distracted), or altering the emotional impact of the stimulus by reappraisal. Changing the output can be accomplished by intensifying, diminishing, prolonging, or curtailing the emotional experience, expression, or physiologic response (Gross, 1998b).
We are all well aware that peoples’ emotional reactions and their ability to control them are notoriously variable. Sometimes this variation is due to an increased ability to consciously control emotion and sometimes from an increased ability to automatically control emotion. Characteristic patterns in neural activity in the prefrontal and emotional appraisal systems have been found, both at rest and when emotionally stimulated, that correlate with regulatory ability and with gender, personality, and negative affect (see The Cognitive Neuroscientist’s Toolkit: Dimensions of Emotional Style).
For instance, one of the differences in emotion regulation appears to be related to differences in the resting activity of the right and left frontal lobes. Daren Jackson and his colleagues found that people who had more left-sided frontal activation at rest (seen on EEG) than right frontal activation were better able to voluntarily suppress negative emotion (Jackson et al., 2000). Based on this finding, the team predicted that these same types of people also would automatically suppress an emotion more readily. They demonstrated this behavior by measuring decreased EEG responses to unpleasant pictures following, but not during, picture presentation. That is, the study participant’s negative emotion was generated by the stimulus, but it would be suppressed more quickly in those who had more left-sided activation at rest (Jackson et al., 2003). After obtaining a resting EEG, researchers showed participants images on a computer screen that were either pleasant, unpleasant, or neutral. They were to watch the picture the entire time it was on the screen and not look away or close their eyes. Meanwhile, their eyeblink startle magnitude was measured (with EMG) at intervals both during and after the presentation. The eyeblink startle reflex has been found to index the duration of the emotional response following emotional provocation—the smaller the magnitude, the less the emotional response (Davidson, 1998). Participants with greater left anterior EEG activation at rest had attenuated startle magnitude following the negative stimuli. In contrast, these EEG asymmetries did not predict negative reactivity during picture presentation. The study results suggest that the initial reaction to an emotional picture and the response that persists following the picture are mediated by dissociable mechanisms. This relation between resting frontal activation and emotional recovery following an aversive event supports the idea of a frontally mediated mechanism involved in one form of automatic emotion regulation. The relatively fast recovery following a negative-affect elicitor is one index of individual differences in automatic emotion regulation—regulation that occurs in the absence of specific intentions to suppress negative emotion.
FIGURE 10.20 Diagram of the processing steps proposed by Ochsner and his colleagues for generating an emotion and how the emotional outcome might be regulated by cognitive control processes (blue box). |
THE COGNITIVE NEUROSCIENTIST’S TOOLKIT
Dimensions of Emotional Style
For several decades, Richard Davidson (2012) has studied the different ways in which people respond to emotional events. He has formulated what he calls the six dimensions of emotional style, each grounded in a particular pattern of brain activity. Every person lands somewhere on each dimension, and taken together the six dimensions describe our emotional style. Davidson sees these styles as partly genetic and, to some degree, plastic. The six dimensions and their specific pattern of activity are as follows.
The capacity to control emotions is important for functioning in the world, especially the social world. We are so adept at controlling our emotions that we tend to notice only when someone does not: the angry customer yelling at the cashier, the giggler during the wedding ceremony, or a depressed friend overwhelmed with sadness. Indeed, disruptions in emotion regulation are thought to underlie mood and anxiety disorders.
Research into emotion regulation over the past couple of decades has concentrated on how and when regulation takes place. In 1998, James Gross at Stanford University proposed the model in Figure 10.21 to account for seemingly divergent ideas between the psychological and physical literature on emotion regulation. The psychological literature indicated that it was healthier to control and regulate your emotions, while the literature on physical health advanced the idea that chronically suppressing emotions such as anger resulted in hypertension and other physical ailments. Gross hypothesized that “shutting down” an emotion at different points in the process of emotion generation would have different consequences and thus, could explain the divergent conclusions. To test his theory, he compared reappraisal, a form of antecedent-focused emotion regulation, with emotion suppression, a response-focused form. Reappraisal is a cognitive-linguistic strategy that reinterprets an emotion-laden stimulus in nonemotional terms. For instance, a woman wiping the tears from her eyes could be crying because she is sad; or, on reappraisal, she may simply have something in her eye she is trying to remove. Suppression is a strategy in which we inhibit an emotion-expressive behavior during an emotionally arousing situation (for instance, smiling when you are upset). In the experiment, Gross showed participants a disgust-eliciting film under one of three conditions. In the reappraisal condition, they were to adopt a detached and unemotional attitude; in the suppression condition, they were to behave such that an observer could not tell they were feeling disgusted; in the third condition, they were simply asked to watch the film. While watching the film, participants were videotaped and their physiological responses were monitored. Afterward, they completed an emotion rating form. Whereas both reappraisal and suppression reduced emotion-expressive behavior, only reappraisal actually reduced the disgust experience. But suppression actually increased sympathetic activation, causing participants to be more aroused, and this increased sympathetic activity lasted for a while after the film ended (Gross, 1998b). Continued research on emotion regulation has provided fMRI data that support Gross’s hypothesis about the timing of reappraisal and suppression strategies (Goldin et al., 2008).
How does this behavior apply in the real world? Suppose you come home to find that your friend has dropped in and cleaned your house, and you start thinking, “How dare she! She should have asked,” and you feel yourself getting madder than a hornet. You now have three choices. You could wallow in your anger; you could suppress it by putting on a false front; or you could reappraise the situation. In the latter case you think, “Yeah, well, I hate cleaning. Now it looks spotless! This is great.” You start to feel good, and a smile lights up your face. You have just done a little cognitive reappraising and reduced your physiological arousal. This approach is good for your overall health.
FIGURE 10.21 James Gross’s proposed model of emotion.
Gross proposed a model in which emotions may be regulated either by manipulating the input to the system (antecedent-focused emotion regulation) or by manipulating its output (response-focused emotion regulation).
Conscious reappraisal reduces the emotional experience; this finding supports the idea that emotions, to some extent, are subject to conscious cognitive control. In an initial fMRI study to investigate the cognitive control of emotion, Kevin Ochsner and his colleagues (2002) found that using reappraisal to decrease a negative emotion increased prefrontal cortex (PFC) activity (implicated in cognitive control; see Chapter 12) and decreased amygdala activity, suggesting that the PFC modulates emotional activity in subcortical structures such as the amygdala. Reappraisal can mentally make a bad situation better, but it can also mentally make a bad situation worse (or a good situation mentally bad). Would the same neural system also be at work if a person enhanced an emotion, or would altering the strategy alter the system that mediates regulation? Ochsner and his colleagues (2004) hypothesized that cognitive control regions mediating reappraisal (the PFC) would modulate regions involved in appraising the emotional qualities of a stimulus (amygdala). Thus, cognitive upregulation would be associated with greater activation of the amygdala and downregulation would be associated with less. They did an fMRI study of reappraisal that looked at both making a bad situation better (downregulating negative emotions) and making a bad situation worse (upregulating negative emotions).
Participants in this study looked at negative images. They were divided into two groups, a self-focused group or a situation-focused group. In the self-focused group, participants were instructed to imagine themselves or a loved one in the negative scene (increasing negative emotion); to view the pictures in a detached way (decreasing negative emotion); or, in the control condition, simply to look at the image. In the situation-focused group, they were told to increase emotion by imagining that the situation was becoming worse, or to decrease emotion by imagining it was getting better, or again just to look at the image. Each participant then had to report how effective and effortful the reappraisal was. All participants reported success in increasing and decreasing their emotions, but indicated that downregulation took more effort.
Which regions of the brain were involved with the cognitive control of emotions brought about by reappraisal? The team found that whether negative emotions were enhanced or reduced, regions of the left lateral PFC that are involved with working memory and cognitive control (Chapter 12) and dorsal anterior cingulate cortex (dACC) implicated in the online monitoring of performance were activated, suggesting that these regions were involved with evaluating and “deciding” the cognitive strategy (Figure 10.22). They also observed regions of the PFC that were uniquely active. The dorsal medial PFC, implicated in self-monitoring and self-evaluation (Chapter 13), was active in both cases of self-focused reappraisal; but when downregulation was externally focused on the situation, it was the lateral PFC that turned on. During upregulation, the left rostromedial PFC and the PCC (implicated in the retrieval of emotion knowledge) were active, but downregulation activated a different region associated with behavioral inhibition—the right lateral and orbital PFC. It appears, then, that different cognitive reappraisal goals and strategies activate some of the same PFC regions as well as some regions that are different.
FIGURE 10.22 Unique regions activate when increasing or decreasing emotion. |
What about the amygdala? Amygdala activation was modulated either up or down depending on the regulatory goal: Activity increased when the goal was to enhance negative emotion and decreased when the goal was to reduce it. The apparent modulation of the amygdala by prefrontal activity suggests that its activity will be increased if the current processing goals fit with the evaluative aspects of stimuli (in this case, to make a negative stimulus more negative), not the actual valence (positive or negative) of the emotion.
Does cognitive control via reappraisal depend on interactions between the PFC regions that support cognitive control processes and subcortical networks that generate emotional responses, as we have been assuming? Today, a decade after this idea was presented, over 50 imaging studies support this hypothesis (Ochsner et al., 2012).
Although early research suggested that the amygdala was involved exclusively in automatic processing of negative information, Ochsner’s study and more recent research suggest otherwise. The amygdala appears to have a more flexible role in processing the relevance of various stimuli depending on a person’s current goals and motivation (Cunningham et al., 2005; 2008). This trait is known as affective flexibility. For instance, if you go to Las Vegas with the idea that you don’t want to lose any money, your amygdala will be more active when you are losing money. But if you go with the idea of winning money, your amygdala will be more active when you are winning. Amygdala processing, however, appears to be constrained by a negativity bias (Cunningham et al., 2008). Amygdala modulation is more pronounced for positive than for negative information, so it processes negative information less flexibly. PFC modulation can’t completely eradicate the negative stimuli, but—for survival and your wallet—this is a good thing.
Emotion regulation research is in its adolescence. While much remains to be understood, the use of functional imaging coupled with behavioral studies has been fruitful. Much of the research so far has centered on the two cognitive strategies that we have discussed, reappraisal and suppression. Areas of research that need to be addressed are on the deployment of attention (such as ignoring a stimulus or being distracted from it), alternative forms of regulation, such as situation selection (avoiding or seeking certain types of stimuli), and situation modification. Research is also needed to understand the processes behind the range of differences in people’s emotional responses to situations and their ability to regulate their emotions. Achieving a better understanding of emotion regulation will aid in clinical interventions in cases of impaired emotion regulation, which has been implicated in many psychiatric conditions, including depression, borderline personality disorder, social anxiety disorder, and substance abuse disorders (e.g., Denny et al., 2009).
TAKE-HOME MESSAGES
Other Areas, Other Emotions
We have seen that the amygdala is involved in a variety of emotional tasks, ranging from fear conditioning to social responses. But the amygdala is not the only area of the brain necessary for emotions. We consider these other areas next.
The Insular Cortex
FIGURE 10.23 The insula.
The insular cortex (or insula) is tucked between the frontal and temporal lobes in the Sylvan fissure (Figure 10.23). The insula has extensive reciprocal connections with limbic forebrain areas, such as the amygdala, medial prefrontal cortex, and anterior cingulate gyrus (Augustine, 1996; Craig, 2009). It also has reciprocal connections with frontal, parietal, and temporal cortical areas involved with attention, memory, and cognition (Augustine, 1996).
There is a significant correlation between insular activity and the perception of internal bodily states (Critchley, 2009; Pollatos, et al., 2007); this function is known as interoception. Various interoceptive stimuli that activate the anterior insula include thirst, sensual touch, itch, distention of the bladder and intestinal tract, exercise, and heartbeat. The connections and activation profile of the insula suggest that it integrates all of the visceral and somatic input and forms a representation of the state of the body (Craig, 2009; Saper, 2002). Interestingly, people with a bigger right insula are better at detecting their heartbeats than are people with a smaller right insula (Critchley et al., 2004), and those same types of people are also more aware of their emotions (L. Barrett et al., 2004).
Several models of emotion speculate that direct access to bodily states is necessary to experience emotion. It may be that the insula plays a key role in this process. Suggestively, fMRI studies show that the anterior insula and anterior cingulate cortex are jointly active in participants experiencing emotional feelings including maternal and romantic love, anger, fear, sadness, happiness, disgust, and trust. It appears, then, that the insula is active with all feelings, both physical (body states) and emotional, suggesting that it may be the junction where cognitive and emotional information are integrated. The role of the insula as “body information central” is also indicated by its connections to networks across the cortex and to the amygdala with its role in evaluating emotional stimuli (Craig, 2009; Critchley, 2009).
Insular activity also has been reported to be associated with evaluative processing, for instance, when people make risk-adverse decisions. The riskier the decision, the more active is the insula (Xue et al., 2010). Its activity is also associated with the perception of positive emotions in other people (Jabbi et al., 2007). Gary Berntson and his colleagues (2011) investigated the role of the insula in evaluative processing by examining valence and arousal ratings in response to picture stimuli. They compared the behavioral performance of three groups of participants: a group of patients with lesions of the insula, a controllesion group, and an amygdala-lesion group. All patients were asked to rate the positivity and negativity (valence) of each presented picture (from very unpleasant to very pleasant) and how emotionally arousing they found the pictures to be.
The study results showed that patients with insular lesions (compared with patients in the controllesion group) reported both reduced arousal (to both unpleasant and pleasant stimuli) and reduced valence ratings. In contrast, the arousal ratings of patients with amygdala lesions were selectively attenuated for unpleasant stimuli, but they had the same positive and negative valence ratings as the control-lesion group. These findings are in line with an earlier study (Berntson et al., 2007), which found that patients with amygdala damage showed a complete lack of an arousal gradient across negative stimuli, although they displayed a typical arousal gradient to positive stimuli. These results were not attributable to the inability of amygdala patients to process the hostile nature of the stimuli, because the patients with amygdala damage accurately recognized and categorized both positive and negative features of the stimuli. Taken together, these results support the view that the insula may play a broad role in integrating affective and cognitive processes, whereas the amygdala may have a more selective role in affective arousal, especially for negative stimuli (Berntson et al., 2011).
Casting the amygdala as a vigilant watchdog looking out for motivationally relevant stimuli (A. K. Anderson & Phelps, 2001; Whalen, 1998) may prove true, but just what is it watching out for? The answer to that question still eludes investigators. Another fMRI study found that the amygdala is more sensitive to valence than to arousal (Anders et al., 2008). A study mentioned previously reported that novel stimuli generated higher peak responses in the amygdala and activated it for longer than did familiar stimuli (Weierich et al., 2010). Obviously, the amygdala remains enigmatic.
Disgust
Disgust is one emotion that has been linked directly to the insula. This finding should be no surprise, given the insula’s role as the great perceiver of bodily states. Based on imaging studies, many cognitive neuroscientists agree that the anterior insula is essential for detecting and experiencing disgust (Phillips et al., 1997, 1998). This conclusion is consistent with a report of a patient who had insula damage and was unable to detect disgust conveyed in various modalities (Calder et al., 2000).
A study done by Giacomo Rizzolatti (see mirror neurons in Chapter 8) and colleagues (Wicker et al., 2003) confirmed these findings and went a step further. These investigators analyzed the neural response during observation of others experiencing disgust while having firsthand experience of disgust. They observed that the same portion of the anterior insula was activated both when participants viewed expressions of disgust in others and when they smelled unpleasant odors (a firsthand experience of disgust). These results are significant for two reasons. First, they suggest that understanding the emotions of others may require simulating, and thus mildly experiencing, these emotions ourselves (Craig, 2009). This line of thought implies a role for emotion in empathy and theory of mind (discussed in Chapter 13). Second, the results provide additional evidence that the insula is a neural correlate of disgust identification in others and of experiencing disgust directly.
Some have taken all of this evidence to mean that the anterior insula is the region of the brain that is essential for disgust. A large meta-analysis of fMRI studies done by Katherine Vytal and Stephan Hamann (2010) found that disgust consistently activated the inferior frontal gyrus and the anterior insula, and these regions reliably differentiated disgust from all other emotion states. In fact, these researchers’ analysis found locationist evidence for anger, fear, sadness, and happiness. In contrast, Kristen Lindquist and her colleagues (2012), in another large meta-analysis of multiple fMRI studies analyzed by a different method, did not find the insula to be consistently and specifically activated for the emotion of disgust. They found that although the anterior insula is more active during instances of disgust perception, anterior insula activation is observed in a number of tasks that involve awareness of body states, such as gastric distention, body movement, and orgasm. They also found that activation of the left anterior insula was more likely during incidents of anger than of any other emotion. Lindquist and colleagues suggest that the anterior insula plays a key but more general role in representing core affective feelings in awareness. They also found no evidence for a locationist view for the other brain regions. The debate continues.
Happiness
Over the last several years, a small but growing body of research has reported on the neural bases of happiness. It’s not easy to define what makes us happy, so it is a challenging emotion to study. Experimental methods used to study happiness have participants view happy faces, watch films, or try to induce a happy mood by various methods, but they have not been consistently reliable, valid, or comparable across studies. Because of these difficulties, only a few neuroimaging studies have focused on happiness (Habel et al., 2005). One group contrasted participants’ brain activity in response to smiling faces versus sad faces (Lane et al., 1997). In a separate fMRI study, 26 healthy male participants were scanned during sad and happy mood induction as well as while performing a cognitive task that functioned as the experimental control (Habel et al., 2005). Sad and happy moods produced similar activations in the amygdala–hippocampal area extending into the parahippocampal gyrus, prefrontal and temporal cortex, anterior cingulate, and the precuneus. Happiness produced stronger activations in the dorsolateral prefrontal cortex, the cingulate gyrus, the inferior temporal gyrus, and the cerebellum (Figure 10.24). These results reinforce the role of the limbic system and its connections in the processing and expression of positive emotions. Nonetheless, the study of happiness remains extremely challenging. For example, happiness is not necessarily the opposite of sadness. What’s more, happiness is not automatically induced by looking at smiling faces.
Freud equated happiness with pleasure, but others have suggested that it also requires achievement, whether cognitive, aesthetic, or moral. Psychologist Mihaly Csikszentmihalyi suggests that people are really happy when totally immersed in a challenging task that closely matches their abilities (Csikszentmihalyi, 1990). Csikszentmihalyi came to this conclusion following an experiment in which he had participants carry beepers that randomly beeped several times a day. On that signal, they would whisk a notebook from their pockets and jot down what they were doing and how much they were enjoying it. He found that there were two types of pleasure: bodily pleasures such as eating and sex, and, even more enjoyable, the state of being “in the zone,” what Csikszentmihalyi calls flow. Csikszentmihalyi describes flow as the process of having an optimal experience. Flow occurs when you are so into what you are doing that you forget about everything else. It could be riding the top of a wave, working out a theorem, or doing a tango across the dance floor. It involves a challenge that you are equal to, that fully engages your attention, and offers immediate feedback at each step that you are on the right track and pulling it off. When both challenges and skills are high, the person is not only enjoying the moment but also stretching his or her capabilities. This improves the likelihood of learning new skills, and increasing both self-esteem and personal complexity (Csikszentmihalyi & LeFevre, 1989). The concept of flow and what it means suggests that the circuits involved in pleasure, reward, and motivation are essential in the emotion of happiness.
Love
Unlike the studies of happiness, love experiments cannot use facial expressions as either a stimulus or a variable of interest. Indeed, as we noted previously, love is not characterized by any specific facial expressions. Thus the “facial feedback hypothesis” suggesting that facial expressions produce emotional expressions as well as reflect it (Darwin, 1873; Ekman 1992) cannot be applied to the study of love. Love scientists use stimuli that evoke the concept of emotion rather than its visual expression, such as names of loved ones. Subjective feelings of love that participants have for their beloved are usually evaluated with standard self-report questionnaires, such as the Passionate Love Scale (Hatfield & Rapson, 1987).
FIGURE 10.24 Common and different brain regions are activated with sadness and happiness.
Sad and happy moods produced similar activations but differences emerged. In the sadness condition, there was greater activation in the left transverse temporal gyrus and bilaterally in ventrolateral PFC, the left ACC and the superior temporal gyrus. In the happiness condition, higher activation was seen in the right DLPFC, the left medial and posterior cingulate gyrus, and the right inferior temporal gyrus. It appears that negative and positive moods have distinct activations within a common network.
Stephanie Cacioppo (née Ortigue) and her colleagues (2010a) recently reviewed the fMRI studies of love to identify which brain network(s) is commonly activated when participants watch love-related stimuli, independent of whether the love being felt is maternal, passionate, or unconditional (Figure 10.25).
Overall, love recruits a distributed subcortico-cortical reward, motivational, emotional, and cognitive system that includes dopamine-rich brain areas such as the insula, the caudate nucleus and the putamen, the ventral tegmental area, anterior cingulated cortex, bilateral posterior hippocampus, left inferior frontal gyrus, left middle temporal gyrus, and parietal lobe. This finding reinforces the assumption that love is more complex than a basic emotion. No activation of the amygdala has been reported in fMRI studies of love.
Interestingly, each type of love recruits a different specific brain network. For instance, passionate love is mediated by a specific network localized within the limbic system and also within higher order brain areas sustaining cognitive functions, such as self-representation, attention, and social cognition (Figure 10.26). Interestingly, the reported length of time in love correlates with the cerebral activation in particular regions: the right insular cortex, right anterior cingulated cortex, bilateral posterior cingulated cortices, left inferior frontal gyrus, left ventral putamen/pallidum, left middle temporal gyrus, and right parietal lobe (Aron et al., 2005).
FIGURE 10.25 Love activations encompass multiple brain regions. |
On the other hand, while the maternal love circuit also involves cortical and subcortical structures that overlap the area of activity observed with passionate love, there is one activation that is not shared with passionate love: the subcortical periaqueductal (central) gray matter (PAG). As far as love goes, activations in this region were mostly observed in maternal love, suggesting that PAG activation might be specific to maternal love. This conclusion would make sense, because PAG receives direct connections from the limbic emotional system and contains a high density of vasopressin receptors, which are important in maternal bonding (Ortigue et al., 2010a). Love is a complicated business, and it appears to light up much of the brain—but you didn’t need an fMRI study to tell you that.
TAKE-HOME MESSAGES
FIGURE 10.26 Passionate love network.
Superimposed on lateral views of an average human cortical surface model are cortical networks specifically related to passionate love. Brain areas recruited are known to mediate emotion, motivation, reward, social cognition, attention, and self-representation.
Unique Systems, Common Components
It may be an oversimplification to associate each of the emotions we have addressed with a single brain structure. By revealing the various locations in the brain with which different emotions are associated, however, we have made it clear that no single brain area is responsible for all emotions (Figure 10.27). For instance, in a recent meta-analysis including 105 fMRI studies and 1,785 brain coordinates that yielded an overall sample of 1,600 healthy participants, Paulo Fusar-Poli and his colleagues (2009) demonstrated that the processing of emotional faces was associated with increased activation in a variety of visual, limbic, temporoparietal, and prefrontal brain areas. For instance, happy, fearful, and sad faces specifically activate the amygdala, whereas angry or disgusted faces had no clear effect on this brain region. Furthermore, in line with the clinical literature, amygdala sensitivity was greater for fearful than for happy or sad faces.
These results have been reinforced by the previously mentioned fMRI meta-analysis performed by Lindquist and her colleagues (2012). They delineated a so-called neural reference space for emotion (see Figure 10.28). A neural reference space is a region made up of sets of neurons that are probabilistically involved in realizing a class of mental events, in this case, emotion. In these researchers’ view, a set of neurons in this space is not specific to any emotion category, but is somewhat like an ingredient that may or may not be used in a recipe. Lindquist and colleagues concluded that their results do not support a locationist hypothesis of amygdala function. They suggested that the amygdala is part of the distributed network that helps realize core affect because it is involved in signaling salient stimuli (Adolphs, 2008, 2009; Whalen, 1998, 2007). This conclusion is also consistent with a large body of evidence reporting that the amygdala is constantly implicated in orienting responses to motivationally relevant stimuli (Holland & Gallagher, 1999), novel stimuli (e.g., Blackford et al., 2010; Breiter et al., 1996; Moriguchi et al., 2010; Wright et al., 2008), and unusual stimuli (e.g., Blackford et al., 2010). Similarly, when compared to participants with intact amygdalae, individuals with amygdala lesions do not automatically allocate attention to aversive stimuli (A. K. Anderson & Phelps, 2001) and socially relevant stimuli (D. P. Kennedy & Adolphs, 2010).
Although our earlier discussion of the amygdala focused on how this structure operates in isolation, this growing body of evidence suggests that much of the exciting research in the cognitive neuroscience of emotion is outlining how the amygdala works with other brain areas to produce normal emotional responses. For example, as we mentioned earlier, although acquisition of fear conditioning requires the amygdala, normal extinction of a conditioned response involves interactions of the amygdala and the prefrontal cortex (Morgan & LeDoux, 1999). These two structures may also be the culprits in studies examining the ability to associate a reward with a stimulus (Baxter et al., 2000; Hampton et al., 2007). A neuroanatomical model of depression suggests that a circuit comprised of the amygdala, orbitofrontal cortex, and thalamus is overactive in depressed patients, and that the structures in this circuit, working in concert, lead to some of the symptoms of depression (Drevets, 1998). Finally, Damasio’s somatic marker hypothesis proposes that the amygdala and orbitofrontal cortex interact and make unique contributions to emotional decision making.
FIGURE 10.27 Brain areas associated with various emotions. |
FIGURE 10.28 The neural reference space for discrete emotions.
These are regions seen to be active consistently across studies of emotion experience or perception.
Together these findings clearly suggest that emotion research has shifted from identifying areas that specialize in a specific emotion to characterizing how these areas interact and determining if there are any interactions common to different types of emotional experience. To this end, some promising evidence suggests that the anterior cingulate cortex (ACC) could be essential for generalized emotional processing. One study found that emotional arousal while watching films and recalling various emotional experiences was associated with increased activity in the ACC (Lane et al., 1997). Furthermore, the ACC is known to receive projections from the amygdala, the OFC, and the anterior insula (Devinsky et al., 1995), thus making it plausible that the ACC is an essential component of common emotional circuitry. The ACC was also activated during recognition of facial expressions of disgust (Wicker et al., 2003), anger (Blair et al., 1999), happiness (Lane et al., 1997; Habel et al., 2005), and love (Ortigue et al., 2010a).
These observations are suggestive but are far from conclusive. As the study of emotion progresses, it will be essential to develop an understanding of how distant areas of the brain interact to facilitate the detection and experience of emotion.
TAKE-HOME MESSAGES
Summary
Scientists have attributed emotional states to brain processing for almost a century. Recently, however, in case studies on intriguing impairments following bilateral amygdala damage, as well as functional imaging studies that indicate how and where emotions are processed in the brain, we have made great strides toward characterizing the functional neuroanatomy of emotion.
Scientists face many challenges in studying emotion, a behavior that is often difficult to define and therefore difficult to manipulate and study scientifically. One challenge has been establishing a proper place for studies of emotion in cognitive neuroscience. Earlier research and theories tended to view emotion as separate from cognition, implying that they could be studied and understood separately. As research in the neuroscience of emotion proceeded, however, it became clear that emotion could not be considered independently from other, more “cognitive” abilities, or vice versa. The neural systems of emotion and other cognitive functions are interdependent. Although emotion, like all other behaviors, has unique and defining characteristics, current research strongly argues against a concrete emotion–cognition dichotomy.
Studies in the cognitive neuroscience of emotion have tended to emphasize the importance of the amygdala. Our understanding of the role of the amygdala in emotion has been influenced significantly by research with nonhuman animals. In both humans and other species, the amygdala plays a critical role in implicit emotional learning, as demonstrated by fear conditioning. In addition, through interactions with the hippocampus, the amygdala is involved in explicit emotional learning and memory. We have seen that the amygdala is also involved with decision making, attention, and perception. It is also prominently involved in social interactions, enabling us to automatically derive information from the eyes of other people when assessing facial expressions and facilitating categorization of other individuals.
The amygdala is no longer the sole focus of research seeking to characterize the neural correlates of emotion. Different emotions are associated with other neural structures, including the orbitofrontal cortex (anger), the angular gyrus (passionate love), and the insula (disgust). Despite the success of relating these structures to various emotions, an emerging shift in our approach to studying the cognitive neuroscience of emotion is transferring the emphasis from the study of isolated neural structures to the investigation of neural systems. Certainly the amygdala, orbitofrontal cortex, and insula are critical for different forms of emotional processing. But it is now clear that to understand how the brain produces normal and adaptive emotional responses, we need to understand how these structures interact with each other and with other brain regions.
Key Terms
affective flexibility (p. 459)
amygdala (p. 437)
attentional blink (p. 446)
basic emotion (p. 430)
complex emotion (p. 430)
core affect (p. 436)
dimensions of emotion (p. 430)
emotion (p. 427)
emotion generation (p. 430)
emotion regulation (p. 455)
emotional stimulus (p. 430)
facial expression (p. 431)
fear conditioning (p. 439)
feeling (p. 427)
flow (p. 461)
insula (p. 459)
interoception (p. 460)
orbitofrontal cortex (OFC) (p. 449)
reappraisal (p. 457)
somatic marker (p. 448)
suppression (p. 457)
Thought Questions
Suggested Reading
Adolphs, R., Gosselin, F., Buchanan, T. W., Tranel, D., Schyns, P., & Damasio, A. R. (2005). A mechanism for impaired fear recognition after amygdala damage. Nature, 433, 68–72.
Blair, R. J. R., Morris, J. S., Frith, C. D., Perrett, D. I., & Dolan, R. J. (1999). Dissociable neural responses to facial expressions of sadness and anger. Brain, 122, 883–893.
Damasio, A. R. (1994). Descartes’ error: Emotion, reason, and the human brain. New York: Putnam.
Davidson, R. J. (2012). The emotional life of your brain. New York: Hudson Street Press.
LeDoux, J. E. (2012). Rethinking the emotional brain. Neuron, 73(4), 653–676.
Ochsner, K., Silvers, J., & Buhle, J. T. (2012). Functional imaging studies of emotion regulation: A synthetic review and evolving model of the cognitive control of emotion. Annals of the New York Academy of Sciences, 1251, E1–E24.
Ortigue, S., Bianchi-Demicheli, F., Patel, N., Frum, C., & Lewis, J. (2010). Neuroimaging of love: fMRI metaanalysis evidence towards new perspectives in sexual medicine. Journal of Sexual Medicine, 7, 3541–3552.
Rolls, E. T. (1999). The brain and emotion. Oxford, England: Oxford University Press.
Sapolsky, R. M. (1992). Stress, the aging brain, and the mechanisms of neuron death. Cambridge, MA: MIT Press.
Whalen, P. J. (1998). Fear, vigilance, and ambiguity: Initial neuroimaging studies of the human amygdala. Current Directions in Psychological Science, 7, 177–188.