Sunteți pe pagina 1din 47

Journal of Per~onality and So<.:ial P>yr:hology Copyright 199R by th~ American Psychological Association, Inc.

199<':, V11l. 74, Nn. I, 272 279 OOZ2-3514198/$3.00

Effects of Self-Generated Facial Expressions on Mood


Chris L. Kleinke, Thomas R. Peterson, and Thomas R. Rutledge
University of Alaska Anchorage

Two experiments were conducted in which participants looked at photographs (Experiment 1, n =


129) or slides (Experiment 2, n = 90) of people engaging in positive or negative facial expressions.
Participants attempted to communicate these facial expressions as accurately as they could to a video
camera while viewing themselves in a mirror or without viewing themselves in a mirror. Participants
in a control group maintained neutral facial expressions. Participants experienced increased positive
moods when they engaged in positive facial expressions and decreased positive moods when they
engaged in negative facial expressions. These effects were enhanced when participants viewed them-
selves in a mirror. The effects of facial expressions on positive affect were stronger for participants
with high private self-consciousness. Results were integrated with research identifying individuals
who are responsive to self-produced versus situational cues and with theory and research on self-
awareness.

Recognition that facial expressions are intimately related to Our experiments were designed to contribute to research test-
emotional experience has a long and interesting history (Adel- ing the facial feedback hypothesis by using a different method
mann & Z'\ionc, 1989; Izard, 1990). References to the process of manipulating facial expressions than has been used in the
of intensifying one's emotions through amplification of expres- past Research participants were instructed to look at photo-
sions can be found in the writings of Homer and Shakespeare graphs or slides of people engaging in various facial expressions
(Izard, 1990). Darwin (1872/1965) and William James ( 1890/ and to communicate these expressions as accurately as possible
1950) both included facial expressions as an important compo- to a video camera. This methodology differs from that of
nent in their theories of emotion. More recent theories focusing McCaul, Holmes, and Solomon ( 1982), who instructed partici-
on the role of facial feedback in regulating emotions were devel- pants to portray specific emotions with their faces, because it
oped by Tomkins ( 1962, 1963), Gcllhom (1964), !Lard ( 1971, minimizes experimental demands (see Laird, 197 4, p. 4 77). It
1977), and Zajonc, Murphy, and lnglehart ( 1989). was predicted that participants who emulated the facial expres-
During the past 20 years, researchers have attempted to exper- sions would experience the emotions associated with these facial
imentally demonstrate that facial expressions influence emo- expressions to a greater degree than would participants in a
tional experience. One approach has been to show that research control group who were instructed to look at the photographs
participants' emotional experiences in various situations are de- or slides while maintaining a neutral facial expression. To place
creased when they minimize their facial expressions. Another the experiments reported here in perspective, it will he helpful
approach has been to show that research participants' emotional to provide a hrief review of studies in which research partici-
experiences arc enhanced when they accentuate their facial ex- pants' facial expressions were experimentally manipulated.
pressions. Because of the necessity to maintain strict experimen- Laird and his colleagues induced research participants to en-
tal controls, studies using these two approaches have had various gage in positive and negative facial expressions by attaching
degrees of success in capturing the spontaneity of naturally electrodes to their faces and asking them to tense or relax partic-
occurring emotions that were postulated in the original facial ular facial muscles as part of a study of "the activity of facial
feedback theories. However, taken as a whole, these studies have muscles" (Duclos et al., 1989; Duncan & Laird, 1977; Laird,
supported facial feedback theories of emotion by confirming 1974; Laird & Crosby, 1974; Laird, Wagener, Halal, & Szegda,
that research participants' emotional experiences are modified 1982). In general, research participants reacted with more posi-
when their facial expressions are altered (for reviews, see Adel- tive moods when they engaged in positive facial expressions and
mann & Zajonc, 1989; Izard, 1990; Manstead. 1988). with more negative moods when they engaged in negative facial
expressions. Similar results using this methodology were re-
ported by McArthur. Solomon, and Jaffe ( 1980), Rhodewalt
Chris L. Klcinkc, Thomas R. Peterson, and Thomas R. Rutledge, and Comer ( 1979), and Rutledge and Hupka ( 1985 ) .
Department of Psychology, University of Alaska Anchorage. Thomas R. A somewhat less intrusive method of manipulating facial ex-
Peterson is now a doctoral student at the University of New Mexico, pressions was designed by instructing research participants to
and Thomas R. Rutledge is now a doctora1 student at the University of hold a pen with their lips (smiling expression) or with their
British Columbia, British Columbia, Canada.
teeth (frowning expression; Martin, Harlow, & Strack. 1992;
We express appreciation to Anne Lazenby and Rae Thompson for
their assistance in conducting the studies. Strack, Martin, & Stepper, 1988). Research participants reacted
Correspondence concerning this article should be addressed to Chris more positively to stimulus materials when they were induced
L. Kleinke, Department of Psychology, University of Alaska Anchorage, to smile than when they were induced to frown. Larsen, Kasi-
Anchorage, Alaska 99508-8224. matis, and Frey ( 1992) used a similar method in which research

272
FACIAL FEEDBACK 273

participants were instructed to pull together (frown) or keep private self-consciousness would be more influenced by their
apart (neutral) two golf tees that had been attached to their facial expressions than would participants with low scores on
forehead. Participants reacted with more negative moods when private self-consciousness.
they had been induced to frown. 2. Laird (1974) hypothesized that an important mechanism
Three studies used even more naturalistic methods of manipu- mediating the effects of facial expressions on emotions is self-
lating facial expressions. Kleinke and Walton ( 1982) provided perception (Bern, 1972). An attempt was made to look specifi-
positive reinforcement to a group of research participants when- cally at the effects of self-perception by including a treatment
ever they smiled during an interview. Smile-reinforced partici- group in which participants observed themselves in a mirror.
pants reacted with more positive moods than did participants Self-perception theory would predict that facial expressions
who were given noncontingent reinforcement. Research partici- would have a greater effect on moods when the mirror was
pant• in a study by Kraut (1982) were instructed to engage in present because of participants' heightened awareness of their
positive or negative facial expressions while smelling odors that behaviors. Research and theory on self-focused attention (Du-
ranged from pleasant to unpleasant. Participants' ratings of the val & Wicklund, 1972; Wicklund, 1975) would also predict a
odors were influenced mainly by the odor but were moderated stronger effect of facial expressions on mood when participants
by their facial expressions. Zajonc et al. (1989) manipulated viewed themselves in a mirror because participants would be
facial expressions (and resulting facial temperature) by having more aware of their moods (Carver & Scheier, 1978; Scheier &
research participants read stories containing phonemes whose Carver, 1977).
pronunciation requires the movement of different facial muscles.
Participants reacted with more positive moods when the pho-
nemes they read resulted in facial expressions that facilitated a Experiment
cooling of the face.
Jn a critique of research manipulating facial expressions, Win- Method
ton ( 1986) pointed out that must studies tested a dimensional
model of the facial feedback hypothesis in which negative facial Participants
expressions resulted in generally negative moods and positive
Participants were 73 women and 58 men who·were recruited as volun-
facial expressions resulted in generally positive moods. Since teers from various undergraduate courses. All participants were Cauca-
Winton's critique, Duclos et al. ( 1989) performed a categorical sian. Participants ranged in age from 17 to 58 years (M ~ 27.2 years,
study in which specific facial expressions were shown to elicit Mdn = 25 years, SD ~ 9.13 years).
specific emotions. Matsumoto ( 1987) argued that although the
results of facial manipulation studies supporting the facial feed-
back hypothesis were statistically significant, their effect sizes Experimental Design
were generally low. Izard ( 1990) reanalyzed Matsumoto's data
and found the effect sizes for studies using naturalistic manipu- The experiment was described as a study of how accurately people
can communicate facial expressions. Participants in the two experimental
lations of facial expressions were fairly high ( r = .457),
groups were asked to look at a series of photographs of men and women
whereas the effect sizes for studies using experimenter-manipu-
with posed facial expressions and to communicate the emotional expres-
lated facial expressions were on the lower side ( r = .275). sion of the person in the photograph as accurately as they could with
Izard concluded that the effects of experimenter-manipulated their own facial expressions. lt was explained to participants that their
facial expressions on moods, although generally reliable, are facial expressions would be videotaped and then shown to students in
weak for the following reasons: (a) The innervation of spontane- a subsequent study with the purpose of finding out how accurately these
ous and voluntary (manipulated) facial expressions involves students could judge which emotional expressions the present partici-
different neural pathways, (b) connections between voluntary pants were communicating.
(manipulated) facial expressions and emotions are moderated Participants in the expression-mirror group were instructed to emu-
by learning, (c) manipulated facial expressions may not be late the facial expressions of the people in the photograph as accurately
as possible. They were provided with a mirror so they could match their
congruent with the situation, and (d) the experimental manipula-
facial expressions with those of the people in the photograph. Partici-
tion may be perceived by research participants as untenable and
pants in the expression group were instructed to emulate the facial
intrusive.
expressions of the people in the photograph as accurately as possible.
In addition to using a different method of modifying facial They did not have a mirror for observing their own facial expressions.
expressions than has been used in the past, our experiments Participants in the control group were instructed to maintain a neutral
were designed to make the following specific contributions to facial expression throughout the study. Their task was to make lt impossi-
the research examining the facial feedback hypothesis: ble for people observing their videotaped facial expressions to guess
l. Laird and his colleagues (Duclos ct al., 1989; Duncan & which photographs they were observing. Half of the participants were
Laird, 1977; Laird & Crosby, 1974; Laird et al., 1982) con- given photographs of people engaging in positive facial expressions and
cluded that people differ in their propensity to respond to self- half of the participants were given photographs of people engaging in
negative facial expressions.
produced emotional cues. In Experiment I, individual ditTer-
In addition to being randomly assigned to one of three experimental
ences were assessed with the Revised Self-Consciousness Scale
groups and one of two types of facial expression, participants were
(Scheier & Carver, 1985). People with high private self-con- divided into groups of high versus low private self-consciousness. This
sciousness are very cognizant of their thoughts, feelings, and resulted in a 2 (participant sex) X 3 (treatment group) x 2 (positive vs.
moods (Carver & Scheier, 1978; Scheier & Carver, 1977). It negative facial expression) X 2 (high vs. low private self-consciousness)
was therefore predicted that participants with high scores on factorial design.
274 KLEINKE, PETERSON, AND RUTLEDGE

Instruments group), or to maintain a neutral facial expression for 15 s (control


group). After 15 s, participants were instructed to put Photograph I
MAACL-R. The Multiple Affect Adjective Check List Revised down and to pick up Photograph 2, and so on. There was a 5-s period
(MAACL-R; Zuckerman & Lubin, 1985) was used to assess partici- between each photograph. The tape-recorded instructions guided partici-
pants' affective states. Participants were instructed to complete the pants through all 12 photographs and then told participants to complete
MAACL-R according to "how you are feeling at this moment." The the remaining paperwork, which consisted of the MAACL-R (posttest)
two MAACL-R scales chosen for analysis in this study were Positive and the BSI.
Affect and Dysphoria. Participants completed the MAACL-R at the When participants had completed the experiment, the experimenter
beginning and at the conclusion of the experiment. returned, answered questions, and conducted a brief interview to deter-
Brief Symptom Inventory. The Brief Symptom Inventory (BSI) is mine whether participants might have been influenced by experimental
a brief form of the SCL-90-R (Derogatis, 1975, 1977), measuring nine demands (Orne, 1962). Participants were first asked to describe their
symptom dimensions and providing three global indexes of distress. The experiences during the study. They were then questioned about whether
General Severity Index was used in our study because it is the most .they noticed a connection between their facial expressions and their
sensitive of the global indexes (Derogatis & Spencer, 1982). Participants feelings. Finally, participants were prompted to report whether they were
completed the BSI at the conclusion of the experiment. The BSI was aware that the study was intended to influence their moods. After the
included as a measure in Experiment 1 because research studies have poststudy interview, participants were debriefed and told how they could
found that people's moods have an effect on their self-reported health obtain results of the study.
and well-being (Croyle & Uretsky, 1987; Salovey & Birnbaum, 1989).
It was of interest to determine whether changes in participants' moods
resulting from their facial expressions would influence their reports of
Demand Characteristics and Participant Compliance
psychological distress. Although some participants expressed awareness of a connection be-
Self-Consciousness Scale. The Revised Self-Consciousness Scale tween their facial expressions and feelings, no participant reported
(Scheier & Carver, 1985) was used to measure participants' private self- knowledge of the purpose of the study. lWo participants (one man and
consciousness. On the basis of the nine items measuring private self- one woman) apparently found humor in some of the negative facial
consciousness (e.g., ''I'm always trying to figure myself out"; I think expressions in the photographs and they smiled. Data from these partici-
about myself a lot!\; "I'm constantly thinking about my reasons for pants were deleted, resulting in data from 72 women and 57 men.
doing things), participants were divided at the median (Mdn = 15) into
two groups: those high and those low in private self--consciousness.
Results

Photographs of Positive and Negative Facial Positive Affect


Expressions An analysis of variance (ANOVA) on residual change scores
The photographs were 3 X 5 in. color prints showing shoulder-to-head for MAACL-R Positive Affect identified a significant Treatment
views of men and women with intentionally posed facial expressions Group X Facial Expression interaction, F(2, 105) = 6.24, p <
reflecting either positive (12 photographs) or negative ( 12 photographs) .003, r = .24, and a significant interaction for Treatment Group
emotions. The positive facial expressions were characterized by smiling X Facial Expression X Private Self-Consciousness, F(2, 105)
and an attempt to look pleased or happy. The negative facial expressions = 3.94, p < .023, r = .19.
were characterized by frowning and an attempt to look angry, displeased, Data in Figure 1 indicate that participants reported increased
or disgusted. The positive versus negative emotions expressed in the
positive affect when they engaged in positive facial expressions
photographs were unambiguous. Twenty judges sorted the photographs
and decreased positive affect when they engaged in negative
into groups reflecting positive, neutral, or negative emotions. All judges
placed the 12 positive photographs in the positive group and the 12 facial expressions. This effect was particularly strong when par-
negative photographs in the negative group. No photographs were placed ticipants viewed themselves in the mirror. Contrasts between
in the neutral group. residual change scores of participants who communicated posi-
tive versus negative facial expressions were as follows: F( I,
105) = 4.01, p < .05, r = .19, for the expression condition,
Procedure
and F( I, 105) = 22.7, p < .001, r = .41, for the expression-
Participants were seated in a comfortably decorated 10 X 10ft room mirror condition.
at a small table holding experimental materials and a cassette audiotape The Treatment Group X Facial Expression X Private Self-
player. A video camera on a tripod in full view was oriented to record Consciousness interaction is outlined in Table 1. Note that the
participants' facial expressions. The experimenter observed participants
influence of posed facial expressions (particularly negative ex-
on a video monitor in an adjacent room to ensure that they followed
experimental instructions. In the expression-mirror condition, a 9 X 12
pressions) on positive affect was more pronounced for partici-
in. mirror was set up on the table so participants could look directly at pants with high private self-consciousness than it was for partici-
themselves. pants with low private self-consciousness.
Participants read and signed an informed consent and completed the
MAACL-R (pretest) and the Revised Self-Consciousness Scale. After Dysphoria
this, they listened to tape·recorded instructions describing the experi-
ment and instructing· them about what they were expected to do. The An ANOVA on residual change scores for MAACL-R Dys-
experimenter then answered questions and left the participants alone in phoria did not find any significant main effects or interactions.
the room until the experiment was completed. At this time, participant"!
turned on the tape recorder, which instructed them to pick up the first Brief Symptom Inventory
photograph and to match their facial expression to the expression of the
person in the photograph for 15 s (expression group), to match their An ANOVA on the General Severity Index of the BST identi-
expression for 15 s with the aid of the mirror (expression-mirror fied only a significant main effect for private self-consciousness,
FACIAL FEEDBACK 275

MAACL-R Positive Affect


Residual Change Score

2.0 !n = 221

1.0

0.0
!n = 211

-1.0

-2.0

-3.0 Positive
Expression • !n = 201

Negative
-4.0 Expression •

Control Expression Expression-


Mirror
Figure I. Experiment 1: Mean residual change scores on Multiple Affect Adjective Check List Revised
(MAACL-R) Positive Affect.

F(l, !07) = 6.08, p < .02, r = .24. Participants with high Discussion
private self-consciousness claimed to experience more symp-
toms of distress than did participants with low private self- Experiment I supported the theory that facial expressions can
consciousness (Ms = 0.82 and 0.55, respectively). influence emotions by showing that participants experienced
increased positive moods when they engaged in positive facial
expressions and decreased positive moods when they engaged
in negative facial expressions. The effect of facial expressions
Table 1 on mood was even stronger when participants observed them-
Experiment 1: Mean Residual Change Scores selves in a mirror. This finding is in line with predictions from
for MAACL-R Positive Affect self-perception theory (Laird, 1974, 1984) and from research
Expression- and theory on self-awareness (Carver & Scheier, 1978;
Measure Control Expression mirror Scheier & Carver, 1977).
Participants with high scores on private self-consciousness
Low private self-consciousness were more influenced by their facial expressions than were parti-
Positive expressions .30a .80. .99,
cipants with low scores on private self-consciousness. This find-
(14) (12) (12)
Negative expressions .33 3 -.37, -.77, ing is consistent with research identifying individual differences
(10) (10) (II) in people's propensity to use self-produced versus situational
High private self-consciousness
cues for inferring their attitudes and emotions (Duncan & Laird,
Positive expressions .04a .96._, 2.67, 1977; Laird & Crosby, 1974; Laird et al., 1982). It also supports
(10) (10) (10) studies showing that people with high private self-consciousness
Negative expressions .20n -1.23. -5.20, are more responsive to mood-inducing experiences than are peo-
(II) (10) (9) ple with low private self-consciousness (Carver & Scheier,
Note. Within each private self-consciousness group, means not sharing 1978; Scheier, 1976; Scheier & Carver, 1977).
a common subscript differ at p < .05. Cell sample sizes are in parenthe- There were no main effects or interactions involving partici-
ses. MAACL-R = Multiple Affect Adjective Check List Revised. pant sex, indicating that men and women responded similarly
276 KLEINKE, PETERSON, AND RUTLEDGE

to the facial expression manipulations. Although self-generated emotions. The slides for positive facial expressions were standardized
facial expressions influenced participants' positive moods, they by Ekman and Friesen as communicating happiness. The slides for nega-
did not affect participants' negative moods and participants' tive facial expressions were standardized by Ekman and Friesen as com-
reports of psychological distress on the BSI. One problem with municating sadness, fear, or anger.
the MAACL-R Dysphoria scale was that its range was very
limited. The modal score was zero and few participants had Procedure
prescores or postscores greater than three. The range of scores The procedure was modeled after the procedure used in Experiment
on the General Severity Index of the BSI was also limited. The I. The only differences were as follows: (a) Participants pressed the
modal score was .07 and 90% of the participants had scores button on a slide projector to view the pictures of people making positive
less than one. versus negative facial expressions and (b) participants in the expression-
mirror and expression groups were instructed to emulate the facial ex-
Experiment 2 pression of the person in the slide for 10 s. There was a 10-s pause
between shdes. One other difference from Experiment l was that instead
Experiment 2 was designed to replicate Experiment I with the of having a video camera set up on a tripod to record participants'
following modifications. First, the Positive and Negative Affect facial expressions, participants were told that there was a video camera
Schedule (PANAS; Watson, Clark, & Tellegen, 1988) was sub- photographing them from behind a one-way mirror. Participants could
stituted for the MAACL-R for measuring self-reported mood. nnt see themselves in the reflection of the one-way mirror. The experi-
The main reason for using the PANAS scale was to replicate menter sat behind the one-way mirror to ensure that participants followed
instructions.
Experiment 1 with a different mood measure. The PANAS scale
is briefer and more focused on positive versus negative affec-
tivity than the MAACL-R. A second modification was the use Demand Characteristics and Participant Compliance
of slides of people making facial expressions that had been Although some participants expressed awareness of a connection be-
standardized by experts in the field (Ekman & Friesen, 1976). tween their facial expressions and feelings, no participant reported
Participants in the positive expression condition looked at slides knowledge of the purpose of the study. All participants followed instruc·
of people expressing happiness. Participants in the negative ex- tions and engaged in the appropriate facial expressions.
pression condition looked at slides of people expressing sadness,
fear, and anger. Because of its limited range of scores with our Results
participants, the BSI was not used in Experiment 2.
Positive Ajfect
Method An ANOVA on residual change scores for PANAS Positive
Affect identified a significant Treatment Group X Facial Expres-
Participants
sion interaction, F(2, 78) = 3.12, p < .05, r = .20. Data in
Participants were 50 women and 40 men who were recruited as volun- Figure 2 indicate that participants reported increased positive
teers from various undergraduate courses. All participants were Cauca- affect when they engaged in positive facial expressions and
sian. Participants ranged in age from 18 to 65 years (M = 29.5 years, decreased positive affect when they engaged in negative facial
Mdn = 25.0 years, SD = 10.9 years). expressions. This effect was particularly strong when partici-
pants viewed themselves in the mirror. Contrasts between resid-
Experimental Design ual change scores of participants who communicated positive
The same instructions and experimental groups were used a.o.:, in Exper-
versus negative facial expressions were as follows: F( I, 78) =
iment I, with the exception that participants did not complete the Revised 3.11, p < .I 0, r = .20, for the expression condition and F( I, 78)
Self-Consciousness Scale. This resulted in a 2 (participant sex) X 3 = 5.65, p < .05, r = .26, for the expression-mirror condition.
(treatment group) x 2 (positive vs. negative facial expression) factorial
design. Negative Affect
An ANOVA on residual change scores for PANAS Negative
Instruments
Affect did not find any significant main effects or interactions.
Participants completed the PANAS scale (Watson et al., 1988) before The range of scores on PANAS Negative Affect was very lim-
and after the experiment. The PANAS scale includes 10 positive and 10 ited. The modal score was one, and over 90% of the participants
negative moods, which are rated by respondents according to how they had scores less than two.
feel at this moment on a 5-point scale from I (very slightly or not at
all) to 5 (extremely). Scores for positive affect and negative affect are
computed by averaging the respective ratings for positive and negative General Discussion
moods.
Experiments I and 2 both supported the theory that facial
expressions can influence emotions by showing that participants
Slides of Positive and Negative Facial Expressions experienced increased positive moods when they engaged in
The slides used were the Pictures of Facial Affect developed by Ekman positive facial expressions and decreased positive moods when
and Friesen (1976). These slides represent men and women posing they engaged in negative facial expressions. No effects were
specific emotions with their facial expressions. Twelve slides were se- found for facial expressions on negative moods, largely because
lected for positive emotions and twelve slider;;, were selected for negative the range of scores on negative moods was very limited.
FACIAL FEEDBACK 277

PANAS Positive Affect


Residual Change Score

2.0

1.0

0.0

·1.0

·2.0
Positive
Expression •
·3.0 Negative
Expression A

Control Expression Expression·


Mirror
Figure 2. Experiment 2: Mean residual change scores on Positive and Negative Atfect Schedule (PANAS)
Positive Affect. There were 15 participants in each group.

In both experiments, participants in the control group were close to stimulus) and found no difference between participants
not influenced by the facial expressions of people in the photo- in the expression and expression-mirror groups (scores for
graphs and slides. It appears that mimicry of the facial expres- participants in the control group were all zero). However, be-
sions was inhibited by the instructions given to control partici- cause of the cursory nature of our ratings, it remains for future
pants to maintain neutral facial expressions throughout the ex- research to determine if a mirror enhances the ability to emulate
periment (Hatfield, Cacioppo, & Rapson, 1992, 1993). facial expressions.
Our experiments were not intended to test the specific mecha- Izard ( 1990) explained that the facial feedback hypothesis
nisms underlying the facial feedback hypothesis (Adelmann & has been investigated by two types of studies: those that use
Zajonc, 1989; Izard, 1990; Zajonc eta!., 1989). They do demon- experimenter-manipulated expressions and those that use spon-
strate, however, that the effects of facial expressions on self- taneous, self-initiated expressions. The manipulation of partici-
reported mood are enhanced when people observe themselves pants' facial expressions in our experiments was not as artificial
making these expressions in a mirror. The finding that facial as those used in many studies, but it was also not fully spontane-
expressions had stronger effects on self-reported mood in the ous. Our experiments were limited in demonstrating a dimen-
mirror condition is compatible with theories of self-perception sional form of the facial feedback hypothesis (Winton, 1986),
(Bern, 1972; Laird, 1974, 1984) and self-awareness (Duval & and the effect sizes were modest (Matsumoto, 1987). The exper-
Wicklund, 1972; Wicklund, 1975). Application of self-aware- iments contribute to the literature on facial expressions and
ness theory to Experiments 1 and 2 is cross-validated by the emotions by demonstrating a different method for showing that
fact that the effects of facial expressions on mood were greater facial expressions can influence moods and by demonstrating
for participants with high private self-consciousness as well as that the influence of facial expressions on moods is stronger for
for participants whose self-consciousness was experimentally people with high (measured or manipulated) self-awareness.
enhanced with a mirror (Carver & Scheier, 1978, p. 326; In future research, it will be useful to learn more about indi-
Scheier & Carver, 1980, p. 398). vidual differences in people's propensities to be affected by
The finding that participants in the expression- mirror group their facial expressions. Laird and his colleagues were able to
were most influenced by their facial expressions could be due distinguish between people who are responsive to self-produced
to the fact that the mirror enabled participants to emulate the versus situational cues on the basis of their responses to experi-
facial expressions more successfully. We rated participants' fa- menter-posed facial expressions (Duclos et al., 1989; Duncan &
cial expressions ( 0 = neutral, 1 = close to stimulus, 2 = very Laird, 1977; Laird & Crosby, 1974; Laird eta!., 1982). Experi-
278 KLEINKE, PETERSON, AND RUTLEDGE

ment indicated that participants with high scores on private Kleinke, C. L., & Walton, J. H. (1982). Influence of reinforced smiling
self-consciousness were more influenced by their facial expres- on affective responses in an interview. Journal of Personality and
sions than were participants with low scores on private self- Social Psychology, 42, 557-565.
Kraut. R. E. (1982). Social presence. facial feedback, and emotion.
consciousness. Rutledge and Hupka (1985) found no relation
Journal of Personality and Social Psychology, 42. 853-863.
between participants' responses to their facial expressions and
Laird, J.D. ( 1974). Self-attribution of emotion: The effects of expres-
their scores on scales measuring affective communication, self- sive behavior on the quality of emotional experience. Journal of Per-
monitoring, mental imagery, and self- versus situational orienta- sonality and Social Psychology, 29, 475-486.
tion. It is likely that private self-consciousness is a better pre- Laird, J.D. (1984 ). The real role of facial response in the experience
dictor of sensitivity to one's facial expressions than the measures of emotion: A reply to Tourangeau and Ellsworth, and others. Journal
used by Rutledge and Hupka because private self-consciousness of Personality and Social Psychology, 47, 909-917.
focuses specifically on awareness of personal thoughts and feel- Laird, J. D .• & Crosby. M. ( 1974). Individual differences in self-attribu-
ings. It will be of interest to include measures and manipulations tion of emotion. In H. London & R. Nisbett (Eds.), The cognitive
of self-awareness in future facial feedback research. alteration of feeling states ( pp. 44-59). Chicago: Aldine-Alherton.
Laird, J.D .. Wagener, J., Halal, M., & Szegda, M. ( 1982). Remembering
what you feel: Effect' of emotion on memory. Journal 0{ Personality
References and Social Psychnlogy, 42, 646-657.
Larsen, R. J., Kasimatis, M., & Frey, K. (1992). Facilitating the fur-
Adelmann, P. K., & Zajonc, R. B. ( 19M9). Facial efference and the expe- rowed brow: An unobtrusive test of the facial feedback hypothesis
rience of emotion. Annual Review of Psychology, 40. 249-280. applied to unpleasant affect. Cognition and Emotion, 6, 321-338.
Bern, D. J. ( 1972). Self-perception theory. In L. Berkowitz (Ed.), Ad- Manstead, A. S. R. ( 1988). The role of facial movement in emotion. In
vances in experimental social psychology (Vol. 6, pp. 1-62). New H. L. Wagner (Ed.), Social psychophysiology: Theory and clinical
York: Academic Press. applications (pp. 105-129). London: Wiley.
Carver, C. S .• & Scheier, :'vi. F. (1978). Self-focusing effects of disposi- :'vlarlin, L. L., Harlow, T F., & Strack, F. ( 1992). The role of bodily
tional self-consciousness, mirror presence, and audience presence. sensations in the evaluation of social events. Personality and Social
Journal of Personality and Social Psychology, 36, 324-332. Psychology Bulletin. !8, 412-419.
Croyle, R. T, & Uretsky, M. B. (1987). Effects of mood on self-ap- Matsumoto, D. ( 1987). The role of facial response in the experience of
praisal of health status. Health Psychnlogy, 6, 239-253. emotion: More methodological problems and a meta-analysis. Journal
Darwin, C. R. ( 1965). The expression of emotions in man and animals. of Personality and Social Psychology, 52. 769-774.
Chicago: University of Chicago Press. (Original work published McArthur, L. Z .. Solomon, M. R., & Jaffe, R. H. ( 1980). Weight differ-
1872) ences in emotional responsiveness to proprioceptive and pictorial stim-
Derogatis, L. R. ( 1975). Brief Symptom Inventory. Baltimore: Clinical uli. Journal of Personality and Social Psychology. 39. 308-319.
Psychometric Research. McCaul. K. D., Holmes, D. S., & Solomon, S. ( 1982). Voluntary expres-
Derogalis, L. R. ( 1977). The SCL-90 Manual!: Scoring, administration
sive changes and emotion. Journal of Personality and Social Psycho/~
and procedures for the SCL-90. Baltimore: Clinical Psychometric
ogy, 42, 145-152.
Research. Orne, M. T (1962). On the social psychology of the psychological
Derogatis, L. R., & Spencer, P.M. ( 1982). The Brief Symptom Inventory
experiment: With particular reference to demand characteristics and
( BSI): Administration, scoring, and procedures manual~ I. Balti-
their implications. American Psychologist, 17, 776-783.
more: Johns Hopkins University.
Rhodewalt, F., & Comer, R. (1979). Induced-compliance attitude
Duclos, S. E., Laird. J.D., Schneider, E., Sexter. M., Stem, L., & Van
change: Once more with feeling. Journal of Experimental Social Psy-
Lighten, 0. ( 1989). Emotion-specific effects of facial expressions and
chology, 15, 35-47.
postures on emotional experience. Journal of Personality and Social
Rutledge, L. L., & Hupka, R. B. ( 1985). The facial feedback hypothesis:
Psychology. 57, 100-108.
Methodological concerns and new supporting evidence. Motivation
Duncan, J. W., & Laird, J.D. ( 1977). Cross-modality consistencies in
and Emotion, 9, 219-240.
iodividual differences in self-attribution. Journal of Personality, 45,
Salovey, P., & Birnbaum, D. (1989). Influence of mood on health-
191-206.
relevant cognitions. Journal of Personality and Social Psychology,
Duval, S .• & Wicklund, R. A. ( 1972). A theory of objective self-aware-
57. 539-551.
nes:J. New York: Academic Press.
Ekman, P., & Friesen, W. V. ( 1976). Pictures offacial affect. Palo Alto, Scheier, M. F. ( 1976). Self-awareness, self-consciousness, and angry
aggression. Journal of Personality, 44, 627-644.
CA: Consulting Psychologists Press.
Gellhorn, E. ( 1964). Motion and emotion: The role of proprioception Scheier, M. F., & Carver, C. S. ( 1977). Self-focused attention and the
in the physiology and pathology of the emotions. Psychological Re- experience of emotion: Attraction, repulsion, elation, and depression.
Journal of Personality and Social Psychology. 35, 625-636.
view, 71, 457-472.
Hatfield, E., Cacioppo. J. T., & Rapsnn, R. L. ( 1992). Primitive emo- Scheier, M. F., & Carver, C. S. ( 1980). Private and public self-attention,
tional contagion. In M.S. Clark (Ed.). Review of personality and resistance to change, and dissonance reduction. Journal of Personality
.wcial psychology (Vol. 2, pp. 151-177). Newbury Park, CA: Sage. and Social Psychology. 39, 390-405 .
Hatfield, E., Cacinppo, J. T., & Rapson, R. L. ( 1993). Emotional conta- Scheier, M. F, & Carver. C. S. ( 1985). The Self-Consciousness Scale:
gion. New York: Cambridge University Press. A revised version for use with general populations. Journal of Applied
Izard, C. E. (1971). The face of emotion. New )brk: Appleton-Century- Social Psychology, 15, 687-699.
Crofts. Strack, F., Martin, L. L., & Stepper, S. ( 1988). Inhibiting and facilitating
Izard, C. E. ( 1977). Human emotions. New York: Plenum Press. conditions of the human smile: A nonohtrusive test of the facial feed-
Izard, C. E. (1990). Facial expressions and the regulation of emotion. back hypothesis. Journal of Personality and Social Psychology, 54,
Journal of Personality and Social PsycholO!iY· 58, 487-498. 768-777.
James. W. (1950). The principles of psychology (Vol 2). New York: Tomkins, S. S. ( 1962). Affect, imagery, consciousness: Vol. I. The po>i-
Dover Publications. (Original work published 1890) tive affects. New York: Springer.
FACIAL FEEDBACK 279
Thmkins, S. S. ( 1963). Affect, imagery, consciousness: Vol. 2. The nega- Zajonc, R. B., Murphy, S. R., & Inglehart, M . ( 1989). Feeling and facial
tive affects. New York: Springer. efference: Implications of the vascular theory of emotion. Psychologi-
Watson, D .• Clark, L.A.• & Tellegen, A (1988). Development and valida- cal Review, 96, 395 - 416.
tion of brief measures of positive and negative affect: The PANAS scales. Zuckerman, M., & Lubin, B. (1985). Manual for the MAACL-R: The
Journal of Personality and Social Psychology, 54, 1063- 1070. Multiple Affect Atijective Checklist Revised. San Diego, CA: Educa-
Wicklund, R. A (1975). Objective self-awareness. In L. Berkowitz tional and Industrial Testing Service.
(Ed.), Advances in e.~:perimental social psychology (Vol. 8, pp. 233-
275). New York: Academic Press.
Winton, W. M. (1986) . The role of facial response in self-reports of Received August 3, 1995
emotion: A critique of Laird. Journal of Personality and Social Psy- Revision received R:bruary 18, 1997
chology, 50, 808-812. Accepted March 5, 1997 •

A AMERICAN PsYCHOLOGICAL AssociATION


• SUBSCRIPTION CLAIMS INFORMATION Today'sDate:._ _ _ _ _ _ _ __

We provide this fonn to assist members, institutions, and nonmember individuals with any subscription problems. With the
appropriate information we can begin a resolution. If you usc tbe services of an agent, please do NOT duplicate claims through
them and directly to us. PLEASE PRINT CLEARLY AND IN INK IF POSSIBLE.

PRINT FUU.. NAME OR KEY NAME OF INS1TilJ110N MEMBEROR.ct:STOMEit.NUMBER(MAYBEFOUND ON ANYPASTlSSUl'.l..ABEL)

ADDRESS DAn; YOIJll ORDEll WAS MAn.ED (OR PHONED)

_ _ PilEI'AID _ _ CHECK _ _ CHARGE


CIIECKICIIRD ClEARFD DATE: _ _ _ _ _ _ __
CITY STA"IB'COUN11lY
(U possible. send a cOpy, front and backt of your CBIIOellcd check to help u• io oar re5Cateh
of yCW" claim.)
YOUR NAME AND PHONE NUMBER ISSUES: MISSING DAMAGED

TITLE VOLUME Olt YEAR NUMBER Olt MONTH

(TO BE FILLED OUT BY APA STAR)


DATERECEnffiD: ________________________
DATE OF ACTION:-----------------------
ACTION TAKEN: _________________________
INV. NO. & DATE:-----------------------
STAFF NAME: LABEL NO. & DATE:

Send tllill fonn to APA Snbseriptlon Claims, 750 First Street, NE. Wllllhington, DC :Z0002-4l4l

PLEASE DO NOT REMOVE. A PHOTOCOPY MAY BE USED.


Facial expressions affect mood and emotions

Blink - the power of thinking without thinking - Malcolm Gladwell


p.75
What Ekman is saying is that the face is an enormously rich source of information about emotion. In fact,
he makes an even bolder claim - one central to understanding how mind reading works - and that is that
the information on our face is not just a signal of what is going on inside our mind. In a certain sense, it is
what is going on inside our mind.
The beginnings of this insight came when Ekman and Friesen were first sitting across from each other,
working on expressions of anger and distress. “It was weeks before one of us finally admitted feeling
terrible after a session where we’d been making one of those faces all day,” Friesen says. “Then the other
realized that he’d been feeling poorly, too, so we began to keep track.” They then went back and began
monitoring their bodies during particular facial movements. “Say you do A.U. one, raising the inner
eyebrows, and six, raising the cheeks, and fifteen, the lowering of the corner of the lips,” Ekman said, and
then did all three. “What we discovered is that that expression alone is sufficient to create marked changes
in the autonomic nervous system.
When this first occurred, we were stunned. We weren’t expecting this at all. And it happened to both of
us. We felt terrible. What we were generating were sadness, anguish. And when I lower my brows, which
is four, and raise the upper eyelid, which is five, and narrow the eyelids, which is seven, and press the lips
together, which is twenty-four, I’m generating anger. My heartbeat will go up ten to twelve beats. My
hands will get hot. As I do it, I can’t disconnect from the system. It’s very unpleasant, very unpleasant.”
Ekman, Friesen, and another colleague, Robert Levenson (who has also collaborated for years with John
Gottman; psychology is a small world) decided to try to document this effect. They gathered a group of
volunteers and hooked them up to monitors measuring their heart rate and body temperature - the
physiological signals of such emotions as anger, sadness, and fear. Half of the volunteers were told to try
to remember and relive a particularly stressful experience. The other half were simply shown how to
create, on their faces, the expressions that corresponded to stressful emotions, such as anger, sadness,
and fear. The second group, the people who were acting, showed the same physiological responses, the
same heightened heart rate and body temperature, as the first group.
A few years later, a German team of psychologists conducted a similar study. They had a group of subjects
look at cartoons, either while holding a pen between their lips - an action that made it impossible to
contract either of the two major smiling muscles, the risorius and the zygomatic major - or while holding
a pen clenched between their teeth, which had the opposite effect and forced them to smile. The people
with the pen between their teeth found the cartoons much funnier. These findings may be hard to believe,
because we take it as a given that first we experience an emotion, and then we may - or may not - express
that emotion on our face.
We think of the face as the residue of emotion. What this research showed, though, is that the process
works in the opposite direction as well. Emotion can also start on the face. The face is not a secondary
billboard for our internal feelings. It is an equal partner in the emotional process.

Whenever we experience a basic emotion, that emotion is automatically expressed by the muscles of the
face. That response may linger on the face for just a fraction of a second or be detectable only if electrical
sensors are attached to the face. But it’s always there. Silvan Tomkins once began a lecture by bellowing,
“The face is like the penis!” What he meant was that the face has, to a large extent, a mind of its own. This
doesn’t mean we have no control over our faces. We can use our voluntary muscular system to try to
suppress those involuntary responses. But, often, some little part of that suppressed emotion - such as
the sense that I’m really unhappy even if I deny it - leaks out. That’s what happened to Mary. Our voluntary
expressive system is the way we intentionally signal our emotions. But our involuntary expressive system
is in many ways even more important: it is the way we have been equipped by evolution to signal our
authentic feelings.
1
Facial expressions and emotions
2knowmyself.com/facial_expressions_and_emotions

By M.Farouk Radwan, MSc.

Body language (nonverbal communications)

A recent research has shown something very surprising, people who used botox, which is
an injection that helps older people get rid of wrinkles became less likely to get
depressed!!

Moreover people who used botox reported experiencing less sadness and being less
affected when watching drama on television.

What's happening here? How can a face injection affect a person's emotions to that
extent? The answer isn't in the composition of the chemical itself but rather in the way it
permanently alters facial expressions which are one of the primary factors that affects
emotions.

The connection between facial expressions and emotions


It was found that people have to recall certain facial expressions before they can
experience a certain emotions. Sometimes i find my Mom looking very sad and as soon
as i enter the room to check on her i find her watching her favorite drama series and
that she is absolutely fine :)

You can't experience the emotions you are intended to experience in certain situations
before your facial expressions change.

Now back to botox, because it prevents the person from frowning it also prevents him
from experiencing emotions that are related to frowning!! (to a certain extent of course)

How emotions are communicated between people


Do you know why you feel scared at horror movies?
Its because the facial expressions of the actor who feels afraid has been transferred to
you.

That's also the same reason why we feel comfortable around confident people. In the
Solid Self confidence program i explained how confident people transfer to us their
relaxed state through their facial expressions and so we feel more comfortable around
them.

So how can you alter your facial expressions and so your


1/2
So how can you alter your facial expressions and so your
emotions?
So how can you make use of this information?
How can you do something to your facial expressions so that you alter your mood?

Its as simple as, keeping a smile on your face


when you smile you will force your mind to recall emotions that are associated with this
facial expression and your mood will change to the better.

I am not saying like some other people that a smile is the solution to all problems
because obviously that's against my teachings but i can confidently say that smiling more
often will elevate your overall mood.

The book The ultimate guide to getting over depression was released by 2knowmself,
the book provides a 100% guarantee for feeling better else you will be refunded.
2knowmysef is not a complicated medical website nor a boring online encyclopedia but
rather a place where you will find simple, to the point and effective information that is
backed by psychology and presented in a simple way that you can understand and
apply. If you think that this is some kind of marketing hype then see what other visitors
say about 2knowmyself.

Want to know more?

How to be in control of your emotions

How facial expressions affects communications

How do emotions affect face features

The connection between face features and personality

How to get over anyone in few days (book)

How to make anyone fall in love with me fast (book)

How to end Depression instantly (book)

How to control people's minds (Course)

How to develop rock solid self confidence fast (course)

2knowmyself Best Selling Books

2/2
Why Faking a Smile Is a Good Thing
forbes.com/sites/rogerdooley/2013/02/26/fake-smile

26. veljače 2013.

This article is more than 2 years old.


We think of our face as reflecting our internal
emotions, but that linkage works both ways - we
can change our emotional state by altering our
facial expression! Pasting a smile on your face,
even if you are consciously faking it, can
improve your mood and reduce stress.

Some of the earliest work in the area was done by


psychologist and "facial coding" expert Paul
Ekman. While experimenting with negative facial
expressions like frowns, Ekman found that his
mood seemed to be altered. In 1990, Ekman's
Even fake smiles reduce stress
research on other subjects showed that adopting
a "Duchenne smile" - a full smile that involves
facial muscles around the eyes - produced a change in brain activity that corresponded
with a happier mood.

Chopsticks, Anyone?

A few months ago, new research was published in Psychological Science by Kansas
researchers Tara Kraft and Sarah Pressman. They used a rather unusual way of getting
their subjects to simulate different smiles: the subjects held chopsticks in their mouth in
different configurations to form smiles and neutral expressions. While this seems
awkward, it's a good way to demonstrate if facial muscle activity can affect mood.

Some subjects assumed a Duchenne smile as described above; others, a "social smile"
that involved only the mouth. "Smilers" exhibited lower heart rate levels after
completing a stressful task compared to subjects who assumed a neutral expression.
Some of the forced smilers received an instruction to smile along with the chopsticks;
they showed even less stress than those who got no instruction. The Duchenne smilers
had lower stress numbers than the social smilers, though the data was insufficient to
draw a conclusion.

Today In: Leadership


Botox Levels More Than Wrinkles

As further evidence of the reverse linkage between facial muscles and emotions, Botox
injections have been shown to dampen emotional responses. These injections paralyze
small, wrinkle-causing muscles around the eyes. That makes the face look smoother, but
it also smooths out emotions to a small extent. Scientists report both a reduction in
1/2
depression symptoms as well as weaker reactions to "happy" video clips. This effect
doesn't appear to be enormous - Botox won't turn people into walking automatons. But,
a variety of research shows a measurable effect on one's emotional experience.

PROMOTED
Fake It!

Decades of research bear out the basic truth: your mood is elevated and your stress is
reduced if you plaster a big smile on your face, even for a short period of time.
(Frowns have been shown to have the opposite effect.) The smile doesn't have to be
based on real emotion - faking it works. And while the research details vary, I'd
recommend going with a full, true smile that involves your eyes as well as your mouth.
That's almost certainly a more potent mood changer.

Mood Contagion

And, there's another benefit to that Duchenne smile: if you do it in public, those around
you will be lifted as well. As the WSJ's Sumatha Reddy reminds us, UCLA scientist Marco
Iacoboni notes that our brains are wired for sociability. In particular, if one person
observes another person smile, mirror neurons in that person's brain will light up as if he
were smiling himself. So, smile in private if you must, it will still boost YOUR mood... but
why not extend that benefit to those around you by smiling in public?

Roger Dooley is the author of Brainfluence: 100 Ways to Persuade and Convince Consumers
with Neuromarketing (Wiley, 2011). Find Roger on Twitter as @rogerdooley and at his website,
Neuromarketing.

2/2
Mood induction with facial expressions of emotion in
patients with generalized anxiety disorder.
ncbi.nlm.nih.gov/pubmed/14625879

Format: Abstract

Send to

Depress Anxiety. 2003;18(3):144-8.

Author information
1
Banaras Hindu University, Varanasi, India.

Abstract
Patients with general anxiety disorder (GAD), anxiety prone subjects, and normal
controls (n=30, N=90) were subjected to happy and sad mood induction conditions using
facial expressions of emotion of varied intensity. Following mood induction, subjects
were required to judge their mood state on two scales: the Positive and Negative Affect
Scale and the Emotional Self Rating Scale. In general, the anxiety groups showed more
sensitivity to the sad mood induction condition. However, the anxiety groups had a
higher subjective rating for positive than negative emotions during the happy mood
induction condition. These findings suggest the efficacy of the mood induction
procedures in anxiety disorders.

Copyright 2003 Wiley-Liss, Inc.

1/1
The impact of facial emotional expressions on behavioral
tendencies in females and males
ncbi.nlm.nih.gov/pmc/articles/PMC2852199

PMCID: PMC2852199

NIHMSID: NIHMS171913

PMID: 20364933
,a,b,* ,a,b ,c ,d and a,b,c
The publisher's final edited version of this article is available at J Exp Psychol Hum
Percept Perform
See other articles in PMC that cite the published article.
Go to:
Emotional faces communicate both the emotional state and behavioral intentions of an
individual. They also activate behavioral tendencies in the perceiver, namely approach or
avoidance. Here, we compared more automatic motor to more conscious rating
responses to happy, sad, angry and disgusted faces in a healthy student sample.
Happiness was associated with approach and anger with avoidance. However,
behavioral tendencies in response to sadness and disgust were more complex. Sadness
produced automatic approach but conscious withdrawal, probably influenced by
interpersonal relations or personality. Disgust elicited withdrawal in the rating task
whereas no significant tendency emerged in the joystick task, probably driven by
expression style. Based on our results it is highly relevant to further explore actual
reactions to emotional expressions and to differentiate between automatic and
controlled processes since emotional faces are used in various kinds of studies.
Moreover, our results highlight the importance of gender of poser effects when applying
emotional expressions as stimuli.

Keywords: Behavioral Tendencies, Approach, Avoidance, Emotional Expression, Gender

Go to:
Facial emotional expressions are salient social cues in everyday interaction. Behavioral
data suggest that human facial expressions communicate both the emotional state of
the poser and behavioral intentions or action demands to the perceiver (Horstmann,
2003). Facial emotional expressions are used in various fields of research, e.g., social
cognition, psychology, neuroscience. A lot of work has been done in how we process
these facial signals and how we recognize the emotional meaning (e.g., Adolphs, 2006).
However, in this regard it is also highly important to objectify how the perceiver of an
emotional display responds behaviorally. How do we actually react to the displays of
1/16
various emotional states of an opponent? What kind of behavior do these social signals
trigger in the receiver? A better understanding of the effects on the perceiver might add
important insights on the interpretation of any findings related to these kinds of social
stimuli. Moreover, some studies divided the different emotional expressions in negative
and positive cues based on theoretical speculations without directly testing their impact
on behavior, thus lacking a justification of this categorization on the valence dimension.

Regarding behavioral tendencies, two opposite poles of human behavior and motivation,
approach and avoidance, are most pertinent. Gray's theory (Gray, 1982) of a behavioral
approach (BAS) and a behavioral inhibition system (BIS) has been examined most
extensively. This model supposes two antipodal motivational systems: one appetitive
(approach) and one aversive (avoidance) both forming the basis of human behavior
(Puca, Rinkenauer, & Breidenstein, 2006).

In their review of historical conceptualizations of approach and avoidance motivation,


Elliot and Covington (2001) concluded that automaticity in evaluating incoming stimuli
and generating an adequate behavioral response poses an evolutionary advantage. In
humans, however, this link might at least be influenced by other, more conscious,
motivational mechanisms, such as relationship and mood state. In accordance with the
emotion-motivation model of Lang, Bradley and Cuthbert (1990), Bargh (1997)
hypothesized that these systems build an interface between perception and action and
are directly activated by perceived stimuli. Following this assumption, responsiveness of
the BIS and BAS triggers behavioral tendencies of approach or avoidance and associated
emotions.

Studies investigating behavioral tendencies mostly followed Cacioppo, Priester and


Berntson (1993), who demonstrated that affective evaluation of neutral stimuli can be
influenced by motor manipulations such as isometric arm flexion and extension. Thus,
neutral Chinese ideographs are rated more positively during arm flexion and vice versa.
Therefore, it has been deduced that pushing a lever (extension) might be faster than
pulling (flexion) in response to aversive stimuli and pulling is faster than pushing in
response to appetitive stimuli. Chen and Bargh (1999) showed that both explicit
categorization and implicit processing of positive and negative words directly elicit
behavioral tendencies such that positive evaluations result in approach whereas
negative evaluations produce avoidance behavior. Presenting novel graphic images,
Duckworth, Bargh, Garcia and Chaiken (2002) also observed such direct behavioral
consequences of an automatic evaluation. Using two different paradigms (proprioceptive
and visual cues) to induce approach and avoidance illusions, Neumann and Strack (2000)
showed that perceived approach or avoidance movements correspondingly facilitate
valence categorization of positive and negative words. They also observed influences of
exteroceptive illusions on non-affective judgments. This effect was not mediated by
extremity of the presented words, indicating that approach and avoidance systems are
activated in an all-or-nothing fashion.

2/16
Marsh, Ambady and Kleck (2005) investigated how facial expressions (fear and anger)
affect the perceiver's behavior in an explicit categorization task. While anger expressions
were associated with avoidance, fear expressions elicited approach tendencies. The
authors hypothesized that expressing fear indicates submission rather than provoking
threat, and serves to pacify the opponent (cf. Hess, Blairy, & Kleck, 2000). However, a
preceding study questioned the assumed automatic association between evaluation and
behavior (Rotteveel & Phaf, 2004). These authors failed to find any influence of the
valence of facial expressions (happy and angry) on action tendencies when attention was
drawn to non-affective features of the targets (gender discrimination). Utilizing an
indirect joystick task (without classification of the expressions) in a sample of socially
anxious individuals, Heuer, Rinck, and Becker (2007) observed pronounced avoidance
tendencies in response to angry and happy faces in anxious patients but not in the
control group. According to the authors, it seems that the non-anxious controls were
able to ignore the task-irrelevant dimension of emotional content and responded only to
the relevant dimension (puzzle versus face), while response behavior in the anxious
group seemed to be significantly influenced by the emotional expression.

Several studies indicated that gender influences emotional processing (e.g., Hall &
Matsumoto, 2004; Thayer & Johnson, 2000). Regarding effects of gender of poser, Marsh
et al. (2005) observed that their participants responded faster to female than to male
faces and most quickly to women expressing fear. On the other hand, Rotteveel and Phaf
(2004) reported that their female sample reacted faster to male than to female faces,
particularly to angry male faces.

Due to the heterogeneity of previous results, one major aim of the present study was to
clarify behavioral reactions in response to four basic emotions (happiness, sadness,
anger, disgust) thereby extending the findings of Marsh et al. (2005) who examined
posed expressions of anger and fear. However, contrary to previous studies, we applied
evoked facial emotional expressions due to their higher ecological validity and
naturalness. Comparing the conflicting findings of Marsh et al. (2005), Rotteveel and Phaf
(2004) as well as Heuer et al. (2007), it remains to be clarified whether emotional
expressions automatically elicit approach or avoidance behavior and to which extent this
is influenced by conscious processes (e.g., task instruction). Therefore, another major
aim of this study was to elucidate possible discrepancies between more automatic and
more controlled processes in behavioral tendencies to emotional expressions. To
examine this, we added a rating task of the emotional expressions to compare the
consciously reported tendencies (explicit) with the tendencies based on the reaction time
differences (implicit), which reflects a more automatic evaluation.

Based on rating data of a large sample of adults, Horstmann (2003) reported that sad
faces signal a request for help or comfort (approach), whereas disgust and anger faces
communicate the request to go away (avoidance). The expression of happiness was
perceived as an invitation to reciprocate smiles and to cooperate. Therefore, we
hypothesize that facial expressions of sadness and happiness elicit approaching behavior
whereas angry and disgusted faces initiate avoidance. However, comparing more
3/16
automatic (joystick task) to more controlled (rating task) processes, we expect to
elucidate differential processes in reactions to these diverging facial emotional
expressions.

Finally, due to the inconsistencies in published reports on the effects of gender of poser,
we also aimed at elucidating the influence of gender of poser and rater on behavioral
tendencies by applying a gender-balanced task in a representative gender-balanced
sample.

Go to:

Participants
Fifty-five female and 49 male Caucasian Vienna University students participated in our
study. We investigated Caucasian students to obtain a homogenous sample concerning
cultural background, age (males: mean=25.08 (2.605); females: mean=24.21 (2.562)) and
level of education (years; males: mean=17.18 (2.21); females: mean=16.59 (2.00)) as these
variables have been shown to influence facial emotional processing (e.g., Hess, Blairy &
Kleck, 2000; Calder et al., 2003). All participants had normal or corrected to normal
vision. Male and female participants did not differ in age, t(99) = 1.70, ns, or years of
education, t(99) = 1.42, ns.

Materials and Procedure


We utilized the same joystick task (using the ST290 PRO Joystick, Saitek, Munich,
Germany) and reaction time software (DirectRT; Jarvis, 2004) as implemented and
described in detail by Marsh et al. (2005). The joystick was placed on a rubber pad on the
desk in front of the participants, who were instructed not to change the general position
of the joystick. Furthermore, participants were instructed to move the lever of the
joystick during the experiment only directly forward or backward as far and as quickly as
possible. DirectRT software recorded the time (with a precision of 10 ms) after each
stimulus presentation at which the lever reached its maximal point from the baseline
position.

Participants viewed 24 colored photographs of evoked facial expressions displayed by


Caucasian actors (balanced for gender) with the following four basic emotions:
happiness, sadness, anger and disgust. All expressions were taken from a stimulus set
that has been standardized and used repeatedly as neurobehavioral probes in
neuropsychological studies (see Gur et al., 2002a for development of stimuli; Derntl,
Kryspin-Exner, Fernbach, Moser, & Habel, 2008a; Derntl et al., 2008b; Fitzgerald,
Angstadt, Jelsone, Nathan, & Phan, 2006; Habel et al., 2007).

We composed two different sets of pictures, one containing happiness and disgust and
the other containing sadness and anger, based on our hypothesis that happiness and
sadness would be associated with approach and anger and disgust with avoidance. Thus,
we paired two possible antipodal emotions in these stimulus sets. Participants were
4/16
shown the same 12 emotional expressions (e.g., six happy vs. six disgusted or six sad vs.
six angry) twice, while the instruction once was to pull the lever in response to approach-
inducing emotion (e.g., happy) and to push it in response to avoidance-inducing
expressions (e.g., disgust), and for the second run the instruction was reversed.
Therefore, we had four different conditions, which we presented in randomized order.
To avoid any confusion of instructions possibly increasing mistakes, when the reversed
instruction with the same emotions directly follows, we just randomized the order of the
emotional pairs across participants. This means that we for example presented pair A
(happiness and disgust) with the “congruent” instruction (happy-pull and disgust-push),
followed by pair B (sadness and anger) with the “incongruent” instruction, then pair A
again with the “incongruent” instruction and pair B with the “congruent” instruction.

Every stimulus was presented until the first response occurred but maximally for two
seconds, followed by a black screen for two seconds. Before and between every
emotional condition a joystick practice run was conducted instructing the participants to
alternately push or pull the lever in response to an asterisk stimulus.

Additionally, we implemented a nine-point rating scale (ranging from +4 to -4), where


participants were asked to imagine standing face to face to the person and explicitly rate
their tendency to approach or avoid the person showing the particular emotional
expression. Behavioral anchors for each scale point were the number of steps
participants would make towards (+) or away (-) from the person expressing the
particular emotion observed on the screen. Participants were told not to refer their
rating to the attractiveness or trustworthiness of the person but only to the emotional
expression. This rating scale measured the explicit, conscious behavioral tendency, in
contrast to the implicit, more automatic motor reaction in the joystick task.

The order of the explicit rating and the implicit joystick task was randomized across
participants.

See for experimental setup of the two tasks.

5/16
Illustration of experimental setup for the joystick task (a) and the explicit rating paradigm
(b).

Statistical Analysis
Statistical analyses were performed using SPSS (Statistical Packages for the Social
Sciences, Version 14.0, SPSS Inc., USA).

Data of the joystick task were analyzed with a 2 (gender of participant) × 2 (gender of
poser) × 4 (emotional expression) × 2 (lever direction) ANOVA with repeated measures on
the averages of the log-transformed reaction times for correct responses to each type of
emotional expression. We also computed Wilcoxon signed-rank tests to analyze whether
the error frequencies differed as a function of lever direction for every type of emotional
6/16
expression.

The analysis of the data of the explicit rating task was calculated with a 2 (gender of
participant) × 2 (gender of poser) × 4 (emotional expression) ANOVA with repeated
measures, with the emotion specific averages of the ratings serving as dependent
measures. In the case of post-hoc multiple comparisons p-value corrections were
conducted using the Bonferroni method, multiple post-hoc t-test were Bonferroni-Holm
corrected. For significant effects estimates of effect size are listed: for the ANOVAs the
partial-eta squared and for the post-hoc t-tests d-values are given.

Go to:

Joystick task
Mean and standard errors of reaction times of pushing and pulling are illustrated in .

7/16
Mean reaction times (ms) of pushing and pulling a lever in response to emotional
expressions (a) and mean reaction times of both pushing and pulling in response to male
and female faces depicting emotional expressions (b).

The ANOVA showed a significant main effect for emotional expression, F(3,276) = 118.28, p
< .001, η2 = 0.56, with fastest reactions to happiness followed by disgust, anger and
sadness. A main effect of gender of poser, F(1,92) = 70.11, p < .001, η2 = 0.43, indicated
that all participants responded more quickly to male than to female faces. There was no
main effect of participant gender, F(1,92) = 1.306, ns, nor for lever direction, F(1,92) = 0.78,
ns, the latter, confirming a balanced responding in the lever paradigm.

Notably, the hypothesized interaction between lever direction and emotional expression
was significant, F(3,276) = 9.05, p < .001, η2 = .09, indicating differential direction for the
polar emotions. The interaction between gender of poser and emotional expression
reached significance, F(3,276) = 17.32, p < .001, η2 = 0.16, showing that response times
differed for male and female posers depending on the specific emotion. Moreover, a
trend for a gender of poser by lever direction interaction occurred, F(1,92) = 3.56, p = .062,
η2 = .037, such that in response to female faces pulling was faster than pushing and in
response to male faces pushing was faster than pulling.

Further elucidating the significant lever direction by emotional expression interaction, post-
hoc Bonferroni-Holm-corrected paired-sample t-tests revealed that this interaction was
significant for sadness, t(99) = 4.03, p < .001, d = 0.36, and happiness, t(98) = 2.52, p =
.039, d = 0.25, indicating approach tendencies. Anger expressions tended to elicit
avoidance behavior, t(99) = 2.02, p = .046 (corr. p = .092). For the expression of disgust
the post-hoc comparison was not significant, t(98) = 1.55, p = .124.

Post-hoc Bonferroni-Holm-adjusted paired-sample t-tests to clarify the significant gender


of poser by emotional expression interaction demonstrated that participants responded
(irrespective of lever direction) more quickly to angry, t(96) = 8.72, p < .001, d = 0.73 and
disgusted, t(97) = 6.61, p < .001, d = 0.44, male compared to female faces.

No other main effects or interactions of this ANOVA reached significance.

Mean and standard errors of response times (of both pushing and pulling) to male and
female faces are presented in .

The Wilcoxon signed-rank tests on the error frequencies for each lever direction in
response to every type of emotional expression did not reach significance, all ps > .40.

Rating of emotional expressions


Across all emotional expressions 13.3 % of the participants had each a standard
deviation greater than 2.5, 53.5 % > 2, 75.8% > 1.8 and 92.9 % > 1.5 in their explicit
responses. This result indicates that the whole scale length was used to rate the explicit
approach/avoidance reaction (see for an illustration of responses).

8/16
Scatterplot of explicit ratings (mean of ratings and a line indicating mean group ratings)
illustrating that the whole scale length was used for the responses. Note that equal
ratings of participants are covered by only one data point.

The 2 (gender of participant) × 2 (gender of poser) × 4 (emotional expression) ANOVA


with repeated measures on the rating data of the emotional expressions revealed a
significant main effect of emotional expression, F(3,297) = 405.13, p < .001, η2 = 0.804,
with happy faces eliciting approaching behavior, whereas sadness, disgust and anger
prompted avoidance. Neither a main effect for gender of poser, F(1,99) = 2.26, ns, nor for
gender of participant, F(1,99) = 0.19, ns, emerged.

Furthermore, the analysis yielded a significant interaction between emotional expression


and gender of poser, F(3,297) = 36.82, p < .001, η2 = 0.27, and a significant interaction
between gender of poser and gender of participant, F(1,99) = 6.70, p = .011, η2 = 0.06.

Disentangling the significant gender of poser by gender of participant interaction


revealed that women, t(51) = 2.78, p = .016, d = 0.32, rated female faces more positive
than male faces.
9/16
Post-hoc Bonferroni-Holm-adjusted t-tests of the emotional expression by gender of
poser interaction indicated that happy, t(100) = -4.74, p < .001, d = 0.37, and sad male
faces, t(100) = -3.97, p < .001, d = 0.33, were rated more positive than corresponding
female faces, but angry, t(100) = 6.16, p < .001, d = 0.63, and disgusted female faces,
t(100) = 5.54, p < .001, d = 0.56, were rated more positive than male faces. Mean and
standard errors of the rating data across female and male posers are illustrated in .

Results of the explicit rating task displayed by male and female actors visualizing the
significant emotional expression by gender of poser interaction

Go to:
The aim of the present study was to explore behavioral approach and avoidance
tendencies in response to evoked expressions of four basic emotions displayed by male
and female actors. Moreover, in contrast to previous studies (e.g., Marsh et al., 2005;
Rotteveel & Phaf, 2004), we directly compared rather automatic with more conscious
behavioral reactions to these salient emotional cues.

Emotional expressions
Data from the joystick task measuring rather implicit behavioral tendencies revealed that
happiness activates the hypothesized behavioral approach system. This finding is in
accordance with previous reports of happy faces communicating an invitation to
10/16
cooperate (Horstmann, 2003).

Angry expressions elicited pronounced avoidance according to our rating data, but this
tendency did not reach significance in the joystick data. However, Marsh et al. (2005)
showed that angry faces facilitated avoidance behaviour and Horstmann (2003)
concluded from his rating data that anger communicates the request to go away. This
idea also has intuitive appeal since in a social interaction anger is mostly elicited by a
negative action of the opponent. Our explicit rating data are congruent with the notion
that angry expressions activate the behavioral avoidance system and the implicit joystick
data strongly point into the same direction.

However, comparison of results for sadness in the joystick task (more automatic
evaluation) and the rating task (conscious processing) revealed diverging tendencies. In
the rating paradigm, subjects indicated that their conscious preference is to keep their
distance from the sad emotional face, indicating avoidance. In contrary, results of the
joystick task revealed approaching behavior. This inconsistence is really astonishing and
has several important implications for studies using sad expressions as negative cues.
Based on the joystick data, we assume that the automatic response to sadness is to
approach a sad person, who is communicating a request for help (Horstman, 2003).
However, this does not imply that this first response of approach is triggered by sadness
being positively valenced for the perceiver. Sadness is a distress cue which is supported
by the fact that with enhanced cognitive processing our participants seem to evaluate
the conveyed emotional message more negatively and react with avoidance. We suppose
an underlying conflict between the more automatic tendency to support others and the
tendency to avoid distress cues which might be triggered by social learning processes.
Moreover, previous experiences may reinforce a tendency not to burden oneself with
others' distress.

Generally, our findings seem somehow contrary to the model of Lang et al. (1990) who
proclaimed that the mere existence of an emotional stimulus will trigger the
corresponding motivational system and thereby elicits an automatic behavioral
response. However, Graziano, Habashi, Sheese and Tobin (2007) reported that perceived
relationship is an important factor in helping behavior. In their study, kin received more
help than did nonkin. Therefore, we suppose that the non-familiarity of our stimulus
faces influenced the avoidance tendencies in the conscious rating task. Notably, in the
implicit joystick task non-familiarity does not seem to affect the automatic approaching
tendencies when facing a person in need. Moreover, analysis of the impact of different
personality traits on helping behavior e.g., altruism (Wang & Wang, 2008) and
agreeableness (Graziano et al., 2007), might further elucidate those inconsistencies.

According to our data, approach might be the automatic reaction to expressions of


sadness but this tendency is interrupted and re-evaluated when higher conscious
processes are involved, such as familiarity with the sad person, personality traits, etc.

11/16
Hence, our data strongly support the significance of differentiating more controlled and
more automatic processes of interpersonal behavior to gain new insights about basic
mechanisms of social interaction.

Behavioral reactions to expressions of disgust also seem rather complex. While the lever-
pushing data were not significant, the rating data showed a significant association with
the avoidance system. From an evolutionary point of view, disgust signals a request to
avoid, e.g., the food just consumed. However, the recipient himself may cause the
disgust, for example by his or her appearance. In this case, the expression of disgust
may signal the request to go away. Rozin, Lowery and Ebert (1994) showed differences
between those two forms of disgust in facial expressions such that food-offense disgust
is associated with tongue extrusion whereas a raised upper lip signals individual-related
disgust. Our unclear finding concerning behavioral tendencies may be due to the
inclusion of both of these two disgust expressions in our stimulus material. Food-offense
disgust might lead to approach whereas expressions of individual-related disgust may
prompt avoidance by signaling the person is not tolerated here and should keep some
distance.

Reaction times
Happy faces generally elicited the fastest responses followed by disgust, anger and
sadness. Happiness appeared to be the emotion best recognized, which might have
influenced the response times of the joystick task. Hence, our data support Duckworth et
al. (2002) who also observed fastest responses to positive words, probably due to their
simple and unambiguous meaning and the fact that happy expressions are clearly
characterized by the smiling mouth and were the only positive emotion presented.

Despite its low accuracy in previous studies (e.g., Derntl et al., 2008a , b) disgust elicited
rather quick categorization and behavioral responses. This might have been due to the
pairing with happiness, enabling a quick categorization of disgust which otherwise is
hard to recognize. However, from an evolutionary point of view, disgust has a high
relevance for surviving by communicating the information of uneatable food. A
distinction between the subtypes of disgust expression (food-offense and individual-
related disgust) may help differentiate its communicative role and the thereby elicited
diverging behavioral responses in the perceiver.

Gender of poser
Although female and male participants did not differ in their behavioral tendencies,
gender of poser seems to be an important influencing factor on both conscious and
more automatic behavioral tendencies. In the explicit rating task, male sad and happy
expressions were rated more positively than the corresponding female faces. Contrarily,
angry and disgusted male faces elicited faster responses in the joystick task and were
rated more negatively than the corresponding female faces. This effect might be linked to
the observed interaction of gender of poser and gender of rater, where women tend to

12/16
evaluate female faces more positively than male faces in the rating task. Generally, it is
assumed that men learn to hide helplessness and express aggression related emotions
such as anger more clearly, whereas women are reinforced when showing helplessness
and suppressing anger (Fischer, 1993). Therefore, social learning possibly turned male
faces expressing negative emotions (e.g., anger and disgust) to more salient social cues,
which communicate the explicit message to go away.

Limitations
There were several methodological limitations of our study that have to be discussed.
Our participants saw the same pictures twice, which might have had an impact on both
joystick and rating data. In particular, performing the rating task before the joystick task
might have influenced the reactions in the joystick task, but we did not observe any
difference between the two groups (with different sequences of the tasks). The fixed
combinations (anger with sadness / disgust with happiness) in the presentation of the
four emotional expressions might also have had an impact on our results. One can argue
that combining possible antipodal emotions intensifies the effect by enhancing the
possible valence difference in the direct comparison. Future studies should clarify this by
combining different emotional expressions in a more unsystematic way.

Moreover, we used static emotional expressions despite their unrealistic character. In


order to generate more ecologically valid results, the use of dynamic facial expressions
or whole-body stimuli should be considered in future studies. Additionally, the neutral
experimental context might have influenced the behavioral tendencies. Therefore, the
implementation of a joystick task in a more realistic experimental situation (e.g., virtual
reality) may help clarify approach and avoidance behavior.

Future prospects
Responding to a different dimension (e.g., gender or puzzles versus faces) than
emotional valence (Rotteveel & Phaf, 2004; Heuer, et al., 2007) measures more automatic
behavioral tendencies than an explicit emotion categorization paradigm as has been
used in this study and by Marsh et al. (2005). Rotteveel and Phaf (2004) could not find
any RT differences in response to the different emotional expressions in their approach-
avoidance task. However, Heuer et al. (2007) reported a significant influence of
emotional expressions on motor reactions but only in the socially anxious group. Our
paradigm aimed at eliciting actual motor responses to emotional expressions recognized
as such, possibly increasing the impact of emotional expressions on behavioral
tendencies. Further examination of indirect measures of behavioral tendencies in the
joystick task is highly relevant to resolve these issues and to clarify whether the mere
presence of an emotional expression is sufficient to elicit approach or avoidance
responses.

In a very recent comprehensive theoretical and experimental framework, Eder and


Rothermund (2008) introduce the so-called evaluative coding theory which

13/16
conceptualizes approach and avoidance reactions as positively and negatively coded
motor behaviors. They hypothesize that evaluative implications of semantic action labels
and goals assign affective codes to motor responses on a representational level and if
there is a match in the evaluative stimulus-response (S-R) combination then response
facilitation is predicted. The evaluative coding theory assumes that evaluative response
codes should have a greater influence in evaluative discriminations than in gender
decisions about facial expressions thereby offering a possible explanation for the
conflicting findings of Rotteveel and Phaf (2004). Similarly, more subtle evaluative
connotations of response labels should influence the response selection process only if
they serve to discriminate the response alternatives (Ansorge & Wühr, 2004; Lavender &
Hommel, 2007). However, this theory questions the assumed association of approach
and avoidance reactions to motivational systems (e.g., Chen & Bargh, 1999) by linking the
affective response activations to any motor behavior that relies on evaluative response
codes. Nevertheless, it underlines the importance of an interdisciplinary discussion
about the theoretical definitions and implications of different research directions
examining S-R compatibility and behavioral tendencies.

Apart from theoretical speculations (e.g., Gray, 1982; Pickering & Gray, 1999) little is
known about the neuronal correlates of the behavioral approach and avoidance
systems. One functional imaging study investigating social-motivational behavior
emphasized for example the role of the orbitofrontal cortex in the voluntary control of
approach and avoidance reactions (Roelofs, Minelli, Mars, van Peer & Toni, 2008). Thus
further application of neuroimaging techniques is meritorious, especially in elucidating
the neural substrates of the observed discrepancies between more automatic and more
controlled processes.

Additionally, it would be informative to investigate implicit and explicit behavioral


tendencies in clinical populations such as depression, which is hypothesized to be
characterized by an increased BIS and decreased BAS (e.g., Gray, 1982; Kasch,
Rottenberg, Arnow, & Gotlib, 2002). Heuer et al. (2007) reported interesting findings in a
sample of socially anxious individuals, who responded highly avoidant to both angry and
happy facial expressions, but only in the indirect joystick task. In a rating task of the
valence of the emotional expressions the socially anxious individuals did not differ
significantly form non anxious controls further indicating the need for exploring
automatic and conscious behavioral tendencies separately.

Conlusions
In the context of earlier studies, our data indicate that happiness is strongly associated
with approach and anger with avoidance whereas behavioral tendencies in response to
sadness and disgust are more complex. Sadness induces automatic approach but
conscious avoidance. From an evolutionary point of view, it seems reasonable that
sadness communicates a request for help and elicits approach towards the sender, but
prior social experiences may lead to a restraint. Disgust elicited clear avoidance behavior
in the rating task, but reactions in the joystick task were rather ambiguous, probably
14/16
influenced by expression style. The expression of disgust serves an evolutionary
advantage as an important signal in avoiding noxious stimuli but can also be
interpersonally related to the opponent, thereby eliciting avoidance.

Men and women seem to react similarly to emotional expressions but there are
differences in reactions to male and female faces, which may be due to varying
socialization processes. This study shows the importance of differentiating between
more automatic and more controlled processes in social interaction since we observed
that perceivers can rate an expression as avoidance-eliciting while still showing
pronounced automatic approach tendencies towards it. Based on these results it is
highly relevant to further explore actual reactions to emotional expressions and
emotional stimuli in general since they are widely used in various kinds of studies
without fully understanding the reactions they actually release in the perceiver. This may
also further help to clarify findings on neural correlates of these different emotions, in
which approach and avoidance conflicts may explain divergent findings. Moreover,
integrating theoretical and experimental results from cognitive psychology research
might further clarify existing inconsistencies. Further investigation of perceivers'
reactions to emotional facial expressions could also focus on gender of poser effects and
enhance our understanding of gender related social interaction.

Go to:
We are grateful to two anonymous reviewers and Gernot Horstmann for their thoughtful
comments on the first version of this manuscript. E.M.S. was financially supported by the
Faculty of Medicine, RWTH Aachen University (START 690811 to B.D.), B.D. and U.H. were
supported by the German Research Foundation (DFG, IRTG 1328, KFO 112) and R.C.G.
was supported by the NIMH grant MH 60722.

Go to:
Publisher's Disclaimer: The following manuscript is the final accepted manuscript. It has not been subjected
to the final copyediting, fact-checking, and proofreading required for formal publication. It is not the definitive,
publisher-authenticated version. The American Psychological Association and its Council of Editors disclaim any
responsibility or liabilities for errors or omissions of this manuscript version, any version derived from this
manuscript by NIH, or other third parties. The published version is available at
www.apa.org/pubs/journals/xhp.

Go to:
Adolphs R. Perception and Emotion. How We Recognize Facial Expressions. Current
Directions in Psychological Science. 2006;15(5):222–226.
Bargh JA. The automaticity of everyday life. In: Wyer RS, editor. Advances in social
cognition. Vol. 10. Mahwah, NJ: Erlbaum; 1997. pp. 1–62.
Chen M, Bargh JA. Consequences of Automatic Evaluation: Immediate Behavioral
Predispositions to Approach or Avoid the Stimulus. Personality and Social
Psychology Bulletin. 1999;25(2):215–224.
Elliot AJ, Covington MV. Approach and Avoidance Motivation. Educational
Psychology Review. 2001;13(2):73–92.
Fischer A. Sex differences in emotionality: Fact or stereotype? Feminism &
Psychology. 1993;3:303–318.
15/16
Gray JA. The neuropsychology of anxiety: an enquiry into the functions of the
septo-hippocampal system. Oxford: Oxford University Press; 1982.
Hess U, Blairy S, Kleck RE. The influence of facial emotion displays, gender, and
ethnicity on judgments of dominance and affiliation. Journal of Nonverbal
Behavior. 2000;24(4):265–283.
Jarvis BG. DirectRT research software (Version 2004) New York: Empirisoft; 2004.
Computer software.
Lavender T, Hommel B. Affect and action: Towards an event-coding account.
Cognition and Emotion. 2007;21:1270–1296.
Pickering AD, Gray JA. The neuroscience of personality. In: Pervin LA, John OP,
editors. Handbook of Personality Theory and Research. New York: Guilford Press;
1999. pp. 277–299.

16/16
Facial Expressions Control Emotions
psychcentral.com/news/2018/02/01/facial-expressions-control-emotions/11082.html

Rick Nauert PhD 1. veljače 2018.

Obviously displaying a sad face or a happy face can


inform others of what you are thinking or feeling. New
research suggests facial expression may also play a role
in understanding written language.

Specifically, researchers believe facial expressions can


affect your ability to understand written language
related to emotions.

The findings were presented to the Society for Personal


and Social Psychology in Las Vegas, and will be published
in the journal Psychological Science.

The new study reported on 40 people who were treated


with botulinum toxin, or Botox. Tiny applications of this
powerful nerve poison were used to deactivate muscles
in the forehead that cause frowning.

The interactions of facial expression, thoughts and emotions has intrigued scientists for
more than a century, says the study’s first author, University of Wisconsin-Madison
psychology Ph.D. candidate David Havas.

Scientists have found that blocking the ability to move the body causes changes in
cognition and emotion, but there were always questions. (One of the test treatments
caused widespread, if temporary, paralysis.)

In contrast, Havas was studying people after a pinpoint treatment to paralyze a single
pair of “corrugator” muscles, which cause brow-wrinkling frowns.

To test how blocking a frown might affect comprehension of language related to


emotions, Havas asked the patients to read written statements, before and then two
weeks after the Botox treatment.

The statements were angry (“The pushy telemarketer won’t let you return to your
dinner”); sad (“You open your email in-box on your birthday to find no new emails”); or
happy (“The water park is refreshing on the hot summer day.”)

Havas gauged the ability to understand these sentences according to how quickly the
subject pressed a button to indicate they had finished reading it. “We periodically
checked that the readers were understanding the sentences, not just pressing the
button,” says Havas.

1/3
The results showed no change in the time needed to understand the happy sentences.
But after Botox treatment, the subjects took more time to read the angry and sad
sentences. Although the time difference was small, it was significant, he adds.

Moreover, the changes in reading time couldn’t be attributed to changes in participants’


mood.
The use of Botox to test how making facial expressions affect emotional centers in the
brain was pioneered by Andreas Hennenlotter of the Max Planck Institute in Leipzig,
Germany.

“There is a long-standing idea in psychology, called the facial feedback hypothesis,” says
Havas.

“Essentially, it says, when you’re smiling, the whole world smiles with you. It’s an old
song, but it’s right. Actually, this study suggests the opposite: When you’re not frowning,
the world seems less angry and less sad.”

The Havas study broke new ground by linking the expression of emotion to the ability to
understand language, says Havas’s advisor, UW-Madison professor emeritus of
psychology Arthur Glenberg.

“Normally, the brain would be sending signals to the periphery to frown, and the extent
of the frown would be sent back to the brain. But here, that loop is disrupted, and the
intensity of the emotion, and of our ability to understand it when embodied in language,
is disrupted.”

Practically, the study “may have profound implications for cosmetic surgery,” says
Glenberg.

“Even though it’s a small effect, in conversation, people respond to fast, subtle cues
about each other’s understanding, intention and empathy. If you are slightly slower
reacting as I tell you about something that made me really angry, that could signal to me
that you did not pick up my message.”

Such an effect could snowball, Havas says, but the outcome could also be positive:
“Maybe if I am not picking up sad, angry cues in the environment, that will make me
happier.”

In theoretical terms, the finding supports a psychological hypothesis called “embodied


cognition,” says Glenberg, now a professor of psychology at Arizona State University.

“The idea of embodied cognition is that all our cognitive processes, even those that have
been thought of as very abstract, are actually rooted in basic bodily processes of
perception, action and emotion.”

2/3
With some roots in evolutionary theory, the embodied cognition hypothesis suggests
that our thought processes, like our emotions, are refined through evolution to support
survival and reproduction.
Embodied cognition links two seemingly separate mental functions, Glenberg says.

“It’s been speculated at least since Darwin that the peripheral expression of emotion is a
part of the emotion. An important role of emotion is social: it communicates, ‘I love’ or ‘I
hate you,’ and it makes sense that there would be this very tight connection between
peripheral expression and brain mechanism.”

“Language has traditionally been seen as a very high level, abstract process that is
divorced from more primitive processes like action, perception and emotion,” Havas
says.

“This study shows that far from being divorced from emotion, language understanding
can be hindered when those peripheral bodily mechanisms are interrupted.”

Source: University of Wisconsin-Madison

3/3
Effect of Facial Expression on Emotional State Not
Replicated in Multilab Study
psychologicalscience.org/observer/effect-of-facial-expression-on-emotional-state-not-replicated-in-
multilab-study

A coordinated replication effort conducted


across 17 labs found no evidence that
surreptitiously inducing people to smile or
frown affects their emotional state. The
findings of the replication project have
been published as part of a Registered
Replication Report (RRR) in Perspectives on
Psychological Science.

The RRR project, proposed by University of


Amsterdam psychology researchers Eric-
Jan Wagenmakers, Titia Beek, Laura
Dijkhoff, and Quentin Gronau, aimed to
replicate a 1988 study conducted by
psychological scientists APS Fellow Fritz
Strack, APS Fellow Leonard L. Martin,
and Sabine Stepper.

In the 1988 paper, Strack, Martin, and Stepper reported two studies in which they
surreptitiously changed participants’ facial expressions. Their goal was to test the idea
that our facial expressions can trigger emotional reactions — the so-called “facial
feedback hypothesis” — even when people are unaware that they are making that
expression. Participants who held a pen between their teeth, inducing a smile, rated
cartoons as funnier than did those who held a pen between their lips, inducing a frown.

The study is cited frequently in the scientific literature and in introductory psychology
courses and textbooks. Although other studies have tested the facial feedback
hypothesis using different methods, this influential study had not been directly
replicated with the same design and outcome measure. The RRR paper describes a
rigorous, multilab replication of that study, with each lab following a vetted protocol that
was registered online prior to data collection.

The aim was to replicate the original study as closely as possible, but the RRR differed in
several ways from the original. Strack provided the materials from the original study,
including the original Far Side cartoons. The RRR study also used a set of Far Side
cartoons after first conducting a study to ensure that they were moderately funny by
today’s standards. The RRR protocol also standardized the instructions to participants
and stipulated that they be delivered via computer in order to minimize interactions with

1/3
the experimenter. Based on guidance from an expert reviewer during the protocol
vetting process, participants were recorded on video during the experiment to ensure
that they were holding the pen correctly on each trial.

All of the materials, the protocol, the data, and the analysis scripts are publicly available
on the Open Science Framework.

As in the original study, participants were told they would be completing different tasks
with parts of the body not normally used for those tasks. Per the instructions provided,
they held the pen in their mouth (between their teeth or between their lips) and
completed the tasks presented in a booklet, which included drawing lines between
various numbers, underlining vowels, and indicating how amused they were by cartoons.

The combined results from 1,894 participants were inconsistent with the findings
reported in the original study. The data provided no evidence that inducing participants
to have particular facial expressions led them to rate the cartoons differently.

“This RRR did not replicate the [Strack, Martin, Stepper] results and failed to do so in a
statistically compelling fashion,” the contributing researchers write in their report.

“Nevertheless, it should be stressed that the RRR results do not invalidate the more
general facial feedback hypothesis,” they conclude.

In a commentary accompanying the RRR report, Strack commends the efforts of those
involved in the RRR. He notes his surprise that the original finding was not replicated,
especially given that his and colleagues’ labs have confirmed the results in “numerous
operational and conceptual replications.” Strack speculates about some possible reasons
for the different outcomes, including that the presence of a camera in the RRR
experiments might have affected how participants reacted to the cartoons.

APS Fellow Daniel J. Simons, the acting editor for this RRR project, commended the care
taken by the proposing authors: “This team’s exceptional rigor and care in developing
the study protocol, teaching other researchers how to follow it, and fully documenting
every step of the process set a standard that I hope future large-scale studies like this
one will emulate.”

Eric-Jan Wagenmakers will speak at the 2017 APS Annual Convention, May 25–28, 2017, in
Boston, Massachusetts.

Related

NIH Funding Opportunity for Neuroimaging Data Secondary


2/3
NIH Funding Opportunity for Neuroimaging Data Secondary
Analysis
The National Institutes of Health have announced a funding opportunity for projects that
will conduct secondary analysis of existing data from the Brain Research through
Advancing Innovative Neurotechnologies Initiative. NIH intends to commit an estimated
total of $4 million to fund 8 awards in fiscal year 2020. More

How to Maintain Data Quality When You Can’t See Your


Participants
Online data collection makes recruiting study participants faster and easier, but at what
cost? Psychological scientist Jennifer Rodd of University College London outlines the
steps researchers should take to manage the uncertainty inherent in remote
experimentation. More

March Methodology Madness


Borrowing from the “March Madness” college basketball tournament in the United
States, the Observer presents our annual look at methodology innovations and research
practices. More

3/3
Archival ReportAutomatic Mood-Congruent Amygdala
Responses to Masked Facial Expressions in Major
Depression
sciencedirect.com/science/article/abs/pii/S0006322309008981

Background
Cognitive theories of depression predict mood-congruent negative biases already at
automatic stages of processing, although several behavioral studies seem to contradict
this notion. That is, depression should potentiate emotional reactivity to negative
emotional cues, whereas it should reduce reactivity in response to positive emotional
stimuli. Assessing neurobiological substrates of automatic emotion processing might be
a more sensitive challenge for automatic negative bias in depression than behavioral
measures.

Methods
In 30 acutely depressed inpatients and 26 healthy control subjects, automatic amygdala
responses to happy and sad facial expressions were assessed by means of functional
magnetic resonance imaging (fMRI) at 3 Tesla. To examine automatic responses, a
presentation paradigm using subliminal, backward-masked stimuli was employed. A
detection task was administered to assess participants' awareness of the masked
emotional faces presented in the fMRI experiment.

Results
Detection performance was at chance level for both patients and healthy control
subjects, suggesting that the neurobiological reactions took place in absence of
conscious awareness of the emotional stimuli. A robust emotion by group interaction
was observed in the right amygdala. Whereas healthy control subjects demonstrated
stronger responses to happy faces, depressed patients showed the opposite.
Furthermore, amygdala responsiveness to happy facial expression was negatively
correlated with current depression severity.

Conclusions
1/2
Depressed patients exhibit potentiated amygdala reactivity to masked negative stimuli
along with a reduced responsiveness to masked positive stimuli compared with healthy
individuals. Thus, depression is characterized by mood-congruent processing of
emotional stimuli in the amygdala already at an automatic level of processing.

Copyright © 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights
reserved.

2/2
Research paperInduction of depressed and elated mood
by music influences the perception of facial emotional
expressions in healthy subjects
sciencedirect.com/science/article/abs/pii/016503279400092N

Outline
1. Abstract
2. Keywords
3. References

https://doi.org/10.1016/0165-0327(94)00092-NGet rights and content

Abstract
The judgement of healthy subject rating the emotional expressions of a set of schematic
drawn faces is validated (study 1) to examine the relationship between mood
(depressed/elated) and judgement of emotional expressions of these faces (study 2).

Study 1: 30 healthy subjects judged 12 faces with respect to the emotions they express
(fear, happiness, anger, sadness, disgust, surprise, rejection and invitation). It was found
that a particular face could reflect various emotions. All eigth emotions were reflected in
the set of faces and the emotions were consensually judged. Moreover, gender
differences in judgement could be established.

Study 2: In a cross-over design, 24 healthy subjects judged the faces after listening to
depressing or elating music. The faces were subdivided in six ‘ambiguous’ faces (i.e.,
expressing similar amounts of positive and negative emotions) and six ‘clear’ faces (i.e.,
faces showing a preponderance of positive or negative emotions). In addition, these two
types of faces were distinguished with respect to the intensity of emotions they express.

11 subjects who showed substantial differences in experienced depression after


listening to the music were selected for further analysis. It was found that, when feeling
more depressed, the subjects perceived more rejection/sadness in ambiguous faces
(displaying less intensive emotions) and less invitation/happiness in clear faces. In

1/2
addition, subjects saw more fear in clear faces that express less intensive emotions.
Hence, results show a depression-related negative bias in the perception of facial
displays.

Previous article in issue

Next article in issue

Keywords
Perception

Facial expression

Musical mood induction

Depression

Ambiguity

View full text


Copyright © 1995 Published by Elsevier B.V.

Citations

Citation Indexes: 132

Captures

Exports-Saves: 92
Readers: 150

View details

2/2
Why our facial expressions don’t reflect our feelings
bbc.com/future/story/20180510-why-our-facial-expressions-dont-reflect-our-feelings

By Talya Rachel Meyers

10 May 2018
While conducting research on emotions and facial expressions in
Papua New Guinea in 2015, psychologist Carlos Crivelli discovered
something startling.

He showed Trobriand Islanders photographs of the standard Western face of fear –


wide-eyed, mouth agape – and asked them to identify what they saw. The Trobrianders
didn’t see a frightened face. Instead, they saw an indication of threat and aggression.

In other words, what we think of as a universal expression of fear isn’t universal at all.
But if Trobrianders have a different interpretation of facial expressions, what does that
mean?

You might also like:


• There are 19 types of smile but only 6 are for happiness
• How your face betrays your personality – and health
• The 'untranslatable' emotions you never knew you had

One emerging – and increasingly supported – theory is that facial expressions don’t
reflect our feelings. Instead of reliable readouts of our emotional states, they show our
intentions and social goals.

1/7
Our faces are ways we direct the trajectory of a social interaction – Alan
Fridlund

The face acts “like a road sign to affect the traffic that’s going past it,” says Alan Fridlund,
a psychology professor at University of California Santa Barbara who co-authored a
recent study with De Montfort University's Crivelli arguing for a more utilitarian view of
facial expressions. “Our faces are ways we direct the trajectory of a social interaction.”

That’s not to say that we actively try to manipulate others with our facial expressions
(though every once in a while, we might). Our smiles and frowns may well be instinctive.

But our expressions are less a mirror of what’s going on inside than a signal we’re
sending about what we want to happen next. Your best ‘disgusted’ face, for example,
might show that you’re not happy with the way the conversation is going – and that you
want it to take a different tack.

“It’s the only reason that makes sense for facial expression to have evolved,” says Bridget
Waller, an evolutionary psychology professor at the University of Portsmouth. Faces, she
says, are always “giving some sort of important and useful information both to the
sender… and to the receiver.”

While it may seem sensible, this theory has been a long time coming.

The idea that emotions are fundamental, instinctive and expressed in our faces is deeply
ingrained in Western culture. Ancient Greeks placed the ‘passions’ in opposition to
reason; in the 17th Century, philosopher René Descartes laid out six basic passions
which could interfere with rational thought. Artist Charles Le Brun then connected them
to the face, laying out “the anatomically correct and appropriately nuanced facial
configuration for each Cartesian passion”, write Crivelli and Fridlund.
2/7
In the 1960s and ’70s, scientific research also began to back up the idea that a few basic
emotions could be universally understood through facial expressions.

In different countries around the world, researcher Paul Ekman asked subjects to match
photos of facial expressions with emotions or emotional scenarios. His studies seemed
to indicate that some expressions, and their corresponding feelings, were recognised by
people of all cultures. (These “basic emotions” were happiness, surprise, disgust, fear,
sadness, and anger.) Today, the legacy of Ekman’s theories is everywhere: from the
“Feelings” posters you see in preschools with their cartoons of smiles and frowns to a US
government programme designed to identify potential terrorists.

3/7
But this viewpoint has always had detractors. Margaret Mead, who believed that our
expressions were learned behaviours, was among them. So was Fridlund, who early in
his
career collaborated on two articles with Ekman before becoming disillusioned with
Ekman’s ideas.

Face-off

New research is challenging two of the main pillars of basic emotion theory. First is the
idea that some emotions are universally shared and recognised. Second is the belief that
facial expressions are reliable reflectors of those emotions. “They are two different
points which have really been confounded by scholars,” says Maria Gendron, a
psychology researcher at Northeastern University soon joining the Yale University
faculty.

That new research includes recent work by Crivelli. He has spent months immersed with
the Trobrianders of Papua New Guinea as well as the Mwani of Mozambique. With both
indigenous groups, he found that study participants did not attribute emotions to faces
in the same way Westerners do.

It was not just the face of fear, either. Shown a smiling face, only a small percentage of
Trobrianders declared that the face was happy. About half of those who were asked to
describe it in their own words called it “laughing”: a word that deals with action, not
feeling. And several described the smiling face as displaying the “magic of attraction”, a
uniquely Trobriand-identified emotion that Crivelli describes as “a raptured
enchantment”, or a feeling of being positively impacted by magic.

4/7
Gendron found similar reactions while studying other indigenous groups – the Himba
people in Namibia and the Hadza in Tanzania. Both groups, when asked to describe a
facial expression in their own words, tended not to describe an expression as “happy” or
“sad”. Instead, they would focus on the actions of the people in the photographs
(describing them as laughing or crying) or extrapolate reasons for the expressions
(“Someone has died”).

In other words, neither researcher found evidence that what is behind a facial expression
– including whether an expression reflects an innermost emotion at all – is innately or
universally understood.

Researchers found that a minority of people’s faces reflected their


actual feelings

Making matters more complicated, even when our facial expressions are interpreted by
others as exhibiting a certain feeling, they might pinpoint an emotion we’re not actually
experiencing.

In a 2017 analysis of about 50 studies, researchers found that only a minority of people’s
faces reflected their actual feelings. According to co-author Rainer Reisenzein, there was
one strong exception: amusement, which almost always resulted in smiling or laughter.

Reisenzein hesitates to interpret what those findings mean. “I’m one of these old-
fashioned scientists who just do research,” he jokes. However, he does feel that there
are good evolutionary reasons for us not to reveal our inner states to other people: “It
puts us at a disadvantage.”

5/7
If our expressions don’t actually reflect our feelings, there are enormous consequences.

One is in the field of artificial intelligence (AI), specifically robotics. “A good number of
people are training their artificial intelligence and their social robots using these classic
‘poster’ faces,” says Fridlund. But if someone who frowns at a robot is signalling
something other than simple unhappiness, the AI may respond to them incorrectly.

“There’s no way to predict how the robot should react when it sees a smiley face or a
pouty face or a growling face,” Fridlund points out. “You have to have some kind of
knowledge of the person’s role with respect to you, and also your history together,
before knowing what that face means.” Fridlund, who consults with companies that
develop AI, feels that AI taught to draw from contextual cues will be more effective.

For most of us, though, the new research may have most of an effect on how we
interpret social interactions. It turns out that we might communicate better if we saw
faces not as mirroring hidden emotions – but rather as actively trying to speak to us.

6/7
People should read faces “kind of like a road sign,” says Fridlund. “It’s like a switch on a
railroad track: do we go here or do we go there in the conversation?” That scowl on your
friend’s face may not be actual anger; maybe she just wants you to agree with her point
of view. Your son’s pout doesn’t necessarily reflect sadness; he may just want you to
empathise or to protect him from an uncomfortable situation.

An inappropriately-timed laugh might show that you’re not paying


close attention to the conversation, or may even signal hostility

Take laughter, says Waller: “when you laugh and how you laugh within a social
interaction is absolutely crucial.” An inappropriately-timed laugh might not reveal your
inner joy at what’s going on – but it might show that you’re not paying close attention to
the conversation, or may even signal hostility.

For Crivelli, our faces may even be more calculating than that. He compares us to
puppeteers, with our expressions like “invisible wires or ropes that you are trying to use
to manipulate the other.”

And, of course, that other person is manipulating us right back. We’re social creatures,
after all.

If you liked this story, sign up for the weekly bbc.com features newsletter , called “If You
Only Read 6 Things This Week”. A handpicked selection of stories from BBC Future, Culture,
Capital, and Travel, delivered to your inbox every Friday.

7/7

S-ar putea să vă placă și