Sunteți pe pagina 1din 4

A new tool to support diagnosis of neurological disorders by means of facial expressions

Vitoantonio Bevilacqua12, Dario DAmbruoso1, Giovanni Mandolino1


Dipartimento di Elettrotecnica ed Elettronica, Politecnico di Bari, Via Orabona, 4, Bari, 70125, Italy 2 e.B.I.S. s.r.l. (electronic Business in Security), Spin-Off of Polytechnic of Bari Via Pavoncelli, 139 Bari Italy corresponding author: bevilacqua@poliba.it
Abstract Recognizing facial emotions is an important aspect of interpersonal communication that may be impaired in various neurological disorders: Aspergers syndrome, Autism, Schizoid Personality, Parkinsonism, Urbach-Wiethe, Amyotrophic Lateral Sclerosis, Bipolar Disorder, Depression, Alzheimer. Altough it is not possible to define unique emotions, we can say that are mental states, physiological and psychophysiological changes associated with internal or external stimuli, both natural and learned. Replicating the studies by Charles Darwin, the Canadian psychologist Paul Ekman has confirmed that an important feature of basic emotions is the fact that they are expressed universally. The face, as is known, is the space where, in a communication process, is concentrated most of the sensory information, whether they exhibited (the issuer) or read (receive). According to Ekman and Friesen "the face is a system of multisignal response, multi-message capable of tremendous flexibility and specificity." These signals are combined to produce different messages, and the movement of facial muscles pull the skin, temporarily distorting the shape of the eyes, eyebrows, lips and the appearance of folds, wrinkles and facial swelling in different parts of the face. This paper highlights certain requirements that the specification approach would need to meet if the production of such tools were to be achievable. In particular, we present an innovative and still experimental tool to support diagnosis of neurological disorders by means of facial expressions. At the same time, we propose a new study to measure several impairments of patients recognizing emotions ability, and to improve the reliability of using them in computer aided diagnosis strategies. Keywords: Emotion Recognition; Action Unit Recognition; Neurological Disorders
1

"motio", that means moving, so "emotion" indicates a movement, in particular a body movement. Ekman's theory is based on a experiment analysis and cross-cultural comparison (Ekman et. al., 1972). Its been observed as an emotional expression within a specific population is interpreted correctly and uniformly by other ones, and vice versa. If there are inborn expressions across in the whole of humanity, it means that there are common emotions they generate, and these can be defined as primary. Ekman has thus identified six primary emotions: 1. Happy; 2. Surprise; 3. Disgust; 4. Anger; 5. Fear; 6. Sadness.

neutral

happy

sad

angry

fear

disgust

FIGURE 1 Examples of Emotional Expressions, from left to right: Mild (row 1) and Extreme (row 2)expressions. While universal emotions are recognized across different cultural and ethnic groups, social emotions, such as guilt, shame, arrogance, admiration and flirtatiousness, are particular of culture specific interactions. Processing of affect is supported by distributed neural systems, and as lesion studies show, preferentially by the right hemisphere. On account of human and primate studies, cortical areas of the inferior occipital gyrus, fusiform gyrus and inferior temporal gyrus

I.

INTRODUCTION

Ability to produce and recognize facial expressions of emotion is an important component of interpersonal communication in humans and primates. Cultures may differ in social rules and customs, however certain facial expressions of emotion are universally recognized. These universal emotions include happiness, sadness, anger, fear, disgust, surprise. The word "emotion" comes from the Latin "ex" that means out, and

have been reported as essential for face processing. The amygdala receives input from these cortical areas and from the visual pathways of the midbrain and thalamus [1]. The amygdala is mainly involved in the processing of fear, but is activated upon passive and implicit presentations of facial expressions of emotions in general. Orbitofrontal areas, in particular on the right, are involved in explicit identification of facial emotions. Both amygdala and frontal areas can modulate sensory areas via feedback mechanisms, engage other cortical areas, such as hippocampus and neocortex, and via connections to motor cortex, hypothalamus, and brainstem generate a physical response. There are two main theories regarding the hemispheric specialization in processing emotions. The right hemispheric hypothesis posits that the right hemisphere is specialized to process all emotions, whereas the valence hypothesis regards the right hemisphere as superior for processing negative and the left hemisphere as superior for positive emotions. Common negative emotions are sadness, fear and anger, common positive emotions are happiness and possibly, surprise. While cognition dysfunction has been thoroughly evaluated across neuropsychiatric disorders, impairments in emotion recognition have received increasing attention within the past 15 years. Early investigations on emotion recognition were limited to groups of persons with schizophrenia, depression and brain injury, however within the past decade, emotion recognition deficits have been explored in a wide range of brain related disorders. Now we will analyze the neurological disorder linked to emotional recognition deficits and marked impairment in the use of facial expression.

nods. They rarely experience strong emotions such as anger and joy. In the Major Depressive there is presence of a depressed mood can be inferred from the person's facial expression and demeanor. Some individuals emphasize somatic complaints (e.g., bodily aches and pains) rather than reporting feelings of sadness. Many individuals report or exhibit increased irritability (e.g., persistent anger, a tendency to respond to events with angry). In the Autistic Disorder is known a marked impairment in the use of multiple nonverbal behaviors, such as eye-to-eye gaze, facial expression [3]. III. IMPAIRMENT EXPRESSION IN THE USE OF FACIAL

In Schizophrenia the emotion processing deficits may relate to dysfunction of mesial temporal regions. Schizophrenia patients are impaired in overall emotion recognition, particularly fear and disgust, and the neutral cues are frequently misidentified as unpleasant or threatening. The subjects affected of Bipolar Disorder showed impairment in emotion recognition, in particular for fear and disgust. In matched controls on a task that included happy, neutral and sad faces, a group of Depression patients was more impaired on happy recognition and considered it a negative emotion [4]. In Urbach-Wiethe disease (lipoid proteinosis), a disorder that produces selective and severe bilateral distruction of the amygdala, while sparing hippocampal and neocortical structures, Adolphs and Colleagues [5] reported selective impairment of fear recognition, when they compared subjects with brain damage and healthy ones. In patients with Alzheimer may be a deficit in processing some or all facial expressions of emotions [6]. Emotional recognition deficits occur in bulbar Amyotrophic Lateral Sclerosis, particularly with emotional facial expressions, and can arise independent of depressive and dementia symptoms or co-morbidity with depression and dementia. These findings expand the scope of cognitive dysfunction detected in ALS, and bolsters the view of ALS as a multisystem disorder involving cognitive and motor deficits [7]. IV. ACTION UNITS Before describing the tools, we explain and discuss the meaning of Action Units (AUs) [8]. AUs are the basic elements for the construction of an expression, they represent minimal facial actions, that are not further separated into more simple actions. The muscle actions and AU do not coincide: in fact, an AU may correspond to the action of one or more of the same muscles and a muscle can be associated with several AUs. Au is, in short, a basic change in appearance of the face, caused by activation of one or more facial muscles. The AUs are divided into groups according to position and/or the type of action involved. First AUs are in the UPPER FACE, and affect the eyebrows, forehead, and eyelids. Then the LOWER FACE AUs are presented in a file groups: up/down, Horizontal, Oblique, Orbital, and Miscellaneous.

FIGURE 2 Section image of an anatomical limbic lobe. II. EMOTIONAL RECOGNITION DEFICIT

In the Asperger's Disorder there is a marked impairment in the use of multiple non verbal behaviors such as eye-to-eye gaze, facial expression, body postures, and gestures to regulate social interaction, a lack of spontaneous seeking to share enjoyment [2]. In the Parkinsonism there is known a decrease of spontaneous facial expressions, gestures, speech, or body movements. Usually in the Schizoid Personality Disorder is display a "bland" exterior without visible emotional reactivity and rarely reciprocate gestures or facial expressions, such as smiles or

V.

METHODS

The basic idea is to track the face, after it detected by a C++ software library for finding features in faces: Stasm [9] or automatic facial feature points detection [10]. Stasm extends the Active Shape Model of Tim Cootes and his colleagues with other techniques. Before searching for facial features begins, Stasm uses the Rowley or the Viola Jones face detector to locate the overall position of the face. The STASM is a software that fits a standard shape in the frontal face images with ASM algorithm. The shapes are fitted by moving the points along a line orthogonal to the boundary called Whisker. The points movement is made by weighing changes in levels of gray along the whisker. Each point has an associated profile vector and the image is sampled along the one-dimensional whisker. After the shape is fitted, the STASM creates the log file and the image with the overlapped shape. The algorithm begins with the acquisition of the shape coordinates points from the log file. After selecting the face, Stasm tracks the fiducial points. You give it an image of a face and it returns the positions of the facial features. It also allows you to build your own Active Shape Models (Figure.3).

FIGURE 4 Identification features points on the face. VI. FROM ACTION UNIT TO EMOTION Numerous state of the art related to encoding of emotions according to FACS allowed us to compile a matrix in which each AU is associated with expressions (and emotions) in which it occurs. Below are listed only three of all AUs, precisely the ones we have selected as indicative of the difference between positive and negative attitude.

AU 4
Brow lowerer Anger Fear Sadness [11] FIGURE 3 The mean face" (yellow) positioned over a search face at the start of a search. AFFPD made a face segmentation in which it localizes the various face components (eyebrows, eyes, mouth, nose and chin). After this, in each component, it detects 18 features points: the two pupils, the four eye corners, the four eyebrow corners, the two nostrils, the nose tip, the two mouth corners, the upper and lower lip extremity and the tip of chin. It uses the competitive learning algorithm developed by Mavrinac to divide the image in three cluster (The darkest cluster contains face features such as eyes, mouth, nostrils and eyebrows), then it finds the regions that probably contain the eyes by means of template matching and to verify the real presence of the eyes by a Support Vector Machine classifier. Once detected the eyes, it proceeds to the localization of the other facial components using simple anthropometrical considerations. Anger Fear Sadness [14] Anger Fear Sadness [12] Anger Fear Sadness [15] Sadness 100% Disgust 87,5% Anger 85,5% Fear 75% Embarrassment 12,5% Surprise 37,5% [16]

AU 9
Nose wrinkle Disgust [11] Disgust [14] Disgust [12] Disgust [15] Disgust [13] Disgust 75% [16]

AU 12
Lip corner puller Happy [11] Happy [14] Happy [12] Happy [15] Happy [13] Joy 100% Happy 50% [16]

Should be stated that [16] uses a different nomenclature for the emotions, or the theory of Robert Pluntchik according to which basic emotions are not six but eight, two by two pairs. CONCLUSIONS Analyzing the photos at our disposal, as well as on the matrix shown, we can concluded that to distinguish between positive and negative emotions is not necessary evaluate all the AUs, so we will consider negative expressions those with AUs 4 (Figure 6), or AU 9 (Figure 7), and positive ones that have only the AU 12 (Figure 8).

The rise of the nostrils and the upper lip;

The move away and raising the corners of the mouth.

FIGURE 6 FIGURE 7 FIGURE 8 This images represent particular Aus, in the order: Brow lowerer, Nose wrinkle, Lip corner puller. The question is different if we had to specify the emotion and not the simple distinction between positive and negative attitudes. Moreover still different question you might have if you wanted to distinguish a false from a real attitude, for example a smile, must be considered true if it presents the socalled crow's feet at the corners of the eyes, otherwise you are sure that the same is forced. And ultimately if you want a higher accuracy (safety recognition) we should not to consider only the displacements between the red points in the following pictures, but we must also consider the formation of wrinkles in particular areas of the face like the forehead, the nasolabial area, and so on, which could make use of certain filters of OpenCV library. In conclusion, to make the simple distinction between positive emotions and negative emotions, and then recognize AU 4, AU 9, and 12 AU, we must evaluate:

To extract the points that we discuss, we use the STASM Software or Automatic facial feature points detection. The tool aims to ultimately recognize the expressions of a face, and then translates them into positive and negative attitudes. The ultimate aim of all this is: 1. help diagnose neurological distress that have been a loss of expressiveness in the face. A photo or video is shown to the patient, he responds by changing the facial expression, the tool decodes the expression intensity or its absence. 2. help the patient (the one who is unable to decode expression), to recognize them. REFERENCES
[1] [2] [3] [4] [5] Recognition of Facial Emotions in Neuropsychiatric Disorders Christian G. Kohler, MD, Travis H. Turner, BS, Raquel E. Gur, et. al 2004; Diagnostic and Statistical Manual of Mental Disorders Fourth Edition American psychiatric association 1994; Diagnostic Criteria in Neuorology Alan J.Lerner, MD 2006; Facial emotion discrimination: II Behavioral findings in depression. Psychiatry Res. 42:241-251 Gur RC, Erwin RJ, Gur RE, et al. 1992; Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala. Nature 372:669-672 Adolphs R, Tranel D, Damasio H, et al. 1994; Emotion-discrimination deficits in mild Alzheimer disease. Am J Geriatr Psychiatry 13:926933 Kohler CG, Anselmo-Gallagher G, et al. 2005; Emotional perception deficits in amyotrophic lateral sclerosis Dep. of Neurology, College of Medicine, Penn State/Hershey Medical Center, Hershey, PA 17033, USA Zimmerman EK, Eslinger PJ, et al. 2007; FACS: Facial Action Coding System , Paul Ekman, et al. 2002; Locating Facial Features with an Extended Active Shape Model S. Milborrow and F. Nicolls;

[6] [7]

[8] [9]

[10] Automatic Facial Feature Points Detection LNAI 5227 Edited by:D.-S. Huang et al.. 11421149 Vitoantonio Bevilacqua et al. 2008; [11] Facial Expression Recognition in Image Sequences using Geometric Deformation Features and Support Vector Machines Irene Kotsiay Ioannis Pitasy,Senior Member IEEE 2007; [12] Selection for Universal Facial Emotion Bridget M. Waller et al. 2008; [13] Classification of Upper and Lower Face Action Units and Facial Expressions using Hybrid Tracking System and Probabilistic Neural Networks Hadi Seyedarabi, Won-Sook Lee et al. 2006; [14] Emfacs Friesen & Kalman 1984; [15] Investigators guide Ekman; [16] Worth a Thousand Words: Absolute and Relative Decoding of Nonlinguistic Affect Vocalizations (supplement) Skyler T. Hawk, Gerben A. van Kleef, Agneta H. Fischer, and Job van der Schal 2009.

The approach and lowering of the interior angles eyebrows;

S-ar putea să vă placă și