Hupont, IsabelleCerezo, EvaPere Brunet and Nuno Correia and Gladimir Baranoski2014-01-312014-01-3120063-905673-60-6https://doi.org/10.2312/LocalChapterEvents/siacg/siacg06/179-185When developing new multimodal user interfaces emotional user information may be of great interest. In this paper we present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of 10 characteristic points (that are part of the MPEG4 feature points) to extract relevant emotional information (basically five distances, presence of wrinkles and mouth shape). The method defines and detects the six basic emotions (plus the neutral one) in terms of this information and has been fine-tuned with a database of 399 images. We analyze the effect of different facial parameters and other issues like gender and ethnicity in the classification results. For the moment, the method is applied to static images.Categories and Subject Descriptors (according to ACM CCS): I.3.6 [Computer Graphics]: Interaction Techniques I.4.8 [Image Processing and Computer Vision]: Scene AnalysisIndividualizing the New Interfaces: Extraction of User's Emotions from Facial Data