Academic Journals Database
Disseminating quality controlled scientific knowledge

Facial Emotion Recognition Using Context Based Multimodal Approach

ADD TO MY LIST
 
Author(s): Priya Metri | Jayshree Ghorpade | Ayesha Butalia

Journal: International Journal of Interactive Multimedia and Artificial Intelligence
ISSN 1989-1660

Volume: 1;
Issue: 4;
Start page: 12;
Date: 2011;
VIEW PDF   PDF DOWNLOAD PDF   Download PDF Original page

Keywords: Body posture recognition system | Emotion recognition | Face Detection | Facial Action Units | Facial expression recognition system | Multimodal approach

ABSTRACT
Emotions play a crucial role in person to person interaction. In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers. The ability to understand human emotions is desirable for the computer in several applications especially by observing facial expressions. This paper explores a ways of human-computer interaction that enable the computer to be more aware of the user’s emotional expressions we present a approach for the emotion recognition from a facial expression, hand and body posture. Our model uses multimodal emotion recognition system in which we use two different models for facial expression recognition and for hand and body posture recognition and then combining the result of both classifiers using a third classifier which give the resulting emotion . Multimodal system gives more accurate result than a signal or bimodal system
Save time & money - Smart Internet Solutions     

Tango Rapperswil
Tango Rapperswil