Open Access Open Access  Restricted Access Subscription or Fee Access

Emotion Recognition for Mentally Retarded Person Using Viola Jones Algorithm

Anuja A. Bagade, Nagnath B. Hulle

Abstract


Automatic recognition of facial expression for mentally retarded person is very challenging activity in today’s electronic edge.  In human social interaction system, facial expression is a means of conveying one’s emotional/ mental status. So human- machine interface system plays a vital role in non-verbal communication. In this proposed system, we focused on human centered design research on families having mentally retarded member. A magic mirror paradigm, an augmented reality (AR) system where a camera and display device acts as a mirror so that one can see one his/her reflection.  Through the camera emotional/mental status of patient is recognized and compared with the database. We have developed an embedded module containing information package unit, some input output units and external hardware contains GSM module. A new human machine interface system specially designed for mentally retarded person allows the patient’s parent or doctor to keep eye on patent’s every minute activity. Aspects of design, technology, and results are presented. The purpose of this project is to implement and thereby recreate the face detection algorithm presented by Viola-Jones. The ideal goal of any face detection algorithm is to perform on par with a human inspecting the same image, but this project will constrain itself to only match the figures posted by Viola-Jones. This paper describes human machine interface approach for face detection which is capable of detecting faces and facial expressions.


Keywords


Emotion Recognition, Mentally Retarded Person, An Augmented Reality (AR) System, Viola Jones Algorithm, SMS Reminder, GSM Module.

Full Text:

PDF

References


Yuan-Chih Yu, Shingchern D. You, Dwen-Ren Tsai, “Magic Mirror Table for Social-Emotion Alleviation in the Smart Home”, IEEE Transactions on Consumer Electronics, Vol. 58, No. 1, February 2012

Hiroko Sukeda, Youichi Horry, Yukinobu Maruyama, and Takeshi Hoshino, “Information-Accessing Furniture to Make Our Everyday Lives More Comfortable”, IEEE Transactions on Consumer Electronics, Vol. 52, No. 1, February 2006.

Min Kyung Lee,Scot David of, John Zimmerman and Anind Dey, “Smart Homes, Families, and Control”, Carnegie Mellon University Research Showcase.

Scott Davidoff, Min Kyung Lee, John Zimmerman and Anind Dey, “Socially-Aware Requirements for a Smart Home”, Carnegie Mellon University.

Maja Pantic and Ioannis Patras, “Detecting Facial Actions and their Temporal Segments in Nearly Frontal-View Face Image Sequences”, 2005 IEEE International Conference on Systems, Man and Cybernetics Waikoloa, Hawaii October 10-12, 2005.

Maja Pantic and Ioannis Patras, Member, “Dynamics of Facial Expression: Recognition of Facial Actions and Their Temporal Segments From Face Profile Image Sequences”, IEEE Transactions On Systems, Man, And Cybernetics—Part B: Cybernetics, Vol. 36, No. 2, April 2006.

Hans van Kuilenburg, Marco Wiering, and Marten den Uyl,“A Model Based Method for Automatic Facial Expression Recognition”, to appear in the proceedings of ECML-2005.

A. Vinciarelliand H. Salamin, M. Pantic, “Social Signal Processing: Understanding Social Interactions through Nonverbal Behaviour Analysis”, 978-1-4244-3993-5/09 ©2009 IEEE.

Ying-li Tian, Takeo Kanade, and Jeffrey F. Cohn, “Recognizing Action Units for Facial Expression Analysis”, IEEE transactions on pattern analysis and machine intelligence, vol. 23, no. 2, february 2001.

Alessandro Vinciarelli, M. Pantic, Herve Bour Lard, “Social signal processing: Survey of an emerging domain”, Image and Vision Computing 27 (2009) 1743–1759.

C.P. Sumathi, T. Santhanam, and M.Mahadevi, “Automatic Facial Expression Analysis A Survey”, International Journal of Computer Science & Engineering Survey (IJCSES) Vol.3, No.6, December 2012.

Stelios Krinidis, Ioannis Pitas, “Statistical Analysis of Human Facial Expressions”, Journal of Information Hiding and Multimedia Signal Processing c ⃝ 2010 ISSN 2073-4212 Ubiquitous International Volume 1, Number 3, July 2010.

T. Balomenos, A. Raouzaiou, S. Ioannou, A. Drosopoulos, K. Karpouzis, and S.Kollias, “Emotion Analysis in Man-Machine Interaction Systems”, S. Bengio and H. Bourlard (Eds.): MLMI 2004, LNCS 3361, pp. 318 – 328, 2005. © Springer-Verlag Berlin Heidelberg 2005.

Jeffrey F. Cohn, “Foundations of Human Computing: Facial Expression and Emotion”, Department of Psychology University of Pittsburgh Pittsburgh, PA 15260 USA 1-412-624-8825.

Maja Pantic, Alex Pentland, Anton Nijholt and Thomas Huang, “Human Computing and Machine Understanding of Human Behavior: A Survey”, ICMI'06, November 2–4, 2006, Banff, Alberta, Canada. Copyright 2006 ACM 1-59593-541-X/06/001/$5.00.

Yubo Wang, Haizhou AI, Bo Wu, Chang Huang, “Real Time Facial Expression Recognition with Adaboost”, Dept. of Computer Science and Technology, Tsinghua University, State Key Laboratory of Intelligent Technology and Systems , Beijing 100084, PR China.

Ole Helvig Jensen, “Implementing the Viola-Jones Face Detection Algorithm”, IMM-M.Sc.: ISBN 87-643-0008-0 ISSN 1601-233X .

Theo Ephraim and Tristan Himmelman, “Optimizing Viola-Jones Face Detection for Use In Webcams”.


Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.