EMOTION AND MOOD RECOGNITION IN RESPONSE TO VIDEO

Authors

  • DINI HANDAYANI Computer Science Department, Faculty of Information and Communication Technology International Islamic University Malaysia, Malaysia
  • ABDUL WAHAB Computer Science Department, Faculty of Information and Communication Technology International Islamic University Malaysia, Malaysia
  • HAMWIRA YAACOB Computer Science Department, Faculty of Information and Communication Technology International Islamic University Malaysia, Malaysia

Keywords:

Affective computing, mood, emotion recognition, video clip, EEG

Abstract

This paper presents a subject-dependent homogenous emotion recognition method using electroencephalogram (EEG) signals in response to video contents, and a correlation between emotions and moods of subjects in resting state. In the recent years, there has been a trend towards recognizing emotions invoked from watching videos. Thus, in this study, two video clips with explicit emotional contents from movies and online resources were used, and the EEG results were recorded from four subjects as they watched these clips. The best accuracies of 60.71% for valence and 63.73% for arousal were obtained using a Mel-frequency cepstral coefficients (MFCC) and multilayer perceptron (MLP). The results show that MFCC and MLP techniques are applicable in emotion recognition. The result shows that the mood can be recognized from opened eyes or closed eyes experiment of a subject. Furthermore, the results demonstrated that a positive video content can stimulate a subject into being in positive emotional state even when the subject was in bad mood. The emotional state in response to watching a video was shown to be correlated with Self-Assessment Manikin analysis.

 

Downloads

Download data is not yet available.

References

R. Khosrowabadi, A. Wahab, K. K. Ang, and M. H. Baniasad, “Affective Computation on EEG

Correlates of Emotion from Musical and Vocal Stimuli,” Proceeding Int. Jt. Conf. Neural

Networks, Atlanta, Georg. USA, pp. 1590–1594, 2009.

J. J. M. Kierkels, M. Soleymani, and Thierry Pun, “Queries and Tags in Affect-Based Multimedia

Retrieval,” Int’l. Conf.Multimedia Expo, Spec. Sess. Implicit Tagging (ICME 2009), pp. 1436–

, 2009.

G. Lee, M. Kwon, S. Kavuri Sri, and M. Lee, “Emotion Recognition Based on 3D Fuzzy Visual

and EEG Features in Movie Clips,” Neurocomputing, vol. 144, pp. 560–568, Nov. 2014.

S. Koelstra and I. Patras, “Fusion of facial expressions and EEG for implicit affective tagging,”

Image Vis. Comput., vol. 31, no. 2, pp. 164–174, Feb. 2013.

J. J. M. Kierkels, B. B. A, R. De Drize, and C. H. Carouge, “A Bayesian Framework for Video

Affective Representation Mohammad Soleymani Guillaume Chanel Thierry Pun Computer Vision

and Multimedia Laboratory , Computer Science Department,” 2009.

A. Yazdani, J.-S. Lee, and T. Ebrahimi, “Implicit Emotional Tagging of Multimedia Using EEG

Signals and Brain Computer Interface,” Proc. first SIGMM Work. Soc. media - WSM ’09, p. 81,

M. Soleymani, M. Pantic, and T. Pun, “Multimodal Emotion Recognition in Response to Videos,”

IEEE Trans. Affect. Comput., vol. 3, no. 2, pp. 211–223, Apr. 2012.

M. Soleymani, J. Lichtenauer, T. Pun, and M. Pantic, “A Multimodal Database for Affect

Recognition and Implicit Tagging,” IEEE Trans. Affect. Comput., vol. 3, no. 1, pp. 42–55, Jan.

J. Broekens and W.-P. Brinkman, “AffectButton: A method for Reliable and Valid Affective Selfreport,”

Int. J. Hum. Comput. Stud., vol. 71, no. 6, pp. 641–667, Jun. 2013.

G. Irie, T. Satou, A. Kojima, T. Yamasaki, and K. Aizawa, “Affective Audio-Visual Words and

Latent Topic Driving Model for Realizing Movie Affective Scene Classification,” IEEE Trans.

Multimed., vol. 12, no. 6, pp. 523–535, Oct. 2010.

D. Sanchez-Cortes, J.-I. Biel, S. Kumano, J. Yamato, K. Otsuka, and D. Gatica-Perez, “Inferring

Mood in Ubiquitous Conversational Video,” Proc. 12th Int. Conf. Mob. Ubiquitous Multimed. -

MUM ’13, pp. 1–9, 2013.

Y. Baveye, E. Dellandréa, C. Chamaret, and L. Chen, “LIRIS-ACCEDE: A Video Database for

Affective Content Analysis,” Affect. Comput. IEEE Trans., pp. 1–14, 2015.

K. Park, H. Choi, and K. Lee, “Emotion Recognition Based on The Asymmetric Left and Right

Activation,” Int. J. Med. Med. Sci., vol. 3, no. June, pp. 201–209, 2011.

M. Murugappan, R. Nagarajan, and Sazali Yaacob, “Combining Spatial Filtering and Wavelet

Transform for Classifying Human Emotions Using EEG Signals,” J. Med. Biol. Eng., vol. 31, no.

, pp. 45–51, 2011.

X. Wang, D. Nie, and B. Lu, “EEG-Based Emotion Recognition Using Frequency Domain

Features and Support Vector Machines,” Neural Inf. Process., pp. 734–743, 2011.

F. Ringeval, A. Sonderegger, B. Noris, A. Billard, J. Sauer, and D. Lalanne, “On the Influence of

Emotional Feedback on Emotion Awareness and Gaze Behavior,” 2013 Hum. Assoc. Conf. Affect.

Comput. Intell. Interact., pp. 448–453, Sep. 2013.

V. Kolodyazhniy, S. D. Kreibig, J. J. Gross, W. T. Roth, and F. H. Wilhelm, “An affective

computing approach to physiological emotion specificity: toward subject-independent and

stimulus-independent classification of film-induced emotions.,” Psychophysiology, vol. 48, no. 7,

pp. 908–22, Jul. 2011.

T. F. Bastos-Filho, A. Ferreira, A. C. Atencio, S. Arjunan, and D. Kumar, “Evaluation of Feature

Extraction Techniques in Emotional State Recognition,” 2012 4th Int. Conf. Intell. Hum. Comput.

Interact., pp. 1–6, Dec. 2012.

S. Koelstra, C. Muhl, M. Soleymani, J.-S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I.

(Yiannis) Patras, “DEAP : A Database for Emotion Analysis Using Physiological Signals,” IEEE

Trans. Affect. Comput., vol. 3, no. 1, pp. 18–31, 2012.

D. Nie, X. Wang, L. Shi, and B. Lu, “EEG-based Emotion Recognition during Watching Movies,”

Proc. 5th Int. IEEE EMBS Conf. Neural Eng., pp. 667–670, 2011.

S. Koelstra, A. Yazdani, M. Soleymani, C. Mühl, J.-S. Lee, A. Nijholt, T. Pun, T. Ebrahimi, and I.

Patras, “Single trial classification of EEG and peripheral physiological signals for recognition of

emotions induced by music videos,” Brain informatics, pp. 89–100, 2010.

C. Katsimerou, J. Redi, and I. Heynderickx, “A Computational Model for Mood Recognition,”

nd Int. Conf. UMAP 2014, Aalborg, Denmark, vol. 8538, pp. 122–133, 2014.

J. Russell, “A circumplex model of affect.,” J. Pers. Soc. Psychol., 1980.

L. S. S. Bialoskorski, J. H. D. . Westerink, and E. L. van den Broek, “Mood Swings: An affective

Interactive Art System,” ICST Inst. Comput. Sci. Soc. Informatics Telecommun. Eng. 2009, pp.

–186, 2009.

R. Khosrowabadi, H. C. Quek, A. Wahab, and K. K. Ang, “EEG-based Emotion Recognition

Using Self-Organizing Map for Boundary Detection,” 2010 20th Int. Conf. Pattern Recognit., pp.

–4245, Aug. 2010.

Emotiv, “Emotiv Epoc,” 2014. [Online]. Available: http://www.emotiv.com/epoc.php. [Accessed:

-Jun-2015].

N. Kamaruddin and A. Wahab, “Human behavior state profile mapping based on recalibrated

speech affective space model.,” Conf. Proc. 34th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc.

IEEE Eng. Med. Biol. Soc., pp. 2021–4, Jan. 2012.

D. Handayani, H. Yaacob, A. W. A. Rahman, W. Sediono, and A. Shah, “Computational

Modeling of Mood from Sequence of Emotions,” 3rd Int. Conf. Internet Serv. Technol. Inf. Eng.

, pp. 154–158, Feb. 2015.

Y. Guoliang and W. Zhiliang, “Affective computing model based on emotional psychology,”

Affect. Comput. Model Based Emot. Psychol., pp. 251–260, 2006.

M. Bradley and P. J. Lang, “Measuring Emotion: The Self-Assessment Manikin and The Semantic

Differential,” J. Behav. Ther. Exp. Psychiat., vol. 25, no. I, 1994.

K. Šušmáková, “Human Sleep and Sleep EEG Institute of Measurement Science , Slovak

Academy of Sciences,” vol. 4, pp. 4–5, 2004.

B. T. Jap, S. Lal, P. Fischer, and E. Bekiaris, “Using EEG spectral components to assess

algorithms for detecting fatigue,” Expert Syst. Appl., vol. 36, no. 2, pp. 2352–2359, Mar. 2009.

P. Sauseng, B. Griesmayr, R. Freunberger, and W. Klimesch, “Control mechanisms in working

memory: a possible function of EEG theta oscillations.,” Neurosci. Biobehav. Rev., vol. 34, no. 7,

pp. 1015–22, Jun. 2010.

B. Zoefel, R. J. Huster, and C. S. Herrmann, “Neurofeedback training of the upper alpha

frequency band in EEG improves cognitive performance.,” Neuroimage, vol. 54, no. 2, pp. 1427–

, Jan. 2011.

A. Fink, B. Graif, and A. C. Neubauer, “Brain Correlates Underlying Creative Thinking: EEG

Alpha Activity in Professional vs. Novice Dancers.,” Neuroimage, vol. 46, no. 3, pp. 854–62, Jul.

J. Kamiński, A. Brzezicka, M. Gola, and A. Wróbel, “Β Band Oscillations Engagement in Human

Alertness Process.,” Int. J. Psychophysiol., vol. 85, no. 1, pp. 125–8, Jul. 2012.

B. Penolazzi, C. Spironelli, C. Vio, and A. Angrilli, “Brain plasticity in developmental dyslexia

after phonological treatment: a beta EEG band study.,” Behav. Brain Res., vol. 209, no. 1, pp.

–82, May 2010.

Downloads

Published

2015-10-25

How to Cite

HANDAYANI, D. ., WAHAB, A. ., & YAACOB, H. . (2015). EMOTION AND MOOD RECOGNITION IN RESPONSE TO VIDEO. Journal of Mobile Multimedia, 11(3-4), 296–312. Retrieved from https://journals.riverpublishers.com/index.php/JMM/article/view/4515

Issue

Section

Articles