Filmlerde Otomatik Duygu Analizi için Sentetik Veri Üretimi ve Çokkipli Birleştirme

Translated title of the contribution: Movie emotion estimation with multimodal fusion and synthetic data generation

Nihan Karslioglu, Heysem Kaya, Albert Ali Salah

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

In this work, we propose a method for automatic emotion recognition from movie clips. This problem has applications in indexing and retrieval of large movie and video collections, summarization of visual content, selection of emotioninvoking materials, and such. Our approach aims to estimate valence and arousal values automatically. We extract audio and visual features, summarize them via functionals, PCA, and Fisher vector encoding approaches. We used feature selection based on canonical correlation analysis. For classification, we used extreme learning machine and support vector machine. We tested our approach on the LIRIS-ACCEDE database with ground truth annotations. The class imbalance problem was solved by generating synthetic data. By fusing the best features at score and feature level, we obtain good results on this problem, especially for the valence prediction.

Translated title of the contributionMovie emotion estimation with multimodal fusion and synthetic data generation
Original languageTurkish
Title of host publication27th Signal Processing and Communications Applications Conference, SIU 2019
PublisherIEEE
ISBN (Electronic)9781728119045
DOIs
Publication statusPublished - 1 Apr 2019
Event27th Signal Processing and Communications Applications Conference, SIU 2019 - Sivas, Turkey
Duration: 24 Apr 201926 Apr 2019

Conference

Conference27th Signal Processing and Communications Applications Conference, SIU 2019
Country/TerritoryTurkey
CitySivas
Period24/04/1926/04/19

Keywords

  • Affective computing
  • Movie analysis
  • Multimodal fusion

Fingerprint

Dive into the research topics of 'Movie emotion estimation with multimodal fusion and synthetic data generation'. Together they form a unique fingerprint.

Cite this