IIIT Hyderabad Publications |
|||||||||
|
Evaluating Content-centric vs User-centric Ad Affect RecognitionAuthors: abhinav.shukla ,Shruti Shriya Gullapuram,Harish Katti,Karthik Yadati,Mohan Kankanhalli,Ramanathan Subramanian Conference: 19th ACM International Conference on Multimodal Interaction (ICMI-2017 2017) Location Glasgow, Scotland Date: 2017-11-13 Report no: IIIT/TR/2017/50 AbstractDespite the fact that advertisements (ads) often include strongly emotional content, very little work has been devoted to affect recognition (AR) from ads. This work explicitly compares content-centric and user-centric ad AR methodologies, and evaluates the impact of enhanced AR on computational advertising via a user study. Specifically, we (1) compile an affective ad dataset capable of evoking coherent emotions across users; (2) explore the efficacy of content-centric convolutional neural network (CNN) features for encoding emotions, and show that CNN features outperform low-level emotion descriptors; (3) examine user-centered ad AR by analyzing Electroencephalogram (EEG) responses acquired from eleven viewers, and find that EEG signals encode emotional information better than content descriptors; (4) investigate the relationship between objective AR and subjective viewer experience while watching an ad-embedded online video stream based on a study involving 12 users. To our knowledge, this is the first work to (a) expressly compare user vs content-centered AR for ads, and (b) study the relationship between modeling of ad emotions and its impact on a real-life advertising application. Full paper: pdf Centre for Visual Information Technology |
||||||||
Copyright © 2009 - IIIT Hyderabad. All Rights Reserved. |