IIIT Hyderabad Publications |
|||||||||
|
My music and I: Decoding Individual Differences via Musical BehaviourAuthor: Yudhik Agrawal Date: 2021-06-09 Report no: IIIT/TH/2021/54 Advisor:Vinoo Alluri AbstractMusic is omnipresent and has existed since time immemorial. In general, individuals’ response to music, be it in terms of movements or listening strategies, tend to be dictated by features intrinsic to music such as rhythmic structure, genre, in addition to extrinsic factors such as mood states, socio-cultural norms, amongst others. However, at the same time, human differences in physical, psychological, and behavioural traits affect the decisions we make and the experiences we pursue in daily life. Individual differences can refer to gender, cognitive styles of thinking, personality, amongst others. In this thesis, we try to expand our understanding of how individual characteristics modulate our musical behaviour. The study and analyses presents various angles of approaching musical behaviour in an interdisciplinary manner, particularly in the areas of embodied music cognition and music consumption. We particularly investigate musical behaviour in two broader sub-domains: Active Music Engagement, which entails corporeal involvement and Passive Music Engagement, which involves listening to music. In the first part, we look at decoding individual differences via active music engagement. The paradoxical balance between universality and individuality in human motoric responsiveness to music makes it interesting to note that all the individuals’ traits are encoded in free dance movement. The study addresses this by identifying individual differences, specifically gender, cognitive styles, Big Five personality traits, and music preferences using free dance movements via our proposed Machine Learning model. We further demonstrate the robustness of the proposed model by testing its efficacy on another data set and further providing conclusive evidence about the learned models’ generalizability. The results of the study support theories of embodied music cognition and the role of bodily movement in musical experiences by demonstrating the influence of gender, personality, and music preferences on embodied responses to heard music. In the second study, which deals with passive engagement, we investigate out how individual differences, specifically personality, are associated with the kind of emotional experiences one seeks from lyrics. We chose to investigate lyrics specifically because they are under-explored in the field of music information retrieval despite them playing a crucial role in eliciting emotions. Firstly, we sought to identify emotional connotations of music based on lyrics following which we associate them with individual differences represented by personality traits. To this end, first we propose a novel deep learning architecture to identify emotional connotations of music based on lyrics, which we further compare with the existing deep learning and even traditional methods on relevant datasets and show state-of-the-art performance. Subsequently, we use this model to extract emotional preferences of individuals mined via online music streaming platforms and associate them with inherent personality traits. Our findings validate our theory that various types of emotions conveyed by songs have a unique relationship with individual traits. This study contributes to a better understanding of the relationship between broad personality dimensions and the emotional experiences one naturally seeks on online music streaming platforms. Both of our studies corroborate previous research in the field in addition to providing novel findings in both the aspect of musical behaviour being modulated by individual differences. Full thesis: pdf Centre for Cognitive Science |
||||||||
Copyright © 2009 - IIIT Hyderabad. All Rights Reserved. |