University of Twente Student Theses

Login

How is emotion change reflected in manual and automatic annotations of different modalities?

Xiang, Y. (2017) How is emotion change reflected in manual and automatic annotations of different modalities?

[img] PDF
2MB
Abstract:The SEMAINE database consists of recordings of persons talking to different virtual characters. Each of the virtual characters speaks and behaves according to a repertoire of utterances. The repertoire designed the verbal and nonverbal behaviors of virtual character to influence the emotional state of persons via the conversational interaction. There are four characters - Obadiah, Poppy, Spike, and Prudence. The Obadiah character is gloomy and tries to make other person feeling depressed, whereas Poppy character is cheerful and is defined to make other person talking to it happily as well. The Spike character is angry and is intended to provoke other persons. The Prudence character is sensible. It tries to make other people sensible, too. The recordings have been annotated manually in several dimensions which indicating the emotional state of the person as it changes over the interaction. The first goal of the thesis is to find out whether the emotions of the participants align with the character they interact with. Since there are 4 different characters, it is possible to explore how these characters differ in changing the emotions of participants. In the manual annotations, the emotional states were indeed changed among characters. The changes in the dimensions of the Happiness, the Sadness, the Anger, and full rating dimensions (except the Intensity) were statistically different. The Poppy and Spike characters had unique impacts in the Happiness dimension. Compared to Poppy and Prudence characters, the Obadiah character had different impacts in the Sadness dimension. The Anger character had more advantages than Obadiah and Poppy characters in influencing the anger emotion of participants. The second goal of the thesis is to investigate whether automatic emotion recognition tools assign the same emotions to the persons as the manual annotations. The selected tools included the FaceReader - a software program that recognizes the emotion of a person based on the detected facial expressions, and the LIWC (Linguistic Inquiry and Word Count) - a tool which is based on the words used by a person. The emotions recognized by FaceReader match the annotated emotions more closely than the emotions recognized by LIWC. In the results of FaceReader, the dimensions of the happy, the sad, and the angry emotions were correctly correlated with the manual annotations in most characters. The FaceReader achieved almost 42% accuracy. In the results of LIWC, only the dimension of the Posemo (happy) well correlated with the Semaine annotation. The LIWC only had 33% accuracy. The whole thesis is divided into four steps as follows. In the first step, we preprocess the samples to ensure the data is well organized. The second step is the Phase I which selected the facial expression and the pair of virtual characters (Obadiah and Poppy) with the strongest contrast of emotional impacts. The third step is the Phase II which increased the types of virtual characters (Spike and Prudence). The last step is the Phase III which added the modality of text into the analysis.
Item Type:Essay (Master)
Faculty:EEMCS: Electrical Engineering, Mathematics and Computer Science
Subject:54 computer science
Programme:Interaction Technology MSc (60030)
Link to this item:https://purl.utwente.nl/essays/72174
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page