Title:Automatic emotion capture when viewing web-based media on a smartphone
Description:
Every day, we view different types of web-based media such as tweets, online videos, and articles. Each of these types of media can produce an emotional response such as anger, sadness, or surprise. Automatic classification of these responses has received a lot of attention in the image and video analysis communities, as well as in social computing. Following on from a CUROP project that produced a prototype face tracker and emotion classifier, this project aims to further develop a more robust face tracker and emotion classifier, and then to apply these to real-world scenarios. The resultant data will then be analysed, to further understand how people respond to different types of media.
Deliverables: Initial plan
Final report
Student: Matthew Rhys Jones
Supervisor: Dave Marshall
Moderator: Yukun Lai
Report: Archive