Explore a groundbreaking approach to predicting music preferences through facial expression analysis in this 23-minute conference talk from the 24th International Conference on Intelligent User Interfaces. Delve into the innovative method of automatically extracting pairwise music preferences by analyzing users' facial expressions while listening to tracks. Learn how this low-effort preference elicitation technique outperforms traditional baselines and adapts to users' personalities. Discover the potential implications for recommender systems, the role of emotional responses in music preference, and the interplay between personality traits and prediction accuracy. Gain insights into the experimental design, user study results, and the predictive facial features used in this research. Consider the broader implications for user interface design, privacy concerns, and the future of personalized music recommendations.
Prediction of Music Pairwise Preferences from Facial Expressions