Key Visual DW Innovation
Key Visual DW Innovation

Using Emotions for Watching TV

MixedEmotions – an EU-funded research project – brings emotion analysis to various businesses. We are developing an open source platform for the automatic detection of emotions in speech, audio, video, text and social media data – in multiple languages and with preservation of the semantic context of expressed emotions. The envisioned platform comprises modules for multilingual, multimodal emotion and sentiment detection, entity linking, topic detection and more!

The following article illustrates our starting point in our Social TV pilot and shows the achievements that we have made one year into the two-year project.

Our starting point

Ever since radio and television existed, they have come with a programme guide to let customers know which programme will be broadcast when. These programme guides became more and more important when the number of channels and programmes increased. So-called EPG's (Electronic Program Guides) helped customers to find their way through hundreds of different options. The need for an intelligent program selection system has increased even more since linear programming has been supplemented by catch-up television and video-on-demand services.

From a viewer's point of view, these services have established a completely new content consumption landscape. In principle, viewers are now free to consume any content at any time. But this freedom comes with the challenge of having to surf through an overwhelming amount of different formats and content items. For instance, Germany's international broadcaster Deutsche Welle, one of the project partners, produces a weekly audio and video on-demand output of 100 hours in 30 languages, 17 hours in English alone.

From a content provider's point of view, the new on-demand services have increased the challenge to stick out and make its content visible to potential customers. And also, even when they have caught the attention of the audience, content providers are faced with the additional challenge to create an 'audience flow' that keeps customers within their sphere and makes them watch as much of their content as possible. In order to achieve this and to support the consumers, a number of recommendation engines have been developed aiming at selecting and recommending content that is relevant for the individual viewer in his or her very specific situation and state of mind.

The MixedEmotions consortium has decided to add emotion analysis to recommendation services. Given the fact that media consumption is very much driven by emotions, we will develop a solution that will analyse emotions in order to improve the recommendation of on-demand and live content to individual users. We intend to showcase this approach via an AppleTV application.

Emotion-driven video discovery

The idea is pretty straight forward. Just after a viewer has watched a particular video, the AppleTV application will suggest a number of videos that the user could also be interested in.

post_social_tv_ui.jpg
Social TV UI draftDW Innovation

Within the course of the MixedEmotions project, we plan to explore the possibility of further detailing the list of recommendations by suggesting particularly joyful content on the one hand and more ambitious and intriguing content on the other hand. This additional feature depends on the quality of emotion analysis and our capability to connect specific emotions with the categories "joyful" and "intriguing".

Behind the scenes

Let's have a look at the underlying data processing pipeline. We will use the Deutsche Welle media repository in order to develop and showcase our solution. The platform would have the capability of identifying emotional patterns in the content (the video) itself, but we are unsure about the usefulness of such an analysis as actual news content is usually presented in a rather neutral way. As a consequence, we intend to analyse the buzz in social networks around the topics that are covered by the video. This approach enables us to identify the dominating emotions at the moment of media usage rather than at the moment of media production.

The results of emotion analysis will be fed into the recommendation engine which, as a result, makes use of the following signals:

  • Media usage tracking–which videos are being watched, in what sequence and how long
  • Emotions collected on Social Media with respect to the topics that are dealt with in the videos
  • Emotional feedback provided by the user after watching a video

The following figure illustrates the procedure described above.

post_social_tv_architecture.jpg
Social TV architecture.DW Innovation

The process starts with video metadata which is extracted from Deutsche Welle's media center API. The video descriptions are analysed for semantic entities which in turn are used to form queries for searches on Twitter. The emotional content of the returned tweets is analysed and the results are fed into the recommendation engine.

In parallel, the emotions contained in the videos themselves are extracted, both from the video and audio tracks. The results are also fed to the recommendation engine.

Of course, the recommendation engine also uses traditional signals, such as the media usage on any given Apple TV set-top-box. On the return path, the engine provides recommendations which are displayed on the TV interface.

Current development status and planned improvements
During the first year of the project, we have developed the technology to identify entities in Deutsche Welle video content and to map these entities to related tweets and posts in social networks. In addition, social media content is being analysed for emotional signals with the result that five distinctive emotions can be detected: joy, fear, anger, disgust, and sadness. We have also developed the first version of the Apple TV application which is currently subject to internal testing.

The focus for the second year of the project is to work on the following areas:

  • Feed detected Social Media emotions into the video recommendation engine
  • Investigate the possibility to extract emotions from the soundtrack and image content of Deutsche Welle videos, the results of which could be fed into the recommendation engines
  • Ask TV viewers to rate videos on an emotional level (similar to the recently introduced Facebook ratings via emoticons) and feed the ratings into the recommendation engine
  • Investigate the mapping of emotional analysis results to joyful and intriguing categories

The Apple TV application is intended to be released as a beta version.

We are looking for beta testers

Let us know if you own an Apple TV 4 and under which e-mail address we can reach you. We'll send you an invitation as soon as the beta version is released for external testers.

Testing requests as well as all other questions about this research can be directed towards Kay Macquarrie.

Author
team_kay_macquarrie.jpg
Kay Macquarrie