The MixedEmotions platform is a Big Data Toolbox for multilingual and multimodal emotion analysis. In a series of three webinars the toolbox was introduced. Find here some material including a full video of the webinar.

The MixedEmotion platform is built around stand alone docker modules with an orchestrator that links the modules into analysis workflows utilising MESOS for scalable cloud deployment.

Core capabilities include emotion extraction from text, audio and video with many other capabilities, such as sentiment analysis, social network analysis, entity detection and linking and sophisticated data visualisation.

You want more info? Download the handouts:

 

Take me to the MixedEmotion’s Platform.
Take me to the MixedEmotion’s Github Account and Docker Account and read our DIY – Emotion Extraction Platform Made Easy article.

Image Credits

 

Today, emotion detection services based on speech technologies make its way to the market. Speech bears valuable information about speakers; and emotions of the speaker are one of them. Just try to imagine the potential power you can get, if you have a mining tool for emotions.

Read More

MixedEmotions – the Open Source platform* for emotion extraction

The MixedEmotions platform is a Big Data Toolbox for multilingual and multimodal emotion extraction and analysis (*be aware the plattform is in a beta stadium and still under development). It can extract emotions from text, audio and video. However, it also has many other capabilities, such as sentiment analysis, social network analysis and knowledge graphs visualization among others. Take me directly to the MixedEmotion’s Platform.

Read More

Gestures, mimics, and speech are the key ways to communicate and convey thoughts to others. However, the content of speech (e.g., words) only serves around seven percent of communications. The other parts of human communication include HOW we talk (38%) and how we move our body and perform facial mimics (55%) [1].
Read More

The project MixedEmotions proposes to follow a linked data approach for publishing emotions. What does this mean?

Read More

Emotions are an integral aspect of human mental processes and everyday experience. They drive much of our behaviour and play an important part in communication. Emotions are often intertwined with mood, temperament, personality, disposition, and motivation.

To understand human emotions, to react to them, and to intentionally induce them has been a long-standing dream of researchers in human-computer interaction. How much better could our lives be if computers, search engines, or smart personal assistants would be able to sense when we start getting annoyed or frustrated with them and if they could adapt accordingly?

Read More

Can you imagine your alarm clock knew if you slept badly and chose your favorite song to raise your spirits? Or that your TV chose for you that movie you needed today to make you smile? Or that your car suggests you to stop to grab a cup of coffee?

These premises, which until now were part of science fiction cinema, from classics like Blade Runner to more current hits like Her, may be closer to reality. Thanks to the convergence of technologies of analysis of emotions, Big Data, and Internet of Things (IoT). Both are being researched in the European project MixedEmotions.
Read More

What’s the challenge? One problem has been that traditional approaches to data investigations have been lacking the ability to handle mixed data effectively. Enterprise systems allow to search across structured and unstructured data, but just to reveal “results” in the form of – for example – blue links. And Business Intelligence systems on the other hand are great for seeing aggregates but lack the ability to search and investigate down to the single last record.

Within the context of the MixedEmotion project we’re interested in investigating the “emotions” in texts. At the core are algorithms, which extract measurements of “Emotional states” from text. The harvested emotional data can then be measured, averaged and acted upon.

Read More