Gestures, mimics, and speech are the key ways to communicate and convey thoughts to others. However, the content of speech (e.g., words) only serves around seven percent of communications. The other parts of human communication include HOW we talk (38%) and how we move our body and perform facial mimics (55%) [1].
Read More
The project MixedEmotions proposes to follow a linked data approach for publishing emotions. What does this mean?
Can you imagine your alarm clock knew if you slept badly and chose your favorite song to raise your spirits? Or that your TV chose for you that movie you needed today to make you smile? Or that your car suggests you to stop to grab a cup of coffee?
These premises, which until now were part of science fiction cinema, from classics like Blade Runner to more current hits like Her, may be closer to reality. Thanks to the convergence of technologies of analysis of emotions, Big Data, and Internet of Things (IoT). Both are being researched in the European project MixedEmotions.
Read More
What’s the challenge? One problem has been that traditional approaches to data investigations have been lacking the ability to handle mixed data effectively. Enterprise systems allow to search across structured and unstructured data, but just to reveal “results” in the form of – for example – blue links. And Business Intelligence systems on the other hand are great for seeing aggregates but lack the ability to search and investigate down to the single last record.
Within the context of the MixedEmotion project we’re interested in investigating the “emotions” in texts. At the core are algorithms, which extract measurements of “Emotional states” from text. The harvested emotional data can then be measured, averaged and acted upon.