Gestures, mimics, and speech are the key ways to communicate and convey thoughts to others. However, the content of speech (e.g., words) only serves around seven percent of communications. The other parts of human communication include HOW we talk (38%) and how we move our body and perform facial mimics (55%) [1].
Read More
Can you imagine your alarm clock knew if you slept badly and chose your favorite song to raise your spirits? Or that your TV chose for you that movie you needed today to make you smile? Or that your car suggests you to stop to grab a cup of coffee?
These premises, which until now were part of science fiction cinema, from classics like Blade Runner to more current hits like Her, may be closer to reality. Thanks to the convergence of technologies of analysis of emotions, Big Data, and Internet of Things (IoT). Both are being researched in the European project MixedEmotions.
Read More
What’s the challenge? One problem has been that traditional approaches to data investigations have been lacking the ability to handle mixed data effectively. Enterprise systems allow to search across structured and unstructured data, but just to reveal “results” in the form of – for example – blue links. And Business Intelligence systems on the other hand are great for seeing aggregates but lack the ability to search and investigate down to the single last record.
Within the context of the MixedEmotion project we’re interested in investigating the “emotions” in texts. At the core are algorithms, which extract measurements of “Emotional states” from text. The harvested emotional data can then be measured, averaged and acted upon.
MixedEmotions finds and identifies emotions in Big Data. How are we doing this? The first step is to select an emotion classification scheme. Research into emotion has proposed several approaches to classification and characterisation of emotion. So, which one to chose?
Early work on linguistic characterisation of emotion found that emotion could largely be characterised by just three dimensions: primarily affective valence (ranging from positive to negative) and arousal (ranging from calm to excited), with a dimension they labelled “dominance” or “control” having less significance. Much work in emotion analysis has used this VAD (Valence, Arousal, Dominance) model, including the widely used dictionary of emotional significance of words “Affective Norms of English Words” (ANEW).
Facebook expands Like button by introducing emoji
Facebook has expanded its Like button. The six emoji-alternatives, called “Reactions,” give Facebook users an extended palette of emotions, most of which amount to various shades of positivity.
Read more: http://www.wired.com/2015/10/facebook-reactions-design/