Suppose you felt disgusted by something you saw Donald Trump say on TV.
You might wonder: Hey, am I only the one?
That question is the motivation behind Emotit for President, a new and free mobile app released today to “take the temperature of voters.”
Its maker is Krush, the Dayton, Ohio-based company behind ooVoo, a multi-user video chat and messaging platform that claims more than 150 million users worldwide. Founded last year, the company has also released a free-rotation VR simulation pod called Moveo.
In the Emotit app, a user indicates gender, ZIP code and whether he/she is a Democrat, Republican or “other.” The app — now on Android, soon on iOS — then displays short, curated 15- to 30-second video clips from a daily-updated library relating to the US presidential race.
With user consent, the app employs facial recognition technology from M.I.T. Media Lab spinoff Affectiva. Fifteen micro-expressions from different parts of the face of a user watching the clip are captured through the camera of the same smartphone or tablet showing the clip. Affectiva then assesses to what degree the user is expressing one or more of seven emotions: anger, sadness, disgust, fear, joy, surprise or contempt.
After the clip is shown, the app displays in real time a score that indicates whether there was a generally positive or negative response by all viewers — high is positive— as well as the user’s score. A midpoint arrow shows the moment in the clip where the user displayed the strongest response, or where the average of all users did, and the user can immediately jump to that spot. There’s also a map showing which parts of the US mostly “liked” or “disliked” the clip.
At least that’s the theory.
In the remote demo I saw, the first clip shown was from a newscast, so the reporter’s commentary, B-roll shots and other content were mixed in with shots from a political gathering. It’s difficult to know whether an emotional response to such a mixed clip was to the reporter’s take, the political content, the other shots or some or all of the above. Krush Chairman JP Nauseef told me the clip library includes news clips and “popular political parodies,” but “80 to 90 percent” are just of the candidate.
Then the demo showed a clip just of former Secretary of State Hillary Clinton. The response score showed an 8.8 positive response — but among self-declared Republicans! And Democrats showed a negative score of 1.0.
Infinity of conditions
This kind of information could dramatically change forecasts of the race, but Nauseef said the sample group of users in the testing phase was just too small for the score to represent a valid measurement of general trends.
Then there’s the issue of mobile capture in an infinity of conditions. Affectiva says on its website that it accounts for differences due to gender, eyeglasses, head pose and lighting conditions, but those variations obviously impact capture in ways that are not clearly defined.
Affectiva also notes that some emotional expressions — joy, disgust, contempt and surprise — are the most accurately detected, while anger, sadness and fear are “more nuanced and subtle and therefore harder to detect resulting in scores at the lower end of the [accuracy] range.” So, if you’re angered by something Trump or Clinton said, it might not be accurately registered.
Finally, there’s the matter of exactly what the score and the arrow show. The demo and the announcement, for instance, have described the score as representing “positive or negative emotions.”
But which of the seven analyzed emotions — anger, sadness, disgust, fear, joy, surprise and contempt — are positive besides joy? Is surprise negative or positive? Are anger, sadness, disgust, fear and contempt all negative? Krush directed me to the Affectiva site, which offers no explanation.
Similarly, it’s not clear what the midpoint arrow is supposed to indicate. I was variously told it was where “the most meaningful emotional connection occurred,” or that it was “the most engaging,” “the most interesting,” “the most liked,” “the top moment” or “the most intense” point in the clip.
In my book, those are all very different things.
Nauseef told me that his company has no plans for making money on Emotit by the time the election rolls around, although he did mention they are in discussion with several unnamed political groups and campaigns. The business model, he added, would most likely center around selling the anonymized emotional data that indicates responses to specific content, which he compared to a focus group. He said that no facial imagery is recorded or kept.
Facial recognition for marketers
Certainly, emotional detection through facial recognition is catching on as a common tool for marketers. Affectiva claims more than 1,400 brands as clients. In January, London-based major media agency MediaCom announced that it would use emotional measurement from Realeyes — a key Affectiva competitor — as a regular part of its content testing and media planning.
Also in January, Apple announced it had purchased emotion-detection startup Emotient. Late last year, visual engagement analytics platform Sticky added emotional tracking to its eye-tracking services. And video ad platform Virool released a platform in 2015 for measuring emotional responses to its ads.
While much of the focus for emotional tracking has been on ads, it was only a matter of time before content was similarly put to the test, as Krush is doing. But until the terminology is more consistently used, the capture conditions more standardized and the accuracy better validated, one wonders if the US’s already white-hot political scene — where one wild poll throws a news cycle into a tizzy — benefits from such temperature-taking.
Nauseef is an investor in Affectiva and is on its board, and early last year, Krush released an SDK called “Intelligent Video” so that apps could utilize the facial recognition tech. Another Krush app, Flinch, offers an Affectiva-powered staring contest, and a match-making game also built around emotional reactions is in the works.