NEXT
Instead of asking people "what they think" about Super Bowl ads, we decided to measure "how they feel" about the ads. In February 2016, Annalect organized a study that captured a viewer's true emotions while watching the Super Bowl commercials. We did this by applying facial detection and recognition analysis to actual facial and emotional reactions to Super Bowl commercials.
NEXT
Ad Popularity Contest
YouTube Views
We monitored the actual ads and ad teasers on YouTube for a week. On Game Day, ads by Hyundai, Pokemon and Mini dominated the game. Axe was very quick to get to the top of the chart, however the ad was published on January 25th. And if it was a popularity contest, hands down Hyundai Elantra would have won with 2 brilliant ads.
Unfortunately YouTube popularity does not guarantee ad efficacy, and to find out what people really thought of the ads we decided to use facial recognition technology to measure how people “feel” while watching them.
NEXT
The week before the Super Bowl we set up a small lab at the New York co-working space LMHQ. Participants were asked to watch five commericals that were selected at random. Every three seconds, during each commercial, a camera took photos of facial expressions, leaving us with an average of 10 photos per commercial. After each session ended, we applied feature extraction on the photographs which gave us an age/gender and mood guess for each participant joined to a confidence level.
“Happiest” Ads
Sensitivity (true positive rate)
As the eyelids tighten, the cheeks rise and the outside corners of the brows pull down – some ads just leave people smiling. They are cute, adorable, and resonate across different age groups. The ads that made us happy this year were adorable Kung Fu Panda, witty Shock Top, and Snickers’ original take.
"Smiley" Ads
Sensitivity (true positive rate)
Ads that make you smile
Mountain Dew topped our smiling index - every single participant who was exposed to the ad, was smiling. We also saw Marmot Snow Angel and Amazon Echo on top of this index.
NEXT
“Off-putting” Ads
Sensitivity (true positive rate)
There were some ads that provoked slightly narrowed brows, wrinkling of the nose and visible protrusions of the tongue. These ads appear to cause discomfort to viewers. One of those ads was Colgate – in spite of addressing an important issue people became uncomfortable watching. Surprisingly Hyundai with Ryanville also fell into this category, while enjoying incredible popularity on YouTube, this ad made some viewers uncomfortable.
Final Remarks
In the past, the only way for marketers to figure out how people react to advertising was to ask them. With new developments in technology, we don't need to rely on dubious primary research anymore, instead, we can see how people feel.
During our experiment, we were able to see which ads generated a strong facial response and which ones left people completely neutral. The most amazing part is that we didn't need complicated cables, we simply used a web camera and did it in real-time (as data-crunching and photo parsing is currently possible at fairly high speeds). As facial extraction becomes more accessible and much faster, we might want to start rethinking the way we do primary research and ad-testing.
Finally our moodommter showed that most people became slightly happier after watching the commercials and that they had an overall positive experience watching the commercials. The data also revealed about 38% change from angry/sad mood to neutral and happy. Turns out we don't hate ads as much as we claim.
Methodology
134 people participated in our moodometer lab in Downtown Manhattan during the week before the Super Bowl game. Both genders were equally represented. Before the experiment, we asked them to answer a few questions about their age, gender and current mood. After the experiment, we asked a few more questions about their post mood and buying intent.
The Technology
The software used in this experiment was a prototype that randomized five pre-released commercials for them to watch after each participant answered a few questions. We then used the Sky Biometry API to analyze the photographs and extract the mood.