Did you know that you’re more likely to abandon your online shopping cart if you feel overwhelmed by choice? Or that 20% of online customers seek validation in their choice by seeking out expert advice or reviews from other people?
As much as we like to think that we make purchasing decisions from a rational place, in reality, we’re more likely to be influenced by our emotional connection with the brand. To this end, companies are now using emotion detection and recognition software to measure the reactions to their brand or products. It’s estimated that the industry could be worth $65 billion by 2023.
In theory, the technology could be used to customise online experiences or change decision making, although there are also concerns that it could be used to manipulate consumers.
How does it work?
The technology works by using artificial intelligence and machine learning to recognise complex facial features. The software uses a method called “factorised variational auto encoders” to track facial behaviour. After just a few minutes, the software can predict your reaction to future events.
Disney has already put this technology to work to test audience reactions to The Jungle Book and Star Wars: The Force Awakens. Using infrared cameras during the movie screenings, audience reactions were tracked. After a few minutes, the algorithm was able to predict when in the movies people would laugh or smile.
How are brands using this technology?
This technology could transform the way brands carry out market research. Affectiva is one of the biggest companies currently operating in the emotion detection and recognition software industry. Kellogg’s was one of the first brands to test drive their software.
Kellogg’s used it to test audience reactions to multiple versions of an advert for their cereal. They were able to determine that, although one elicited a strong positive reaction for the first viewing, subsequent viewings weren’t as positive. This allowed them to settle on a version of the advert that gave steadier levels of engagement.
Isn’t that pretty invasive?
It’s important to remember that this technology can only ever be used with consent. You needn’t worry about your webcam spying on your reactions to things you find online. It can only be used in market research situations where the person being observed has given their consent. Rather than filling out a form at the end, their facial expressions can do all of the talking.
There are some scenarios where the technology can be used in real time. The video game Nevermind uses facial recognition software to change the game based on the player’s emotional reactions. However, it’s important to note that this information is only used to change the game for that individual and none of the information is stored.
How can this be used in marketing?
The most obvious use is in market research and testing consumer reactions to a brand. Consumers make decisions based on emotional connections rather than information, so it’s easy to see why brands will want to know what their customer really thinks about them.
The technology could also be put to use in testing user experience and conversion rate optimisation. E-commerce navigation is notoriously difficult to get right, but one company is using an EEG cap to measure brain activity during user testing stages. Founded in 2015, Space Between is bridging the gap between user experience testing and emotion detection. When user testing subjects are asked to narrate the experience out loud, researches have found that how they articulate their feelings doesn’t always match up to their facial expressions. This is why many traditional user testing methods fall short.
How close are we to this being widespread?
At the moment, it’s only the companies with big budgets that are able to make the most of this technology. Affectiva currently has a section of their site dedicated to trying out the technology for yourself. You can watch a YouTube video and see your emotional responses on a chart below the video.