US TV show Lie to Me examines the art of ‘reading’ people’s faces to detect their emotions or whether they are lying. Most of us have the ability to read emotions, in varying degree and talented sales and service personnel in dealerships and car manufacturers are extremely adept at gauging customer’s emotions and use it to great advantage.
Can technology do better? Well, no. At least, not yet. But technological innovation is providing the possibility of reading customer emotions on a grand scale. Affective computing, as it is sometimes referred to, is already being used in advertising and marketing and also in some call centres to gauge whether people’s claims for insurance are bogus or true.
One of the key innovators in this area, which sprung from the Massachusetts Institute of Technology (MIT) Media Lab, is Affectiva and its product Affdex, a cloud-based technology that can recognise facial expressions. It picks up the raised eyebrows, frowns and smirks we are often unaware we’re making (despite best attempts at a poker face) using a webcam. From that data, it is able to work out whether a person is confused, astonished, annoyed and a range of other emotions. It can then pull that data together from multiple participants and present it in dashboard format to provide a timely view of customer feelings, typically about an advert they’ve seen.
Facing the truth
The original idea was to help those on the autistic spectrum to recognise emotions, but its commercial implications were quickly recognised. Unilever and The Coca-Cola Company have already used the technology to test customer responses to advertising.
Earlier this year, AOL entertainment company, Be On, introduced a platform from UK firm Realeyes that reads people’s facial expression while watching video content, so that they can begin to understand their unconscious feelings about the brand messages they are watching. Be On chief executive René Rechtman said it was considering ways it could use it to track the emotions of general users, but only those who wanted to opt in to it. The long-term aim was to improve the experience of customers exposed to any AOL or brand content.
But can affective computing be applied to the wider customer experience and to the automotive industry?
Emotion measurement technology opens up a host of new ways of getting under the skin of customers and giving them more of what they really want. In focus groups, you could be able to gain a more accurate understanding of what customers really think rather than perhaps making responses that they thought you wanted to hear about your latest car features.
Using technology also provides the opportunity for fast response and on a much larger scale than face-to-face focus groups. A better understanding of customers’ true likes and dislikes is powerful knowledge, enabling car manufacturers to create a more personalised customer experience and products that are more closely in tune with what they want.
This field of research, emerging over the last few years, has potential beyond the obvious advertising and marketing application, to what people are feeling when they look at products in-store, or perhaps cars in dealership.
And it’s not just visual triggers. Beyond Verbal is creating software that can work out whether you’re arrogant or annoyed, just from the tone of your voice. Emotion sensors could pick up when you’re angry - one of the chief causes of accidents.
The opportunities for such technology are immense, and for some it’s a little too close to Big Brother for comfort. But emotion measurement technology will soon be ubiquitous, according to the MIT researchers and Affectiva Group. Clearly, they have a vested interest in saying that, but the potential for use technology to measure emotional response and using information to improve customer experience is huge.
It may not be hitting the dealerships yet, but for car manufacturers it could be another arsenal in their customer intelligence arsenal.