Understanding emotions: How Depth Psychology and Algorithms Enable Accurate Predictions

Depth psychology and algorithms

The rheingold institute combines depth psychological methods with TAWNY's Emotional Recognition Technology, which combines various methods for measuring emotions. The software is specially optimized for remote studies with typical stimuli and recognizes very subtle reactions.

This article appeared on marktforschung.de on June 23, 2022.

It's these moments when market researchers want to look into the minds of consumers. Behind the glass during a group discussion, you can hear how no one in the group wants to see the 72-hour claim in the commercial for a new men's deodorant: "Why does this have to work for 72 hours, I shower every day, don't I?" Or that consumers describe a successful and pretty career woman in the commercial as "dumb, stupid and clichéd."

With a good depth psychological analysis, one can work out whether this is really not motivating, or whether it is rational defense against the advertising message. Based on this, implicit procedures come into play, which the subjects cannot influence because their unconscious and automatic reactions are measured.

For a long time, there have been methods based on body function measurements such as heartbeat, skin resistance, or EEG, where it really looks like you can gain access to the subject's head.

Laboratory situations lead to misjudgements

We have always refrained from these procedures. Primarily because it is a laboratory situation in which no consumer reacts as they would at home. Scientifically, it is also still controversial whether one can validly infer specific emotions from bodily functions. Body functions indicate the intensity of an emotion, but not its direction. An example: the heartbeat goes up when one is happy or sexually excited, but also when one feels fear or anger. This basic problem remains even if you increase the number of parameters and try to combine them.

There is a lot of scientific credulity involved here, and for good reason, the polygraph, for example, is still not considered reliable.

Therefore, we have been looking for a long time for an implicit measurement of emotional reactions that is more direct, more natural, and ultimately more evident. That facial expressions can be understood as a mirror of the soul has long been known and well researched, for example by Paul Ekman. Microemotions, in particular, are involuntary and reveal, even if one does not mean to, what one is really feeling. In addition, facial expressions, which are shaped by over 20 muscles, are multifaceted, with almost infinite possibilities for expression. This can be seen impressively in media, art or emoticons. Then you don't have to blindly rely on a measurement, you can see it for yourself, and the trained eye of the psychologist can read a lot in facial expressions.

In contrast to body function measurements, which take place in secret, measurements of mimic expression are much more eye-catching and immediately evident.

Better algorithms for face reading

Face reading programs have long been available for analyzing facial expressions, reading out emotions from the camera image of a face. In the beginning, they were rather inaccurate and the measurements were not very detailed. With the great advances in AI and Deep Learning, the algorithms have become better and better. rheingold has been experimenting with different software for a long time, for example with the Face Reader from Noldus. We have had good results with this, as our own validation studies through follow-up in-depth interviews revealed.

Measurement without additional devices

For some time now, we have been working with the software of the young Munich-based company TAWNY, which in our view currently fits our requirements best. The software can be used on the end device of the test persons at home in their natural environment. The test persons do not feel like test persons at all, but sit as usual in front of the screen with webcam, where they also consume advertising in everyday life. In TAWNY, "automatic facial expression recognition" by the webcam is also combined with eye tracking without additional devices, can learn movement patterns and additionally measure the heartbeat relatively reliably.

Emotions are the fuel of action.

The TAWNY software recognizes the presence of different emotion categories such as surprise, joy, sadness or anger. The algorithms are specially optimized for remote studies with typical stimuli and also detect very subtle reactions.

Recognize positive emotions and readiness for action

Based on the Deep Learning algorithms, a multidimensional matrix of valence and arousal is then formed. Simplified, this can be thought of as the quality or direction of the emotions (valence) and the strength in terms of a "willingness to act" (arousal). The valence is a good proxie for a positive emotional effect. For example, it can be deduced whether one develops positive emotions towards the brand or would like to try the product oneself.

High arousal across the entire presentation is also a reliable indicator of assertiveness. The spot penetrates the advertising environment and reaches the consumer, who reacts emotionally. He literally does not switch off.

Particularly in the case of moving images, it is possible to answer further questions by specifically analyzing the course of emotions:

  • Does the brand come through, so does the spot have sufficient branding?
  • Do the product and its benefits have enough space?
  • Do the emotion parameters show that it is something interesting new, or is it just old news?

Genuine annoyance about certain vignettes of a spot also shows up in the parameters, as do difficulties in understanding. This can then be optimized in a targeted manner.

Emotional Recognition combined with depth psychological methods

There are numerous offers on the market that rely exclusively on face reading tools. Often, face readers are also used as DIY tools, whereby serious misjudgements can occur if one is not familiar enough with the significance of the parameters. Such stand alone applications are of course well scalable, cheap and fast.

We achieve better results when we combine Emotional Recognition with a depth psychological analysis using in-depth interviews.

In this process, the psychologists and the face reading specialists work closely together. We find this imperative in order to clarify the content of the advertising impact. Are the motives of the product area addressed well? Does the spot fit the brand positioning and are the intended goals from the agency briefing achieved? If you rely only on measurement, you may end up with a spot that is well received emotionally, but whose message is counterproductive to the advertising goals.

Advertising impact reliably predicted

For example, in a storyboard test for a young face cream, we found out that there was a supporting, very likeable emotional core scene, but it didn't pay off in terms of the brand and the product. In the end, with the exception of this scene, little of the advertising impact remained, which would be a disaster for a new launch campaign.

Where can such a combined approach be used successfully? It works very well with moving images. It's the most emotional medium: people go along with it, identify with it, and there are very meaningful emotion curves. The emotions of storyboards or animatics can also be measured very well, which is important for the suitability of the approach as a pretest. We start with the measurement and analysis of facial expressions when linking the methods. The results provide initial indications of strengths and weaknesses of the spot as well as important vignettes and are then explored and analyzed in detail using in-depth interviews. The insights gained can then be incorporated once again into the detailed analysis of the face reading parameters, which is carried out together with TAWNY.

As a result, you get a quantitative advertising impact measurement, for which there are also benchmarks and a comprehensive understanding of the advertising content as well as dedicated optimization recommendations.

Emotions in storyboards or animatics can also be measured very well, which is important for the suitability of the approach as a pretest. Print motifs or packaging can also be examined well, especially if eye tracking is integrated. In the case of verbal concepts, whether positioning or product concepts, one may have to reckon with rather unclear or ambiguous results in the emotion measurement, because they have less of an emotional impact. But then the results of the in-depth interviews still remain to make an evaluation.

The whole thing can make an important contribution in the area of UX. Be it in testing and optimizing an app, an online customer journey, a homepage or an online store.

Even with our approach, we still can't look consumers in the head. But we can measure and understand what they really want and how we can win them over as customers.

Related articles