Emotional analysis – making customers feel good

An interview with Martin Bäumler

Anger? Annoyance? Ambivalence? The customer’s mood can be decisive in the outcome of a client dialogue. Therefore, it is advantageous if a contact partner can sense it. Humans mostly do this intuitively. But how do you teach a machine to recognise emotions?

This is Martin Bäumler’s area of responsibility within the Deutsche Telekom eLIZA project. The aim is to read and recognise emotions from a user’s various inputs, and facilitate a better user outcome – regardless of whether it relates to a search request or contact with customer service. At present this is limited only to written text; however, Bäumler’s team will soon start filtering out emotions from the spoken word, as well.


What exactly is emotion recognition?

Information is conveyed through what is said, but also in the way it is said. Emotions are an essential element of this complex transfer. Automated speech recognition computer systems to date have only been able to understand the explicit content. We are working on a way of evaluating every piece of information conveyed via the spoken or written word. Our objective is to offer our customers an even better experience, and even better service.


What is the relationship between speech recognition and emotion recognition, with respect to better understanding the needs of customers?

Speech recognition can live without emotion recognition; however, emotion recognition cannot exist without speech recognition. Speech recognition drives the whole thing. To understand the customer’s problem as well as possible, I first need speech recognition – simply recognising emotions isn’t enough. If the customer just cries when they’re on the phone, this is a clear display of emotion; however, it doesn’t help in determining the problem. This is why speech recognition is a priority. The recognition of emotions only hints at the direction of the solution.


With the exception of Tinka, how could this recognition be applied?

When applied to customer interactions with human agents, we can provide agents with an indicator of the customer’s emotional state. The call centre agent could then be shown alternative conversational prompts, or the discussion could be passed onto a colleague specialised in challenging cases, who has a talent for de-escalation.


How does emotional analysis work? For example, how might you tell that I am in a good mood today?

We can analyse a number of factors within the voice: volume, modulation, pitch, as well as other tiny, almost imperceptible characteristics. For text, things become a little more difficult. Here, we must examine the content very carefully. One possibility might be the Emoticons that a customer uses. Sentence construction, capitalisation, content, word choice as well as grammar and structure are key factors in providing information on a user’s emotional state. The interesting thing is that the longer the dialogue is, the more precise the analysis can be.


Do you also use information about the user in the analysis; such as if they live in what might be considered a good residential area?

It is not our role to determine whether a particular residential area is better or worse than another. We would never, under any circumstances, set up a user profile that included this type of personal information. Data protection is our highest priority. When the customer is logged in, we can clearly identify them and we also know their address. However, our priority is solving the customer's problem as quickly and efficiently as possible – regardless of where they are from. All customers are equal to us. And every customer is entitled to the best possible service, irrespective of whether he or she is a prepaid or premium customer.


How do you evaluate intercultural differences in this context? You mentioned volume as one of the factors. Some might venture to suggest that a Southern European, for instance, would generally speak louder – or they might do so if they become agitated – whereas Northern Europeans would be more likely to use a quieter tone.

Intercultural differences play a far smaller role than you might think. Volume is just one of the factors that we use in combination with others to analyse the customer’s emotions. And the more customer contact the system has, the more it improves over time.


How can I impart this understanding onto an artificial intelligence? How does the process work?

First of all, the system analyses existing chats and existing information. The more interactions there are, the better it becomes. Likewise, the specialist knowledge of the people who actually chat with customers daily also plays a significant role. They contribute their own personal specialist knowledge and, in doing so, they enhance our algorithm.


How many algorithms do emotions need? Is there a formula for feelings?

We are in the process of developing an emotional model, and of clarifying questions like these. Which emotions are in any way relevant to us? How do I differentiate between them? Do I need to distinguish, for example, between wrath and anger or joy and happiness? At the end of the day, it boils down to a very small emotional roadmap: one that is highly specialised and doesn’t include very many emotions at all.


Is it ethically and morally right to recognise emotions?

Emotions are the heart of what it is to be human. If we can recognise, evaluate, potentially simulate and respond to them – and if at the same time, we build artificial intelligence that becomes increasingly human – then we must increasingly consider the ethical and moral questions.


What consequences arise from making AI emotional, and with it more human?

It allows us to gain better access to a broader mass. The use of technology has a lot to do with reservations about contact. Look at older people – they are cautious when it comes to using new technologies. The way in which a machine responds to the needs of the individual will definitively break down the barriers between artificial intelligence and humans.

Kommentare
Add a comment
Sorry

your browser is not up to date
to enjoy this website you will need to install a modern browser.
we recommend to update your browser and to install the latest version.

iOS users, please male sure you're running at least iOS 9.

Mozilla Firefox Google Chrome Microsoft Edge Internet Explorer