Programmed Perspective; Empathy > Emotion for Digital Assistants

Personal assistants are anything but personal.  When I ask Alexa what the weather is, I receive an answer about the weather in my location.  When someone on the other side of the world asks Alexa that same question, they too will find out what the weather is like in their location. Neither of us will find Alexa answering with a different personality or the interaction further cementing our friendship. It is an impersonal experience.

When we talk about personal assistants, we want them to know when we need pure expediency from a conversation, when we want expansion on details, and the different way each one of us likes to be spoken to.

I would like to propose two solutions to this problem – emotion and empathy.  I’d like you to see from my view why empathy is the path we should be taking.

AdobeStock_49711430

Emotion

An emotional assistant would be personal. It would require either a genuine internal experience of emotion (which is just not possible today), or an accurate emulation of emotion.  In the same way that we build up relationships with people overtime, starting from niceties and formality, to gradually developing a relationship unique to the two parties that guides all their interactions.  Sounds great, but it’s not all plain sailing.  I’m sure everyone has experienced a time where we’ve inadvertently offended a friend in a way that has made it more difficult for us to communicate for some time afterwards, or even damaging a relationship in a way that it’ll never repair itself.

We really don’t want this with a personal assistant.  If you were a bit short with Alexa yesterday because you were tired, you still want it to set off your alarm the next morning.  You don’t want Alexa to tell you that it can’t be in the same room as you and to refuse to answer your questions until it gets a heartfelt apology.

Empathy

Empathy does not need to be emotional.  Empathy requires that we put ourselves in the place of others to imagine how they feel, and to act appropriately.  This ideally is what doctors do.  A doctor must empathise with a patient, putting themselves in their shoes to understand how they will react to difficult news, and how to describe the treatment to ensure they feel as comfortable as possible.  Importantly though, the doctor should be removed emotionally from the situation.  If they are to personally feel the emotion with each appointment it could become unbearable.  Empathy helps them to add a layer of abstraction, allowing them to shed as much of the emotion as possible when they return home.

This idea is described in Jean Paul Sartre’s ‘Being and Nothingness’.  Sartre describes two types of things;

  • Beings in themselves – An unconscious object, like a mug or a pen.
  • Beings for themselves – People and conscious things.

In our every day lives we are a hybrid of the two.  Though we are people, and naturally become a thing for itself, we adopt roles like doctors, managers, parents and more. These roles are objects, like a pen or a mug as they have an unspoken definition and structure. We use these roles/objects to guide how we interact in different situations in life. In a new role we ask ourselves ‘what should a manager do in this situation’, or ‘what would a good doctor say’.  It may become less obvious as we grow into a role, but it’s still there.

When you go into a store, we have an accepted code of conduct and a type of conversation we expect to have with the retailer.  We naturally expect them to be polite, to ask us how we are, to share knowledge of different products and services, and that their aim is to sell us something.  We believe that we can approach them, a stranger, and ask question upon question in our preamble.

Sartre states ‘A grocer who dreams is less a grocer’ (to paraphrase).  Though the grocer may be more honest to themselves as a person, they’re reducing their utility as a grocer.  It’s easy to imagine stopping to buy some vegetables, and getting stuck in an irrelevant conversation for half an hour.  It might be a nice break from the norm, and a funny story to tell when you get home, but in general we want our grocers to be…. Grocers…

If we apply this to personal assistants, it really comes together.  We want to receive the kind of personal service that we would get from a person who is really great at customer service.  We want it to communicate information to us in a way which works best for us.  By making an empathetic assistant over what we have today we gain personalisation and utility

If we go fully emotional we gain more personalisation, but the trade-off is utility.  What we don’t want is an emotion assistant, which becomes depressed, and gets angry at us.  Or even on the other extreme which becomes giddy with emotion and struggles to structure a coherent sentence to us because of the digital butterflies in its stomach.  That’s both deeply unsettling, and unproductive.

So, let’s build empathetic assistants.

Published by

Ben Gilburt

I lead Sopra Steria's horizon scanning team, researching emerging technologies that have the greatest potential to impact our business and that of our clients and finding how we can best make use of them. I'm also a philosophy undergrad, and the intersection between philosophy and technology often leads me to machine and robot ethics. If you're interested in this kind of thing too, follow me @RealBenGilburt

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.