Monday, June 21, 2004

Artificial Empatelligence

Jonathan Ichikawa recently noticed an interesting contrast between how different businesses want to portray the mechanical/computational aspects of their service:
[A]n amusing message from Amazon.com: "Dear Customer, We've noticed that many customers who've purchased albums by Various are also interested in music by John Williams."...

Consider the extremely similar phenomenon of Google's gmail's ad-targetting... Gmail wisely does not follow Amazon in anthropomorphizing its pattern-analyzing self... They're having enough privacy trouble as it is. Apparently there's an important difference between Amazon and Gmail: Amazon seems to thrive on selling itself as a smart person who watches us very carefully and anticipates our desires, while Gmail has to work very hard to avoid that impression.

Now what I found interesting about this, is its implications when considering the growing importance of emotional labour in the modern economy (thanks to Just Left for the link):
Two ways of measuring the demands of a job have defined industrial relations since the beginning of the Industrial Revolution - time and effort - but a third has emerged in the past few decades: emotional labour. It's not just your physical stamina and analytical capabilities that are required to do a good job, but your personality and emotional skills as well. From a customer services representative in a call centre to a teacher or manager, the emotional demands of the job have immeasurably increased.

...Empathy has become big business, according to consultancy Harding & Yorke, which claims to be able to measure every aspect of the emotional interaction between customer and company. If a company wants its employees to sound warmer or more natural, it turns to the likes of Bob Hughes at Harding & Yorke. Delight your customers and they'll be back, is his watchword: empathy makes money.

...This kind of cognitive restructuring of employees' responses is required to pamper the customer's every whim. Such self-control can be very hard work, as management theorist Irena Grugulis points out: "Expressing warmth towards and establishing rapport with customers may provide a genuine source of pleasure for workers. Yet in practice, emotions are incorporated into organisations within strict limits. Emotion work does not necessarily legitimise the expression of human feelings in a way that supports the development of healthy individuals, instead it offers these feelings for sale."

I feel that there's an important link to be made here, but I'm having trouble putting my finger on it. I guess it's to do with the future presentation of computational processes - will they tend to get 'personalised', or "dressed up", to more and more try to simulate a real person? Or will they instead remain cold and impersonal, merely mechanical, to assure us that there's no threat to our own humanity?

One interesting issue regards the possibility of utilising mechanical processes (e.g. complex computer programs) capable of easing the emotional workloads. As Jonathan mentioned, some people find it disconcerting when computer programs mimick a personal touch. But I find the "friendliness" of tele-marketers (for example) to be no less artificial and off-putting. In effect, we're currently asking real people to pretend to be machines pretending to be real people. The Guardian article suggests that this puts an enormous strain on the workers involved, who are instructed to "Think of yourself as a trash can. Take everyone's little bits of anger all day, put it inside you, and at the end of the day, just pour it into the dumpster on your way out of the door". So why not skip the outer layer of deception, and just use machines pretending to be real people?

Of course, given current levels of technology, this simply isn't practical. Machines aren't versatile enough to respond - let alone respond appropriately - to all the different possible concerns a customer might have. But I see no reason why advances in artificial intelligence couldn't make improved simulations viable, perhaps within the next few decades.

So I guess the questions I'm raising here are:
1) How will computed processes be presented in future - as personal or impersonal? (Or perhaps both, with a growing divide between the Amazon type vs the Gmail type?)
2) Would it be possible (in future) to use advanced computer programs for emotional labour?
3) If so, would this be desirable?

1 comment:

  1. [Copied from old comments thread]

    This is an interesting discussion. All I have to say right now, though, is that this is false:

    Jonathan finds it disconcerting when computer programs mimick a personal touch.I personally have no trouble with it at all; I was merely observing that some people do and would.
    Jonathan | Email | Homepage | 22nd Jun 04 - 8:38 am | #

    -------------------

    I'm not sure about the three questions, but I do want to say this: I agree with what you say about the emotive sales techniques these days... they are kinda silly. Personally, I find it insulting when I go to burger king, order a burger, and can tell that the person over the counter is forcing a smile and repeating the same cheesy line to me as they did to the last 100 customers. Interestingly, burger king has a store policy that the staff have to have a 'genuine smile' one their face... which is kinda stupid cuz if your not happy, how can you smile genuinely? I'd prefer a totally neutral, but polite manner over that false emotion any day! And as far as computer programs go, I would go insane if I ring up tele-banking and they use even one more word than is absolutely neccessary. I don't ring up for a conversation, I ring up to do my banking.

    I get pissed off when the people over the phone ask me "if there is anything else I wanted"... if there was I would ask, and if there isn't, they are wasting my time by asking! I'm sure that there are many people that disagree with me on this one and like polite, friendly automatons (whether human robots or simulated ones!) but I prefer functionality.
    Patrick Kerr | 22nd Jun 04 - 12:14 pm | #

    ---------------

    Sorry Jonathan, I'll update the post accordingly.

    Patrick - Yeah, I'd certainly agree with your preference for a polite manner over false friendliness. Regarding the robots, however, it's worth bearing in mind that personality can aid functionality - for example, if you compare "we've noticed..." to something like "Computational statistical analyses indicate that...", I think the former is actually preferable - both nicer to deal with, and easier to understand.
    Richard | Email | Homepage | 22nd Jun 04 - 3:53 pm | #

    ------------------

    I don't mind when computers pretend to be people, because they generally do a poor job of it and I can laugh at them "haha, stupid thing doesn't make sense" as in the Amazon email there.

    But when people pretend to be computers pretending to be people, as you put it, it annoys me. Why can't they just pretend to be people and be honest about what they really think about some things, rather than sticking to the exacting Company Line that they've been told to? Overly happy and enthusiatic shop people piss me off.

    As for Patrick's comment about "anything else?" - this is a very useful question, because you may have had something else that you wanted to talk about, but have temporarily forgotten, and this can trigger your memory and make you ask them. Rather than have to ring up later, visit them later, or just plain forget about it altogether.
    Lanthanide | 23rd Jun 04 - 12:48 pm | #

    ReplyDelete

Visitors: check my comments policy first.
Non-Blogger users: If the comment form isn't working for you, email me your comment and I can post it on your behalf. (If your comment is too long, first try breaking it into two parts.)

Note: only a member of this blog may post a comment.