[A]n amusing message from Amazon.com: "Dear Customer, We've noticed that many customers who've purchased albums by Various are also interested in music by John Williams."...
Consider the extremely similar phenomenon of Google's gmail's ad-targetting... Gmail wisely does not follow Amazon in anthropomorphizing its pattern-analyzing self... They're having enough privacy trouble as it is. Apparently there's an important difference between Amazon and Gmail: Amazon seems to thrive on selling itself as a smart person who watches us very carefully and anticipates our desires, while Gmail has to work very hard to avoid that impression.
Now what I found interesting about this, is its implications when considering the growing importance of emotional labour in the modern economy (thanks to Just Left for the link):
Two ways of measuring the demands of a job have defined industrial relations since the beginning of the Industrial Revolution - time and effort - but a third has emerged in the past few decades: emotional labour. It's not just your physical stamina and analytical capabilities that are required to do a good job, but your personality and emotional skills as well. From a customer services representative in a call centre to a teacher or manager, the emotional demands of the job have immeasurably increased.
...Empathy has become big business, according to consultancy Harding & Yorke, which claims to be able to measure every aspect of the emotional interaction between customer and company. If a company wants its employees to sound warmer or more natural, it turns to the likes of Bob Hughes at Harding & Yorke. Delight your customers and they'll be back, is his watchword: empathy makes money.
...This kind of cognitive restructuring of employees' responses is required to pamper the customer's every whim. Such self-control can be very hard work, as management theorist Irena Grugulis points out: "Expressing warmth towards and establishing rapport with customers may provide a genuine source of pleasure for workers. Yet in practice, emotions are incorporated into organisations within strict limits. Emotion work does not necessarily legitimise the expression of human feelings in a way that supports the development of healthy individuals, instead it offers these feelings for sale."
I feel that there's an important link to be made here, but I'm having trouble putting my finger on it. I guess it's to do with the future presentation of computational processes - will they tend to get 'personalised', or "dressed up", to more and more try to simulate a real person? Or will they instead remain cold and impersonal, merely mechanical, to assure us that there's no threat to our own humanity?
One interesting issue regards the possibility of utilising mechanical processes (e.g. complex computer programs) capable of easing the emotional workloads. As Jonathan mentioned, some people find it disconcerting when computer programs mimick a personal touch. But I find the "friendliness" of tele-marketers (for example) to be no less artificial and off-putting. In effect, we're currently asking real people to pretend to be machines pretending to be real people. The Guardian article suggests that this puts an enormous strain on the workers involved, who are instructed to "Think of yourself as a trash can. Take everyone's little bits of anger all day, put it inside you, and at the end of the day, just pour it into the dumpster on your way out of the door". So why not skip the outer layer of deception, and just use machines pretending to be real people?
Of course, given current levels of technology, this simply isn't practical. Machines aren't versatile enough to respond - let alone respond appropriately - to all the different possible concerns a customer might have. But I see no reason why advances in artificial intelligence couldn't make improved simulations viable, perhaps within the next few decades.
So I guess the questions I'm raising here are:
1) How will computed processes be presented in future - as personal or impersonal? (Or perhaps both, with a growing divide between the Amazon type vs the Gmail type?)
2) Would it be possible (in future) to use advanced computer programs for emotional labour?
3) If so, would this be desirable?