Had my first encounter with a (not very) new word the other day. Came up in an Innotribe session at Sibos (I would link, but the conference isn't my point here and it's copiously online). The word is "cobot". And the thinking is: your job is taken by a robot, and the robot is very good at most of it. But there's an element that the robot just doesn't get. For example: "Why are we making a film that depicts the destruction of the human race by..."...terminators, Mayan prophesies (remember?), zombies, overly warm weather? Delete as applicable. Difficult to explain to a shiny silver mind that human minds enjoy disasters, horror movies, chaos, entertaining thoughts of their own destruction. So you need a human mind to do the understanding of all that.
Asimov got it right, in the sense that the motivation (sorry, spoiler) in his Robot books is protection. So did James Cameron and Gale Anne Hurd. Sort of. Or possibly not. Have I got that wrong? It was pre-emptive self-defence, maybe? Anyway - I suspect that if there's ever a real-life equivalent to any of that, there will have to be a human mind behind it all, getting a thrill out of nudging that robot finger towards the big red button. Stretching a point, I suppose the Terminator is only doing what James Cameron directs it to do.
And then secondly, there's the real variety of AI. It's present in the world today, and it isn't half as impressive as the first. Real AI does that thing with your fridge where it works out that you're low on milk and orders more from a delivery service. Or it sits on tables in the houses of people who live in television advertisements, and responds wittily to their instructions to alter the mood lighting or the heating. Real AI has names, but not always, and it just extends our capabilities - "just"! It's useful in the way that a TV remote is useful if you've just got comfortable and an ad comes on for, er, [insert name here]. It kills us slowly, like so much technology kills us eventually. [Discuss.]
When the real disaster comes, and the supermarkets are emptied by looters, I hope I can channel some kind of ancestral hunting/warming the house instinct. How many of us secretly believe we'd manage quite well in the proverbial zombie apocalypse - and how many of us had full fuel tanks when the tanker drivers went on strike in the UK a few years back? When the real disaster comes - if it ever does - it won't be because some machine decides all by itself to turn nasty. We'll nudge it along. "Algorithms have parents," says Clara Durodié, CEO, Cognitive Finance Group (Sibos again). We're "bringing up" the machines; we're responsible for their "adult" behaviour (but let's not get into nature versus nurture, please).
So the singularity that matters isn't the one where some robot starts making up its own mind (in however many senses - sic). It's the singularity where the real AI does what we all seem to want it to do, and merges with the imaginary AI - with our encouragement. The danger is not robots going crazy on their own; it's robots doing what they're told.
* In a way that I didn't quite get it here.
The issue goes away with empathy. It goes away with understanding. It goes away with any sense of the self as one of many. Talking about behaviour, we can agree on what's "bad" and what's "wrong" because we come together in communities and have to/want to get along. If we're going to bring in words that echo religion, we could mention tolerance here, even forgiveness. Maybe persuasion fits in as well, and possibly even reason? The statement "We are not animals" is true as well as false. But I don't think anything goes away - not completely, not finally - with intolerance, or coercion, or any kind of legalistic, judgemental enforcement of rules.
It's a difficulty to which I doubt there's an answer. I just notice the hostility that we bring to disagreement these days. We argue for our political ideals by vilifying the believers in other ideals - left versus right, Remain versus Brexit, Trump versus ... yeah. We attack each other's rival ideas of how to achieve peace. To labour the point, we're aggressive in our pacifism.
Actually, this goes back through history and religious belief hardly guarantees good behaviour either, does it? I suppose what I'm suggesting is ... oh ... a combination of self-confidence and respect for others? Something like that? Lots of high-sounding qualities that come from within. Or maybe what I'm doing here is thinking around the apparent disappearance of all those "goes away with" qualities from the second paragraph.
What's missing from our lives, and why? What makes us react the way we do?