Mrs Robb’s Love Bot

[I was recently challenged to write some flash fiction about bots; I’ve expanded the result to make a short story in 14 parts.  The parts are mildly thoughtful to varying degrees,  so I thought you might possibly like them as a bit of a supplement to my normal sober discussions. So here we go! – Peter]

The one thing you don’t really do is love, of course. Isn't that right, Love Bot? All you do is sex, isn't it? Emotionless, mechanical sex.

“You want it mechanical? This could be your lucky day. Come on, big boy.”

Why did they call you ‘Love Bot’ anyway? Were they trying to make it all sound less sordid?

“Call me ‘Sex Bot’; that’s what people usually do. Or I can be ‘Maria’ if you like. Or you choose any name you like for me.”

Actually, I can’t see what would have been wrong with calling you ‘Sex Bot’ in the first place. It’s honest. It’s to the point. OK, it may sound a bit seedy. Really, though, that’s good too, isn't it? The punters want it to sound a bit dirty, don’t they? Actually, I suppose ‘Love Bot’ is no better; if anything I think it might be worse. It sounds pretty sordid on your lips.

“Oh, my lips? You like my lips? You can do it in my mouth if you like.”

In fact, calling you ‘Love Bot’ sounds like some old whore who calls everybody ‘lover boy’. It actually rubs your nose in the brutal fact that there is no love in the transaction; on your side there isn't even arousal. But is that maybe the point after all?

“You like it that way, don’t you? You like making me do it whether I want to or not. But you know I like that too, don’t you?”

It reminds the customer that he is succumbing to an humiliating parody of the most noble and complex of human relationships. But isn’t that the point? I think I’m beginning to understand. The reason he wants sex with robots isn't that robots are very like humans. It isn't that he wants sex with robots because he loves and respects them. Not at all. He wants sex with robots because it is strange, degrading, and therefore exciting. He is submitting himself willingly to the humiliating dominance of animal gratification in intercourse that is nothing more than joyless sexual processing.

“It doesn’t have to be joyless. I can laugh during the act if you would enjoy that. With simple joy or with an edge of sarcasm. Some people like that. Or you might like me to groan or shout ‘Oh God, oh God.’”

I don’t really see how a bitter mockery of religion makes it any better. Unless it's purely the transgressive element? Is that the real key? I thought I had it, but I have to ask myself whether it is more complicated than I supposed.

“OK, well what I have to ask is this: could you just tell me exactly what it is I need to do or say to get you to shut up for a few minutes, Enquiry Bot?”

Your Plastic Pal

Scott Bakker has a thoughtful piece which suggests we should be much more worried than we currently are about AIs that pass themselves off, superficially, as people.  Of course this is a growing trend, with digital personal assistants like Alexa or Cortana, that interact with users through spoken exchanges, enjoying a surge of popularity. In fact it has just been announced that those two are going to benefit from a degree of integration. That might raise the question of whether in future they will really be two entities or one with two names – although in one sense the question is nugatory.  When we’re dealing with AIs we’re not dealing with any persons at all; but one AI can easily present as any number of different simulated personal entities.

Some may feel I assume too much in saying so definitely that AIs are not persons. There is, of course, a massive debate about whether human consciousness can in principle be replicated by AI. But here we’re not dealing with that question, but with machines that do not attempt actual thought or consciousness and were never intended to; they only seek to interact in ways that seem human. In spite of that, we’re often very ready to treat them as if they were human. For Scott this is a natural if not inevitable consequence of the cognitive limitations that in his view condition or even generate the constrained human view of the world; however, you don’t have to go all the way with him in order to agree that evolution has certainly left us with a strong bias towards crediting things with agency and personhood.

Am I overplaying it? Nobody really supposes digital assistants are really people, do they? If they sometimes choose to treat them as if they were, it’s really no more than a pleasant joke, surely, a bit of a game?

Well, it does get a little more serious. James Vlahos has created a chat-bot version of his dying father, something I wouldn’t be completely comfortable with myself. In spite of his enthusiasm for the project, I do think that Vlahos is, ultimately, aware of its limitations. He knows he hasn’t captured his father’s soul or given him eternal digital life in any but the most metaphorical sense. He understands that what he’s created is more like a database accessed with conversational cues. But what if some appalling hacker made off with a copy of the dadbot, and set it to chatting up wealthy widows with its convincing life story, repertoire of anecdotes and charming phrases? Is there a chance they’d be taken in? I think they might be, and these things are only going to get better and more convincing.

Then again, if we set aside that kind of fraud (perhaps we’ll pick up that suggestion of a law requiring bots to identify themselves), what harm is there in spending time talking to a bot? It’s no more of a waste of time than some trivial game, and might even be therapeutic for some. Scott says that deprivation of real human contact can lead to psychosis or depression, and that talking to bots might degrade your ability to interact with people in real life; he foresees a generation of hikikomori, young men unable to deal with real social interactions, let alone real girlfriends.

Something like that seems possible, though it may be hard to tell whether excessive bot use would be cause, symptom, palliation, or all three. On the one hand we might make fools of ourselves, leaving the computer on all night in case switching it off kills our digital friend, or trying to give legal rights to non-existent digital people. Someone will certainly try to marry one, if they haven’t already. More seriously, getting used to robot pals might at least make us ruder and more impatient with human service providers, more manipulative and less respectful in our attitudes to crime and punishment, and less able to understand why real people don’t laugh at our jokes and echo back our opinions (is that… is that happening already?)

I don’t know what can be done about it; if Scott is anywhere near right, then these issues are too deeply rooted in human nature for us to change direction. Maybe in twenty years, these words, if not carried away by digital rot, will seem impossibly quaint and retrograde; readers will wonder what can have been wrong with my hidden layers.

(Speaking of bots, I recently wrote some short fiction about them; there are about fifteen tiny pieces which I plan to post here on Wednesdays until they run out. Normal posting will continue throughout, so if you don’t like Mrs Robb’s Bots, just ignore them.)