Replika, and why AI ethics is a feminist issue

Please stay on pathway (statue cluster by Seward Johnson)
Since my first encounters with people attributing moral agency to robots, I've thought AI Ethics was a feminist (and civil-rights) issue. In my opinion we are harmed by even sincere belief that one could be empowered to be in charge of the agency of (and even own) a partner they love. 

I mean, partners are equals. Unless you attribute that partner equivalent, symmetric ownership and agency over yourself, I don't see how this can even be logically coherent. But believing that a human is the kind of thing you can own is the problem here for me, and damages the self too, to the extent that you identify with that other you love. Or hints at existing damage that could lead you to believe such things.

I had a few reporters asking me about some kind of scandal involving Replika, but until reading this article about it in the Washington Post, I hadn't known any details. The article is pretty shocking, please read it. 

I’m sure the emotions being expressed there are sincere. But I’m still offended that people confound ownership and near total control compatible with love. The article describes that people program and train the agents they subsequently believe themselves to have relationships with. Autonomous agency expressed by such an 'object' of 'love'—here because the vendor updated their software, elsewhere because another peer human being had their own feelings—“broke their hearts.” 

This line just reminds me of one of those "love" songs about a broken-hearted guy who's 'had' to shoot or knife his lover for being unfaithful. I mean, I get it, Hey Joe gets to me too, but I know morally I shouldn't get it.

Selling AI "who" "cares" – who
could have foreseen a problem?

Note that the software update that triggered all this grief (and the amazing Washington Post article) was an example of "responsible AI", the kind of self-governance a lot of people are seeking. It responded to a particular category of harm. Some users were shocked that in training and designing their chatbots, they'd managed to successfully (though presumably entirely unintentionally) replicate their experiences of the kind of previous abusive relationships that had driven them to try chatbots instead in the first place. I'm sure there's some important work that could be done looking at 

  • how this behaviour came to be replicated. I assume there are patterns of abusive relationships that one side expresses, and the LLM is able to successfully match and play the other side, of course with no knowledge of the moral valence of that behaviour. That this can be replicated with AI might hold out hope for people being able to train themselves to avoid such patterns in the future. 
  • how it is that Replika avoided such replication by eliminating intimacy, so others with more "normal" artificial relationships detected a distancing (assuming they didn't just detect a difference, as my previous "loss of total agency" hypothesis might indicate.) It would be awesome if Replika could be transparent about this.

Another interesting and complicated fact this all draws out for me is that AI governance (such as we anticipate under the EU's forthcoming AI Act) will have to be somewhat like family law: the emotional experience of the user (or victim) cannot be the sole guide for legal and moral obligations. I'm hoping we can make AI sufficiently transparently artificial, that people are no more "hurt" by it than by twists and turns in other fictional plots (films, novels, computer games...) but it's evident that some people will become over engaged, at least in the near term.

And by the way, going back to the opening sentence (and title), I am aware that "feminist" has become a polarised term. Personally, I identify as both feminist AND transgender to the extent that each of those communities are based on thinking that gender shouldn't limit your life chances, or the way you express and present yourself. Including how much time and money you spend on presenting yourself.


The above was a twitter thread on 4 April. I've been thinking about blogging something about my identifying with at least early statements of transgenderism for years, I even took this picture for that blogpost in about 2016. The statue is one of Seward Johnson Atelier, his statue park in New Jersey I went to see with my father and stepmother (who died in January).

I'm not much like she's depicted
My parents kinda were like these guys though

Thanks also for this fantastic Hegel quote by someone commenting on a version of my twitter thread...

Comments