Clones should NOT be slaves

More progress I think in my attempt to untangle the confusion behind why people think robots are moral patients.   You should read robots are more like novels than children first.  Then below I respond to a Facebook (the Machine Question page) question about the original robots, the Czech play Rossum's Universal Robots, and asking whether we should distinguish "robots and androids." 

There's a continuum about how much a robot is like a human; they will always be more human than stones in that they compute sense & act, but they don't need to be anthropomorphic. However, as much as I hate elevating fiction to the same conversation as real engineering, if you are looking at RUR or Blade Runner, you are positing modified clones. I would not call these "robots". Such an approach would clearly inherit the issues apes have evolved with subordinance, and thus owning them would be unethical and permanently damaging to them. Clones should NOT be slaves.
The last line alludes to my old talk and book chapter Robots Should Be Slaves.  I originally used that line to clarify that it was obligatory not to build robots we are obliged to, that they should not be human-like because they will necessarily be owned.  However, I took it for granted that humanity has agreed that humans cannot be slaves, and assumed I was transparently asserting robots' inhumanity.  In fact, I realise now that you cannot use the term "slave" without invoking its human history, so I don't use this line in talks any more.  But the founder of Machine Question, Dave Gunkel, keeps bringing that paper up so I felt safe alluding to it here.

Update 31 July 2016: Update for the Westworld controversy:  I don't need to say anything, just direct you to Why do we give robots female names? Because we don't want to consider their feelings. Laurie Penny is wrong to assume that AI will also necessarily suffer from emotional neglect, but right to think AI ethics is a feminist issue.

Update 9 March 2016:  I had a big discussion with someone on twitter yesterday about this, .  Chris was giving a pretty standard futurist / transhumanist line, that there was no way that it would be OK to call a sentient robot a "robot" or a "slave", these terms were too demeaning.  By "sentient" he meant basically "moral subject", and yes, I totally agree we shouldn't own moral subjects.  But there's also a political and economic reality is that if we manufacture something, we'll own it.  So we shouldn't manufacture moral subjects, so we should choose to build robots that do not have the critical traits of moral subjects.  But simple intelligence (connecting perception to action) does not in itself make something a moral subject.