I just got a robocall from a neighbouring police department asking me to
search my "vehicles and outbuildings" for a specific, named, 16-year-old
who went missing less than 11 hours ago. I'm wondering if he's going to
wake up from his first night with a partner to be mortified, or if he's
actually known to have done something wrong or to be in trouble.
The robocall clearly had slots built in for the description, age, name, last location etc. of the missing person, and was indeed targeted to the location (the last known position of the kid was much nearer our house than the police department). You could imagine this being very useful in an emergency, but you could also imagine people insisting it be used against people practising ordinary levels of autonomy.
I in fact just had searched my yard because a tradesman came looking for a rope he might have left here, but I think I would have looked anyway in case there was a kid really in trouble. Still, I feel that it's wrong to be mobilised without knowing the reason for the mobilisation. It means I'm being used just as an instrument of the state, not as an agent constituting and contributing to the state.
This is after all the entire point of my stance on AI: that we are ethically obliged to keep artefacts just as intelligent instruments, things that sense and act for us, but have no intrinsic motivation for competition for human goals such as self worth or self expression. This is not to say that robots could not be agents of creative expression, but that the selves that they express should be human selves, human constructions.
It's perplexing to me that some vociferous members of our society are demanding that even robots not be "reduced to" (actually, maintained as, but these members say "reduced to") mere instruments of human moral subjectivity, while at the same time AI is actually facilitating the reduction of humans to mere instruments. I realise that the reason I was not fully informed about the context I was being absorbed into was probably to protect the 16-year-old's privacy (well, part of it; I already know his name, description, and last location), but also to save myself time, and possibly to keep me unbiased if there's a need for a jury. But still, I know enough about automation to be conflicted about knowingly engaging in reducing my own autonomy.
The robocall clearly had slots built in for the description, age, name, last location etc. of the missing person, and was indeed targeted to the location (the last known position of the kid was much nearer our house than the police department). You could imagine this being very useful in an emergency, but you could also imagine people insisting it be used against people practising ordinary levels of autonomy.
I in fact just had searched my yard because a tradesman came looking for a rope he might have left here, but I think I would have looked anyway in case there was a kid really in trouble. Still, I feel that it's wrong to be mobilised without knowing the reason for the mobilisation. It means I'm being used just as an instrument of the state, not as an agent constituting and contributing to the state.
This is after all the entire point of my stance on AI: that we are ethically obliged to keep artefacts just as intelligent instruments, things that sense and act for us, but have no intrinsic motivation for competition for human goals such as self worth or self expression. This is not to say that robots could not be agents of creative expression, but that the selves that they express should be human selves, human constructions.
It's perplexing to me that some vociferous members of our society are demanding that even robots not be "reduced to" (actually, maintained as, but these members say "reduced to") mere instruments of human moral subjectivity, while at the same time AI is actually facilitating the reduction of humans to mere instruments. I realise that the reason I was not fully informed about the context I was being absorbed into was probably to protect the 16-year-old's privacy (well, part of it; I already know his name, description, and last location), but also to save myself time, and possibly to keep me unbiased if there's a need for a jury. But still, I know enough about automation to be conflicted about knowingly engaging in reducing my own autonomy.
One of our outbuildings is rather transparent. |
Comments