Humanities & Science (short & simple)

This can't be original, but it also seems weirdly not widely understood.  The version here comes from talking to my friend and literature prof. Alison Waller, er, some weeks ago now – and it informed things I said at the IJCAI AI ethics panel.  (I actually wrote this post just after the panel and before I wrote the panel post, but then sat on this posting because Pinker fluffed the same issue just then.)

Science (by which I mean to include engineering) is descriptive; it's about understanding the present and past, and predicting what can happen in the future.  The Humanities are about giving meaning to the present and choosing what should happen in the future.  That is, the humanities can be normative, but science and engineering cannot. This is not to say there's no overlap.  There has to be overlap, because science is done by humans who need meaning and are guided by norms.  Also, science makes choosing norms easier, or at least more powerful – by allowing us to recognise and understand the possibilities of our actions and their consequences.  But science in itself provides data, not motivation, so it cannot on its own say what needs to be done.  Once a goal is chosen, science can say what the most likely way to achieve it is, and that likelihood can inform the choice of goal.  But the choice itself requires more than data – it requires values. These derive from the humanities.

Addendum (31 October 2013)

I figured out at least one source of the above, though for the general Humanities substitute Theology.  It's the Anne Forest article "Robots and Theology", which unfortunately doesn't appear to be on line, but I quote the relevant section in my commentary that was published with it, Building Persons Is a Choice.

After quoting Foerst I go on to say:
It is possible that science cannot provide meaning. This depends on some axioms that it is not essential to this commentary to debate. Regardless of these, science can and should provide an explanation for why meaning is so often an essential motivation for human adults. This is a basic question of human psychology and human behaviour. 
The rest of the essay goes on to talk about the human condition and how this relates to our issues with robot ethics.
Edited for clarity July 2014

Comments