On killer robots
The Guardian's editorial of 14th April 2014 (Weapons systems are becoming autonomous entities. Human beings must take responsibility) argued that killer robots should always remain under human control, because robots can never be morally responsible. They kindly published my reply, which said that this may not be true if and when we create machines whose cognitive abilities match or exceed those of humans in every respect. Surveys indicate that around 50% of AI researchers think that could happen before 2050. But long before then we will face other dilemmas. If wars can be fought by robots, would that not be...