If a Terminator pours coffee in your lap, who’s to blame? The Terminator or Skynet? Who the hell knows at this point.
Quick question: If a Terminator traveled back in time and accidentally spilled hot coffee on your lap, who would you sue? The Terminator or Skynet.
Tricky question, and one that European lawmakers are currently wrestling with right now in the year 2018.
The issue is with a report from the European Commission, released in early 2017, that suggests creating a “legal status for robots in the long run” so they could be “responsible for making good any damage they may cause”.
It’s one single line in a lengthy report, but it’s been deemed important enough for 156 artificial intelligence experts to write an open letter denouncing the suggestion. According to the letter, there’s a number of reasons why assigning (what the report calls) “electronic personality” to robots is a bad idea.
To begin with, we could remove liability from the companies creating robots. Secondly, we’d have to grant robots “the right to remuneration or the right to citizenship” according to the letter, something that could potentially be in contradiction with the Charter of Fundamental Rights of the European Union and the Convention for the Protection of Human Rights and Fundamental Freedoms.
The open letter claims that the original European Commission report was “distorted by Science-Fiction” and “an overvaluation of the actual capabilities of even the most advanced robots”.
In short, we’re hardly at Blade Runner levels. The Terminator isn’t going to be spilling coffee in your lap any time soon.
The recommendation of the 156 AI experts putting their name to this open letter is pretty clear: Protect human beings at all costs.
“The European Union must prompt the development of the AI and Robotics industry insofar as to limit health and safety risks to human beings,” the letter said. “The protection of robots’ users and third parties must be at the heart of all EU legal provisions.”
Hard agree. Not quite ready for judgement day just yet. (Click to Source)