The Future Air War
I think the people creating these machines may need to read up on their Isaac Asimov, in particular his three laws of robotics:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Still, as long as nobody comes up with an idea called SkyNet, I guess we'll be fine.
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Still, as long as nobody comes up with an idea called SkyNet, I guess we'll be fine.
Last edited by Martin the Martian; 3rd Jul 2013 at 19:38. Reason: spelling
Key points
Lethal autonomous robots are pre-programmed to kill
Can make their own decisions and do not need to be controlled by humans
Lethal autonomous robots are pre-programmed to kill
Can make their own decisions and do not need to be controlled by humans
No man in the loop.
(I know, they only move at the speed of continental drift but speed wasn't mentioned as a requirement for this new "controversy".)
Join Date: Aug 2010
Location: UK
Posts: 714
Likes: 0
Received 0 Likes
on
0 Posts
For the sake of argument: aren't simple landmines "pre-programmed to kill"? They sense, make a pre-programmed decision, and kill.
No man in the loop.
(I know, they only move at the speed of continental drift but speed wasn't mentioned as a requirement for this new "controversy".)
No man in the loop.
(I know, they only move at the speed of continental drift but speed wasn't mentioned as a requirement for this new "controversy".)
By no means an expert but I suspect not balsa, once activated the landmine can do nothing but detonate. There is no real intelligence there - for argument sake it could not differentiate between the footfall of a legitimate target and that of a non legitimate target. It does not make decisions based of ROE, it cares not if the enemy is retreating, carrying casualties, or if a ceasefire has been declared.
Last edited by TomJoad; 4th Jul 2013 at 16:26.
The following 3 users liked this post by Mogwi:
I think the people creating these machines may need to read up on their Isaac Asimov, in particular his three laws of robotics:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Still, as long as nobody comes up with an idea called SkyNet, I guess we'll be fine.
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Still, as long as nobody comes up with an idea called SkyNet, I guess we'll be fine.
The following users liked this post:
Join Date: Jan 2007
Location: Mostly in my own imagination
Posts: 477
Received 312 Likes
on
146 Posts
Salute!
Maybe the "zeroth" law, Bald?
See last Asimov book and Daneel's ultimate mission. Great read.
Foundation and Earth.
Interestingly, a robot developed the "zeroth" law. I would hope the computer geeks today could make that embedded in the chips for the AI stuff coming here to a theater near you.
Gums sends...
Maybe the "zeroth" law, Bald?
See last Asimov book and Daneel's ultimate mission. Great read.
Foundation and Earth.
Interestingly, a robot developed the "zeroth" law. I would hope the computer geeks today could make that embedded in the chips for the AI stuff coming here to a theater near you.
Gums sends...
For the sake of argument: aren't simple landmines "pre-programmed to kill"? They sense, make a pre-programmed decision, and kill.
No man in the loop.
(I know, they only move at the speed of continental drift but speed wasn't mentioned as a requirement for this new "controversy".)
No man in the loop.
(I know, they only move at the speed of continental drift but speed wasn't mentioned as a requirement for this new "controversy".)