<a href="https://www.facebook.com/RTDocumentary/videos/1819508171439117/" target="_blank">https://www.facebook.com/RTDoc...os/1819508171439117/</a>Are you for or against military use of AI to kill humans without human oversight?Yes -- it is okay (please say why)Yes -- But AI must obey Asimov's 3 rules of roboticsNo -- AI has no restraints
Do not necessarily attribute someone's nasty or inappropriate actions as intended when it may be explained by ignorance or stupidity.
July 14, 2018, 10:54 AM
Rustyblade
Appreciate if someone could help me open the video. ------------
Thank you cheniThis message has been edited. Last edited by: Rustyblade,
Do not necessarily attribute someone's nasty or inappropriate actions as intended when it may be explained by ignorance or stupidity.
July 14, 2018, 11:09 AM
Sig2340
Can we have a choice of No without Asimov's bleating attached?
Nice is overrated
"It's every freedom-loving individual's duty to lie to the government." Airsoftguy, June 29, 2018
July 14, 2018, 11:15 AM
Rustyblade
quote:
Originally posted by Sig2340: Can we have a choice of No without Asimov's bleating attached?
Good point. Thank you.
Do not necessarily attribute someone's nasty or inappropriate actions as intended when it may be explained by ignorance or stupidity.
July 14, 2018, 11:17 AM
konata88
With current state of human intelligence and competencies, NO to AI doing anything that might harm humans (intentionally or not) without oversight.
"Wrong does not cease to be wrong because the majority share in it." L.Tolstoy "A government is just a body of people, usually, notably, ungoverned." Shepherd Book
July 14, 2018, 11:36 AM
gearhounds
I don't have a problem with autonomous vehicles attacking and destroying military vehicles during combat, with combatants within or not. But any device targeting individuals must have direct oversight.
On another note, I am shocked that there has not been some kind of terror attack utilizing a drone. It would be a simple matter to load a large, weight bearing drone up with a toxic substance, biological weapon, or radioactive threat. It would be simple to pack one with explosives and fly it into the flight path of a jet on approach. The variety of attack scenarios are infinite.
“Remember to get vaccinated or a vaccinated person might get sick from a virus they got vaccinated against because you’re not vaccinated.” - author unknown
July 14, 2018, 11:39 AM
sigmonkey
"Once the rock has left your hand, the only oversight you have is as a spectator."
-Ook the caveman
"the meaning of life, is to give life meaning" ✡ Ani Yehudi אני יהודי Le'olam lo shuv לעולם לא שוב!
July 14, 2018, 11:49 AM
BBMW
I think what we might say has no bearing on what will actually be done.
July 14, 2018, 11:55 AM
cheni
----------------- Silenced on the net, Just like Trump
July 14, 2018, 01:02 PM
redleg2/9
quote:
...But AI must obey Asimov's 3 rules of robotics
In case you have never read them:
"A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law." .
“Leave the Artillerymen alone, they are an obstinate lot. . .” – Napoleon Bonaparte
Are you kidding me? Just the fact that you linked to the right arm of the beast freaks me out!
_____________________________________________________ Sliced bread, the greatest thing since the 1911.
July 14, 2018, 01:36 PM
sigfreund
Asimov was brilliant, as were his Robot books, but although his three laws impressed the ever-lovin’ out of me as a kid, others pointed out later that even if it were possible to ensure they were enforced faithfully in AI creations, that wouldn’t guarantee things would turn out the way we humans would prefer. One of the most obvious issues is the “or, through inaction, allow a human to come to harm” clause. An AI might decide that countless things we humans do cause harm to ourselves: owning weapons, eating the wrong foods, releasing pollutants into the environment, engaging in risky activities such as riding motorcycles, etc. And because controlling such things would require rigorous control over our behaviors, that’s what would be necessary: countless free will restrictions.
If the three laws were followed, it would of course be impossible for an AI to serve as a weapon except, possibly, against other robots.
► 6.0/94.0
I can tell at sight a Chassepot rifle from a javelin.
July 14, 2018, 02:00 PM
arcwelder
I welcome our Robot overlords.
I'm getting sick of people and their "feelings" being so fucking important.
Arc. ______________________________ "Like a bitter weed, I'm a bad seed"- Johnny Cash "I'm a loner, Dottie. A rebel." - Pee Wee Herman Rode hard, put away wet. RIP JHM "You're a junkyard dog." - Lupe Flores. RIP
July 14, 2018, 02:09 PM
Voshterkoff
I'm sure South Korea wouldn't mind the Samsung automated gun platform system gunning anything in sight down if it's northern neighbors were to march across the dmz.
July 14, 2018, 02:29 PM
mrw
Let them loose on our foes. I have worked in robotics for 30 years and with the proper rules then no worries. Don’t fall for the Hollywood BS terminator crap.