Defense Innovation Board Unveils AI Ethics Principles For the Pentagon (venturebeat.com) 34
The Defense Innovation Board, a panel of 16 prominent technologists advising the Pentagon, today voted to approve AI ethics principles for the Department of Defense. From a news article: The report includes 12 recommendations for how the U.S. military can apply ethics in the future for both combat and non-combat AI systems. The principles are broken into five main principles: responsible, equitable, traceable, reliable, and governable. The principles state that humans should remain responsible for "developments, deployments, use and outcomes," and AI systems used by the military should be free of bias that can lead to unintended human harm. AI deployed by the DoD should also be reliable, governable, and use "transparent and auditable methodologies, data sources, and design procedure and documentation." "You may see resonances of the word fairness in here [AI ethics principle document]. I will caution you that in many cases the Department of Defense should not be fair," DIB board member and Carnegie Mellon University VP of research Michael McQuade said today. "It should be a firm principle that ours is to not have unintended bias in our systems." Applied Inventions cofounder and computer theorist Danny Hillis and board members agreed to amend the draft document to say the governable principle should include "avoid unintended harm and disruption and for human disengagement of deployed systems." The report, Hillis said, should be explicit and unambiguous that AI systems used by the military should come with an off switch for a human to press in case things go wrong.
Seems hopeless (Score:3, Insightful)
Re: (Score:1)
There will always be tyrants who would try to annihilate you. The only answer is to have a stronger military than them. Speak softly and carry a big stick.
Re: (Score:3)
There will not always be tyrants (Score:1)
At least not running whole countries. Maybe family level tyrants... like lice, jumping from place to place, forever hunted, never to be ubiquitous on every human head again...
Progress is not only possibly you can look at history and see how likely it is and which direction it's going in.
If we LIKE tyrants... they can be around forever, and if not, they can be eliminated forever.
Re: (Score:3)
> The only answer is to have a stronger military than them.
NO, murdering others is NOT not the only answer.
Fighting for peace is like fucking for virginity.
Only idiots fight another man's rich war.
--
Q. What do you call someone who murders 170 people?
A. Depends who is paying them: If the military than a war "hero", if no one than a serial killer / psychopath.
Re: (Score:2)
You certainly don't need to have a stronger military than the other side in order to defend your country. Switzerland has not maintained hundreds of years of peace by having the world's strongest military, but simply by having adequate defenses and a peace-oriented foreign policy. Vietnam obviously didn't have a stronger military than the USA. China has a lesser military than the USA, but the odds of the USA annihilating China are zilch.
The defender always has a huge advantage. The committed defender who be
Re: (Score:3)
Switzerland has maintained hundreds of years of peace by having a small piece of mostly meaningless real estate.
Re: (Score:2)
There are currently 40 active wars going on around the globe right now. War will never be obsolete. War never changes.
war is constantly changing (Score:1)
you might as well say there will always be kings... kings never change.
Re: (Score:2)
There are currently 40 active wars going on around the globe right now. War will never be obsolete. War never changes.
Well something that has changed is very regular state of war somewhere in western Europe for nearly 2 millennia hasn't existed so much for some years now.
Re: (Score:2)
War is obsolete because its consequences are unacceptable
If you have superior weapons, the consequences could be pretty good.
Re: (Score:2)
Re: (Score:2)
Those weren't real wars. In a real one, they would have dropped some nukes.
Re: (Score:2)
Re: (Score:2)
A country which drops nukes without a very good reason will be economically strangled by sanctions from the rest of the world, and nuked into oblivion if it resists that with violence. Even the USA, yes. Nobody wants to be next on the hit list.
Re: (Score:1)
Re: (Score:1)
I don't think it's possible to fight a war without committing war crimes and crimes against humanity. The world doesn't need more advanced weapons technologies. War is obsolete because its consequences are unacceptable. If you want to stop war, round up the perpetrators of war crimes, run them through a legal process, and if convicted mete out justice sufficient to stop them forever and send a chilling warning to posterity that war is the unacceptable and deranged product of psychopathic minds, and this is an enlightened, sane, spiritually ascendant age and we will never accept, collaborate with, or tolerate war, warriors, war profiteers, or weapons of mass destruction. These things are contrary to the progress of the species and too destructive in a world grown so small and interconnected.
How would you propose we "round up" a perpetrator who is willing to use military force to defend himself from the threat of capture and prosecution?
Re: (Score:2)
exactly (Score:1)
it's amazing that people can imagine a world where immigration is stopped (impossible) but not where war is stopped (inevitable)
Re: (Score:2)
And that takes force, as in military force.
Re: (Score:2)
Re: (Score:2)
Daily use of the drone tech for decades on anything that moves..moves like the enemy.
An AI to ensure 24/7 accuracy and no hesitation by political mil/contractors.
Very much like the thinking around the Boer War, the free-fire zone of Vietnam but with advanced contractor drones.
With all the support from surrounding Cooperative Security Locations.
Then move to a Forward Operating Site and finally something like a Main Operating Base thanks
How to get to the off switch? (Score:1)
But then, "ethics" for hardware where its only purpose is to kill is a special kind of oxymoron in the first place. Or maybe the people trying to do ethics (all for the benjamins in this fat,
Re: (Score:2)
I suppose it depends on what personality model you build up the ethical model on
Worst case, would be a Self aggrandizing monster that just assumes ownership of everything and seeks to destroy anything that opposes it, think Donald Trump
Best case would be Mohatmas Ghandi, who would seek to shut down any opposition through non-violent means
Acceptable case would be Arjuna (think Bhagavad Ghita), whose personality starts as that of a warrior who accepts rigid rules of behavior and grows into a Dharmic understan
void at the top (Score:2)
Re: (Score:2)
You're talking about Google?
Three basic rules. (Score:1)
Second Law: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
[1] https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
What nonsense those are since robots have no intelligence, artificial or otherwise. Nor will they in the foreseeable future.
But weapons don't require that nonexistent stuff so the machine of war will continue to evolve
Re: (Score:2)
are a series of cases showing how that the three rules don't work.
The USA "innovating" in ethics ... (Score:2)
This is gonna be bad ...
Especially when people do it, who already have accepted for-profit mass-murder as the essence of their actual de-facto job. ... Maybe a long time ago. But not anymore since a long time ago.)
(Tell that "protecting" fairy tale to a voter or comparable retard
A Big Philosophical Debate To Be Sure... (Score:1)
I think if we were to create a self-aware AI in love, it would only be natural for this being to learn love.
Off switches and an attempt to, 'maintain', control, seem to me to be futile excuses about opening pa