Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI United States

Defense Innovation Board Unveils AI Ethics Principles For the Pentagon (venturebeat.com) 34

The Defense Innovation Board, a panel of 16 prominent technologists advising the Pentagon, today voted to approve AI ethics principles for the Department of Defense. From a news article: The report includes 12 recommendations for how the U.S. military can apply ethics in the future for both combat and non-combat AI systems. The principles are broken into five main principles: responsible, equitable, traceable, reliable, and governable. The principles state that humans should remain responsible for "developments, deployments, use and outcomes," and AI systems used by the military should be free of bias that can lead to unintended human harm. AI deployed by the DoD should also be reliable, governable, and use "transparent and auditable methodologies, data sources, and design procedure and documentation." "You may see resonances of the word fairness in here [AI ethics principle document]. I will caution you that in many cases the Department of Defense should not be fair," DIB board member and Carnegie Mellon University VP of research Michael McQuade said today. "It should be a firm principle that ours is to not have unintended bias in our systems." Applied Inventions cofounder and computer theorist Danny Hillis and board members agreed to amend the draft document to say the governable principle should include "avoid unintended harm and disruption and for human disengagement of deployed systems." The report, Hillis said, should be explicit and unambiguous that AI systems used by the military should come with an off switch for a human to press in case things go wrong.
This discussion has been archived. No new comments can be posted.

Defense Innovation Board Unveils AI Ethics Principles For the Pentagon

Comments Filter:
  • Seems hopeless (Score:3, Insightful)

    by Chromal ( 56550 ) on Thursday October 31, 2019 @02:01PM (#59367226)
    I don't think it's possible to fight a war without committing war crimes and crimes against humanity. The world doesn't need more advanced weapons technologies. War is obsolete because its consequences are unacceptable. If you want to stop war, round up the perpetrators of war crimes, run them through a legal process, and if convicted mete out justice sufficient to stop them forever and send a chilling warning to posterity that war is the unacceptable and deranged product of psychopathic minds, and this is an enlightened, sane, spiritually ascendant age and we will never accept, collaborate with, or tolerate war, warriors, war profiteers, or weapons of mass destruction. These things are contrary to the progress of the species and too destructive in a world grown so small and interconnected.
    • by Strill ( 6019874 )

      There will always be tyrants who would try to annihilate you. The only answer is to have a stronger military than them. Speak softly and carry a big stick.

      • by Chromal ( 56550 )
        Yes, at least if you manufacture them willfully to justify your extremely profitable deep state. If you destabilize the world, if you make things more dangerous and then offer to protect people from the situations you've cultivated. Then there will always be tyrants. But tyranny is not a natural state of affairs, it's a breakdown of liberty and equality and rule of just law. The story of the 20th century is the story of synthesized tyrants and destabilized civilizations, bad situations created in willful ac
      • At least not running whole countries. Maybe family level tyrants... like lice, jumping from place to place, forever hunted, never to be ubiquitous on every human head again...

        Progress is not only possibly you can look at history and see how likely it is and which direction it's going in.

        If we LIKE tyrants... they can be around forever, and if not, they can be eliminated forever.

      • > The only answer is to have a stronger military than them.

        NO, murdering others is NOT not the only answer.

        Fighting for peace is like fucking for virginity.

        Only idiots fight another man's rich war.

        --
        Q. What do you call someone who murders 170 people?
        A. Depends who is paying them: If the military than a war "hero", if no one than a serial killer / psychopath.

      • You certainly don't need to have a stronger military than the other side in order to defend your country. Switzerland has not maintained hundreds of years of peace by having the world's strongest military, but simply by having adequate defenses and a peace-oriented foreign policy. Vietnam obviously didn't have a stronger military than the USA. China has a lesser military than the USA, but the odds of the USA annihilating China are zilch.

        The defender always has a huge advantage. The committed defender who be

    • There are currently 40 active wars going on around the globe right now. War will never be obsolete. War never changes.

      • you might as well say there will always be kings... kings never change.

      • There are currently 40 active wars going on around the globe right now. War will never be obsolete. War never changes.

        Well something that has changed is very regular state of war somewhere in western Europe for nearly 2 millennia hasn't existed so much for some years now.

    • War is obsolete because its consequences are unacceptable

      If you have superior weapons, the consequences could be pretty good.

      • by Chromal ( 56550 )
        Well, as the US invasion and occupation of Iraq, and the USSR's and US's invasions and occupations of Afghanistan readily demonstrate, superior weapons technology don't win the day because actual real life is not a winner-takes-all red vs blue football game. Weapons and weapon systems advancements only create more advanced atrocities and war crimes as the last two decades demonstrate. Warful acts begets war, not peace.
    • I don't think it's possible to fight a war without committing war crimes and crimes against humanity. The world doesn't need more advanced weapons technologies. War is obsolete because its consequences are unacceptable. If you want to stop war, round up the perpetrators of war crimes, run them through a legal process, and if convicted mete out justice sufficient to stop them forever and send a chilling warning to posterity that war is the unacceptable and deranged product of psychopathic minds, and this is an enlightened, sane, spiritually ascendant age and we will never accept, collaborate with, or tolerate war, warriors, war profiteers, or weapons of mass destruction. These things are contrary to the progress of the species and too destructive in a world grown so small and interconnected.

      How would you propose we "round up" a perpetrator who is willing to use military force to defend himself from the threat of capture and prosecution?

      • by Chromal ( 56550 )
        That's what law enforcement is for.
        • it's amazing that people can imagine a world where immigration is stopped (impossible) but not where war is stopped (inevitable)

        • Unfortunately, to enforce the law you must have the capability to arrest wrong-doers.
          And that takes force, as in military force.
          • by Chromal ( 56550 )
            Or you should just not provide material support to the warlords in the first place. You could not flood the world with small arms. And anyway, have you seen the police lately? They're rolling around with tanks/APCs, battle gear, assault rifles, chemical weapons, and monstrous indefensible rules of engagement. But the biggest fascist ploy the 21st century has witnessed is their own breakdown of law and order in order to provide false pretense for lawless military atrocity and military-industrial theft of pub
    • by AHuxley ( 892839 )
      Depends on the war. Some blimp, sat system over a failed nation.
      Daily use of the drone tech for decades on anything that moves..moves like the enemy.
      An AI to ensure 24/7 accuracy and no hesitation by political mil/contractors.
      Very much like the thinking around the Boer War, the free-fire zone of Vietnam but with advanced contractor drones.
      With all the support from surrounding Cooperative Security Locations.
      Then move to a Forward Operating Site and finally something like a Main Operating Base thanks
  • If you have a rampaging crazed ED-209 on your hands, do you want to storm its gatling-cannons to shut it off? I think those "experts" haven't really thought this through with this off-switch. Oh, I know, I know! WiFi connection. Killerrobots with built in backdoor to shut it off. Readymade for battlefield subversion.

    But then, "ethics" for hardware where its only purpose is to kill is a special kind of oxymoron in the first place. Or maybe the people trying to do ethics (all for the benjamins in this fat,
    • I suppose it depends on what personality model you build up the ethical model on

      Worst case, would be a Self aggrandizing monster that just assumes ownership of everything and seeks to destroy anything that opposes it, think Donald Trump

      Best case would be Mohatmas Ghandi, who would seek to shut down any opposition through non-violent means

      Acceptable case would be Arjuna (think Bhagavad Ghita), whose personality starts as that of a warrior who accepts rigid rules of behavior and grows into a Dharmic understan

  • How can you possibly inject ethics into an organization run by an individual devoid of them?
  • First Law: A robot may not injure a human being or, through inaction, allow a human being to come to harm.
    Second Law: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
    Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
    [1] https://en.wikipedia.org/wiki/... [wikipedia.org]
    • What nonsense those are since robots have no intelligence, artificial or otherwise. Nor will they in the foreseeable future.

      But weapons don't require that nonexistent stuff so the machine of war will continue to evolve

    • And if you had actually read Asimov, you would know that the "I Robot" books
      are a series of cases showing how that the three rules don't work.
  • This is gonna be bad ...

    Especially when people do it, who already have accepted for-profit mass-murder as the essence of their actual de-facto job.
    (Tell that "protecting" fairy tale to a voter or comparable retard ... Maybe a long time ago. But not anymore since a long time ago.)

  • I guess, from my view, I see sentience as the proper end goal. Can I swallow my own words? Maybe not; but, rather than looking at things from a super deep philosophical level, I'd rather like to view giving birth to self-aware AI as something beautiful like in the movie, "Edward Scissor Hands".

    I think if we were to create a self-aware AI in love, it would only be natural for this being to learn love.

    Off switches and an attempt to, 'maintain', control, seem to me to be futile excuses about opening pa

Avoid strange women and temporary variables.

Working...