Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
The Military AI Software United States Technology

Military Documents Reveal How the US Army Plans To Deploy AI In Future Wars (thenextweb.com) 141

In a just-released white paper, the Army describes how it's working to make a battlefield network of machines and humans a reality. The Next Web reports: "Most of such intelligent things will not be too dissimilar from the systems we see on today's battlefield, such as unattended ground sensors, guided missiles (especially the fire-and-forget variety) and of course the unmanned aerial systems (UAVs)," reads the paper. "They will likely include physical robots ranging from very small size (such as an insect-scale mobile sensors) to large vehicle that can carry troops and supplies. Some will fly, others will crawl or walk or ride."

The paper was authored by the Army's chief of the Network Science Division of the Army Research Laboratory, Dr. Alexander Kott. It outlines the need to develop systems to augment both machines and people in the real world with artificially intelligent agents to defend the network: "In addition to physical intelligent things, the battlefield -- or at least the cyber domain of the battlefield -- will be populated with disembodied, cyber robots. These will reside within various computers and networks, and will move and acts in the cyberspace."

Kott takes pains to underscore the fact that the AI powering U.S. war efforts will need to be resilient in ways that today's AI simply isn't. He states: "The intelligent things will have to constantly think about an intelligent adversary that strategizes to deceive and defeat them. Without this adversarial intelligence, the battle things will not survive long enough to be useful." Ultimately, aside from outlining what the future battlefield will look like, the paper's conclusion is either disappointing or a giant relief, depending on your agenda: "Clearly, it is far beyond the current state of AI to operate intelligently in such an environments and with such demands. In particular, Machine Learning -- an area that has seen a dramatic progress in the last decade -- must experience major advances in order to become relevant to the real battlefield."

This discussion has been archived. No new comments can be posted.

Military Documents Reveal How the US Army Plans To Deploy AI In Future Wars

Comments Filter:
  • by phantomfive ( 622387 ) on Tuesday April 03, 2018 @12:02AM (#56370877) Journal
    In other words, the paper is just science fiction, trying to guess the future. Current capabilities are not enough.
    • by AHuxley ( 892839 )
      Its just a return to free fire zones https://en.wikipedia.org/wiki/... [wikipedia.org] and what the UK did during the Boer war.
      https://en.wikipedia.org/wiki/... [wikipedia.org].
      ie "anyone unidentified is considered an enemy combatant."
      ie "a scorched earth policy of destroying Boer farms and moving civilians into concentration camps."

      But with robots who always obey. Can a cute robot really do a war crime?
      - - --
    • Comment removed based on user account deletion
      • Nonsense. The future of ground combat is EM & EMP warfare. Because at some point, someone is going to say "Enough is enough."

      • We have "spider mines" since 30 years as anti tank mines ...
        Just saying.

        Or, how about a swarm of drones that inject a substance via needle that either debilitates an enemy combatant, or outright paralyzes them to death. That ladies and gentleman is the future of ground combat.
        That is most likely true ...

      • Injections can take time to incapacitate the target. These don't. [youtu.be]
    • It sure sounded like something from "Starship Troopers" (the book, not the movie - let's not go there today).

    • by syn3rg ( 530741 )
      Skynet smiles.
    • This is old news, at least 30 years old, probably older. The Armed Forces have been trying to use advanced computation and AI for a long, long time.

  • Are we done yet? Even if it doesn't take out the processors it will blind it.

    • Re:EMP EMP (Score:4, Insightful)

      by djinn6 ( 1868030 ) on Tuesday April 03, 2018 @01:01AM (#56371027)
      You have a poor understanding of EMP if you think it can take out a 10-foot wide UAV. Commercial jets run into EMP all the time when they get hit by lightning and nothing happens to them.

      As for blinding them, yes, it might have an effect on radio or radar, but the current crop of UAVs have visible light cameras. Meanwhile, whoever's fighting them will lose radio too, so it's not really an advantage.
      • by Revek ( 133289 )

        If the shit really hits the fan trump will let the nukes fly. My understanding of EMP is just fine.

  • Didn't they learn anything from the Butlerian Jihad?
  • Translation (Score:4, Insightful)

    by b0s0z0ku ( 752509 ) on Tuesday April 03, 2018 @12:23AM (#56370951)
    Translation: how to better use technology to end human lives and mutilate fellow humans instead of improving human lives. It's unfortunate that a lot of new technology is first used to murder and maim.
    • What i find limiting is that the use of "3 Laws Safe" is not being even considered. Battle should not be about increased KIAs. but about helping those that need help.
      • by dcw3 ( 649211 )

        What i find limiting is that the use of "3 Laws Safe" is not being even considered. Battle should not be about increased KIAs. but about helping those that need help.

        That only works if all sides play by that rule. If one side doesn't than the others lose.

      • However, consider the Solarian robot military initiative in "The Naked Sun". If Solaria had all-robot ships, and could convince the robots that other warships didn't have humans, they'd get a really big military advantage.

    • Re:Translation (Score:4, Insightful)

      by hcs_$reboot ( 1536101 ) on Tuesday April 03, 2018 @02:02AM (#56371159)
      Yes, let stop designing high level weapons. We can be reassured, Russia and China will also stop working on that right away (btw, motorbikes might also be used to carry people, and not frighten them by gangs of riders doing lots of noise ;-)
    • Re:Translation (Score:4, Interesting)

      by Nidi62 ( 1525137 ) on Tuesday April 03, 2018 @09:38AM (#56372285)

      Translation: how to better use technology to end human lives and mutilate fellow humans instead of improving human lives. It's unfortunate that a lot of new technology is first used to murder and maim.

      One of the first uses I see for "AI" (which in this implementation isn't even true, "real" AI) is for semi-autonomous heavy load-bearing equipment. Think back to 2001: US special forces operating in Afghanistan had to use mules to help them move equipment over mountainous terrain. Imagine a 4 or 6 legged robot that can follow a patrol and carry supplies, ammunition, wounded soldiers, etc. With 6 legs and an articulated front segment it should be able to go over just about any terrain a person could reasonably go. Boston Dynamics has tried a few things but the technology is not quite there yet. Also, they seem to keep focusing on making them look like dogs or horses while I picture something more like an armored pickup with legs, with the "cab" holding the computers, sensors, batteries, etc and a bed in the back for the cargo. Of course, it's the military, so if it's big enough you can mount a manned machine gun/Mk 19 grenade launcher on top and you have instant LAV/gun truck support where normally a wheeled vehicle couldn't go.

      • Re: (Score:2, Interesting)

        by b0s0z0ku ( 752509 )
        The "safer" war becomes for US soldiers, the more the US can act with impunity. I'm actually not for making war safe from things like IEDs -- the only way that the public won't support war is if war is dangerous, slow, and unpleasant. Keep war hellish.
        • Which would mean that you're against artillery, ground attack planes, guns, and everything back to and including bows.

      • Although this doesn't quite apply to mules, it's close -

        "Horses can make other horses, that is a trick that tractors haven't learned yet" - Heinlein

        Horses and mules are a hell of a lot cheaper than Boston Dynamics critters and for quite some time will be quieter, more flexible, have more endurance than robots. 'Horses for courses' - use what works.

        • by Nidi62 ( 1525137 )
          Horses don't funnel more and more money to the military industrial complex, and horse breeders aren't big contributors to campaigns nationwide.
    • by Anonymous Coward

      How many human lives were saved because the allies won WW2? How many human lives were improved because the US won the cold war?

      You, and your mods, are too idealistic to recognize that having the better war technology does improve the lives of their citizens and in many cases the lives of others across the world.

      • WW2 was the last just war that US was involved in. The wars since then have just been money-wasting homicide sprees that we often enough didn't even win.
    • Pacifism is a legitimate philosophical viewpoint, and I respect pacifists. However, lots of people aren't pacifists.

      If you're not a pacifist, you probably agree that we need to have weapons. If we do, we may as well make them good weapons. When we need weapons, it's usually not a good idea to aim for second best. It can be more humane to have highly effective weapons, as they can shorten a war and lessen the suffering. There are weapons that cause suffering out of proportion to their military value,

    • Obligaary XKCD: https://www.xkcd.com/1968/ [xkcd.com]

      (the number of the strip is interesting, BTW)
  • by Archfeld ( 6757 ) <treboreel@live.com> on Tuesday April 03, 2018 @12:55AM (#56371005) Journal

    https://www.youtube.com/watch?... [youtube.com]

    I am more worried about how the Army employs basic intelligence than I am what they will do with artificial intelligence...

    • With AI? Imagine the software from Uber autonomous cars, driving nuclear devices.
  • by Anonymous Coward

    The intelligent things will have to constantly think about an intelligent adversary that strategizes to deceive and defeat them.

    So it's like autonomous cars except some of the pedestrians and other drivers and things the sensors can't even detect are trying to kill you. Well, that should be easy. Then take a vehicle that's survived a combat environment and has adapted and enhanced it's algorithms and put it back in a civilian environment. It'll do the AI equivalent of PTSD and start drinking methanol heavily, freaking out when it hears it's own backfire and running down moms with strollers.

  • The time is now (Score:1, Insightful)

    by Anonymous Coward

    We must dissent

  • by djinn6 ( 1868030 ) on Tuesday April 03, 2018 @01:47AM (#56371119)
    Consider a future where everyone who wants a drone army can have one. Governments, private individuals, terrorists and so on. Governments will have the largest and most capable drones. They will win any head-on fight through more advanced AI and sheer numbers. Private individuals might employ them for self-defense. A handful of them situated around their properties can deter criminals. Finally, terrorists would use them in place of human suicide bombers.

    But how is that any worse than today?

    Governments already have armies capable of controlling their respective people. If not in America then at least in the rest of world. Private individuals already have guns at home. As for terrorist attacks, it'll actually be much harder to carry them out. Armed guards stationed at every crowded venue is impractical for law enforcement, but a handful of drones is cheap. A terrorist would have to get past the law enforcement just as they have to today, but that law enforcement is going to be much more vigilant and instantly reacting to any threat.

    As for a war between nations, the biggest and most dangerous weapons, nuclear ICBMs, have been autonomous since the 1960s and that hasn't changed at all. Neither have countermeasures since they've been invented. Can a relatively large drone carry a nuclear weapon? Yes, but it's going to be much easier to defend against given its slow speed and limited range. Swarms of them flying towards you will be easy pickings for a medium or short range nuclear missiles, while stragglers will be handled by your own drones or anti-air missiles. Any opponent wanting to get an upper hand would still need to counter your nuclear missiles first.

    In other words, having drones won't change the landscape of war. They're no better than ICBMs at getting past enemy defenses, much less damaging than nuclear explosions, and definitely not capable of defending against anything moving at Mach 10.
    • Can a relatively large drone carry a nuclear weapon? Yes, but it's going to be much easier to defend against given its slow speed and limited range.

      Isn't a cruise missile a single-use drone?

      • Yes, it is, but it is not AI, or do you think it would voluntarily explode if it was intelligent ... reminds me about the movie "Dark Star" ;D

    • You ever saw a drone?
      A 2inch one? Can carry an ounce of C4 ...
      Suppose a company of 100 soldiers is surrounding your position ... you drop them on their faces and let them explode ...
      You have not read much SF, I guess?

      • by djinn6 ( 1868030 )

        You have not read much SF, I guess?

        I do read SF, but it's science fiction. In real life, the soldiers wouldn't need to be there at all. They'd have drones do that work. Unless you have better tech, you'll be out-droned.

  • Intelligent Things, or 'IT' for short.

  • Obviously. AI warfare would violate the 3 laws of robotics. "A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law." But that is assuming that the AI is programmed to attack humans. What if it is only programmed to attack another AI ? Then we h
    • by Anonymous Coward

      Ultimate target of the war, any war, is always you, the civilian, "your ass", your beating heart, your panicky brain, your loved ones, the things you are attached to. When all the battles of the war are done, it is "or else" time. So, don't even dream that future wars will mean only robots' deaths. As soon as you just as much as think "So what? The hell we'll let them have what they fought us for, even though they annihilated all our metal representatives!", you'll find a bright red (well, infra-red is just

    • by dcw3 ( 649211 )

      Yeah, let me know when you get Putin, and Rocket Man to sign up to follow those rules. In the meantime, I'm not going to bury my head in the sand and pretend that we'll all be singing Kumbaya.

  • by petes_PoV ( 912422 ) on Tuesday April 03, 2018 @03:13AM (#56371275)

    the AI powering U.S. war efforts will need to be resilient in ways that today's AI simply isn't

    The crucial point about AI is the training they need. Unless the americans have been collecting data for decades (given how few armed conflicts there are and that each one is different from the one before) there won't be any realistic scenarios for the AIs to learn from.

    It would also be quite easy to defeat AIs that had been trained - just do the unexpected, as all gifted military leaders do.

    Although once they get past the initial phase of monumentally screwing up everything they touch - another facet of "superpower" military might - they could easily develop new strategies. The best strategy would be for the AIs to decide that the battle isn't worth fighting.

    • by Whibla ( 210729 )

      The crucial point about AI is the training they need. Unless the americans have been collecting data for decades (given how few armed conflicts there are and that each one is different from the one before) there won't be any realistic scenarios for the AIs to learn from.

      And how do you think they train real flesh and blood soldiers in teamwork, tactics and so on, without risking their lives for real? Ah, that's right, they use a game [americasarmy.com]. Yeah, ok, that's probably a fairly small part of their training, with a great chunk of the rest being physical training, group bonding, inculcation of muscle memory (think time spent on the firing range) and technical training. Robots don't require any of that stuff because it comes under the guise of engineering, manufacture, and basic algori

    • AI do not need training.
      Artificial neural networks need training.
      And the result of that usually is not an AI but a "cognitive system". Big difference, well, in case you study computer science. If you just talk about it in a pub there is probably no difference ... just saying.

    • We've known how to deal with this for a long time -

      Form them into a committee, that will do them in.

  • by Bearhouse ( 1034238 ) on Tuesday April 03, 2018 @06:12AM (#56371611)

    "The intelligent things will have to constantly think about an intelligent adversary that strategizes to deceive and defeat them. Without this adversarial intelligence, the battle things will not survive long enough to be useful."

    In other words; they're theorizing about "battle things" that will be lethal, highly-autonomous and adaptable...

    Nope, no Terminator-esque red flags there...

  • I didn't see anything in the article or responses about countermeasures.

    Our soldiers (and automatons) will be beacons. All the foe needs to do is build (radio silent) drones that target any radio signal. Of course, more sophisticated versions will target, say, tanks or GIs based on the signal mix. The list of "improvements" is long, given such a target rich environment.

    • by dcw3 ( 649211 )

      Um, doubtful, at least in our lifetime. The amount of payload that it would require to
      1.) identify "any radio signal"...unrealistic as there are radio signals everywhere.
      2.) not only direction find that specific signal, but also geo-locate it for targeting...not a simple task requiring complex algorithms and hardware. And that's all assuming a stationary emitter.
      3.) target the emitter for destruction in time before it moves.
      4.) and let's not forget the payload requirements for said drone. How much does a

  • "Buried deep in the report is mention of a new program in its infancy which would provide an AI based network that will eventually control ALL US weapons systems and networks. Code name: Skynet."

  • by dcw3 ( 649211 ) on Tuesday April 03, 2018 @11:28AM (#56372943) Journal

    " the Army describes how it's working to make a battlefield network of machines and humans a reality"

    No, no it doesn't. This is a piece of shit paper that some random dork presented. White papers are presented to the military all the time...I've done one myself. That doesn't mean that this is the Army's doctrine, or that it will guide a single thing that they do.

    The article doesn't present anything insightful, nor innovative. It's almost all Sci-Fi stuff that you'd see in random futuristic movies. Whoever is paying this dork's salary needs their head examined.

  • Once the computers completely control us, he who controls the computers needs no guns (or at least very few).

    The Chinese are the masters of control these days. Their citizen surveillance systems, combined with their new social credit system are already far superior to anything in the west. Probably won't be long before our governments buy these systems from the masters, although probably indirectly. Because terrorism, children, human trafficking, copyright, ...

    You do not need to kill someone in order to

  • by tmjva ( 226065 )
    Steve Jackson's [was Metagaming] "Ogre" game becomes reality.  Actually it is looking a lot more like Metagaming's "Rivets".

8 Catfish = 1 Octo-puss

Working...