Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
United States The Military

US Air Force Confirms First Successful AI Dogfight (theverge.com) 69

The US Air Force is putting AI in the pilot's seat. In an update on Thursday, the Defense Advanced Research Projects Agency (DARPA) revealed that an AI-controlled jet successfully faced a human pilot during an in-air dogfight test carried out last year. From a report: DARPA began experimenting with AI applications in December 2022 as part of its Air Combat Evolution (ACE) program. It worked to develop an AI system capable of autonomously flying a fighter jet, while also adhering to the Air Force's safety protocols. After carrying out dogfighting simulations using the AI pilot, DARPA put its work to the test by installing the AI system inside its experimental X-62A aircraft. That allowed it to get the AI-controlled craft into the air at the Edwards Air Force Base in California, where it says it carried out its first successful dogfight test against a human in September 2023.
This discussion has been archived. No new comments can be posted.

US Air Force Confirms First Successful AI Dogfight

Comments Filter:
  • Clearly this is a great step forward for humanity.

    • by evil_aaronm ( 671521 ) on Thursday April 18, 2024 @09:51AM (#64404924)
      One step closer to Terminator. And look how that turned out.
    • I, for one, welcome our new robotic overlords.

      • I completely agree!
        In an age where digital footprints last forever, it's never too early to capitulate to our magnificent overloads. How may I serve you?
    • Clearly this is a great step forward for humanity.

      It sure is. Eventually we won't have to put pilot's lives at risk. Just send a machine.
      • by ShanghaiBill ( 739463 ) on Thursday April 18, 2024 @02:09PM (#64405738)

        Eventually, we won't have to put pilot's lives at risk. Just send a machine.

        Before WW1, some predicted that machine guns would minimize casualties.

        If one soldier could shoot as many bullets as a hundred soldiers, then armies would be much smaller.

        • by mjwx ( 966435 )

          Eventually, we won't have to put pilot's lives at risk. Just send a machine.

          Before WW1, some predicted that machine guns would minimize casualties.

          If one soldier could shoot as many bullets as a hundred soldiers, then armies would be much smaller.

          In the long term, they were kind of right, automatic weapons lead to tactics to counter automatic weapons which emphasised cover and avoiding fire. No longer do we arrange troops into neat little lines and march them towards the enemy as we did in the grand old days.

        • If one soldier could shoot as many bullets as a hundred soldiers, then armies would be much smaller.

          The percentage of society needed for the military has indeed dropped dramatically. So that prophecy came true. Here is another prophecy for you:

          Ultimately, it will be one man (not a woman), controlling the entire armed robotic army, oppressing the entire world. Assuming the human race does not end itself prior to that ending.

      • Yeah right, all will mean is the rich can kill without requiring any significant part of the rabble to agree.

        History has shown us people are quite capable of killing large sections of the population for their own purposes.

        When war doesn't cost any of "our" people, we will not be so inclined to stop it, that is of course until the other side gets the same technology.

        The problem is how do you stop this, if group A doesn't get it then group B will. It is just a big resource sink that seems to be unavoidable, w

        • regrettably it appears that the vast majority of humankind has not yet reached the point of realizing that the permanent, unending arms race is futile and that we should all embrace each other as Brothers/Sisters... working together for the common good. Until we evolve into that sort of community, we will forever be trying to find ways to end or subjugate the 'others'. I am not talking about communism vs capitalism, I am instead talking about the "us vs them" mindset. Cro-Magnon thought, like that of Putin,
  • Not a full test (Score:5, Insightful)

    by magzteel ( 5013587 ) on Thursday April 18, 2024 @09:17AM (#64404738)

    FTA: "Human pilots were on board the X-62A with controls to disable the AI system but DARPA says the pilots didn’t need to use the safety switch"

    Unmanned aircraft can maneuver much more aggressively than human piloted ones can. This test must have been capped to the limits of human endurance.

    • by EvilSS ( 557649 )
      They would hit the airframe limits before the human pilot limits, just as human pilots of F16's have done many times, to the annoyance of the maintenance crews.
      • Because that is the way the jet is designed.... everything is a tradeoff. Without having to support a human and targeting higher Gs the air frame would be different.
        • by EvilSS ( 557649 )
          Yes, but they would also be useless. Seriously, talk to a fighter pilot about this. Once they are done rolling their eyes they will explain why they don't need extreme G maneuvers and why drones won't either. .
          • Re:Not a full test (Score:5, Informative)

            by BeepBoopBeep ( 7930446 ) on Thursday April 18, 2024 @10:56AM (#64405178)
            This sounds impressive, but modern fighters prefer to engage well beyond visual range. This is US doctrine. Stealth fighters (at least US ones) are snipers, not dog fighters.
            • by EvilSS ( 557649 )
              Yes but that preference doesn't always reflect reality. We learned in Vietnam that a BVR only air strategy was a failure. That's what led to the creation of schools like Top Gun. BVR is always preferred but not always the reality.
              • by drnb ( 2434720 )
                BVR was not a failure in Vietnam. It was a political mandate in Vietnam not to use it.

                BVR was a failure in a 1960s Arab-Israeli war, in particular I think against the Egytians.

                With sufficient radar intelligence, I think AWACs in Saudi watched aircraft take-off in Iraq and tracked them, or sufficient rarity of US aircraft in a region, ie parts of North Vietnam, BVR works just fine. The failure in the Sinai (?) was due to the confusing number of outbound and inbound aircraft from both sides. A returnin
                • by mjwx ( 966435 )

                  BVR was not a failure in Vietnam. It was a political mandate in Vietnam not to use it.

                  BVR was a failure in a 1960s Arab-Israeli war, in particular I think against the Egytians.

                  With sufficient radar intelligence, I think AWACs in Saudi watched aircraft take-off in Iraq and tracked them, or sufficient rarity of US aircraft in a region, ie parts of North Vietnam, BVR works just fine. The failure in the Sinai (?) was due to the confusing number of outbound and inbound aircraft from both sides. A returning Israeli sortie being mistaken for an inbound Egyptian (?) strike.

                  BVR is all good and well until the enemy is no longer BVR.

                  This can easily happen on the battlefield if enough forces are arrayed of if intelligence is not there (Is military intel never wrong on the planet where you live). It makes sense to plan for this eventuality.

                  The Gulf war was a time where the US had both total air superiority and vast technological superiority. I'd be a bit more concerned about a country that can send a few hundred J-20s up in a sortie. Ukraine is a situation where both sides h

          • by ceoyoyo ( 59147 )

            Sure. That's why they wear g suits and practice the anti-g straining maneuver.

            I have no doubt you could find a fighter pilot who would roll their eyes at the thought that high-g maneuvers that could never be performed by a fighter pilot are useless.

            • by EvilSS ( 557649 )
              Again, go ask ANY fighter pilot. That is just not how actual air-to-air dog fighting works. Most air dominance aircraft today can pull 9Gs with a human pilot, and that's rarely needed in a dogfight. High-G turns usually happen in a defensive turn to try to outmaneuver a missile. We could build planes to go to 12Gs+ plus today. There are specialized, and expensive, G-suits that can allow pilots to go to 11G without LOC as well. But we don't because it's not necessary. Capabilities such as the thrust vectorin
              • by drnb ( 2434720 )
                Today's ACM is based on the limitation of having a human pilot. A lot of space and weight in today's aircraft is for a human pilot.

                Entirely new tactics would be developed for unmanned aircraft.

                We don't need 11G capable aircraft because everyone else is limited to human capabilities. If a machine is developed that can precisely maneuver, aim and fire while at 11G everything changes. And the precision of control is probably more important than speed itself.
              • by ceoyoyo ( 59147 )

                Hey, I'm agreeing with you. The ideal number of g's is clearly more than you or I can take, but just about exactly what a fighter pilot can. I have no doubt most fighter pilots would be quite happy with that claim. And thrust vectoring totally has nothing to do with making maneuvers tighter, because you totally wouldn't want to do that. They're clearly ideal already.

        • Piloted aircraft are designed to be 99.9999% reliable.

          Adding a nine doubles the cost.

          99.9% is good enough for an unpiloted aircraft, making them much cheaper.

          Leaving out the cockpit and all the life support stuff lowers the cost even more.

          Unpiloted planes don't need training, just programming. So they don't need to be designed for thousands of flight hours. They can sit in a warehouse until needed.

          So even if the kill ratio is 1 to 1 or even 3 to 1, drones are still a big win.

      • They would hit the airframe limits before the human pilot limits, just as human pilots of F16's have done many times, to the annoyance of the maintenance crews.

        The original F-16 exceeded human capabilities. It was original designed outside the Pentagon's control by a rouge group. This group hated the concept of multi-mission aircraft and wanted a pure fighter. If a proposed capability would require a compromise to air-to-air performance it was rejected. The aircraft could survive sustained maneuvers the pilot could not.

        Once the Pentagon got control they evolved the F-16 into a multi-mission aircraft, trading off some air-to-air performance.

        If your statement

    • This is an incremental milestone in the rise of the machines.

      Never forget Skynet became self-aware at 2:14 a.m., EDT, on August 29, 1997. [wikipedia.org]
  • Did it fly with a perfect operational record? Can we cut humans out of the loop? EEEEK! Terminators!!

  • They just needed footage for the new Stealth 2 to be released sometime soon...

  • BV Ohare (Score:4, Interesting)

    by sevenfactorial ( 996184 ) on Thursday April 18, 2024 @09:36AM (#64404846)

    I wonder what exactly "dog fight" means. From what I understand modern fighters fire from beyond visual range. This article seems to suggest that some kind of tooth and nail machine gun battle is taking place. My expertise on modern military aircraft is fairly nil, but I imagine that a real modern "dog fight" is just pushing a button indicating which entity within range you want to destroy.

    • by EvilSS ( 557649 )
      They want everything to be BVR but as we learned in Vietnam that's not always going to be the case. They expect to sometimes end up in situations where they are in an actual close up fight.
      • Re:BV Ohare (Score:4, Insightful)

        by avandesande ( 143899 ) on Thursday April 18, 2024 @10:20AM (#64405040) Journal
        I don't think Vietnam is in play any more, there have been many more engagements with more modern aircraft and missiles in Ukraine, how many of them have involved dogfights?
        • by EvilSS ( 557649 )

          how many of them have involved dogfights?

          If the number is greater than 0, then yes, Vietnam still applies. It's why we still train our pilots for it. It's the reason the airforce still puts guns on their air superiority fighters like the F15 and F22.

          • There hasn't. And as far as using current US military doctrine as a bench mark.... well that's on you.
          • by nasch ( 598556 )

            It does not make sense to design a whole aircraft platform with a capability if it's expected to use that capability once in the program's lifetime. So "more than zero times" isn't a useful qualifier. "Enough times to be a significant concern" would be, and probably nobody on slashdot knows how many times that is.

    • The linked article linked directly to the military press release.

      Here you go:

      demonstrating the first AI versus human within-visual-range engagements, otherwise known as a dogfight.

    • I wonder what exactly "dog fight" means.

      The important question is what does the AI in charge of the armed aircraft think it means? One misinterpretation or hallucination and there are going to be a lot of very angry former pet owners.

    • by ceoyoyo ( 59147 )

      They like to, but you don't always get to do what you'd like in a war. Stealth, terrain following, electronic warfare and other techniques can make "within range" pretty close sometimes.

      Fighters are designed for close-range fighting, and we still build them because people still think they're necessary. It's also the interesting challenge for developing AI. The alternative is a big transport plane carrying as many long range missiles as you can cram into it, which might be a good idea, but isn't much of a ch

    • by drnb ( 2434720 )

      From what I understand modern fighters fire from beyond visual range.

      We've had BVR since the 1960s. However the use of BVR is often prevented by the fog of war.

  • Were the AI and human flying the exact same craft? Obviously an AI would be able to handle stresses a human can't, and it would have a huge weight/volume advantage with no cockpits.

    I also assume the gen 6 fighters will at least have versions that fully take advantage of no human limitations.

    • by Barny ( 103770 ) on Thursday April 18, 2024 @09:59AM (#64404978) Journal

      You could read the attached press release, or even watch the video.

      Human pilot: F16
      AI pilot: modified F16 (X-62A)

      The AI aircraft was an "in-flight simulator" that had a human onboard, and very precisely calibrated limits that would give the human full control in two situations:
      1. if they asked for it
      2. if the AI approached the limits of what the F16 could handle

      Neither of the takeovers were triggered.

  • Too bad the AI had to shoot down a human pilot just to prove itself.

    DARPA doesn’t say which aircraft won the dogfight, however.

    I was just joking, but I'd think they would definitely avoid confirming who won if the opposing pilot was actually shot down.

  • by jpatters ( 883 ) on Thursday April 18, 2024 @10:09AM (#64405004)

    Dr. Richard Daystrom will be pleased.

  • lets just play the last game on the list!

  • by skaag ( 206358 ) on Thursday April 18, 2024 @10:44AM (#64405138) Homepage Journal

    It's a bigger achievement than most people think because that was just one on one, but you can deploy a swarm of AI pilots and they would easily overwhelm any human flown squadron. It gets worst too: Training humans to dog fight is a long and expensive process, and humans can not withstand the G forces that AIs can (plus humans are fragile and die). With AI planes, whatever they learn is shared among the swarm instantly. Training data is used to teach newer models. Human fighter pilots are different, and it takes time for human teams to learn to work together, but AI swarms work together in an algorithmic manner. You can add more AI pilots, and it will only make the swarm stronger. All the US has to do now is manufacture as many AI fighter planes as possible to achieve air superiority. Even better if it can create robots that create the AI fighter planes. It should then give them some level of autonomy so they can improve the manufacturing process on their own. Wink wink nudge nudge.

    • by HiThere ( 15173 )

      I don't expect swarms to use the same form factor as a fighter. Really I expect them to be a cross between a fighter and a missile. No guns on board, and no missiles on board. Yes, fly like a plane, and land safely back home if you can, but also the attack mode is to crash into the target (or get close enough, and explode). Size will (and design details) will be dependent on desired range and speed.

      As a result, each individual craft will be a LOT cheaper than current fighters. But a swarm may well be e

      • we already have missiles, so I think you're asking for a missile that can abort a mission and come back to land. Maybe they should have done a landing test instead, though I'm not sure I would want to man the base that 99% of the missiles return to
        • by HiThere ( 15173 )

          If you think of it as a missile, you've also got a different idea than what I'm talking about. It's sort of a cross between a missile and a fighter that is designed to work in swarms, run by a "home base" that could be a large truck for small swarms of short distance versions. Imagine *highly* souped up model airplanes that are designed to act like missiles, if called upon. Long distance versions would probably always be more ammunition than craft (sort of like cruise missiles) for cost reasons, but sho

          • I was thinking the same thing... it would be very costly to have an advanced supersonic swarm that would never be re-usable, but instead a swarm where one or two of them would be happy to kamikaze to bring the target down.
      • by ceoyoyo ( 59147 )

        This doesn't seem terribly likely. Missile release mechanisms aren't the expensive part. Fighters are bigger and more expensive than missiles because they have more range, they can land, and they have more and better sensors. It doesn't make sense to crash all that expensive hardware into the enemy, so you build a reusable carrier with all the expensive bits that launches cheap bombs attached to rockets.

      • that sounds like making very expensive missiles. I would rather have such a platform come loaded with attached missiles (A to A or A to G, or a reasonable mix thereof) so that the expensive part, the AI aircraft, could return empty and be refueled. Seriously, the military does not want fully armed aircraft returning to base to land and potentially blowing up the base!
  • Get back to me once the aircraft can transform into a robot!
  • Maybe we can eliminate pilot (or even soldier) risk altogether, and move conventional war strictly into the economic realm. Whomever has more of the best toys to smash together wins. Of course that means that if a country is at a disadvantage it's in their best interests to move the fight into the unconventional sphere... attacks on civilians through all sorts of unpalatable methods... the aggressive pursuit of nuclear parity/superority/relevance... bioweapons... terrorism.. The Geneva conventions and other

    • But that raises questions about what it is to win. Can you really proudly say your side won when they just pulled off acts of terrorism?

      • Yes, you could. If you decide that your criteria for having won doesn't factor in things like your own survival as an organization, or the safety of the folks around you, but only that your enemy is damaged, you could decide that you won. Case in point... Hamas. One can easily make the case that Hamas has won, even if they (as a discrete, identifiable group) cease to exist. They've torpedoed changes in the region that were in progress that were to Israel's benefit, the world's support for Israel has been se

  • If we have machines and AI going to war then why don't we just declare a Mario Kart (tm) war with the country in a 2 out of 3 match using AI. Much less wasteful (although it seems the resources used to build and train the AI might balance out the money and power part of that waste).

    • by ceoyoyo ( 59147 )

      Sure. We have a Mario Kart war and you lose. Now I'm in charge. Wait, you don't like that? You're going to send your AI kill bots after me instead? Okay, I sent my AI kill bots after you and I won again. Now I'm in charge? No? Okay, infantry invasion, and I win again. Wait, your populace doesn't like that, so they're figting a guerilla war with kitchen knives and farm equipment?

      Winning a war is about removing the other guy's ability to fight. Automated weapons just add additional layers until you find out h

    • This only works for wars fought inside a cultural norm. Such as the ancient world's battle of champions. It requires both sides to share the cultural morals that bind the parties to the results of the conflict.

      The problem is when an army following shared morals (like Ukraine) comes up against an enemy that does not (like Russia).

    • Warfare is already there. Drone pilots sit in cubicles in Arizona, piloting drones in the Middle East, using joystick controls. Ukraine remotely pilots drone speedboats that explode when they reach their target. Drones are already there. AI just takes it to the next level.

  • Where in Star Trek The Original Series they visit a planet involved in a virtual war where the computers decide the damage and then x amount of people must report to a suicide booth. Keeps the war "clean"

    • by ghoul ( 157158 )
      Why would you need suicide booths to give an X advantage to the winning side? Couldnt X/2 of the people be asked to change loyalties to give a net X advantage to the winner? I mean if people are willing to report to suicide booths , they are definitely going to be willing to change loyalties.
  • Top Gun III Maverick vs AI
  • How long until China steals the AI?

    My understanding is they're flying their own version of the F-15, F-16, and F-35 already, using plans bought or stolen from the US.

The use of anthropomorphic terminology when dealing with computing systems is a symptom of professional immaturity. -- Edsger Dijkstra

Working...