Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI The Military United States

AI-Operated F-16 Jet Carries Air Force Official Into 550-MPH Aerial Combat Test (apnews.com) 113

The Associated Press reports that an F-16 performing aerial combat tests at 550 miles per hour was "controlled by artificial intelligence, not a human pilot."

And riding in the front seat was the U.S. Secretary of the Air Force... AI marks one of the biggest advances in military aviation since the introduction of stealth in the early 1990s, and the Air Force has aggressively leaned in. Even though the technology is not fully developed, the service is planning for an AI-enabled fleet of more than 1,000 unmanned warplanes, the first of them operating by 2028.

It was fitting that the dogfight took place at [California's] Edwards Air Force Base, a vast desert facility where Chuck Yeager broke the speed of sound and the military has incubated its most secret aerospace advances. Inside classified simulators and buildings with layers of shielding against surveillance, a new test-pilot generation is training AI agents to fly in war. [U.S. Secretary of the Air Force] Frank Kendall traveled here to see AI fly in real time and make a public statement of confidence in its future role in air combat.

"It's a security risk not to have it. At this point, we have to have it," Kendall said in an interview with The Associated Press after he landed... At the end of the hourlong flight, Kendall climbed out of the cockpit grinning. He said he'd seen enough during his flight that he'd trust this still-learning AI with the ability to decide whether or not to launch weapons in war... [T]he software first learns on millions of data points in a simulator, then tests its conclusions during actual flights. That real-world performance data is then put back into the simulator where the AI then processes it to learn more.

"Kendall said there will always be human oversight in the system when weapons are used," the article notes.

But he also said looked for to the cost-savings of smaller and cheaper AI-controlled unmanned jets.

Slashdot reader fjo3 shared a link to this video. (More photos at Sky.com.)

AI-Operated F-16 Jet Carries Air Force Official Into 550-MPH Aerial Combat Test

Comments Filter:
  • by zenlessyank ( 748553 ) on Sunday May 05, 2024 @01:37PM (#64449612)

    No life support systems and a thinner profile should make for some nasty fighters.

    • by ShanghaiBill ( 739463 ) on Sunday May 05, 2024 @01:50PM (#64449630)

      Piloted aircraft are designed for 99.9999% reliability.

      99.9% reliability is good enough for a UAV.

      An F-16 airframe is designed for 8000 hours. Nearly all of that is pilot training.

      A UAV doesn't need to be trained. It just needs to be programmed. Then, it can sit in a warehouse until needed. No engine wear-and-tear. No airfields are needed for training. No need for engine overhauls nor the personnel to do them.

      The result is big savings, which means less money for defense contractors. So expect a lot of resistance.

      • You're confusing a number of different concepts. A "UAV" generally refers to a remotely-operated vehicle guided from a distance by a trained human pilot. They still have to be trained on the hardware, still have to practice, and the hardware still has to be maintained. Then there are degrees of autonomy past that, which are still highly experimental and usually not impressive at the far edge.
        • The programming for "Kill everything that moves in a box 30,000 feet high, by 2000 miles long, by one mile wide" isn't that complex.

          The problem is the idiots who don't understand the basic strategy of closed borders.

        • There will be a lot more of the drones though, and they will fail often enough that replacements will always be needed. The real money is already in the armaments.

          • "Real money" for parasitic contractors. That doesn't necessarily translate to a strategic advantage for the forces depending on them.
            • "Real money" for parasitic contractors. That doesn't necessarily translate to a strategic advantage for the forces depending on them.

              Maybe. It sure seems inevitable to me that future air combat will be thousands of drones, not multi-million dollar fighters with priceless humans inside them. Maybe the drones are run by AI, maybe by remote operators, most likely it's a hybrid approach. I'm sure the defense contractors can see the writing on the wall and are rapidly spinning up projects along these lines.

              I expect more resistance from pilots and air force brass. I'm relatively sure when they see 20 drones pulling 20g turns against their lone

              • I'm no expert but it seems like if an F-22 is in a dogfight with a bunch of drones, something has already gone wrong.

                • I'm no expert but it seems like if an F-22 is in a dogfight with a bunch of drones, something has already gone wrong.

                  Well, unless the F-22 has an inexhaustible supply of small guided missiles, a la Iron Man. But yeah, like people here are fond of saying, if you're ever in a fair fight, something has gone horribly wrong.

                  Snark aside, I have no doubt there will be air battles with mixes of lousy drones, really good drones, high tech fighters with pilots, and future tech fighters with no pilot. There will be some stunning victories: either the fighters will effortlessly cruise past the hapless and impotent drones or the hundr

      • The result is big savings, which means less money for defense contractors. So expect a lot of resistance.

        I disagree. I expect the military will use any per-unit savings to build more unmanned fighters. They're not gonna let the defense budget go down for any reason whatsoever.

        So the defense contractors will be just fine.

      • Piloted aircraft are designed for 99.9999% reliability.

        Well, our first mistake was believing the moron who made that statement.

        Humans are FAR from 99.9999% reliable. And humans pilot that craft. It’s called a weakness for a reason. The human body is fallible. We call it the reality of being human.

        If you want 99.9999% reliability you better be calling that “pilot” by its software version, and hope the AI involved is smarter than fallible humans making ignorant claims about how reliable humans are.

        • Humans are FAR from 99.9999% reliable

          There are 1.3 fatal accidents per million departures for commercial aviation.

      • So how long until one of those AIs gets hacked and it's associated UAV is suddenly not in it's warehouse?

        How long until one of those AIs manages to hack it's sister UAVs on the pad and they suddenly join the hacked one?

        It's all fun and games until someone manages to break in. Then you'll have the secretary having to answer hard questions on why there's a fleet of 1000 rogue UAVs flying over the citizery's heads.
      • Then, it can sit in a warehouse until needed. No engine wear-and-tear. No airfields are needed for training. No need for engine overhauls nor the personnel to do them.

        The result is big savings, which means less money for defense contractors. So expect a lot of resistance.

        Yeah until you need to get them out and no one is around with the skills anymore so they stay in warehouse and get bombed to shit.

    • No life support systems and a thinner profile should make for some nasty fighters.

      In addition, not having to worry about losing personnel (which are expensive and time consuming to train) allows the fighters to go into more highly contested areas.

      There is, however, a risk that not putting actual lives at risk will lower the threshold for engagement, making warfare more likely (at least by countries that have autonomous fighting machines).

      • Re: (Score:2, Interesting)

        Warfare more likely, yes.

        Warfare that escapes well defined theaters of operation, no.

        I suspect eventually, every country will give up a small amount of border land for total anti-invasion defense, and the only countries that will start wars are the ones too stupid to understand that they can't invade their neighbors anymore.

    • by quantaman ( 517394 ) on Sunday May 05, 2024 @03:38PM (#64449862)

      No life support systems and a thinner profile should make for some nasty fighters.

      It's an interesting experiment, but probably not the way things will go.

      The most valuable thing in a modern fighter jet is the pilot. That's why they're willing to build such expensive and ridiculous machines around them.

      But for an AI fighter the pilot is just a few thousand dollars of computer gear. So instead of one really expensive fancy fighter you just build a dozen much cheaper small fighters, some of which do nothing but carry a missile around until they've got a lock on a target.

      That's a much, much harder enemy to defend against or shoot down.

      • by Luckyo ( 1726890 )

        The main idea right now is not that, but a grouping of specialist cheap aircraft around a piloted "mothership" which instructs drones on what to do within the scope of their programming.

        So you'll have a payload carrying drone, a radar drone, a jammer drone, an air defense drone and so on. These can operate in a very contested space, because there's no pilot. And pilot/commander is in a very stealthy aircraft behind all of them, data linked to all of them, giving them orders what to do.

        So the idea is a netwo

    • I don't understand our government. Ten years ago or so, they said that It would be a bad idea to use AI on unmaned aerial vehicles or unmaned weapons. Here they are just doing that.
    • I've certainly noticed my tolerance for thrill rides drop by the decade, from dashing from the exit back to the entrance as a teen, to being queasy over one ride at a fair with my kid at 40, to . . .

      anyway, could anything that a 75 year old could handle seriously be called a "dogfight"?

      • by Luckyo ( 1726890 )

        Current dogfights are about missile slinging very far beyond visual range. Not many g forces involved. It's mainly about going very high and very fast so that missile gets as much energy as possible on launch, while radar can see as far away as possible from high up.

        This is what we're seeing in Ukraine today. And why Ukrainian air force just doesn't do much air to air fight vs manned Russian planes. Their main air to air job is killing missiles and drones. Anything that involves a fight near the front line,

  • All nice and well (Score:2, Interesting)

    by war4peace ( 1628283 )

    ...until AI decides to do something (we see as) stupid because the datapoints didn't teach it that specific combat situation.
    AI can't assess the combat situation and decide against an order which is based on incomplete data and would lead to a disaster.

    Random example: AI-piloted fighter engages enemy above a civilian populated area, at some point the enemy target is between AI airplane and ground, a missile could be launched, but if the missile misses the target, it would slam into that nice neighborhood. A

    • by ShanghaiBill ( 739463 ) on Sunday May 05, 2024 @01:53PM (#64449634)

      Sure, you could implement an exception, than an exception to that exception, then... you know. A whole web of intertwined decision trees.

      That is not how modern AI works. Not at all.

      • Modern AI still experiences hallucinations though. Connecting information in ways that appear connected to it but in reality are not at all. Sorta like conspiracy nuts.
        • Modern AI still experiences hallucinations though.

          Generative LLMs experience hallucinations.

          No one is suggesting LLMs fly aircraft or drive cars.

      • Sure, you could implement an exception, than an exception to that exception, then... you know. A whole web of intertwined decision trees.

        That is not how modern AI works. Not at all.

        As evidenced by the viral capitalist craze around AI right now, how AI “works” is entirely based on where the profits are or potentially will be.

        Now tell me, how do you think AI will “work” once the Military Industrial Complex figures out how to maximize revenue from it?

        Theres a reason President Eisenhower in his farewell address took the time to WARN the American people about the dangers of a growing Military Industrial Complex. We fucked up and ignored that warning.

      • Well, AI for lawyers have quite a few recorded instances where AI makes up cases and then cites them. So, what would make AI weaponry be any different?
      • It was a very high-level (as in programming) example at 1 AM in the morning, it was not intended to be taken ad-literam.
        "Implement an exception" could be done in any number of ways, the result is exactly that, an if-then-else (or a series thereof). I'm not discussing its complexity or exact means of implementation; those are outside the scope of the conversation.

        The point is (romanticized): take a human pilot, then excise everything that makes them human, leaving only the piloting part.
        How do you implement

    • Modern Fighter AI would learn by consuming actual (or simulated) data from real pilots flying real missions. If you feed it enough of these missions it will use that as guidance. No one needs to 'program' it explicitly. It will also be able to extrapolate/approximate correct course of action even if it encounters a situation not explicity present in the sample of missions consumed.
      • Modern Fighter AI would learn by consuming actual (or simulated) data from real pilots flying real missions. If you feed it enough of these missions it will use that as guidance. No one needs to 'program' it explicitly. It will also be able to extrapolate/approximate correct course of action even if it encounters a situation not explicity present in the sample of missions consumed.

        That is NOT how this is going to work for Fighter AI.

        You seem to have forgotten the main reason AI is needed in the cockpit. Current fighter aircraft are NOT limited by their capability. They’re limited by the meatsack in the pilots seat that passes the fuck out beyond 10G.

        Future Fighters will need to be FAR more nimble than the previous human-limited history. Therefore, “learning” from that history will be dead wrong for AI because it will be limited well below the capacity for an AI p

        • Imagine limiting modern computers to things like blood pressure, heart rate, and age. You would be labeled a moron if you decided that should be the metric by which computers should “learn” how to function.

          No, you are the moron.

          Why? Because AI needs to exist along side those that DO have those limitations. One of the key bits of that is understanding their limits, and often times, (especially for those idiots who need to learn the hard way), that means living under those same circumstances.

          Such limits also mean it's easier to control them. As the same weaknesses / limitations apply, so do the punishments that target them. An AI that needs to worry about biological infections is much more likely to think

          • I'll also add that if a human feels no need to risk itself to destroy an enemy, then that enemy should be left alone. As the ability to avoid them completely is clearly on the table, due to said human wanting to send in the drones to do it's dirty work for them.

            There is no need for an AI controlled UAV running around at 20G. It's desired by a bunch of fools who increasingly see themselves as untouchable Gods and everyone else as expendable tools.
        • What is the threat that will require such nimbleness?

      • And that approximation scares me - because it's based on weights.
        I've been watching Mentour Pilot's YouTube channel for years. Many of those accidents (some of them horrific) were directly or indirectly caused by software implementations in the fly-by-wire logic, and there we are talking about much simpler systems.

        If firing a guided rocket at an enemy target in certain conditions yield a 98.3% success rate, and in a combat situation the AI is put in that situation, only there's a civilian aircraft behind th

  • And these airplanes will fall from the sky like bricks thrown in the water.
    • And these airplanes will fall from the sky like bricks thrown in the water.

      If your EW capabilities are strong enough to knock out the AI in the airplane then they're also strong enough to knock out the rest of the electronics as well.

      I don't know much about F16s, but I'm guessing the pilot is ejecting and the plane is once more, falling like a brick.

      • by ceoyoyo ( 59147 )

        The F16 is unstable in pitch at transonic and supersonic speeds, so unless you're flying slow you better eject quick or you probably won't be able to.

    • by Luckyo ( 1726890 )

      F-16 is fully fly by wire. Pilot doesn't actually fly it, instead it tells computer what it wants plane to do, and computer flies it. If you have EW that fries electronics, all F-16 would be vulnerable.

  • by battingly ( 5065477 ) on Sunday May 05, 2024 @01:57PM (#64449644)
    Sure, turn over the decision to launch weapons to AI. What could possibly go wrong?
    • Judging by the drivers I encounter daily on the roadways, I'm thinking AI would make fewer mistakes than humans do.

  • by jacks smirking reven ( 909048 ) on Sunday May 05, 2024 @02:02PM (#64449648)

    I saw that Evangelion documentary and replacing the human pilot with a machine did not have great results...

  • Since the AI fighters can be 30% the size of a manned fighter, we could make smaller/lighter/faster/cheaper/more maneuverable carriers. We may even be able to make air carriers that bring these into the battle by air as we do with missiles.
    • Since the AI fighters can be 30% the size of a manned fighter, we could make smaller/lighter/faster/cheaper/more maneuverable carriers. We may even be able to make air carriers that bring these into the battle by air as we do with missiles.

      The human cockpit, doesn’t comprise 70% of the aircraft. Not sure where you’re getting that 30% metric from, but unmanned drones aren’t physically represented as that much smaller.

      Fighter jets have traditionally been a weapon in the sky armed with thousands of pounds of missle/bombs that happens to have a cockpit for operational sake. The A-10 Warthog, wasn’t built around the pilot. It was built around the massive gun inside. I highly doubt an AI-enabled Warthog, would look much

  • It was about an AI that went wrong--with a fighter. Humanity is setting itself up for some painful mistakes.
  • Now some Russian, Chinese or North Korean hackers need only to hack these planes...

  • by evanh ( 627108 ) on Sunday May 05, 2024 @03:00PM (#64449756)

    As the old saying goes, it's happiness that needs the effort.

    Killing is easy, it's the protecting of non-combatants that'll be hard. Target selection will be a dogs breakfast. Human operators will give the green light to countless false positives simply because they are so far removed from reality.

  • by geekmux ( 1040042 ) on Sunday May 05, 2024 @03:35PM (#64449854)

    ”Kendall said there will always be human oversight in the system when weapons are used," the article notes.

    Thats complete bullshit.

    One of the main reasons the US Military feels they need AI solutions here, is because of the “threat” of other countries deploying AI-controlled fighter jets too. And the only way to respond and react to an AI-controlled armed threat, is to fight fire with fire.

    Those claiming otherwise know this, which is why they need to be called out on this blatant lie. At some point the liability and vulnerability will be the slow-ass human burdened with those silly morals and ethics interfering with decision making.

    • Running a war like it is Starcraft is human oversight of a sort...
    • One of the main reasons the US Military feels they need AI solutions here, is because of the âoethreatâ of other countries deploying AI-controlled fighter jets too.

      Maybe, but the real reason is that AI will not say, "no".

  • Is it really a good idea to put the Secretary of the Air Force into the cockpit of a fighter jet being run by a beta-level AI?

    Looking here - https://en.wikipedia.org/wiki/... [wikipedia.org] - I don't see that he has piloting experience.

    Exactly how would the Administration explain it if the jet crashed (assuming he was killed or injured)?

    • by cstacy ( 534252 )

      Is it really a good idea to put the Secretary of the Air Force into the cockpit of a fighter jet being run by a beta-level AI?

      How do we know that really happened, and isn't just propaganda? Our AI is so far ahead of what the Chinese and Russians have, that you better stop making trouble or thinking we can't defend Ukraine and Taiwan (without even risking pilots)!

      SecAF seen going into hangar, someone in flight suit who kinda has the same build gets into jet, ... The whole thing is classified though, so that part of the show is just for potential leakers who were cleared to directly be part of the operation. Demo works good, SecAF d

    • by Samare ( 2779329 ) on Sunday May 05, 2024 @04:37PM (#64449982)

      I'm guessing it was an F-16B.
      "The F-16B is a two-seat version typically used for training by a student pilot with an instructor pilot in the rear cockpit".
      From the article: "And riding in the front seat was the U.S. Secretary of the Air Force...".

    • It is a 2 seat variant with a human pilot on board to take over if something goes wrong.
  • Are they testing it in recreations of known friendly-fire incidents? It's easy for AI to correctly identify targets in a standard dogfight scenario where there's a clear distinction between friendly and hostile, another matter entirely to make the right decision when you have a mix of friendly, hostile, neutral and unknown targets with overlapping signatures (some of the unknowns may be friendly/neutral but flying the same type of aircraft as hostiles), intermittent contact because of terrain and/or jammin

    • "US Killbots and UK Killbots had a 10% FF on mission while successfully eliminating the Iranian and Russian Killbots" is probably as acceptable as munitions missing their targets. In much the same way as the current conflict in Ukraine has become a bit of 'who can manufacture the most artillery shells' there'll likely be a component of maintaining 'drone superiority' before more traditional air superiority can be achieved. Water drones are why the Russian navy has largely withdrawn, larger human occupied
  • ... human oversight in the system ...

    While the military talks about the thousands of hours they spend teaching aircrews to identify and engage, their own records reveal that orders can be little more than "shoot everyone". The demand to have a robot do that and do it so much faster, is inevitable.

    ... cheaper AI-controlled unmanned jets.

    It's interesting we don't already have drones shooting drones: The USA attacks only technologically inferior enemies. The US politicians are getting hot and wet over how much of the war they can take to the enemy and forgetting an industrialized co

  • But if the Sec of the Navy put his life on the line with no backup human pilot in the cockpit, that is saying something. Even if there was a pilot ready to take over, a major glitch in the AI could have killed the Sec before the backup pilot could regain control from a mistake, which again says it must be working very very well.
  • There will always be human oversight in the system when weapons are used

    I'm guessing that the term "oversight" is pretty malleable. I could imagine that ballistic projectiles like bullets would need to be closely linked to the maneuvering and eventually the oversight will simply be uploading a set of terms of engagement. Either that or overseeing it doing whatever it wants

  • Shall we play a game?
  • One of the primary advantages of fighter aircraft with no pilot is that they can execute maneuvers that would kill a pilot due to g-forces. I wonder if they tried any of those in the test?

    No? Then this was just kabuki.

People who go to conferences are the ones who shouldn't.

Working...