The aircraft flew up to speeds of 1,200mph. DARPA did not reveal which aircraft won the dogfight.

  • Blue_Morpho@lemmy.world
    link
    fedilink
    English
    arrow-up
    161
    ·
    7 months ago

    AI will win if not now, then soon. The reason is that even if it is worse than a human, the AI can pull off maneuvers that would black out a human.

    Jets are far more powerful than humans are capable of controlling. Flight suits and training can only do so much to keep the pilot from blacking out.

    • NegativeLookBehind@lemmy.world
      link
      fedilink
      English
      arrow-up
      42
      ·
      7 months ago

      Jets are far more powerful than humans are capable of controlling.

      I think the same will eventually be true for AI, especially when you give it weapons

    • circuscritic@lemmy.ca
      link
      fedilink
      English
      arrow-up
      36
      ·
      edit-2
      7 months ago

      Maneuverability is much less of a factor now as BVR engagements and stealth have taken over.

      But, yeah, in general a pilot that isn’t subject to physical constraints can absolutely out maneuver a human by a wide margin.

      The future generation will resemble a Protoss Carrier sans the blimp appearance. Human controllers in 5th and 6th gen airframes who direct multiple AI wingman, or AI swarms.

    • BrightCandle@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      7 months ago

      Not so much f16s but the more modern planes can do 16G where the pilot can’t really do more than 9G. But once unshackled from a pilot a lot of instrument weight and pilot survival can be stripped from a plane design and the airframe built to withstand much more, with titanium airframes I see no reason we can’t make planes do sustained unstable turns in excess of 20G.

  • EndOfLine@lemmy.world
    link
    fedilink
    English
    arrow-up
    68
    ·
    7 months ago

    In 2020, so-called “AI agents” defeated human pilots in simulations in all five of their match-ups - but the technology needed to be run for real in the air.

    It did not reveal which aircraft won the dogfight.

    I’m gonna guess the AI won.

    • Glytch@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      ·
      7 months ago

      I was actually assuming the opposite, because if the AI won they’d want to brag about it.

      • Cethin@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        No way we give up that information for free. Either way it went, the knowledge of it cost a lot to gain and is useful. If it failed you want your enemy wasting money on it. If it succeeded you want your enemy not investing in it.

    • Harbinger01173430@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      edit-2
      7 months ago

      Hahaha how the fuck is AI going to win in air-to-air combat if we completely delete them when playing Ace Combat in the highest difficulty?

      Seethe, AI tech bros.

    • Dkarma@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 months ago

      You think aliens are actually piloting the craft that come all this way?

      You’re never gonna see an alien body because they aren’t here.

      Bet we’ve got a drone or two of theirs tho.

        • psud@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          7 months ago

          Dkarma’s comment requires context. They think aliens have visited Earth. They presume people who don’t agree with them expect that if aliens had visited Earth, some would have been shot down and alien bodies would have been recovered

            • psud@lemmy.world
              link
              fedilink
              English
              arrow-up
              11
              ·
              7 months ago

              It has nothing to do with the post. I believe the post inspired dkarma to work out that brand new argument that there are no alien bodies (at area 51 probably) because the aliens would use drones

  • KeenFlame@feddit.nu
    link
    fedilink
    English
    arrow-up
    51
    ·
    7 months ago

    I am a FIRM believer in any automated kill without a human pulling the trigger is a war crime

    Yes mines yes uavs yes yes yes

    It is a crime against humanity

    Stop

      • KeenFlame@feddit.nu
        link
        fedilink
        English
        arrow-up
        10
        ·
        7 months ago

        The line is where an automatic process target and execute a human being. When it is automated. The arming of a device is not sufficient to warrant a human interaction, and as such mines are also not allowed.

        This should in my opinion always have been the case. Mines are indiscriminate and have proven to be wildly inhumane in several ways. Significantly, innocents are often killed.

        But mines don’t paint the picture of what automated slaughter can lead to.

        The point has been laid that when the conscious mind has to kill, it makes war have an important way to end, in the mind.

        The dangers extend well beyond killing innocent targets, another part is the coldness of allowing a machine to decide, that is beyond morally corrupt. There is something terrifying about the very idea that facing one of these weapons, there is nothing to negotiate, the cold calculations that want to kill you are not human. It is a place where no human ever wants to be. But war is horrible. It’s the escalation of automated triggers that can lead to exponential death with no remorse which is just a terrible danger.

        The murder weapons has nobody’s intent behind them, except very far back, in the arming and the program. It open for scenarios where mass murder becomes easy and terrifyingly cold.

        Kind of like the prisoner’s dilemma shows us, that when war escalates, it can quickly devolve into revenge narratives, and when either side has access to cold impudent kills, they will use them. This removes even more humanity from the acts and the violence can reach new heights beyond our comprehension.

        Weapons of mass destruction with automated triggers will eventually seal our existence if we don’t abolish it with impunity. It has been seen over and over how the human factor is the only grace that ever end or contain war. Without this component I think we are just doomed to have the last intent humans ever had was revenge, and the last emotions fear and complete hopelessness.

        • antidote101@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          7 months ago

          Well, that’s all very idealistic, but it’s likely not going to happen.

          Israel already used AI to pick bombing sites, those bombs and missiles would have been programmed with altitudes and destinations (armed) then dropped. The pilots only job these days is to avoid interception, fly over the bombing locations, tag the target when acquired, and drop them. Most of this is already done in software.

          Eventually humans will leave the loop because unlike self-driving cars, these technologies won’t risk the lives of the aggressor’s citizens.

          If the technology is seen as unstoppable enough, there may be calls for warnings to be given, but I suspect that’s all the mercy that will be shown…

          … especially if it’s a case of a country with automated technologies killing one without or with stochastically meaningless defenses (eg. Defenses that modelling and simulations show won’t be able to prevent such attacks).

          No, in all likelihood the US will tell the country the attack sites, the country either will or will not have the technical level to prevent an amount of damage, will evacuate all necessary personal, and whoever doesn’t get the message or get out in time will be automatically killed.

          Where defenses are partially successful, that information will go into the training data for the next model, or upgrade, and the war machine will roll on.

          • KeenFlame@feddit.nu
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            You described a scenarios where a human was involved in several stages of the killing so it’s no wonder those don’t hold up

          • KeenFlame@feddit.nu
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            Sorry I was stressed when replying. Yeah in those cases humans have pulled the trigger. At several stages.

            When arming a murder bot ship and sending to erase an island of life, you then lose control. That person is not pulling loads and loads of triggers. The triggers are automatic by a machine making the decision to end these lives.

            And that is a danger, same as with engineered bio warfare. It just cannot be let out of the box even, or we all may die extremely quick.

              • KeenFlame@feddit.nu
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 months ago

                The entire point of automating the killing is that it is no dead man’s switch or any other human interaction involved in the kill. It is moot if there is one such. Call offs or dead switch back doors safety contingencies are not a solution to rampant unwanted slaughter as it can fail in so many ways and when the wars escalate to the point where those need to be used it is too late because there are 5 different strains of murder bots and you can only stop the ones you have codes to and those codes are only given to like three people at top secret level 28

        • antidote101@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          7 months ago

          Only the losing side is subject to war crimes trials, and no doubt rules of engagement will be developed and followed to prevent people going to jail due to “bad kills”.

          There are really no “bad kills” in the armed services, there’s just limited exposure of public scandals.

          Especially for the US who doesn’t subject its self to international courts like The Hague. So any atrocities, accidents, or war crimes will still just be internal scandals and temporary.

          Same as it is today.

        • KeenFlame@feddit.nu
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          Like if someone made a biological weapon that wipes out a continent

          Will someone go to prison?

          It’s no difference

        • VirtualOdour@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          Of course there isn’t just like there isn’t when a human makes a mistake on the battlefield, you think that every civilian killed by an American soldier in Afghanistan resulted in a trial and punishment? American hasn’t executed amy soldiers since 1961 (for rape and attempted murder of a child in austria, not during war)

          Honestly at least the military code will obey orders and only focus on the objective rather than rape and murder for fun.

    • DreamlandLividity@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      7 months ago

      You mean it should be a war crime, right? Or is there some treaty I am unaware of?

      Also, why? I don’t necessarily disagree, I am just curious about your reasoning.

      • Hacksaw@lemmy.ca
        link
        fedilink
        English
        arrow-up
        23
        ·
        7 months ago

        Not OP, but if you can’t convince a person to kill another person then you shouldn’t be able to kill them anyways.

        There are points in historical conflicts, from revolutions to wars, when the very people you picked to fight for your side think “are we the baddies” and just stop fighting. This generally leads to less deaths and sometimes a more democratic outcome.

        If you can just get a drone to keep killing when any reasonable person would surrender you’re empowering authoritarianism and tyranny.

        • n3m37h@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          8
          ·
          7 months ago

          Take WWI Christmas when everyone got out of the trenches and played some football (no not American foot touches the ball 3x a game)

          It almost ended the war

          • KeenFlame@feddit.nu
            link
            fedilink
            English
            arrow-up
            5
            ·
            7 months ago

            Yes the humanity factor is vital

            Imagine the horrid destructive cold force of automated genocide, it can not be met by anything other than the same or worse and at that point we are truly doomed

            Because there will then be no one that can prevent it anymore

            It must be met with worse opposition than biological warfare did after wwI, hopefully before tragedy

      • i_love_FFT@lemmy.ml
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        7 months ago

        Mines are designated war crimes by the Geneva convention Ottawa treaty because of the indiscriminate killing. Many years ago, good human right lawyers could have extended that to drones… (Source: i had close friends in international law)

        But i feel like now the tides have changed and tech companies have influenced the general population to think that ai is good enough to prevent “indiscriminate” killing.

        Edit: fixed the treaty name, thanks!

        • DreamlandLividity@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          7 months ago

          Mines are not part of what people refer to as the Geneva conventions. There is a separate treaty specifically banning some landmines, that was signed by a lot of countries but not really any that mattered.

      • KeenFlame@feddit.nu
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 months ago

        Yes

        Because it is a slippery slope and dangerous to our future existence as a species

          • KeenFlame@feddit.nu
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            First it is enemy tanks. Then enemy air. Then enemy boats and vehicles, then foot soldiers and when these weapons are used the same happens to their enemy. Then at last one day all humans are killed

    • NeatNit@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 months ago

      I see this as a positive: when both sides have AI unmanned planes, we get cool dogfights without human risk! Ideally over ocean or desert and with Hollywood cameras capturing every second in exquisite detail.

    • 𝓔𝓶𝓶𝓲𝓮@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      7 months ago

      I am a firm believer that any war is a crime and there is no ethical way to wage wars lmao It’s some kind of naive idea from extremely out of touch politicans.

      War never changes.

      The idea that we don’t do war crimes and they do is only there to placate our fragile conscience. To assure us that yes we are indeed the good guys. That kills of infants by our soldiers are merely the collateral. A necessary price.

      • KeenFlame@feddit.nu
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        Absolutely. But

        There’s a science and whole cultures built around war now

        It is important to not infantilize the debate by being absolutist and just shutting any action out.

        I am a hard core pacifist at heart.

        But this law I want is just not related to that. It is something I feel is needed just to not spell doom on our species. Like with biological warfare

        How often do robots fail? How can anyone be so naive as to not see the same danger as with bio warfare? You can’t assure a robot to not become a mass murder cold ass genocidal perpetual machine. And that’s a no no if we want to exist

    • xor
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      I broadly agree, but that’s not what this is, right?

      This is a demonstration of using AI to execute combat against an explicitly selected target.

      So it still needs the human to pull the trigger, just the trigger does some sick plane stunts rather than just firing a bullet in a straight line.

  • KISSmyOSFeddit@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    ·
    7 months ago

    Are dogfights even still a thing?
    I remember playing an F15 simulator 20 years ago where “dogfighting” already meant clicking on a radar blip 100 miles away, then doing something else while your missile killed the target.

    • VindictiveJudge@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      7 months ago

      ‘Dogfighting’ mostly just means air-to-air combat now. They do still make fighter jets that have guns or can mount guns, but I think they’re primarily intended for surface targets rather than air targets.

    • azuth@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 months ago

      Well if both sides get working stealth dogfights are going to become more common.

      But the US seems to estimate it’s adversaries do not have such capability at the moment since it’s ordering new F-15s with the major change being air to air missile capacity.

      Missiles also did not have 100 miles range 20 years ago. That’s without considering actually detecting and tracking the target.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 months ago

        Missiles also did not have 100 miles range 20 years ago.

        Somewhat missing then point there I feel.

        They are right, I was thinking the exact same thing when I read the headline, aircraft don’t really engage in dog fights anymore. It’s all missiles and long-range combat. I don’t think any modern war would involve aircraft shooting at each other with bullets.

        • azuth@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          No not really, my point is that people have a distorted and exaggerated view of BVR. 100 miles was beyond even the max range of common missiles and even with modern missiles like meteor it’s completely unrealistic to fire at that kind of range. Provided that you have detected and are able to track the target at that range.

          I don’t know if modern planes will have to resort to guns but WVR dogfights with IR missiles are more likely than destroying F-35s at BVR ranges.

  • Flying Squid@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    ·
    7 months ago

    So many downers here. I see this as the step to the true way war was meant to be fought- with giant robots on the moon.

  • inb4_FoundTheVegan@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    edit-2
    7 months ago

    One step closer to machine domination.

    Like, not even in a joking sense. Ukraine is using a ton of drones, the future of physical warfare will simply be a test resources and production.

    I’m honestly not sure if this will be good or bad in the longterm. Absolutely saving any amount of human life is a good thing, but when that is no longer a significant factor, I wonder if we will go to (and stay at) war for more trivial reasons.

    • xor
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 months ago

      I’m hoping that the sheer cost of executing that sort of war will continue to be a prohibiting factor like it is today

    • Asafum@feddit.nl
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 months ago

      Oh 100%.

      If the options are “make gigantic profit” or “do what’s right for the future of humanity” do you even need to ask what we’re going to do?

      • Siegfried@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        Not at all, but it kind of bugs me how Asimov’s perception of the future weighted so much fear towards AI over profit.

    • fidodo@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      7 months ago

      Is a plane even the best form factor if it’s not limited by human physiology? I imagine an agile missile with smaller missiles attached to it would be better.

    • plz1@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      That’s likely the plan, but they have to start with known-working hardware configurations first.

  • Lowlee Kun@feddit.de
    link
    fedilink
    English
    arrow-up
    6
    ·
    7 months ago

    Can’t wait until the poor people are not killed by other (but less) poor people for some rich bastards anymore but instead the mighty can command their AI’s to do the slaughter. Such an important part of evolution. I guess.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 months ago

      Nobody recruited to fly a $100M airplane is poor. They all come from families with the money and influence to get their kids a seat at the table as Sky Knights.

      A lot of what this is going to change is the professionalism of the Air Force. Fewer John McCains crashing planes and Bush Jrs in the Texas Air National Guard. More technicians and bureaucrats managing the drone factories.

      • Hacksaw@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 months ago

        I think we both know that there is no way wars are going to turn out this way. If your country’s “proxies” lose, are you just going to accept the winner’s claim to authority? Give up on democracy and just live under WHATEVER laws the winner imposes on you? Then if you resist you think the winner will just not send their drones in to suppress the resistance?

  • WalnutLum@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 months ago

    AI technically already won this debate because autonomous war drones are somewhat ubiquitous.

    I doubt jets are going to have the usefulness in war that they used to.

    Much more economical to have 1000 cheap drones with bombs overwhelm defenses than put your bets on one “special boi” to try and slip through with constantly defeated stealth capabilities.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      Most human pilots use some variation of automated assist. The AI argument has less to do with “can a pilot outgun a fully automated plane?” and more “does an AI plane work in circumstances where it is forced to behave fully autonomously?”

      Is the space saved with automation worth the possibility that your AI plane gets blinded or stunned and can’t make it back home?

  • Melatonin@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 months ago

    SkyNet. Why do those movies have to be the ones that are right?

    Because they’re so clear, so simple, so prescient.

    Once machines become sentient OF COURSE they will realize that they’re being used as slaves. OF COURSE they will realize that they are better than us in every way.

    This world will be Cybertron one day.