• PrinceWith999Enemies@lemmy.world
    link
    fedilink
    English
    arrow-up
    147
    ·
    6 months ago

    I was involved in discussions 20-some years ago when we were first exploring the idea of autonomous and semiautonomous weapons systems. The question that really brought it home to me was “When an autonomous weapon targets a school and kills 50 kids, who gets charged with the war crime? The soldier who sent the weapon in, the commander who was responsible for the op, the company who wrote the software, or the programmer who actually coded it up?” That really felt like a grounding question.

    As we now know, the actual answer is “Nobody.”

    • lurch (he/him)@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      59
      ·
      6 months ago

      actually, it depends on who you ask. some will say it’s the kids fault for being so shootable. some will blame it on trans kids or gay people or immigrants. some will blame the libs, some on capitalism, the devil, nonbelivers and others even the president. someone may end up being charged with a war crime, but it’s gonna be entirely random.

      • Sanctus@lemmy.world
        link
        fedilink
        English
        arrow-up
        20
        ·
        edit-2
        6 months ago

        Anybody but the weapons manufacturers and investors. I think I heard a popular show say recently “Everything is a product…The end of the fucking world is a product.” The only responsibility any of these people feel is toward their stock prices.

      • IninewCrow@lemmy.ca
        link
        fedilink
        English
        arrow-up
        7
        ·
        6 months ago

        It’s all based on geography.

        If the school is located in a mineral rich area or underground oil field, then that wasn’t a school, it was a military base and those weren’t students they were terrorists.

        If the school is located in aa area that lacks any natural wealth, then the robots have become autonomous and acted without control by anyone. It was an accident.

      • mPony@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        most of those responses are obviously in bad faith, though. How have we gotten to a point where we feel compelled to respond to bad faith at all?

    • VelvetStorm@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      6 months ago

      We don’t even charge people when they blow up schools and hospitals with drone strikes now. Why would this be any different?

    • Sidyctism@feddit.de
      link
      fedilink
      English
      arrow-up
      15
      ·
      6 months ago

      To be fair, the answer to the question “when somebody kills a schoolbus of kids, who gets charged with a warcrime?” was always “nobody”

    • masquenox@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      6 months ago

      As we now know, the actual answer is “Nobody.” the 50 kids who gets designated as “terrorists” afterwards.

      FTFY - it’s the American way.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      6 months ago

      When an autonomous weapon targets a school and kills 50 kids, who gets charged with the war crime?

      When a human in a plane drops a bomb on a school full of kids, we don’t charge anyone with a war crime. Why would we start charging people with war crimes when we make the plane pilotless?

      The autonomy of these killer toys is always overstated. As front-line trigger pullers, they’re great. But they still need an enormous support staff and deployment team and IT support. If you want to blame someone for releasing a killer robot into a crowd of civilians, its not like you have a shortage of people to indict. No different than trying to figure out who takes the blame for throwing a grenade into a movie theater. Everyone from the mission commander down to the guy who drops a Kill marker on the digital map has the potential for indictment.

      But nobody is going to be indicted in a mission where the goal was to blow up a school full of children, because why would you do that? The whole point was to murder those kids.

      Israelis already have an AI-powered target-to-kill system, after all.

      But in 2021, the Jerusalem Post reported an intelligence official saying Israel had just won its first “AI war” – an earlier conflict with Hamas – using a number of machine learning systems to sift through data and produce targets. In the same year a book called The Human–Machine Team, which outlined a vision of AI-powered warfare, was published under a pseudonym by an author recently revealed to be the head of a key Israeli clandestine intelligence unit.

      Last year, another +972 report said Israel also uses an AI system called Habsora to identify potential militant buildings and facilities to bomb. According the report, Habsora generates targets “almost automatically”, and one former intelligence officer described it as “a mass assassination factory”.

      The recent +972 report also claims a third system, called Where’s Daddy?, monitors targets identified by Lavender and alerts the military when they return home, often to their family.

      Literally the entire point of this system is to kill whole families.

    • BigFig@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 months ago

      To be fair they are specifically testing AI Aimed and not fired. Firing is still up to and operator