• Dra@lemmy.zip
    link
    fedilink
    English
    arrow-up
    27
    ·
    1 year ago

    Semi-autonomous doesnt really mean anything and is a deliberately sensationalist headline.

    The key technological discussion is when it’s not a human pulling the trigger.

    Even guns are semi autonomous by this definition

    • Kbobabob@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      The tank-like robot has the ability to transport itself to a preset destination. It can also spot and avoid obstacles by utilizing dozens of sensors and an advanced driving system. Moreover, the platform can self-destruct if it falls into enemy hands.

      I’d have to assume it has some sort of finding/tracking tech as well to stay on target. Trying to compare this to a handgun is just silly.

      • Railcar8095@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        The thing is, will it select a target and fire without manual intervention? I’m less worried about it moving autonomously than killing. Not that in this case I think it will make any difference if a civilian enters l encounters it.

      • Dra@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        The CIWS Phalanx has been doing this since the 70s. The main point is that it’s implying something new about it’s semi-autonomous nature that hasnt previously been used and is thus noteworthy. A handgun automates much of labour involved with applying kinetic force to another human being, reducing it down to a button press that anyone can do.

        Trying to suggest this is somehow newsworthy and that they arent just fishing for clickbait headlines is silly.

    • Uriel238 [all pronouns]
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Australia already has area-denial sentries that autonomously shoot at any motion (with some parameters regarding size and speed).

      These, or a similar techology was used for a while along the Korean DMZ until we started talking about building autonomous drones.

      One of the shot-down airliner incidents (Flight 007, maybe?) involved a misdesignation of a sensor contact by a US Aegis missileboat system. The plane was pinging with F4 Phantom Radar (which ruled out an ordinary airliner). The Aegis required a human to authorize an attack, but it reported the contact as a bogey (unknown, peresumed to be hostile)…

      — Apparently, I posted this without finishing it. —

      So that instance might be considered the first historical case of an autonomous weapon system accidentally killing a civilian (at least partially civilan) target, given the human doing the authorizing had inadequate data to make an informed decision.

      (A lot of cruelty of our systems comes from authorizations based on partial data. Law enforcement in the US is renowned for massaging their warrants to make them easy on the signing magistrate, resulting in kids and dogs slain during SWAT raids in poor neighborhoods. I’m ranting.)