• Starbuncle@lemmy.ca
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    1 month ago

    Maybe hypothetically in the future, but it’s plainly obvious to anyone who has a modicum of understanding regarding how LLMs actually work that they aren’t even anywhere near being close to what anyone could possibly remotely consider sentient.

    • Dragon Rider (drag)@lemmy.nz
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      Sentient and capable of suffering are two different things. Ants aren’t sentient, but they have a neurological pain response. Drag thinks LLMs are about as smart as ants. Whether they can feel suffering like ants can is an unsolved scientific question that we need to answer BEFORE we go creating entire industries of AI slave labour.

      • Starbuncle@lemmy.ca
        link
        fedilink
        English
        arrow-up
        12
        ·
        1 month ago

        Sentient and capable of suffering are two different things.

        Technically true, but in the opposite way to what you’re thinking. All those capable of suffering are by definition sentient, but sentience doesn’t necessitate suffering.

        Whether they can feel suffering like ants can is an unsolved scientific question

        No it isn’t, unless you subscribe to a worldview in which sentience could exist everywhere all at once instead of under special circumstances, which would demand you grant ethical consideration to every rock on the ground in case it’s somehow sentient.

      • beefbot
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        1 month ago

        I PROMISE everyone ants are smarter than a 2024 LLM. (edit to add:) Claiming they’re not sentient is a big leap.

        But I’m glad you recognise they can feel pain!