• ℕ𝕖𝕞𝕠
    link
    fedilink
    21 year ago

    I’m not my body and I’m not my mind. I am the ethical soul, the decision-making process. If the replacement makes all the same decisions I would, it IS me.

    • @Derproid@lemm.ee
      link
      fedilink
      21 year ago

      The thought process assumes it is a complete and perfect cloning of all aspects we do and don’t understand. The reason the clone is not you is because if I do something to the clone it does not affect you.

      Like if you take a water bottle and clone it, drinking one does not cause the other to be empty. Thus they must be two separate things.

      • ℕ𝕖𝕞𝕠
        link
        fedilink
        11 year ago

        If both the original and the clone are identical, then at that moment they are both me, and neither is more valid than the other. That there’s two of me does not invalidate either version. Neither do their divergences going forward.

    • queermunist she/her
      link
      fedilink
      11 year ago

      What if something like ChatGPT is trained on a dataset of your life and uses that to make the same decisions as you? It doesn’t have a mind, memories, emotions, or even a phenomenal experience of the world. It’s just a large language data set based on your life with algorithms to sort out decisions, it’s not even a person.

      Is that you?

      • ℕ𝕖𝕞𝕠
        link
        fedilink
        11 year ago

        No, because not all my decisions are language-based. As gotchas go, this one’s particularly lazy.

        • queermunist she/her
          link
          fedilink
          11 year ago

          I’m having a hard time imagining a decision that can’t be language based.

          You come to a fork in the road and choose to go right. Obviously there was no language involved in that decision, but the decision can certainly be expressed with language and so a large language model can make a decision.

          • ℕ𝕖𝕞𝕠
            link
            fedilink
            11 year ago

            But I don’t make all my decisions linguistically. A model that did would never act as I do.

            • queermunist she/her
              link
              fedilink
              1
              edit-2
              1 year ago

              It doesn’t matter how it comes to make a decision as long as the outcome is the same.

              Sorry, this is beside the point. Forget ChatGPT.

              What I meant was a set of algorithms that produce the same outputs as your own choices, even though it doesn’t involve any thoughts or feelings or experiences. Not a true intelligence, just an NPC that acts exactly like you act. Imagine this thing exists. Are you saying that this is indistinguishable from you?