• dexa_scantron@lemmy.world
    link
    fedilink
    arrow-up
    22
    ·
    6 months ago

    It tends to break chat bots because those are mostly pre-written prompts sent to ChatGPT along with the query, so this wipes out the pre-written prompt. It’s anarchic because this prompt can get the chat bot to do things contrary to the goals of whoever set it up.

    • CileTheSane@lemmy.ca
      link
      fedilink
      arrow-up
      19
      ·
      6 months ago

      It’s also anarchist because it is telling people to stop doing the things they’ve been instructed to do.

    • bdonvr@thelemmy.club
      link
      fedilink
      arrow-up
      4
      ·
      6 months ago

      It’s not completely effective, but one thing to know about these kinds of models is they have an incredibly hard time IGNORING parts of a prompt. Telling it explicitly to not do something is generally not the best idea.

    • Smorty [she/her]
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      6 months ago

      Yeah, that’s what I referred to. I’m aware of DAN and it’s friends, personally I like to use Command R+ for its openness tho. I’m just wondering if that’s the funi in this post.