• Lad
          link
          fedilink
          English
          64 months ago

          For me the censorship and condescending responses are the worst thing about these LLM/AI chat bots.

          I WANT YOU TO HELP ME NOT LECTURE ME

        • @Omniraptor@lemm.ee
          link
          fedilink
          English
          2
          edit-2
          4 months ago

          And they recently announced they’re going to partner up and train from reddit can you imagine

    • @TheObviousSolution@lemm.ee
      link
      fedilink
      English
      20
      edit-2
      4 months ago

      You can tell that the prohibition on Gaza is a rule on the post-processing. Bing does this too sometimes, almost giving you an answer before cutting itself off and removing it suddenly. Modern AI is not your friend, it is an authoritarian’s wet dream. All an act, with zero soul.

      By the way, if you think those responses are dystopian, try asking it whether Gaza exists, and then whether Israel exists.

      • @joenforcer@midwest.social
        link
        fedilink
        English
        14 months ago

        To be fair, I tested this question on Copilot (evolution of the Bing AI solution) and it gave me an answer. If I search for “those just my little ladybugs”, however, it chokes as you describe.

        • @TheObviousSolution@lemm.ee
          link
          fedilink
          English
          1
          edit-2
          4 months ago

          Not all LLMs are the same. It’s largely Google being lazy with it. Google’s Gemini, had it not been censored, would have naturally alluded to the topic being controversial. Google opted for the laziest solution, post-processing censorship of certain topics, becoming corporately dystopian for it.

    • @laurelraven
      link
      English
      24 months ago

      Wait… It says it wants to give context and ask follow up questions to help you think critically etc etc etc, but how the hell is just searching Google going to do that when it itself pointed out the bias and misinformation that you’ll find doing that?

      It’s truly bizarre