• Ragdoll X@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    5 months ago

    I’ve seen some people on Twitter complain that their coworkers use ChatGPT to write emails or summarize text. To me this just echoes the complaints made by previous generations against phones and calculators. There’s a lot of vitriol directed at anyone who isn’t staunchly anti AI and dares to use a convenient tool that’s avaliable to them.

    • morrowind@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      5 months ago

      I’m not on twitter, but frankly the strongly anti-AI I see is often from techy places. HN and lemmy are two main ones.

    • Windex007@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      5 months ago

      I think my main issue with that use case is that it’s a “solution” to a relatively minor problem (which has a far simpler solution), that actually compounds the problem.

      Let’s say I don’t want to write prose for my email, I have a list of bullet points I want to get across. Awesome, I feed it into the chat gippity and boom, my points are (hopefully) property represented in prose.

      Now, the recipient doesn’t want to read prose. ESPECIALLY if it’s the fluffy wordy-internet-recipe-preamble that the chat gippity tends to produce. They want a bullet point summary. So they feed it into the chat gippity to get what is (hopefully) a properly condensed bullet point summary.

      So, suddenly we have introduced a fallible middle translation layer for actually no reason.

      Just write the clear bullet point email in the first place. Save everyone the time. Save everyone from the 2 chances for the chat gippity to fuck it up.