• sem
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    3 days ago

    To me that’s so cringe because I’ve tried it out for explaining concepts and when I take that information and try to use it, it is confidently wrong so much of the time.

    The one thing it has helped me with is when I’m trying to do some system administration task, where traditional search engine results are old forum entries or out of date documentation, llms can suggest a way to do the task, and then I can follow those breadcrumbs and do real research on how to do what I need to do.

    • untakenusername@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      3 days ago

      I was trying to think of use cases for it. honestly if you just want a general overview of a topic, the hallucinations dont affect it too much