• xantoxis@lemmy.world
      link
      fedilink
      English
      arrow-up
      76
      ·
      edit-2
      26 days ago

      Is it? If ChatGPT wrote your paper, why would citations of the work of Frankie Hawkes raise any red flags unless you happened to see this specific tweet? You’d just see ChatGPT filled in some research by someone you hadn’t heard of. Whatever, turn it in. Proofreading anything you turn in is obviously a good idea, but it’s not going to reveal that you fell into a trap here.

      If you went so far as to learn who Frankie Hawkes is supposed to be, you’d probably find out he’s irrelevant to this course of study and doesn’t have any citeable works on the subject. But then, if you were doing that work, you aren’t using ChatGPT in the first place. And that goes well beyond “proofreading”.

      • And009@reddthat.com
        link
        fedilink
        English
        arrow-up
        15
        ·
        26 days ago

        This should be okay to do. Understanding and being able to process information is foundational

    • yamanii@lemmy.world
      link
      fedilink
      English
      arrow-up
      75
      ·
      26 days ago

      There are professional cheaters and there are lazy ones, this is gonna get the lazy ones.

      • MalditoBarbudo@programming.dev
        link
        fedilink
        English
        arrow-up
        32
        ·
        26 days ago

        I wouldn’t call “professional cheaters” to the students that carefully proofread the output. People using chatgpt and proofreading content and bibliography later are using it as a tool, like any other (Wikipedia, related papers…), so they are not cheating. This hack is intended for the real cheaters, the ones that feed chatgpt with the assignment and return whatever hallucination it gives to you without checking anything else.

    • abbadon420@lemm.ee
      link
      fedilink
      English
      arrow-up
      28
      ·
      26 days ago

      But that’s fine than. That shows that you at least know enough about the topic to realise that those topics should not belong there. Otherwise you could proofread and see nothing wrong with the references

    • psud@aussie.zone
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      26 days ago

      LLMs can’t cite. They don’t know what a citation is other than a collection of text of a specific style

      You’d be lucky if the number of references equalled the number of referenced items even if you were lucky enough to get real sources out of an LLM

      If the student is clever enough to remove the trap reference, the fact that the other references won’t be in the University library should be enough to sink the paper

      • auzy@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        ·
        edit-2
        26 days ago

        They can. There was that court case where the cases cited were made up by chatgpt. Upon investigation it was discovered it was all hallucinated by chatgpt and the lawyer got into deep crap

      • uis@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        25 days ago

        LLMs can’t cite. They don’t know what a citation is other than a collection of text of a specific style

        LLMs can cite. It’s called Retrival-Augmented Generation. Basically LLM that can do Information Retrival, which is just academic term for search engines.

        You’d be lucky if the number of references equalled the number of referenced items even if you were lucky enough to get real sources out of an LLM

        You can just print retrival logs into references. Well, kinda stretching definition of “just”.

        • notthebees@reddthat.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          24 days ago

          My question is that the thing they are citing actually exists and if it does exist, contains the information it claims.

          • uis@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            24 days ago

            the thing they are citing actually exists

            In case of RAGs it exists in searched dataset.

            and if it does exist, contains the information it claims.

            Not guaranteed.

          • FutileRecipe@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            24 days ago

            Depends. In my experience, it usually does exist. Now there are hallucinations where GPT makes up stuff or just misinterprets what it read. But it’s super easy to read the GPT output, look at the cited work, skim works for relevance, then tweak the wording and citing to match.

            If you just copy/paste and take GPT’s word for it without the minimal amount of checking, you’re digging your own grave.