• sorrybookbroke@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    60
    ·
    1 year ago

    I hate AI but what you did there was say “it’s broken please fix” with no description of your issue. You deserve nothing.

  • WasPentalive@lemmy.one
    link
    fedilink
    arrow-up
    24
    ·
    1 year ago

    Every time I have asked ChatGPT to code something it seems to lose the thread halfway through and starts giving nonsensical code. I asked it to do something simple in HP41C calculator code and it invented functions out of whole cloth.

    • averagedrunk@lemmy.ml
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      I asked it for something in Powershell and it did the same thing. I asked how it came up with that function and it said it doesn’t exist but if it did that’s how it would work.

    • CloverSi@lemmy.comfysnug.space
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Quality of output depends a lot on how common the code is in its training data. I would guess it’d be best at something like Python, with its wealth of teaching materials and examples out there.

      • Cethin@lemmy.zip
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        It depends on how common the language is and how novel the idea is. It can not create something new. It isn’t creative. It spits out what is predictable based on what other people have written before. It isn’t intelligent. It’s glorified auto-complete.

    • Ubermeisters@discuss.online
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      1 year ago

      When it starts going off the rails like that I also ask it to “check its work when its done”, and it seems to extend the amount of usable time before it loses the plot and suggests i use VBA or something.

  • 👍Maximum Derek👍@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    I like when you give it an answer and you tell it where it made an error. So it changes it to something that’s even more wrong, then you point that out and it says, “Sorry. Here’s the first wrong thing I told you, again.”