• seaQueue@lemmy.worldOP
    link
    fedilink
    arrow-up
    216
    ·
    6 months ago

    The best I can do is an ML model running on an NPU that parses JSON in subtly wrong and impossible to debug ways

    • Aceticon@lemmy.world
      link
      fedilink
      arrow-up
      58
      ·
      6 months ago

      Just make it a LJM (Large JSON Model) capable of predicting the next JSON token from the previous JSON tokens and you would have massive savings in file storagre and network traffic from not having to store and transmit full JSON documents all in exchange for an “acceptable” error rate.

      • knorke3@lemm.ee
        link
        fedilink
        arrow-up
        5
        ·
        6 months ago

        Did you know? By indiscriminately removing every 3rd letter, you can ethically decrease input size by up to 33%!