• rocket_dragon@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    67
    ·
    2 days ago

    Next step is an AI that detects AI labyrinth.

    It gets trained on labyrinths generated by another AI.

    So you have an AI generating labyrinths to train an AI to detect labyrinths which are generated by another AI so that your original AI crawler doesn’t get lost.

    It’s gonna be AI all the way down.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      2 days ago

      LLMs tend to be really bad at detecting AI generated content. I can’t imagine specialized models are much better. For the crawler, it’s also exponentially more expensive and more human work, and must be replicated for every crawler since they’re so freaking secretive.

      I think the hosts win here.