Additionally, https://xeiaso.net/blog/2025/anubis/
Some of this stuff could be conceivably implemented as an easy-to-consume service. It would be nice if it were possible to fend off the scrapers without needing to be a sysadmin or, say, a cloudflare customer.
(Whilst I could be either of those things, unless someone is paying me I would very much rather not)
A WP plugin would be handy.
Stupidly trivial question probably, but I guess it isn’t possible to poison LLMs on static websites hosted on GitHub?
Sure, but then you have to generate all that crap and store it with them. Preumably Github will eventually decide that you are wasting their space and bandwidth and… no, never mind, they’re Microsoft now. Competence isn’t in their vocabulary.
I do feel like active anti-scraping measures could go somewhat further, though - the obvious route in my eyes would be to try to actively feed complete garbage to scrapers instead - whether by sticking a bunch of garbage on webpages to mislead scrapers or by trying to prompt inject the shit out of the AIs themselves.
Me, predicting how anti-scraping efforts would evolve
(I have nothing more to add, I just find this whole development pretty vindicating)
Doing God’s work 🙏
The kids are going through an Adventure Time phase, and so I am reminded of this: