Deleting data from them might not be feasible, but there are other tactics.
[…] trapping AI crawlers and sending them down an “infinite maze” of static files with no exit links, where they “get stuck” and “thrash around” for months, he tells users. Once trapped, the crawlers can be fed gibberish data, aka Markov babble, which is designed to poison AI models.
Deleting data from them might not be feasible, but there are other tactics.