Whenever I stumble on reddit I make sure to post disinformation or some kind of dumb shit to throw a wrench into the LLM training data they sell to google.
I hate to ruin this for you, but if you post nonsense, it will get downvoted by humans and excluded from any data set (or included as examples of what to avoid). If it’s not nonsensical enough to be downvoted, it still won’t do well vote wise, and will not realistically poison any data. And if it’s upvoted… it just might be good data. That is why Reddit’s data is valuable to Google. It basically has a built in system for identifying ‘bad’ data.
Whenever I stumble on reddit I make sure to post disinformation or some kind of dumb shit to throw a wrench into the LLM training data they sell to google.
Make sure to have some LLM generate the comment for you, as LLMs learning synthetic data may fuck them up over time: AI models fed AI-generated data quickly spew nonsense
I hate to ruin this for you, but if you post nonsense, it will get downvoted by humans and excluded from any data set (or included as examples of what to avoid). If it’s not nonsensical enough to be downvoted, it still won’t do well vote wise, and will not realistically poison any data. And if it’s upvoted… it just might be good data. That is why Reddit’s data is valuable to Google. It basically has a built in system for identifying ‘bad’ data.
this is an ancient and noble practice known as shitposting, no need to call it something else :)