I’m new to the fediverse, and so far, I have been seeing some situations from lemmy instances rejecting users to defederating from others, but I ask myself: what can be done if trolls or bots come from self-hosted single-user instances?
They get defederated wherever they get consistently banned…and then they can federate with their friends and make their community of shit and yell at each other or something.
Can it be practical if several instances get created programmatically?
Well, if we had a DDOS-like barrage of troll instances, probably a solution would be to institute a “pending federation request” for any new communities.
Looks interestsing.
Would it be also possible to ask for a captcha on the first interactions users from new instances make?
A reputation/trustworthiness rating for instances might be helpful too, something akin to karma but for the instance as a whole. More vulnerable communities would be able to set a minimum trust requirement for unapproved participation.
I like that idea. It would be good to put numbers on positive community traits like trust, helpfulness, politeness, inclusiveness, innovation, etc. so that amoral analysts have something more to go on than Nusers, connections, clicks and replies when deciding what/how to monetize things. A bit like biodiversity protection…or a freedom index or HDI.
I don’t usually make a habit of being an ass, but I am extra careful about what I say because I choose to self host. Since any instance admin that decides not to take a liking to me could take one look at my instance and defederate me without a second thought. If anything, self-hosting makes you more vulnerable than just normally ban evading on an open instance.
deleted by creator
I agree with Jamie. While it is certainly possible for a bad actor to spin up burner instances for the purposes of evading defederation, that’s a disproportionate amount of effort compared to just creating a new account somewhere that already exists.
Will we see it happen? Probably. But it honestly seems easier to deal with than if those bad actors were to hide themselves in established instances.