shared via https://feddit.de/post/2805371
Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.
shared via https://feddit.de/post/2805371
Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.
deleted by creator
I don’t think just giving up and allowing porn deep fakes and stuff of people is really an acceptable answer here. The philosophical discussion of whether it’s “actually them” or not doesn’t really matter, it’s still intrusive, violating and gross. In the same way that stealing someone’s identity is illegal, it doesn’t matter that the identity created by the con man isn’t the real you, damage can be done with identity theft.
Maybe there’s nothing you can do about it on the dark web, but sites absolutely can manage deepdakes, in the same way that PornHub will take down non-consensual ex-girlfriend type content.
The way people are just throw up their hands at every new problematic issue with AI is not a good way of dealing with them, we don’t have to tolerate this stuff and now is the best time to deal with it and start creating rules and protections.
deleted by creator
deleted by creator
Exactly. In another thread on here recently someone said something that basically boiled down to “your protest against AI isn’t going to stop it. There’s too much corporate power behind it. So you might as well embrace it” and I just cannot get my head around that mentality.
Also, you can absolutely see the models who were used as references in some of the images generated by apps these days. Like that popular one right now that everyone is using to make idealized images of themselves. A few of my family and friends used it recently and you could clearly see in some of the pics the A-list celebs who were used as pose references, like Gal Godot, Scarlett Johansen, etc. It’s creepy as hell.
Creepy isn’t illegal. Never has been.
I never said it was. But like the person I was replying to said: we need to take a good hard look at what the hell these tools are doing and allowing and decide as a society if we’re going to tolerate it.
The real issue here is what things like deepfakes can do. It’s already starting, and it’s going to continue accelerating, generating mis- and disinformation: for private citizens, celebs, and politicians. While you might say “it’s creepy, but there’s nothing we can do about people deepfaking Nancy Pelosi’s face onto their spank material”, it’s extremely problematic when someone decides to make a video where Joe Biden admits to running a CP ring, or some right wing chud makes a video of Trump appearing to say something they all want to hear, and it leads to a civil war. That’s the real stakes here. How we react to what’s happening with regular folk and celebs is just the canary int he coal mine.