I suppose the only thing I disagree with is that the law can do anything about it. Obviously, you can go after sites that have money and/or a real business presence, a la Pornhub. But the rest? It’s the wild west.
I think it’s best that it be illegal so that we can at least have a reactive response to the problem. If someone abuses someone else by creating simulated pornography (by any means), we should have a crime to charge them with.
You can’t target the technology, or stop people from using AI to do perverted things, but if they get caught, we should at least respond to the problem.
I don’t know what a proactive response to this issue looks like. Maybe better public education and a culture that encourages more respect for others?
I think it’s best that it be illegal so that we can at least have a reactive response to the problem. If someone abuses someone else by creating simulated pornography (by any means), we should have a crime to charge them with.
So… Where do you draw the line exactly? Does this include classic photo manipulation too? Written stories (fanfic)? Sketching / doodling of some nude figure with a name pointed towards it? Dirty thoughts that someone has about someone else? I find this response highly questionable and authoritarian. Calling it abuse is also really trivializing actual abuse, which I, as an abuse victim, find pretty apprehensive. If I could swap what was done to me with someone making porn of “me” and getting their rocks off of that then I’d gladly make that exchange.
I feel I was misconstrued. 1. a law will probably happen, and 2. it will do fuck all because the tool chain and posting/sharing process are going to be completely anonymous.
Yeah, in specific cases where you can determine deepfake revenge porn of Person A was posted by Person B who had an axe to grind, you might get a prosecution. I just don’t think the dudes making porn on their Nvidia GPUs of Gal Godot f*ckin Boba Fett are ever gonna get caught, and the celebrity cat will stay forever out of the bag.
You can’t ban the tech but you can ban the act so it’s easier to prosecute people that upload deep fakes of their co-workers.
That’s already illegal in most countries, regardless of how it was made. It also has nothing to do with “AI”.
Obviously, you can go after sites that have money and/or a real business presence, a la Pornhub. But the rest? It’s the wild west.
I was referring to that part of his comment. It is also not at all illegal in most countries. Its only illegal at state level in the US for example, and not for all of them either. Canada only has 8 provinces with legislation against it.
I do agree though that it’s not the softwares fault. Bad actors should be punished and nothing more.
I feel an easy and rational solution is to criminalize a certain category of defamation… presenting something untrue/fabricated as true/real or in a way that it could be construed as true/real.
Anything other than that narrow application is an infringement on the First Amendment.
I feel an easy and rational solution is to criminalize a certain category of defamation… presenting something untrue/fabricated as true/real or in a way that it could be construed as true/real.
I would love that solution, but it definitely wouldn’t have bipartisan support.
There are certain political groups that have a vested interest in lying, deceiving, manipulating, and fabricating to get what they want. So… yeah. 😞
I feel that’s just most political groups nowadays. Not implying both sides are the same, just that everyone likes their lies.
The majority of “AI” generated / altered porn is already very much labeled as such though.
Exactly. Photoshop has been around for decades. AI is just more of the same. I find it weird how, as technology evolves, people keep fixating on the technologies themselves rather than the universal (sometimes institutional) patterns of abuse.
Just another reason why we can’t ethically introduce AI.
Anyone could run it on their own computer these days, fully local. What could the government do about that even if they wanted to?
Anyone can make CSAM in their basement, what could the government do about that even if they wanted to?
Anyone can buy a pet from a pet store, take it home and abuse it, why is animal abuse even illegal?
Should I keep going with more examples?
What do you want them to do, have constant monitoring on your computer to see what applications you open? Flag suspicious GPU or power usage and send police to knock on your door? Abusing people or animals requires real outside involvement. You are equating something that a computer generates with real life, while they have nothing to do with each other.
Who is suggesting that?
Murder is illegal, do we surveil everyone who owns a gun or knife?
CSAM is illegal, do cameras all report to the government?
Again, that’s just 2 examples. Lmk if you want more
Maybe my wording is unclear. I am wondering how they should be expected to detect it in the first place. Murder leaves a body. Abuse leaves a victim. Generating files on a computer? Nothing of the sort, unless it is shared online. What would a new regulation achieve that isn’t already covered under the illegality of ‘revenge porn?’ Furthermore, how can they possibly even detect anything beyond that without massive privacy breaches as I wrote before?
https://en.m.wikipedia.org/wiki/Victimless_crime
These are still crimes
The govt’s job is not to prevent crime from happening, that’s dystopian-tier stuff. Their job is to determine what the law is, and apply consequences to people after they are caught breaking it.
The job of preventing crime from happening in the first place mainly belongs to lower-level community institutions, starting with parents and teachers.
Can AIs really consent, though?
The issue is not with all forms of pornographic AI, but more about deepfakes and nudifying apps that create nonconsensual pornography of real people. It is those people’s consent that is being violated.
I still don’t understand why this is now an issue but decades of photo editing did not bother anyone at all.
I mean, it did bother people, it just took more skill and time at using photo manipulation software to make it look convincing such that it was rare for someone to both have the expertise and be willing to put in the time, so it didnt come up often enough to be a point of discussion. The AI just makes it quick and easy enough to become more common.
Regular editing is much easier and quicker than installing, configuring and using stable diffusion. People acting like “AI” is a 1 click solution that gets you convincing looking images have probably never used it.
It literally is a one-click solution. People are running nudifying sites that use CLIP, GroundingDINO, SegmentAnything, and Stable Diffusion to autonomously nudify people’s pictures.
These sites (which I won’t even mention the names of), just ask for a decent quality photo of a woman wearing a crop top or bikini for best results.
The people who have the know-how to set up Stable Diffusion and all these other AI photomanipulation tools are using those skills to monetize sexual exploitation services. They’re making it so you don’t need to know what you’re doing to participate.
And sites like Instagram, which are filled with millions of exploitable images of women and girls, has allowed these perverted services to advertise their warez to their users.
It is now many orders of magnitude easier than it ever has been in history to sexually exploit people’s photographs. That’s a big deal.
If you wanna pay for that then you do you. lol But at that point you could’ve also paid a shady artist to do the work for you too.
Also, maybe don’t pose half naked on the internet already if you don’t want people to see you in a sexual way. That’s just weird, just like this whole IG attention whoring of people nowadays. And no, this isn’t even just a women thing. Just look how thirsty women get under the images of good looking dudes that pose topless, or just your ordinary celeb doing ordinary things (Pedro Pascal = daddy, and yes, that includes more explicit comments too).
This hypocritical fake outrage is just embarrassing.
Removed by mod