Sexually explicit AI-generated images of Taylor Swift have been circulating on X (formerly Twitter) over the last day in the latest example of the proliferation of AI-generated fake pornography and the challenge of stopping it from spreading.
X’s policies regarding synthetic and manipulated media and nonconsensual nudity both explicitly ban this kind of content from being hosted on the platform.
Wow this is going to be interesting from multiple fronts for me especially.
First, I’m a huge swiftie - and Taylor is probably not going to take this lightly. Who she’s going to target will be a more interesting question. (Shameless plug for !taylorswift@poptalk.scrubbles.tech if you want to join our small community)
Second, as a nerd who has dabbled with generated art - thank you trolls for ruining it for all of us. This is just going to beg for regulations that is going to ruin the generative AI world - as if we didn’t have enough regulations barreling towards the area with copyright issues.
Third, as someone who hates Musk - I hope everything focuses on him and the platform formerly known as Twitter.
Awesome.
Is that hatred, or fear, that I hear in this comment?
That’s “suppressing theft masquerading as art is awesome” you hear in that comment.
Ah, it was the third option, ignorance.
Oh, I’m not at all ignorant of how horrible generative " art " is, but I appreciate you checking on me.
If it’s horrible and it’s also “masquerading” as human art, what does that say about human art?
Are you mad at people who can draw or something?
No, I’m just pointing out the common contradiction I see in threads like this, where people argue that AI is both a big threat to “traditional” artists and also that AI is terrible compared to “traditional” artists. It can’t really be both.
Misunderstanding doesn’t make the comment into the type of gotcha you think it is
I just wish my printer could actually print a car. 200mm bed is a little small
Break it down into chunks and assemble it like Lego.
Now you’re stealing from LEGO™! 🙀
I have an honest question and would like to hear your (and others, of course) opinion:
I get the anger at the models that exist today. DallE, Midjourney and others were trained on millions of images scraped without consent. That itself is legally ambiguous, and will be interesting how courts rule on it (who am I kidding, they’ll go with the corporations). More importantly though, some of it (and increasingly more, as the controversy reached mainstream) was explicitly disallowed by the author to be used as training data. While I don’t think stealing is the right term here, it is without question unethical and should not be tolerated. While I don’t feel as strongly about this as many others do, maybe because I’m not reliant on earning money from my art, I fully agree that this is scummy and should be outlawed.
What I don’t understand is how many people condemn all of generative AI. For me the issue seems to be one of consent and compensation, and ultimately of capitalism.
Would you be okay with generative AI whose training data was vetted to be acquired consentually?
Not if it was used to undercut human artists’ livelihoods.
Hypothetical future where everybody gets UBI and/or AI becomes sentient and able to unionize, maybe we look back at this again.
I don’t think AI has a soul but there no reason it couldn’t be given one.
Undercutting artists’ livelihoods is definitely a problem that needs to be addressed. I honestly don’t think UBI is going far enough, as it’s just a bandaid on the festering tumor of capitalism (but that’s a discussion for another day). But can’t the same be said about numerous other fields? AI can perform many tasks throughout all fields of work. At the moment it is still worse than an expert in most of these, but it’s a matter of when, not if, it surpasses that. Engineers, programmers, journalists, accountants, I can’t think of any job that is not en route to be streamlined or automated by AI, reducing need for humans and putting people out of work.
Artists have it worse in the sense that they are often self employed, which makes them more vulnerable to exploitation and poverty. But isn’t the problem much larger than that?
This whole debate somewhat reminds me of the swing riots. They were often portrait as anti-technology or backwards, when in actuality the reason for the revolts wasn’t that machines existed, but that they were used to undercut and exploit workers.
I’m not trying to argue that any of what’s happening now is good, just to clarify again. The current “AI revolution” is rotten through and though. But AI is (for now, the consciousness question is super interesting, but not all that relevant at the moment) just a tool. It irks me that so much righteous anger is projected at AI, instead of the people using it to exploit people and maximize their profits, and the system that gives them the power to do so. Capitalists don’t care if it’s an AI, sweatshop workers overseas or exploited workers competing for jobs domestically. They’ll go with that earns them more money. We should be angry at the cause, not the symptoms.
I’m curious what you mean by soul here, if you’re using it in a metaphorical sense or the religious sense
There’s a difference?
Well I’ve never heard of a religious person claiming AI could have a soul in the religious sense, and “soul” has other meanings than the religiously literal one, so yes?
Well how many and what difference sorts of religious people have you come across?
People hear “religious” and seem equate it with “Abrahamic malarkey it isn’t couth to call folks on.”
“Religiously literal” seems a contradiction of terms a well. There is truth and there is ways to understand and to convey that truth.
I don’t have a problem with training on copyrighted content provided 1) a person could access that content and use it as the basis of their own art and 2) the derived work would also not infringe on copyright. In other words, if the training data is available for a person to learn from and if a person could make the same content an AI would and it be allowed, then AI should be allowed to do the same. AI should not (as an example) be allowed to simply reproduce a bit-for-bit copy of its training data (provided it wasn’t something trivial that would not be protected under copyright anyway). The same is true for a person. Now, this leaves some protections in place such as: if a person made content and released it to a private audience which are not permitted to redistribute it, then an AI would only be allowed to train off it if it obtained that content with permission in the first place, just like a person. Obtaining it through a third party would not be allowed as that third party did not have permission to redistribute. This means that an AI should not be allowed to use work unless it at minimum had licence to view the work. I don’t think you should be able to restrict your work from being used as training data beyond disallowing viewing entirely though.
I’m open to arguments against this though. My general concern is copyright already allows for substantial restrictions on how you use a work that seem unfair, such as Microsoft disallowing the use of Windows Home and Pro on headless machines/as servers.
With all this said, I think we need to be ready to support those who lose their jobs from this. Losing your job should never be a game over scenario (loss of housing, medical, housing loans, potentially car loans provided you didn’t buy something like a mansion or luxury car).
My initial position was that AI art would be exciting when a more carefully curated training data is used. … But after some talking with friends, I think we’re living in a world that has minimal respect for copyright already, except when a corporation has a problem with it and wants to bring down the hammer of the law.
It does hurt and its easy to be emotional about artists’ livelihoods being threatened by AI, they aren’t the only laborers threatened by job loss to automation, but this one hurts the most.
So now its just up to AI and artists to make interesting art with it. And for artists to adapt to this environment that has automated art tools.
omg Franzia haii :3
with how easy it is to run these models by now, the technology is certainly here to stay, and people will need to adapt, for sure. It only really makes sense to discuss AI in the broader context of capitalism, imo
Removed by mod