If you’re worried about how AI will affect your job, the world of copywriters may offer a glimpse of the future.
Writer Benjamin Miller – not his real name – was thriving in early 2023. He led a team of more than 60 writers and editors, publishing blog posts and articles to promote a tech company that packages and resells data on everything from real estate to used cars. “It was really engaging work,” Miller says, a chance to flex his creativity and collaborate with experts on a variety of subjects. But one day, Miller’s manager told him about a new project. “They wanted to use AI to cut down on costs,” he says. (Miller signed a non-disclosure agreement, and asked the BBC to withhold his and the company’s name.)
A month later, the business introduced an automated system. Miller’s manager would plug a headline for an article into an online form, an AI model would generate an outline based on that title, and Miller would get an alert on his computer. Instead of coming up with their own ideas, his writers would create articles around those outlines, and Miller would do a final edit before the stories were published. Miller only had a few months to adapt before he got news of a second layer of automation. Going forward, ChatGPT would write the articles in their entirety, and most of his team was fired. The few people remaining were left with an even less creative task: editing ChatGPT’s subpar text to make it sound more human.
By 2024, the company laid off the rest of Miller’s team, and he was alone. “All of a sudden I was just doing everyone’s job,” Miller says. Every day, he’d open the AI-written documents to fix the robot’s formulaic mistakes, churning out the work that used to employ dozens of people.
Pretty dystopian article.
But this will continue, until oligarchs like Altman, Cook, Nadella etc. start getting put into difficult situations; ones that create very strong incentives for them to show humanity (or at least emulate it).
It’s never the managers who suffer first, is it?
It’s never the upper managent but they don’t actually do anything but landlord. Lower managers are being replaced by bots that police the bottom rung workers.
Anyhow when AI was very not working right at all the ownership class were eager to replace creative workers even then, so we we’ve known for over a year or two they’re gunning to end creative work and replace it with menial work.
I don’t know what the Mahsa Amini moment is going to be to spark the general worker uprising, but news about the conditions being right comes in every day.
Human condition… The water flows down
That’s not water
Welcome to the new Industrial Revolution, where one person can do the work of many. Sure, mass produced
goodscontent aren’t as good as handmade artisanalproductswriting, but there’s a huge market for it.There really isn’t though. Very very few writers live off of writing alone.
A huge market means there’s lots of demand for the products. That doesn’t have to translate to lots of jobs for the people producing that product.
Is there demand tho? Once people catch shit is AI they seem to lose interest.
Can’t do much of resist anymore because shit sounds like bots half the time. Can’t even tell if it is bots tbh but can’t shake that feeling either. Lost all interest.
You don’t think there’s demand for news articles? The comment I’m responding to said there isn’t a huge market. That’s all I’m arguing against here, that there is a huge market. Whether AI can fulfill it a separate issue, one that we’ll see play out.
I don’t think people want to read air articles but I could be wrong tbh
Time will tell how general population reacts to it.
If they have an emotional reaction to the headline, (positive or negative) they click. Clicks make money.
Whether they click to read the fluff or click to share the headline doesn’t matter, a click is a click.
People still click tracking links?
The demand may not be by the end user but by businesses that need to fill stuff with filler text. Say you’re working for an automotive company and have to pull off a big email campaign you can use generative ai to help you type up a couple dozen emails, sure they’ll be crappy but you finished your work by the deadline so spending the company money for a service like that seems worth it
If most peoples jobs really are this mindless, then i don’t know what to say…
We got generation of educated people writing sloppy emails nobody need to be correct?
It’s not that they shouldn’t be correct it’s that people are expected to do the job of multiple people with limited budgets so they contract their work out to services or don’t take the time to double check stuff.
When I worked with the marketing people ot was a team of 3 that had to cover the US mexico and Canada, so they would have people on retainer to make things like art or text for a lot of the stuff, but I can see how they could use a service that used ai for those things
Look at what happened with Wacom, so I just see this kind of cost cutting is appealing to businesses that want to cut costs at the expense of quality
https://community.wacom.com/en-us/ai-art-marketing-response/
Again I’m not saying it’s good but there is demand for this stuff as shitty as it is
Like he said at the end, nobody is reading the garbege.
I think is something g is written by AI the only way to read it is to make another AI to read it and summerize it. Then you still can decide to read the summary or not.
It’s probably replacing garbage written by humans that nobody was reading either.
So in this case, garbage content that nobody reads, AI is probably a good idea.
There’s nothing inherently wrong with doing quality assurance work, but I think the workers are being fooled into thinking it’s less valuable work than their old job. In fact, based on QA in other industries, I’d say these workers should be getting paid more. This is why unions are important, otherwise people just get fooled or bullied into accepting bad deals
The human serfs will have to proofread increasingly voluminous, numerous and complex output from ai systems. The product has become the master. Until the systems develop a sense of ‘truth’ beyond numerical statistics, generative ai is pretty much a toy.
I think retrieval augmentation and fine tunning are the biggest tools to the results more refined (or better reference a document as a source of truth). The other ironically is just regular deterministic programming.
Until the systems develop a sense of ‘truth’ beyond numerical statistics, generative ai is pretty much a toy.
I’ll start by saying I am pro-worker, pro-99%, pro-human.
Now, I must refute your assertion for specific domains (and specific working styles), e.g. translation (or a preference for editing over drafting/coding from a blank page). If money used to hit your bank account every two weeks because you translated or provided customer service for a company, and now that money doesn’t come in anymore, it wouldn’t feel too playful or like a toy is involved.
This is today, not “until” any future milestone.
Re-sharing some screenshots I took a month or so back, below.
November 2022: ChatGPT is released
April 2024 survey: 40% of translators have lost income to generative AI - The Guardian
Also of note from the podcast Hard Fork:
There’s a client you would fire… if copywriting jobs weren’t harder to come by these days as well.
Customer service impact, last October:
And this past February - potential 700 employee impact at a single company:
If you’re technical, the tech isn’t as interesting [yet]:
Overall, costs down, capabilities up (neat demos):
Hope everyone reading this keeps up their skillsets and fights for Universal Basic Income for the rest of humanity :)
Air Canada did that too. Only the lack of precision made offers to customers they weren’t prepared to honor.
Have to wonder how much Klarna invested in their tech, assuming they’re not big ole fibbers