The AI boom is screwing over Gen Z | ChatGPT is commandeering the mundane tasks that young employees have relied on to advance their careers.::ChatGPT is commandeering the tasks that young employees rely on to advance their careers. That’s going to crush Gen Z’s career path.
Bullshit. Learn how to train new hires to do useful work instead of mundane bloat.
100% if an AI can do the job just as well (or better) then there’s no reason we should be making a person do it.
Part of the problem with AI is that it requires significant skill to understand where AI goes wrong.
As a basic example, get a language model like ChatGPT to edit writing. It can go very wrong, removing the wrong words, changing the tone, and making mistakes that an unlearned person does not understand. I’ve had foreign students use AI to write letters or responses and often the tone is all off. That’s one thing but the student doesn’t understand that they’ve written a weird letter. Same goes with grammar checking.
This sets up a dangerous scenario where, to diagnose the results, you need to already have a deep understanding. This is in contrast to non-AI language checkers that are simpler to understand.
Moreover as you can imagine the danger is that the people who are making decisions about hiring and restructuring may not understand this issue.
The good news is this means many of the jobs AI is “taking” will probably come back when people realize it isn’t actually as good as the hype implied
It’s just that I fear that realisation may not filter down.
You honestly see it a lot in industry. Companies pay $$$ for things that don’t really produce results. Or what they consider to be “results” changes. There are plenty of examples of lowering standards and lowering quality in virtually every industry. The idea that people will realise the trap of AI and reverse is not something I’m enthusiastic about.
In many ways AI is like pseudoscience. It’s a black box. Things like machine learning don’t tell you “why” it works. It’s just a black box. ChatGPT is just linear regression on language models.
So the claim that “good science” prevails is patently false. We live in the era of progressive scientific education and yet everywhere we go there is distrust in science, scientific method, critical thinking, etc.
Do people really think that the average Joe is going to “wake up” to the limitations of AI? I fear not.
Not quite. It’s more that a job that once had 5-10 people and perhaps an “expert” supervisor will just be whittled down to the expert. Similarly, factories used to employ hundreds and a handful of supervisors to produce a widget. Now, they can employ a couple of supervisors and a handful of robot technicians to produce more widgets.
The problem is, where do those experts come from? Expertise is earned through experience, and if all the entry-level jobs go away then eventually you’ll run out of experts.
Education. If education was free this wouldn’t be a problem, you could take a few more years at university to gain that experience instead of working in a junior role.
This is the problem with capitalism, if you take too much without giving back, eventually there’s nothing left to take.
You don’t get experts from education. You get experts from job experience (after education).
You definitely don’t get experts from unemployed people, or from people working to the bone doing menial labor for minimum wage.
Education is a broad term, that could include apprenticeships where you do get real work experience. And education would have to change a lot in all areas. The point is, the government can support people to gain that experience, the problem is that right now it isn’t. It’s common to exit just a bachelors degree with crippling amounts of debt.
deleted by creator
This.
In accounting, 10 years ago, a huge part of the job was categorising bank transactions according to their description.
Now AI can kinda do it, but even providers that would have many billions of transactions to use as training data have a very high error rate.
It’s very difficult for a junior to look at the output and identify which ones are likely to be incorrect.
And AI is not always the best solution. One of my tasks at my job is to respond to website reviews. There is a button I can push that will generate an AI review. I’ve tested it. It works… but it’s not personal. My responses directly address things they say, especially if they have issues. Their responses are things like, “thanks for your five-star review! We really appreciate it, blah blah blah.” Like a full paragraph of boilerplate bullshit that never feels like the review is addressed.
You would think responding to reviews properly would be a very basic function an AI could do as well as a human, but at present, no way.
deleted by creator
True, although my company emphasizes human contact with customers. We really go out of our way with tech support and such. That said, I hate responding to reviews. I kind of wish it was good enough to just press the ‘respond to review with AI’ button.
Exactly!
They don’t want to train new hires to begin with. A lot of work that new hires relied on to get a foothold on a job is bloat and chores that nobody wants to do. Because they aren’t trusted to take on more responsibility than that yet.
Arguably whole industries exist around work that isn’t strictly necessary. Does anyone feel like telemarketing is work that is truly necessary for society? But it provides employment to a lot of people. There’s much that will need to change for us to dismiss these roles entirely, but people need to eat every day.
The “not willing to train” thing is one of the biggest problems IMO. But also not a new one. It’s rampant in my field of software dev.
Most people coming out of university aren’t very qualified. Most have no understanding of how to actually program real world software, because they’ve only ever done university classes where their environments are usually nice and easy (possibly already setup), projects are super tiny, they can actually read all the code in the project (you cannot do that in real projects – there’s far too much code), and usually problems are kept minimal with no red herrings, unclear legacy code, etc.
Needless to say, most new grads just aren’t that good at programming in a real project. Everyone in the field knows this. As a result, many companies don’t hire new grads. Their advertised “entry level” position is actually more of a mid level position because they don’t want to deal with this painful training period (which takes a lot of their senior devs time!). But it ends up making the field painful to enter. Reddit would constantly have threads from people lamenting that the field must be dying and every time it’s some new grad or junior. IMO it’s because they face this extra barrier. By comparison, senior devs will get daily emails from recruiters asking if they want a job.
It’s very unsustainable.
Indeed: at least in knowledge based industries, everybody starts by working with a level of responsability were the natural mistakes a learning person does have limited impact.
One of my interns read the wrong voltage and it took me ten minutes to find his mistake. Ten minutes with me and multiple other senior engineers standing around.
I congratulationed him and damn it I meant it. This was the best possible mistake for him to make. Everyone saw him do it, he gets to know he held everything up, and he has to just own it and move on.
Exactly this.
The problem is really going to be in the number of jobs that are left with 40hrs of work to do.
I fully agree, however doing some mundane work for a few weeks while you learn is useful. You can’t just jump straight into the deep work.