On the discussion of job replacement due to A.I, the term “prompt engineer” pops up frequently when mentioning new jobs that will arise due to it.
Isn’t prompt engineering just typing questions? How is it difficult enough for it to be a desired job in the market?
100% prompt engineering was made up by grifters to sell consultations, classes, books, etc.
-
calling it engineering is frankly an affront to anyone who’s gone to an engineering college, or studied technical vocations where engineers are important.
-
In any competitive, expensive field, It is easier to teach a professional to prompt, than a prompter the profession. Same as it is easier to teach a chemical engineer to work Excel, than it is to teach an excel wiz chemical engineering.
-
LLMs with chain of thought reasoning are already making the whole thing irrelevant by promoting themselves. The industry is rapidly replacing the so called, “promt engineers.”
Tldr: just read #3
-
I will never regard “prompt engineering” as any kind of real engineering. Engineering requires design, it requires an understanding of science so that you can use it invent new technology. It requires accountability for those designs: justifying your designs with regard to cost and function using mathematics, and your math needs to be checked by other engineers to make sure you aren’t making mistakes. Prompt “engineering” requires none of this.
The very term “prompt engineer” is just a marketing term that AI companies are spreading around in their propaganda to try to make their business seem more legit and less like a scam.
With large language models (LLMs), you are basically compressing huge quantities of data into a large, black-box statistical model. You cannot describe the nature of this black box scientifically, its nature changes completely every time more training data is immersed into it. There is no expertise to be gained, no way to say “using these prompt terms the output improves by XX% for the reason that…”
- you cannot gradually learn its nature through experimentation
- the prompts can only be reused for a brief time until the LLM changes
- there is no metric by which anyone could hold you accountable for your prompts once the LLM changes
A prompt “engineer” is just an endless game of guessing and checking, and the results selected will be subject to all the biases of the person running the prompts. It is labor intensive, to be sure, but not in any way that requires design with regard to cost and function. It is NOT engineering.
Garbage in, garbage out.
A lot of technical jobs are like this. Knowing what makes good input and being able to judge good output. You could say coding is just typing in lines of code. Or what other types of engineering are just trying different architectures, geometries, or values until something works.
Its not necessarily “difficult”, but a lot of jobs are just applying knowledge of context and then polishing an output for use. In a future prompt engineer role you might be editing copy for a chosen audience, or finding a useful way to organize the input so that a good output takes an hour to land on, instead of a week.
I guess the long answer is “we don’t know”, but the short answer is that yes, the smart money seems to be having some knowledge of how to use AI might be useful in your career, ten years from now.
Do we know any example of companies hiring prompt engineers today?
I don’t know how futuristic the boom of the prompt engineering job is, but the negative side (layoffs supposedly due to A.I, even though they can just be lying and it’s just layoff to layoff) is already happening, so I’m just trying to find this good side while the bad is happening now.
As the other person said, are you talking about engineering? Or about marketing roles expressly using AI to create content? I’m sure there are marketing roles with AI prompt writing roles open now.
As for engineering roles? Yeah lots of jobs for Ai researchers if you have a degree in data science or the like.
Its not really a job, yet.
No. This was already kind of a thing with search, it used to be more of a skill to use a search engine and refine your query and use the right operators to narrow things down. Now, it’s still a bit of a skill, but it’s a lot easier because for the most part natural language or a few keywords are enough.
My guess is the same thing will happen with these language models, it’s already pretty easy and it will just get easier as high quality output will become less sensitive to the exact prompting you use.
Relevant article: https://lemmy.ml/post/12857742
Prompt engineering is a thing, but I wouldn’t say it’s much of a job title. There are people doing it: optimizing system prompts, preprocessing and postprocessing, llms are just one piece of a complex pipeline and someone has to build all that. Prompt engineering is part the boot strapping for making better llms but this work is largely being done by data scientists who are on the forefront of understanding how AI works.
So is prompt engineering just typing questions? IDK. Who knows what those people mean when they say that but whatever it’s called there is a specialized field around improving AI tech and prompt engineering is certainly a part.