cm0002@lemmy.world to Programmer Humor@programming.dev · 1 month agoLike programming in bashlemmy.mlimagemessage-square178fedilinkarrow-up11.62Kcross-posted to: wwwopera@lemm.eeprogrammerhumor@lemmy.ml
arrow-up11.62KimageLike programming in bashlemmy.mlcm0002@lemmy.world to Programmer Humor@programming.dev · 1 month agomessage-square178fedilinkcross-posted to: wwwopera@lemm.eeprogrammerhumor@lemmy.ml
minus-squareperishthethought@lemm.eelinkfedilinkEnglisharrow-up47·1 month agoI don’t normally say this, but the AI tools I’ve used to help me write bash were pretty much spot on.
minus-squaremarduk@lemmy.sdf.orglinkfedilinkarrow-up24·1 month agoYes, with respect to the grey bearded uncles and aunties; as someone who never “learned” bash, in 2025 I’m letting a LLM do the bashing for me.
minus-squareSpaceNoodle@lemmy.worldlinkfedilinkarrow-up38·1 month agoUntil the magic incantations you don’t bother to understand don’t actually do what you think they’re doing.
minus-squareembed_me@programming.devlinkfedilinkarrow-up41·1 month agoSounds like a problem for future me. That guy hates me lol
minus-squareMBM@lemmings.worldlinkfedilinkarrow-up13·1 month agoI wonder if there’s a chance of getting rm -rf /* or zip bombs. Those are definitely in the training data at least.
minus-squarefurikuri@programming.devlinkfedilinkarrow-up3·1 month agoThe classic rm -rf $ENV/home where $ENV can be empty or contain spaces is definitely going to hit someone one day
minus-squarearendjr@programming.devlinkfedilinkarrow-up11·edit-21 month agoIn fairness, this also happens to me when I write the bash script myself 😂
minus-squarekameecoding@lemmy.worldlinkfedilinkarrow-up4·1 month agoYes, I have never wrote a piece of code that didn’t do what I thought it would before LLMs, no sir.
minus-squareSpaceNoodle@lemmy.worldlinkfedilinkarrow-up14·edit-21 month agoYeah, an LLM can quickly parrot some basic boilerplate that’s showed up in its training data a hundred times.
minus-squareewenak@jlai.lulinkfedilinkarrow-up1·1 month agoIf When the script gets too complicated, AI could also convert it to Python. I tried it once at least, and it did a pretty good job, although I had to tell it to use some dedicated libraries instead of calling programs with subprocess.
I don’t normally say this, but the AI tools I’ve used to help me write bash were pretty much spot on.
Yes, with respect to the grey bearded uncles and aunties; as someone who never “learned” bash, in 2025 I’m letting a LLM do the bashing for me.
Until the magic incantations you don’t bother to understand don’t actually do what you think they’re doing.
Sounds like a problem for future me. That guy hates me lol
Yeah fuck that guy
I wonder if there’s a chance of getting
rm -rf /*
or zip bombs. Those are definitely in the training data at least.The classic
rm -rf $ENV/home
where$ENV
can be empty or contain spaces is definitely going to hit someone one dayIn fairness, this also happens to me when I write the bash script myself 😂
Yes, I have never wrote a piece of code that didn’t do what I thought it would before LLMs, no sir.
Yeah, an LLM can quickly parrot some basic boilerplate that’s showed up in its training data a hundred times.
IfWhen the script gets too complicated, AI could also convert it to Python.I tried it once at least, and it did a pretty good job, although I had to tell it to use some dedicated libraries instead of calling programs with subprocess.