• Daedskin@lemm.ee
      link
      fedilink
      arrow-up
      33
      ·
      edit-2
      1 month ago

      I like the sentiment of the article; however this quote really rubs me the wrong way:

      I’m not suggesting we abandon AI tools—that ship has sailed.

      Why would that ship have sailed? No one is forcing you to use an LLM. If, as the article supposes, using an LLM is detrimental, and it’s possible to start having days where you don’t use an LLM, then what’s stopping you from increasing the frequency of those days until you’re not using an LLM at all?

      I personally don’t interact with any LLMs, neither at work or at home, and I don’t have any issue getting work done. Yeah there was a decently long ramp-up period — maybe about 6 months — when I started on ny current project at work where it was more learning than doing; but now I feel like I know the codebase well enough to approach any problem I come up against. I’ve even debugged USB driver stuff, and, while it took a lot of research and reading USB specs, I was able to figure it out without any input from an LLM.

      Maybe it’s just because I’ve never bought into the hype; I just don’t see how people have such a high respect for LLMs. I’m of the opinion that using an LLM has potential only as a truly last resort — and even then will likely not be useful.

    • Mnemnosyne@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 month ago

      “Every time we use a lever to lift a stone, we’re trading long term strength for short term productivity. We’re optimizing for today’s pyramid at the cost of tomorrow’s ability.”

      • Ebber@lemmings.world
        link
        fedilink
        arrow-up
        12
        ·
        1 month ago

        If you don’t understand how a lever works, then it’s a problem. Should we let any person with an AI design and operate a nuclear power plant?

      • julietOscarEcho@sh.itjust.works
        link
        fedilink
        arrow-up
        12
        ·
        1 month ago

        Precisely. If you train by lifting stones you can still use the lever later, but you’ll be able to lift even heavier things by using both your new strength AND the leaver’s mechanical advantage.

        By analogy, if you’re using LLMs to do the easy bits in order to spend more time with harder problems fuckin a. But the idea you can just replace actual coding work with copy paste is a shitty one. Again by analogy with rock lifting: now you have noodle arms and can’t lift shit if your lever breaks or doesn’t fit under a particular rock or whatever.

        • wizardbeard@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 month ago

          Also: assuming you know what the easy bits are before you actually have experience doing them is a recipe to end up training incorrectly.

          I use plenty of tools to assist my programming work. But I learn what I’m doing and why first. Then once I have that experience if there’s a piece of code I find myself having to use frequently or having to look up frequently, I make myself a template (vscode’s snippet features are fucking amazing when you build your own snips well, btw).

      • trashgirlfriend@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        1 month ago

        “If my grandma had wheels she would be a bicycle. We are optimizing today’s grandmas at the sacrifice of tomorrow’s eco friendly transportation.”

      • AeonFelis@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 month ago

        Actually… Yes? People’s health did deteriorate due to over-reliance on technology over the generations. At least, the health of those who have access to that technology.

      • AdamBomb@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        LLMs are absolutely not able to create wonders on par with the pyramids. They’re at best as capable as a junior engineer who has read all of Stack Overflow but doesn’t really understand any of it.

    • Guttural@jlai.lu
      link
      fedilink
      Français
      arrow-up
      12
      ·
      1 month ago

      This guy’s solution to becoming crappier over time is “I’ll drink every day, but abstain one day a week”.

      I’m not convinced that “that ship has sailed” as he puts it.

    • Hoimo@ani.social
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 month ago

      Not even. Every time someone lets AI run wild on a problem, they’re trading all trust I ever had in them for complete garbage that they’re not even personally invested enough in to defend it when I criticize their absolute shit code. Don’t submit it for review if you haven’t reviewed it yourself, Darren.

      • wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 month ago

        My company doesn’t even allow AI use, and the amount of times I’ve tried to help a junior diagnose an issue with a simple script they made, only to be told that they don’t actually know what their code does to even begin troubleshooting…

        “Why do you have this line here? Isn’t that redundant?”

        “Well it was in the example I found.”

        “Ok, what does the example do? What is this line for?”

        Crickets.

        I’m not trying to call them out, I’m just hoping that I won’t need to familiarize myself with their whole project and every fucking line in their script to help them, because at that point it’d be easier to just write it myself than try to guide them.

    • Agent641@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      1 month ago

      Nahhh, I never would have solved that problem myself, I’d have just googled the shit out of it til I found someone else that had solved it themselves

    • merc@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      1 month ago

      And also possibly checking in code with subtle logic flaws that won’t be discovered until it’s too late.

  • SkunkWorkz@lemmy.world
    link
    fedilink
    arrow-up
    104
    ·
    1 month ago

    Yeah fake. No way you can get 90%+ using chatGPT without understanding code. LLMs barf out so much nonsense when it comes to code. You have to correct it frequently to make it spit out working code.

    • Artyom@lemm.ee
      link
      fedilink
      arrow-up
      11
      ·
      1 month ago

      If we’re talking about freshman CS 101, where every assignment is the same year-over-year and it’s all machine graded, yes, 90% is definitely possible because an LLM can essentially act as a database of all problems and all solutions. A grad student TA can probably see through his “explanations”, but they’re probably tired from their endless stack of work, so why bother?

      If we’re talking about a 400 level CS class, this kid’s screwed and even someone who’s mastered the fundamentals will struggle through advanced algorithms and reconciling math ideas with hands-on-keyboard software.

    • AeonFelis@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 month ago
      1. Ask ChatGPT for a solution.
      2. Try to run the solution. It doesn’t work.
      3. Post the solution online as something you wrote all on your own, and ask people what’s wrong with it.
      4. Copy-paste the fixed-by-actual-human solution from the replies.
    • threeduck@aussie.zone
      link
      fedilink
      arrow-up
      5
      ·
      1 month ago

      Are you guys just generating insanely difficult code? I feel like 90% of all my code generation with o1 works first time? And if it doesn’t, I just let GPT know and it fixes it right then and there?

      • KillingTimeItself@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 month ago

        the problem is more complex than initially thought, for a few reasons.

        One, the user is not very good at prompting, and will often fight with the prompt to get what they want.

        Two, often times the user has a very specific vision in mind, which the AI obviously doesn’t know, so the user ends up fighting that.

        Three, the AI is not omnisicient, and just fucks shit up, makes goofy mistakes sometimes. Version assumptions, code compat errors, just weird implementations of shit, the kind of stuff you would expect AI to do that’s going to make it harder to manage code after the fact.

        unless you’re using AI strictly to write isolated scripts in one particular language, ai is going to fight you at least some of the time.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          1 month ago

          I asked an LLM to generate tests for a 10 line function with two arguments, no if branches, and only one library function call. It’s just a for loop and some math. Somehow it invented arguments, and the ones that actually ran didn’t even pass. It made like 5 test functions, spat out paragraphs explaining nonsense, and it still didn’t work.

          This was one of the smaller deepseek models, so perhaps a fancier model would do better.

          I’m still messing with it, so maybe I’ll find some tasks it’s good at.

          • KillingTimeItself@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 month ago

            from what i understand the “preview” models are quite handicapped, usually the benchmark is the full fat model for that reason. the recent openAI one (they have stupid names idk what is what anymore) had a similar problem.

            If it’s not a preview model, it’s possible a bigger model would help, but usually prompt engineering is going to be more useful. AI is really quick to get confused sometimes.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              1 month ago

              It might be, idk, my coworker set it up. It’s definitely a distilled model though. I did hope it would do a better job on such a small input though.

              • KillingTimeItself@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 month ago

                the distilled models are a little goofier, it’s possible that might influence it, since they tend to behave weirdly sometimes, but it depends on the model and the application.

                AI is still fairly goofy unfortunately, it’ll take time for it to become omniscient.

      • nimbledaemon@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        1 month ago

        I just generated an entire angular component (table with filters, data services, using in house software patterns and components, based off of existing work) using copilot for work yesterday. It didn’t work at first, but I’m a good enough software engineer that I iterated on the issues, discarding bad edits and referencing specific examples from the extant codebase and got copilot to fix it. 3-4 days of work (if you were already familiar with the existing way of doing things) done in about 3-4 hours. But if you didn’t know what was going on and how to fix it you’d end up with an unmaintainable non functional mess, full of bugs we have specific fixes in place to avoid but copilot doesn’t care about because it doesn’t have an idea of how software actually works, just what it should look like. So for anything novel or complex you have to feed it an example, then verify it didn’t skip steps or forget to include something it didn’t understand/predict, or make up a library/function call. So you have to know enough about the software you’re making to point that stuff out, because just feeding whatever error pops out of your compiler back into the AI may get you to working code, but it won’t ensure quality code, maintainability, or intelligibility.

      • surph_ninja@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        1 month ago

        A lot of people assume their not knowing how to prompt is a failure of the AI. Or they tried it years ago, and assume it’s still as bad as it was.

      • JustAnotherKay@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 month ago

        My first attempt at coding with chatGPT was asking about saving information to a file with python. I wanted to know what libraries were available and the syntax to use them.

        It gave me a three page write up about how to write a library myself, in python. Only it had an error on damn near every line, so I still had to go Google the actual libraries and their syntax and slosh through documentation

    • Maggoty@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      Usually this joke is run with a second point of view saying, do I tell them or let them keep thinking this is cheating?

  • kabi@lemm.ee
    link
    fedilink
    arrow-up
    104
    ·
    1 month ago

    If it’s the first course where they use Java, then one could easily learn it in 21 hours, with time for a full night’s sleep. Unless there’s no code completion and you have to write imports by hand. Then, you’re fucked.

    • rockerface 🇺🇦@lemm.ee
      link
      fedilink
      English
      arrow-up
      124
      ·
      1 month ago

      If there’s no code completion, I can tell you even people who’s been doing coding as a job for years aren’t going to write it correctly from memory. Because we’re not being paid to memorize this shit, we’re being paid to solve problems optimally.

      • spamfajitas@lemmy.world
        link
        fedilink
        arrow-up
        13
        ·
        1 month ago

        My undergrad program had us write Java code by hand for some beginning assignments and exams. The TAs would then type whatever we wrote into Eclipse and see if it ran. They usually graded pretty leniently, though.

        • ByteJunk@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          1 month ago

          There’s nobody out there writing “commercial” code in notepad. It’s the concepts that matter, not the spelling, so if OP got a solid grasp on those from using GPT, he’ll probably make it just fine

    • 404@lemmy.zip
      link
      fedilink
      English
      arrow-up
      32
      ·
      1 month ago

      My first programming course (in Java) had a pen and paper exam. Minus points if you missed a bracket. :/

      • ECB@feddit.org
        link
        fedilink
        arrow-up
        7
        ·
        1 month ago

        I got -30% for not writing comments for my pen and paper java final.

        Somehow it just felt a bit silly to do, I guess

      • DragonOracleIX@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        1 month ago

        It was the same for the class I took in high school. I remember the teacher saying that its to make sure we actually understand the code we write, since the IDE does some of the work for you.

    • kopasz7@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      Remember having to use (a modified version of?) quincy for C. Trying to paste anything would put random characters into your file.

      Still beats programming on paper.

  • TootSweet@lemmy.world
    link
    fedilink
    English
    arrow-up
    85
    ·
    1 month ago

    generate code, memorize how it works, explain it to profs like I know my shit.

    ChatGPT was just his magic feather all along.

  • nednobbins@lemm.ee
    link
    fedilink
    arrow-up
    81
    ·
    1 month ago

    The bullshit is that anon wouldn’t be fsked at all.

    If anon actually used ChatGPT to generate some code, memorize it, understand it well enough to explain it to a professor, and get a 90%, congratulations, that’s called “studying”.

    • JustAnotherKay@lemmy.world
      link
      fedilink
      arrow-up
      14
      ·
      1 month ago

      Yeah, if you memorized the code and it’s functionality well enough to explain it in a way that successfully bullshit someone who can sight-read it… You know how that code works. You might need a linter, but you know how that code works and can probably at least fumble your way through a shitty 0.5v of it

    • naught101@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 month ago

      I don’t think that’s true. That’s like saying that watching hours of guitar YouTube is enough to learn to play. You need to practice too, and learn from mistakes.

      • RobertoOberto@sh.itjust.works
        link
        fedilink
        arrow-up
        9
        ·
        1 month ago

        I don’t think that’s quite accurate.

        The “understand it well enough to explain it to a professor” clause is carrying a lot of weight here - if that part is fulfilled, then yeah, you’re actually learning something.

        Unless of course, all of the professors are awful at their jobs too. Most of mine were pretty good at asking very pointed questions to figure out what you actually know, and could easily unmask a bullshit artist with a short conversation.

        • naught101@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          1 month ago

          I didn’t say you’d learn nothing, but the second task was not just to explain (when you’d have the code in front of you to look at), but to actually write new code, for a new problem, from scratch.

        • Nalivai@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          1 month ago

          You don’t need physical skills to program, there is nothing that needs to be honed in into the physical memory by repetition. If you know how to type and what to type, you’re ready to type. Of you know what strings to pluck, you still need to train your fingers to do it, it’s a different skill.

      • nednobbins@lemm.ee
        link
        fedilink
        arrow-up
        4
        ·
        1 month ago

        It’s more like if played a song on Guitar Hero enough to be able to pick up a guitar and convince a guitarist that you know the song.

        Code from ChatGPT (and other LLMs) doesn’t usually work on the first try. You need to go fix and add code just to get it to compile. If you actually want it to do whatever your professor is asking you for, you need to understand the code well enough to edit it.

        It’s easy to try for yourself. You can go find some simple programming challenges online and see if you can get ChatGPT to solve a bunch of them for you without having to dive in and learn the code.

        • WarlordSdocy@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          1 month ago

          I mean I feel like depending on what kind of problems they started off with ChatGPT probably could just solve simple first year programming problems. But yeah as you get to higher level classes it will definitely not fully solve the stuff for you and you’d have to actually go in and fix it.

      • Maggoty@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 month ago

        No he’s right. Before ChatGPT there was Stack Overflow. A lot of learning to code is learning to search up solutions on the Internet. The crucial thing is to learn why that solution works though. The idea of memorizing code like a language is impossible. You’ll obviously memorize some common stuff but things change really fast in the programming world.

    • Gutek8134@lemmy.world
      link
      fedilink
      arrow-up
      44
      ·
      edit-2
      1 month ago

      My Java classes at uni:

      Here’s a piece of code that does nothing. Make it do nothing, but in compliance with this design pattern.

      When I say it did nothing, I mean it had literally empty function bodies.

      • boletus@sh.itjust.works
        link
        fedilink
        arrow-up
        23
        ·
        edit-2
        1 month ago

        Yeah that’s object oriented programming and interfaces. It’s shit to teach people without a practical example but it’s a completely passable way to do OOP in industry, you start by writing interfaces to structure your program and fill in the implementation later.

        Now, is it a good practice? Probably not, imo software design is impossible to get right without iteration, but people still use this method… good to understand why it sucks

      • e8d79@discuss.tchncs.de
        link
        fedilink
        arrow-up
        7
        ·
        1 month ago

        So what? You also learn math with exercises that ‘do nothing’. If it bothers you so much add some print statements to the function bodies.

        • Gutek8134@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 month ago

          I actually did do that. My point was to present a situation where you basically do nothing in higher education, which is not to say you don’t do/learn anything at all.

      • I Cast Fist@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        1 month ago

        Mine were actually useful, gotta respect my uni for that. The only bits we didn’t manually program ourselves were the driver and the tomcat server, near the end of the semester we were writing our own Reflections to properly guess the object type from a database query.

    • TheSlad@sh.itjust.works
      link
      fedilink
      arrow-up
      19
      ·
      1 month ago

      A lot of kids fresh out of highschool are pressured into going to college right away. Its the societal norm for some fucking reason.

      Give these kids a break and let them go when they’re really ready. Personally I sat around for a year and a half before I felt like “fuck, this is boring lets go learn something now”. If i had gone to college straight from highschool I would’ve flunked out and just wasted all that money for nothing.

      • boletus@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        1 month ago

        Yeah I remember in high school they were pressuring every body to go straight to uni and I personally thought it was kinda predatory.

        • wizardbeard@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 month ago

          I wish I hadn’t went straight in, personally. Wasted a lot of money and time before I got my shit together and went back for an associates a few years later.

      • boletus@sh.itjust.works
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        1 month ago

        Not a single person I’ve worked with in software has gotten a job with just a diploma/degree since like the early 2000s

        Maybe it’s different in some places.

        • FlexibleToast@lemmy.world
          link
          fedilink
          arrow-up
          10
          ·
          1 month ago

          Many HR departments will automatically kick out an application if it doesn’t have a degree. It’s an easy filter even if it isn’t the most accurate.

          • boletus@sh.itjust.works
            link
            fedilink
            arrow-up
            2
            ·
            1 month ago

            I meant any form of qualification. Sure it helps, but the way you get the job is by showing you can actually do the work. Like a folio and personal projects or past history.

            • blackbeards_bounty@lemmy.dbzer0.com
              link
              fedilink
              arrow-up
              1
              ·
              1 month ago

              Art? Most programming? “Hard skills” / technical jobs… GOOD jobs. Sure. But there’s plenty of degrees & jobs out there. Sounds like you landed where you were meant to be, alot of folks go where opportunity and the market takes them

              • boletus@sh.itjust.works
                link
                fedilink
                arrow-up
                1
                ·
                1 month ago

                Its probably a regional difference. Here in AU, you can be lucky and land a few post grad jobs if you really stood out. Otherwise you’re entirely reliant on having a good folio and most importantly connections.

    • GraniteM@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      If you go through years of education, learn nothing, and all you get is a piece of paper, then you’ve just wasted thousands of hours and tens of thousands of dollars on a worthless document. You can go down to FedEx and print yourself a diploma on nice paper for a couple of bucks.

      If you don’t actually learn anything at college, you’re quite literally robbing yourself.

    • FlexibleToast@lemmy.world
      link
      fedilink
      arrow-up
      24
      ·
      1 month ago

      It’s super easy to learn how algorithms and what not work without knowing the syntax of a language. I can tell you how a binary search tree works, but I have no clue how to code it in Java because I’ve never used Java.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 month ago

      I’m a full stack polyglot and tbh I couldn’t program in some languages without reference docs / LLM even though I ship production code in those language all the time. Memorizing all of the function and method names and all of the syntax/design pattern stuff is pretty hard especially when it’s not really needed in contemporary dev.

    • SaharaMaleikuhm@feddit.org
      link
      fedilink
      arrow-up
      7
      ·
      1 month ago

      You’d think that, but I believe you are underestimating people’s ability to mindlessly memorize stuff without learning it.

      • SoftestSapphic@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        1 month ago

        It’s what we’re trained to do throughout our education system.

        I have a hard time getting mad about it considering it’s what we told them to do from a very young age.

  • SoftestSapphic@lemmy.world
    link
    fedilink
    arrow-up
    51
    ·
    1 month ago

    This person is LARPing as a CS major on 4chan

    It’s not possible to write functional code without understanding it, even with ChatGPT’s help.

    • HotCoffee@lemm.ee
      link
      fedilink
      arrow-up
      4
      ·
      1 month ago

      U underestimate the power of the darkside, how powerful ctrl+c ctrl+v is young padawan

      • SoftestSapphic@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        1 month ago

        If you copy and paste from ChatGPT your code won’t compile.

        You need to know what the peices of code do and how to peice them together to make it work.

        Which is kind of impossible to do without understanding it

    • billwashere@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 month ago

      You would think eventually some of it would sink in. I mean I use LLMs to write code all the time but it’s very rarely 100% correct, often with syntax errors or logic problems. Having to fix that stuff is an excellent way to at least learn the syntax.

  • Xanza@lemm.ee
    link
    fedilink
    English
    arrow-up
    24
    ·
    edit-2
    1 month ago

    pay for school

    do anything to avoid actually learning

    Why tho?

      • Bronzebeard@lemm.ee
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 month ago

        Losing the job after a month of demonstrating you don’t know what you claimed to is not a great return on that investment…

        • L0rdMathias@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          ·
          1 month ago

          It is, because you now have the title on your resume and can just lie about getting fired. You just need one company to not call a previous employer or do a half hearted background check. Someone will eventually fail and hire you by accident, so this strategy can be repeated ad infinitum.

          • Bronzebeard@lemm.ee
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 month ago

            Sorry, you’re not making it past the interview stage in CS with that level of knowledge. Even on the off chance that name on the resume helps, you’re still getting fired again. You’re never building up enough to actually last long enough searching to get to the next grift.

            • L0rdMathias@sh.itjust.works
              link
              fedilink
              arrow-up
              3
              ·
              1 month ago

              I am sorry that you believe that all corporations have these magical systems in place to infallibly hire skilled candidates. Unfortunately, the idealism of academia does not always transfer to the reality of industry.

          • Xanza@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 month ago

            No actual professional company or job of value is not going to check your curriculum or your work history… So like sure you may get that job at quality inn as a night manager making $12 an hour because they didn’t fucking bother to check your resume…

            But you’re not getting some CS job making $120,000 a year because they didn’t check your previous employer. Lol

      • _stranger_@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        1 month ago

        Yeah, Anon paid an AI to take the class he payed for. Setting his money on fire would have been more efficient.

        • Nalivai@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 month ago

          After you finish a random course, a bunch of tech bros contact you immediately, give you a bunch of money, and take you to the land of Silicon, where you play fusball and drink beer, occasionally typing on a keyboard randomly.
          At least, that’s how those things are advertised

          • _stranger_@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 month ago

            I’ve been on both sides of that zoom call, and yeah, the amenities are there because they expect you to live there

  • xelar@lemmy.ml
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    1 month ago

    Brainless GPT coding is becoming a new norm on uni.

    Even if I get the code via Chat GPT I try to understand what it does. How you gonna maintain these hundreds of lines if you dont know how does it work?

    Not to mention, you won’t cheat out your way on recruitment meeting.

  • WolfLink@sh.itjust.works
    link
    fedilink
    arrow-up
    16
    ·
    1 month ago

    Any competent modern IDE or compiler will help you find syntax mistakes. Knowing the concepts is way more important.

    • Farid@startrek.website
      link
      fedilink
      arrow-up
      4
      ·
      1 month ago

      Took first semester Java test a month ago. Had to use a built-in WYSIWYG editor within the test webpage.