• Ramenator@feddit.de
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I guess the biggest benefit is that you can ship it directly from there and don’t have to rewrite your application because Debian ships with an outdated version of some core library

      • vrighter@discuss.tchncs.de
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        1 year ago

        So it’s not a dev environment at all. It’s a runtime.

        If your code only works on your machine, to the extent that you literally have to ship a copy of your entire machine, your code sucks.

        “it works on my machine” is an excuse. And a shitty one at that.

        edit, and this way, after a week or two, your container will be the one using an outdated version of a library, not the system.

        • TheLinuxGuy@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I concur, there was a few problems that might come up on various platforms like Windows not implementing C11 standard threads and other stuff, you would instead use TinyCThread library that works like a polyfill.

          All problems and challenges are workable, if the problem with Debian is out of date library, you could set up CI/CD for release build that rebuild your software when update occurs and static link the updated dependencies.

          Back to your point, if they didn’t design their code and architecture to be multiplatform like in C, they need to re-evaluate their design decisions.

      • CameronDev@programming.dev
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        But then your shipping your entire Dev Env as well? Including vscode? Seems a bit antithetical to what docker containers are meant to be? Or do you then just strip the container back down again?

        • Tempy@lemmy.temporus.me
          link
          fedilink
          arrow-up
          1
          ·
          10 months ago

          With vscode’s “Remote Containers” plugin at least, it’s clever enough to install that into the container after building the image. So the image built from the dockerfile doesn’t contain the vscode stuff.

    • suprjami@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      The advantage is that you can have a reproducible development environment regardless of the underlying platform.

      You use Debian and a workmate uses Fedora? No problem.

      Someone joins with Mac or Windows? No problem.

      Your laptop dies and you’re using something temporary for a while? No problem.

      No more differences of system libraries or “Well it works on my laptop” bullshit. Everyone is using the same libraries and compiler so there is no difference in any developer’s experience.

      • CameronDev@programming.dev
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        So it’s a different way to get a standard operating environment.

        Could you not achieve something similar by making the build and test happen in the docker container, while keeping the IDE etc separate? Bundling the IDE seems a bit overkill.

        Fwiw, in my experience, “it works on my laptop” is a great way to shake out bugs/API implementation quirks, so that’s a benefit for our team. Plus we have a mishmash of IDEs, so prescribing one or the other would probably cause more problems than it solved.

        Still, interesting solution for those who have the problem.