Long story short, I have a desktop with Fedora, lovely, fast, sleek and surprisingly reliable for a near rolling distro (it failed me only once back around Fedora 34 or something where it nuked Grub). Tried to install on a 2012 i7 MacBook Air… what a slog!!! Surprisingly Ubuntu runs very smooth on it. I have been bothering all my friends for years about moving to Fedora (back then it was because I hated Unity) but now… I mean, I know that we are suppose to hate it for Snaps and what not but… Christ, it does run well! In fairness all my VMs are running DietPi (a slimmed version of Ubuntu) and coming back to the APT world feels like coming back home.

On the other end forcing myself to be on Fedora allows me to stay on the DNF world that is compatible with Amazon Linux etc (which I use for work), it has updated packages, it is nice and clean…. Argh, don’t know how to decide!

Thoughts?

I am not in the mood for Debian. I like the Mint approach but I am not a fan of slow rolling releases and also would like to keep myself as close as upstream as possible, the Debian version is the only one that seems reliable enough but, again, it is Debian, the packages are “old”. Pop Os and similar are two hops away from upstream and so I’d rather not.

Is Snap really that bad?

Edit: thank you all for sharing your experience !

    • hallettj@beehaw.org
      link
      fedilink
      English
      arrow-up
      18
      ·
      11 months ago

      Debian unstable is not really unstable, but it’s also not as stable as Ubuntu. I’m told that when bugs appear they are fixed fast.

      I ran Debian testing for years. That is a rolling release where package updates are a few weeks behind unstable. The delay gives unstable users time to hit bugs before they get into testing.

      When I wanted certain packages to be really up-to-date I would pin those select packages to unstable or to experimental. But I never tried running full unstable myself so I didn’t get the experience to know whether that would be less trouble overall.

    • XTL@sopuli.xyz
      link
      fedilink
      arrow-up
      12
      ·
      11 months ago

      It’s unstable in the sense that it doesn’t stay the same for a long time. Stable is the release that will essentially stay the same until you install a different release.

      Sid is the kid next door (Iirc) from Toy Story who would melt and mutilate toys for fun. He may have been a different kind of unstable.

      Neither is unstable like an old windows pc.

    • pruneaue
      link
      fedilink
      arrow-up
      9
      ·
      11 months ago

      Unstable is pretty damn stable, feels arch-y to me, and arch rarely has issues. If there are issues they’re fixed fast.
      Testing is the middle ground. Tested for a bit by unstable peeps but thats it.

      • dan@upvote.au
        link
        fedilink
        arrow-up
        2
        ·
        11 months ago

        Testing is the middle ground. Tested for a bit by unstable peeps but thats it.

        IIRC packages have to be in unstable with no major bugs for 10 days before migrating to testing. It’s a good middle ground IMO.

        Of course, you could always run unstable and be the one to report the bugs :)

    • Draconic NEO@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      11 months ago

      It’s not actually unstable, more accurately it’s tested and verified as much as Debian stable, meaning it’s fine for desktop use but I wouldn’t use it for a server or critical system I plan on running 24/7 without interruption, both since it may have bugs that develop after long term use and gets more frequent updates which will be missed and render it out of date quickly if it’s running constantly.

    • rufus@discuss.tchncs.de
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      11 months ago

      It’s relatively alright for something that’s called unstable. There is also testing which is tested for at least 10 days. And you can mix and match, but that’s not recommended either.

      I wouldn’t put it on my server. And I wouldn’t recommend it to someone who isn’t okay with fixing the occasional hiccup. But I’ve been using it for years and I like it.

      However, mind that it’s not supported and they do not pay attention to security fixes.

      • dan@upvote.au
        link
        fedilink
        arrow-up
        2
        ·
        11 months ago

        I used to run Debian testing on my servers. These days I don’t have much free time to mess with them, so they’re all running the stable release with unattended-upgrades.

        However, mind that it’s not supported and they do not pay attention to security fixes.

        To be clear, it can still get security updates, but it’s the package maintainer’s responsibility to upload them. Some maintainers are very responsive while others take a while. On the other hand, Debian stable has a security team that quickly uploads patches to all officially supported packages (just the “main” repo, not contrib, non-free, or non-free-firmware).

        • rufus@discuss.tchncs.de
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          11 months ago

          Thanks for clarifying. Yeah I implied that but didn’t explain all the nuances. I’ve been scolded before for advertising the use of Debian testing. I’m quite happy with it. But since I’m not running any cutting edge things on my server and Docker etc have become quite stable… I don’t see any need to put testing on the server. I also use stable there and embrace the security fixes and stability / low maintenance. I however run testing/unstable on my laptop.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      It’s a dumping ground for new packages. Nobody makes any guarantees about it. It’s supposed to be used only as a staging area by developers.

      It may happen to work when you install it or it may crash constantly. You don’t know.