- cross-posted to:
- linux_memes@programming.dev
- linux_memes@sopuli.xyz
- cross-posted to:
- linux_memes@programming.dev
- linux_memes@sopuli.xyz
I’ll give the whole story if anyone wants it
Okay, so here’s the recap:
I woke up this morning and decided my main drive (just a 500GB SSD) was too full, at about 85%, so I decided to do something about that. I go through the usual:
pacman -Sc
,paccache -rk0
, andpacman -Qqtd | pacman -Rns -
(which I’ve aliased to “orphankiller” because that’s too much typing for me). None of that did anything though, as I’m usually pretty up on this, and I expected it, so my next step was to find other ways of deleting unnecessary files floating around, and that meant a trip to the usually very helpful Arch wiki.On the page “pacman Tips and Tricks”, I find 1.7: Detecting More Unneeded Packages. “Perfect!” I thought, “That’s exactly what I’m looking for!” I enthusiastically type in the command
pacman -Qqd | pacman -Rns -
, and then quickly go check how much space I just saved. Nada. Or at least not enough to move the percentage point. “Oh well, keep looking,” I think and I go back to Firefox to click some more links in hopes that one of them will be the space saving ultra-script that I need. The first one I click, I get an error from my trusty browser, I don’t remember exactly what it was but it was something about not being able to verify the page. “Weird, let’s try another one.” Nope, same thing.Well, being that I had just deleted something, I figured I should go see what exactly it was that I did. It was a good thing I’d left the terminal window open, because after just a few scrolls I saw it:
ca_certificates
, which Firefox absolutely needs. “Great, I’ll just reinstall.” Nope! I just deleted my pacman cache, and pacman also needs those certificates to download from the Arch repo’s mirrors! “Fantastic,” I grumbled while I tried to think of how I could get this pesky package back on my machine.Then it occurred to me: I’ve been keeping up with my btrfs snapshots (for once, lol)! I can just backup to yesterday and forget this whole mess! So I bring up Timeshift, and we’re on our way back to a functioning system! Or so I thought. See, I don’t have a separate /home partition, but I do have a separate @home subvolume, so when Timeshift asked me if I wanted to restore that too, I clicked the check mark. Only thing is, I don’t think I actually have a separate @home subvolume, which brings us to the error in the meme. /home wouldn’t mount, and that meant I was borked.
Fortunately, our story has a happy ending! I DDG’d the error on my phone, and found a post from like seven years ago, about someone who had this same set of circumstances, and the one reply was my fix: just go into
/etc/fstab
and delete the “subvolid” part of whatever partition that’s giving you grief. Did that, reboot, and we’re finally fixed! And now, forevermore, I shall check what I’m deleting before I hit the enter button!The post-script is bittersweet though, because after all this trouble, and then the rest of the afternoon working on the original problem, I am down to… 81%. Oh well.
Delete unused BTRFS snapshots. Enable compression by setting flags on /etc/fstab and run btrfs defrag to compress existing snapshots.
Great suggestions, that will absolutely be my tomorrow project!
I use BTRFS with zstd compression at the default level basically everywhere and it’s great. I don’t notice any performance difference but I have a lot more storage.
Defrag will remove the CoW of the snapshots tho. It will definitely make things worse. I’d say remove (but keep at least one per subvolume) snapshots, set the flags, and wait until the snapshots trinkle down
Try removing any unused language packs! I’ve heard that the French one takes up allot of space, remove it with
sudo rm -rf /
/s
You mean the Rfench one?
you messed up
sudo rm -fr /*
no /s, this actually works
/s
You joke, but I actually did remove locales with BleachBit, and then changed
pacman.conf
to skip the unnecessary ones. Saved me about 400MB!BleachBit? Wipe with a cloth?
Thank you, I was making a tongue in cheek reference to this though: https://www.bleachbit.org/cloth-or-something
LMAO I was unaware of this! That’s hilarious!
Nono,
-rf
deletes all your files. Usesudo rm -fr /*
instead to actually delete the French language pack!also /s
edit: someone already made the joke, darn
I DDG’d the error on my phone
Found this incredibly relatable lmao.
Yeah, that’s pretty much how I solve all my problems lol
You did mention a “main drive”. I don’t know what’s taking all that space on your SSD but if you have a media library that takes some space you could move that to a connected HDD. While HDDs aren’t good as a boot drive it does the job well enough with most “standard” quality media. So can be said for documents and more obviously. You can then auto-mount your other drive to be inside your home directory for seemless access.
One thing that isn’t mentionned but I’ll just say this just in case. Always have external backups. I’ve scared myself way too many times thinking I had lost my main drive’s data just to find it the next day on one of my backup. Really a life saver if your setup has a problem where you find that one forum post from 12y ago with a “Nvm I fixed it” marked as [FIXED].
Other than that, thanks for sharing and with the solution at that.
Yeah, my other drive is a 1TB HDD, and I do have all my media/documents/pictures/etc. there, I think what’s filling up my drive is actually plugins for Ardour lol, plus I might have too many Things I Definitely Need™. Maybe the real solution to my storage problems is to look within… (like do I seriously need No Man’s Sky installed all the time for the once every three months that I play it?)
But yeah, I wanna set up a NAS for this sort of thing, next time I have money lol
(like do I seriously need No Man’s Sky installed all the time for the once every three months that I play it?)
That sound’s like the data is in semi-regular use at least. For me it’s more like “Do I seriously need the sequel installed for that other game I haven’t even started yet, but am definitely going to start any day now, after years of having it installed?”.
Optimizing your system for space is usually wasted effort in Linux, this is not Windows. To get what uses all the space, there’s plenty of storage analyzing tools like Baobab, qdirstat, etc.
try qdirstat maybe
That’s the second recommendation for qdirstat, so it’s definitely on the tomorrow list!
And for cli ncdu is great
I’d say you might have had a snapshot still holding the deleted data when you first deleted the cache. I don’t use time shift for my backups but I’d assume it uses the same kind of incremental snapshot as btrbk. Which means that, until the next backup date, it will hold onto the previous state of the system, preventing it from truly deleting the file.
You may also have some balance issues, having way more metadata allocation than needed. Try running a balance and see if it changes something.
Please do share. What better thing to do than to take a break from a broken install to read about someone’s own hardship with the endless quest that is maintaining a rolling-release distro.
Just posted it!
this happened to me once, oh man… it was painful
Was it ~7 years ago? Maybe it was your old post I found!
Can’t be, I started using Linux just a year ago and Arch since November. I broke something, tried to rollback, broke it more. I learnt my lesson and now I read most of the docs of anything that messes with the system before even installing.
Just an update: following the very helpful suggestions in this thread has gotten my drive usage down to 16%! Super happy about that, y’all rock!
Sometimes, when it backstabs me, I call it BeTRayFS
I use qdirstat a lot to determine what files are eating all my space
I will check that out! Mostly I’ve been looking for something to determine what files are no longer in use, like old configs for programs I don’t even have anymore, etc.
I think
pacreport --unowned-files
might be able to help with that too. Showing you files that aren’t part of any installed package. Probably only does system files though, nothing in /home
Definitely want to hear the story! I’ve been through my own version of this and would like to hear your experience and how you resolved it.
Just posted the story!
I don’t understand why people use Arch. It takes all kinds I suppose. For me I automate everything and use preconfigured stuff when I can.
I like to tinker, plus I can be absolutely assured that every problem with my system is 100% my fault, which actually makes it easier to track down any problems. But the main reasons people use Arch is probably the rolling release model and the AUR.
Yup, only reason I can’t move on is because of the AUR and the rolling release, though, having said that I’m thinking of trying NixOS but not quite sure it’s for me as it isn’t posix. It seems some software doesn’t really like that although I’ve heard it’s pretty awesome as a server OS.
Yeah, I could see it being a good server OS, but otherwise NixOS seems like it’s on the “immutable” thing that’s popular right now. I’ve tried a few immutable distros, and they’re not for me, I end up layering everything anyways lol
Yhea, that’s my though. I wanna keep up to date and quite frequently change my system. I like having the reproducibility but feel like the immutable might get in the way. My servers though stay pretty static.
It isn’t necessarily your fault as it is unstable software. It is going to break and fall apart. I feel like having a homelab is a much more productive way to tinker.
Arch isn’t unstable, I just keep breaking things in my ignorance. The only thing in this scenario I could pin on Arch is that the “ca-certificates” package should have been marked as a dependency for pacman, but I guess it’s not strictly a dependency, as you can use pacman to install stuff from a local repo. Definitely for Firefox, though, as you can not browse the internet without the certs.
sound’s like timeshift’s fault, not btrfs
Could be, seems to me that BTRFS didn’t match the subvolid between @home and what it expected @home to be in the fstab, but I won’t claim to be an expert lol
I’ve been burned by btrfs before. Never again. It’s not a good file system, especially for multi disk systems.
Idk about all that, it’s been fine for me, just a little misconfiguration here. The compression just saved me a bunch of storage space, so I’m kinda in btrfs’ corner right now lol
It was fine for me too, right up to the point that it really wasn’t.
I’ve had two production systems fail because btrfs didn’t balance metadata and file space like it says it will. It has some fancy features, but do you need them?
My moment was using the experimental repos to get an early view into wayland, after seeing it wasn’t quite ready for my system I just switched back. Mistakes made, and slowly over the next few weeks as I updated, the experimental packages never got superseded and updated, until my system crashed and would not pass boot.
Luckily since it is not windows I just used a live usb stick to mount the disk and manually reinstall all the broken system packages. Scary but made me feel pretty confident I could recover the system myself in the future. Also learned a pretty important lesson. Don’t do that, and look at the upgrade log if you do lol, cause the whole time, as I upgraded there was red text showing me all the system packages that were not getting updated.
After fixing some Windows problems with their version of a live USB (the recovery USB) I really don’t want to do it ever again. It’s harrowing.
This is an interesting read, even if it is a few years old https://arstechnica.com/gadgets/2021/09/examining-btrfs-linuxs-perpetually-half-finished-filesystem/
I gave up on it in in 2016 and it sounded all the same back then too with too many people giving it a pass for unacceptable behavior. I don’t think anything has really changed since.
Interesting, I’ll keep that in mind for if I go for a RAID setup, but for now it’s just my one drive on BTRFS, the other one is ext4.