Dust is a rewrite of du (in rust obviously) that visualizes your directory tree and what percentage each file takes up. But it only prints as many files fit in your terminal height, so you see only the largest files. It’s been a better experience that du, which isn’t always easy to navigate to find big files (or atleast I’m not good at it.)
Anyway, found a log file at .local/state/nvim/log that was 70gb. I deleted it. Hope it doesn’t bite me. Been pushing around 95% of disk space for a while so this was a huge win 👍
Removed by mod
don’t worry, they’ve just been using neovim for 700 years, it’ll be alright
Removed by mod
So I found out that qbittorrent generates errors in a log whenever it tries to write to a disk that is full…
Everytime my disk was full I would clear out some old torrents, then all the pending log entries would write and the disk would be full again. The log was well over 50gb by the time I figured out that i’m an idiot. Hooray for having dedicated machines.
That’s not entirely your fault; that’s pathological on the part of the program.
Removed by mod
If you have ideas please let me know. I’m preparing to hop distros so I’m very tempted to ignore the problem, blame the old distro, and hope it doesn’t happen again :)
Removed by mod
ncdu is the best utility for this type of thing. I use it all the time.
I install ncdu on any machine I set up, because installing it when it’s needed may be tricky
Try dua. It’s like ncdu but uses multiple threads so it’s a lot faster., especially on SSDs.
deleted by creator
Why does it suck that it’s written in Go?
deleted by creator
So you hate a language just because who it’s associated with. That’s dumb. Go is an awesome language, I used it at work for 2 years.
Garbage collected languages will never not be at least slightly ick
Edit: I did not expect this to be so controversial, especially in regard to go, but I stand by my statement.
absurd take
deleted by creator
Counterpoint: I’ve never used Go myself, but I love that Go apps usually seem to be statically-linked executables. I have a few utilities (such as runitor) that I ‘deploy’ to my servers by just copying them into /usr/local/bin using Ansible.
Go is awesome, yet a slight pain in the ass some ways, but you get used to it. I was doing DevOps stuff with it for 3 years. I like it so much more than python.
Yes, it looks very similar. The guy of ncdu is making a new improved and faster version in Zig.
Came in expecting a story of tragedy, congrats. 🎉
But did he even look at the log file? They don’t get that big when things are running properly, so it was probably warning him about something. Like “Warning: Whatever you do, don’t delete this file. It contains the protocol conversion you will need to interface with the alien computers to prevent their takeover.”
PTSD from the days long ago when X11 error log would fill up the disk when certain applications were used.
Try
ncdu
as well. No instructions needed, just runncdu /path/to/your/directory
.If you want to scan without crossing partitions, run with
-x
Removed by mod
I usually use something like
du -sh * | sort -hr | less
, so you don’t need to install anything on your machine.Same, but when it’s real bad sort fails 😅 for some reason my root is always hitting 100%
I usually go for du -hx | sort -h and rely on my terminal scroll back.
dust does more than what this script does, its a whole new tool. I find dust more human readable by default.
Maybe, but I need it one time per year or so. It is not a task for which I want to install a separate tool.
Perfect for your use case, not as much for others. People sharing tools, and all the different ways to solve this type of problem is great for everyone.
Almost the same here. Well,
du -shc *|sort -hr
I admin around three hundred linux servers and this is one of my most common tasks - although I use -shc as I like the total too, and don’t bother with less as it’s only the biggest files and dirs that I’m interested in and they show up last, so no need to scrollback.
When managing a lot of servers, the storage requirements when installing extra software is never trivial. (Although our storage does do very clever compression and it might recognise the duplication of the file even across many vm filesystems, I’m never quite sure that works as advertised on small files)
I admin around three hundred linux servers
What do you use for management? Ansible? Puppet? Chef? Something else entirely?
Main tool is Uyuni, but we use Ansible and AWX for building new vms, and adhoc ansible for some changes.
Interesting; I hadn’t heard of Uyuni before. Thanks for the info!
We’d use
du -xh --max-depth=1|sort -hr
du -xh --max-depth=1|sort -hr
Interesting. Do you often deal with dirs on different filesystems?
Yeah, I was a Linux System Admin/Engineering for MLB/Disney+ for 5 years. When I was an admin, one of our tasks was clearing out filled filesystems on hosts that alerted.
Sounds pretty similar to what I do now - but never needed the -x. Guess that might be quicker when you’re nested somewhere there is a bunch of nfs/smb stuff mounted in.
We’d do it from root (/) and drill down from there, it was usually /var/lib or /var/logs that was filling up, but occasionally someone would upload a 4.5 GB file to their home folder which has a quota of 5 GB.
Using ncdu would have been the best way, but that would require it being installed on about 7 thousand machines.
I’d say head -n25 instead of less since the offending files are probably near the top anyway
Or head instead of less to get the top entries
With sort -hr, the biggest ones are generally at the bottom already, which is often what most people care about.
So like filelight?
I really like ncdu
Yeah I got turned onto ncdu recently and I’ve been installing it on every vm I work on now
A 70gb log file?? Am I misunderstanding something or wouldn’t that be hundreds of millions of lines
I’ve definitely had to handle 30gb plain text files before so I am inclined to believe twice as much should be just as possible
Maybe other tools support this too but one thing I like about xdiskusage is that you can pipe regular du output into it. That means that I can run du on some remote host that doesn’t have anything fancy installed, scp it back to my desktop and analyze it there. I can also pre-process the du output before feeding it into xdiskusage.
I also often work with textual du output directly, just sorting it by size is very often all I need to see.
You guys aren’t using
du -sh ./{dir1,dir2} | sort -nh | head
?I use gdu and never had any issues like that with it
deleted by creator