Dust is a rewrite of du (in rust obviously) that visualizes your directory tree and what percentage each file takes up. But it only prints as many files fit in your terminal height, so you see only the largest files. It’s been a better experience that du, which isn’t always easy to navigate to find big files (or atleast I’m not good at it.)

Anyway, found a log file at .local/state/nvim/log that was 70gb. I deleted it. Hope it doesn’t bite me. Been pushing around 95% of disk space for a while so this was a huge win 👍

    • Aatube
      link
      fedilink
      541 year ago

      don’t worry, they’ve just been using neovim for 700 years, it’ll be alright

    • Nik282000
      link
      fedilink
      281 year ago

      So I found out that qbittorrent generates errors in a log whenever it tries to write to a disk that is full…

      Everytime my disk was full I would clear out some old torrents, then all the pending log entries would write and the disk would be full again. The log was well over 50gb by the time I figured out that i’m an idiot. Hooray for having dedicated machines.

    • @[email protected]OP
      link
      fedilink
      English
      11 year ago

      If you have ideas please let me know. I’m preparing to hop distros so I’m very tempted to ignore the problem, blame the old distro, and hope it doesn’t happen again :)

      • @pete_the_cat
        link
        English
        111 year ago

        Why does it suck that it’s written in Go?

          • @pete_the_cat
            link
            English
            01 year ago

            So you hate a language just because who it’s associated with. That’s dumb. Go is an awesome language, I used it at work for 2 years.

        • ferret
          link
          fedilink
          English
          -9
          edit-2
          1 year ago

          Garbage collected languages will never not be at least slightly ick

          Edit: I did not expect this to be so controversial, especially in regard to go, but I stand by my statement.

          • @[email protected]
            link
            fedilink
            21 year ago

            Counterpoint: I’ve never used Go myself, but I love that Go apps usually seem to be statically-linked executables. I have a few utilities (such as runitor) that I ‘deploy’ to my servers by just copying them into /usr/local/bin using Ansible.

            • @pete_the_cat
              link
              English
              11 year ago

              Go is awesome, yet a slight pain in the ass some ways, but you get used to it. I was doing DevOps stuff with it for 3 years. I like it so much more than python.

  • @KazuyaDarklight
    link
    English
    551 year ago

    Came in expecting a story of tragedy, congrats. 🎉

    • @[email protected]
      link
      fedilink
      English
      21 year ago

      But did he even look at the log file? They don’t get that big when things are running properly, so it was probably warning him about something. Like “Warning: Whatever you do, don’t delete this file. It contains the protocol conversion you will need to interface with the alien computers to prevent their takeover.”

      • @[email protected]
        link
        fedilink
        English
        81 year ago

        PTSD from the days long ago when X11 error log would fill up the disk when certain applications were used.

  • Yote.zip
    link
    fedilink
    English
    541 year ago

    Try ncdu as well. No instructions needed, just run ncdu /path/to/your/directory.

    • @NorthWestWind
      link
      111 year ago

      If you want to scan without crossing partitions, run with -x

  • @[email protected]
    link
    fedilink
    381 year ago

    I usually use something like du -sh * | sort -hr | less, so you don’t need to install anything on your machine.

    • @mvirts
      link
      81 year ago

      Same, but when it’s real bad sort fails 😅 for some reason my root is always hitting 100%

      I usually go for du -hx | sort -h and rely on my terminal scroll back.

    • @[email protected]
      link
      fedilink
      51 year ago

      dust does more than what this script does, its a whole new tool. I find dust more human readable by default.

      • @[email protected]
        link
        fedilink
        21 year ago

        Maybe, but I need it one time per year or so. It is not a task for which I want to install a separate tool.

        • @[email protected]
          link
          fedilink
          01 year ago

          Perfect for your use case, not as much for others. People sharing tools, and all the different ways to solve this type of problem is great for everyone.

    • DigitalDilemma
      link
      fedilink
      3
      edit-2
      1 year ago

      Almost the same here. Well, du -shc *|sort -hr

      I admin around three hundred linux servers and this is one of my most common tasks - although I use -shc as I like the total too, and don’t bother with less as it’s only the biggest files and dirs that I’m interested in and they show up last, so no need to scrollback.

      When managing a lot of servers, the storage requirements when installing extra software is never trivial. (Although our storage does do very clever compression and it might recognise the duplication of the file even across many vm filesystems, I’m never quite sure that works as advertised on small files)

      • @[email protected]
        link
        fedilink
        21 year ago

        I admin around three hundred linux servers

        What do you use for management? Ansible? Puppet? Chef? Something else entirely?

        • DigitalDilemma
          link
          fedilink
          21 year ago

          Main tool is Uyuni, but we use Ansible and AWX for building new vms, and adhoc ansible for some changes.

      • @pete_the_cat
        link
        English
        21 year ago

        We’d use du -xh --max-depth=1|sort -hr

        • DigitalDilemma
          link
          fedilink
          11 year ago

          du -xh --max-depth=1|sort -hr

          Interesting. Do you often deal with dirs on different filesystems?

          • @pete_the_cat
            link
            English
            11 year ago

            Yeah, I was a Linux System Admin/Engineering for MLB/Disney+ for 5 years. When I was an admin, one of our tasks was clearing out filled filesystems on hosts that alerted.

            • DigitalDilemma
              link
              fedilink
              11 year ago

              Sounds pretty similar to what I do now - but never needed the -x. Guess that might be quicker when you’re nested somewhere there is a bunch of nfs/smb stuff mounted in.

              • @pete_the_cat
                link
                English
                21 year ago

                We’d do it from root (/) and drill down from there, it was usually /var/lib or /var/logs that was filling up, but occasionally someone would upload a 4.5 GB file to their home folder which has a quota of 5 GB.

                Using ncdu would have been the best way, but that would require it being installed on about 7 thousand machines.

    • aname
      link
      fedilink
      21 year ago

      Or head instead of less to get the top entries

      • DigitalDilemma
        link
        fedilink
        11 year ago

        With sort -hr, the biggest ones are generally at the bottom already, which is often what most people care about.

    • KNova
      link
      fedilink
      English
      31 year ago

      Yeah I got turned onto ncdu recently and I’ve been installing it on every vm I work on now

  • @[email protected]
    link
    fedilink
    91 year ago

    A 70gb log file?? Am I misunderstanding something or wouldn’t that be hundreds of millions of lines

    • @[email protected]
      link
      fedilink
      71 year ago

      I’ve definitely had to handle 30gb plain text files before so I am inclined to believe twice as much should be just as possible

  • @donio
    link
    7
    edit-2
    1 year ago

    Maybe other tools support this too but one thing I like about xdiskusage is that you can pipe regular du output into it. That means that I can run du on some remote host that doesn’t have anything fancy installed, scp it back to my desktop and analyze it there. I can also pre-process the du output before feeding it into xdiskusage.

    I also often work with textual du output directly, just sorting it by size is very often all I need to see.