Just wanted to share a really cool program for Linux.
It's perfect to delete large files that are using all your space. It's incredibly fast (reads terabytes in seconds), and it's super simple to use.
Cool!
I have an alias for the base 'du' command with some sorting and human readability logic, very useful on small hosted boxes with limited space. Lots of npm projects and SSH'ing via VSCode will create large folders without you realizing.
du ~ -h -d 5 | sort -r --human-numeric-sort | head -n 15
Gives you the 15 largest files / folders with human readable filesize outputs. From there I usually just rm -rf it.
reply
Yeah, I actually found ncdu while searching for a command like that one.
To be honest, ncdu is a game changer. And it even works on servers or local machines because it's all terminal based, but also very intuitive.
I literally recovered 500GB in a few seconds with it. Great to find those temp files, and many other things that just keep increasing.
reply
Thanks, will try out since docker images always seem to take up all the space on my small server
reply