Comment by BanterTrouble
9 hours ago
The memory usage is quite large compared to C/C++ when compiling. I use Virtual Machines for Demos on my YouTube Channel and compiling something large in Rust requires 8GB+.
In C/C++ I don't even have to worry about it.
I can't agree, I've had C/C++ builds of well known open source projects try to use >100GB of memory...
Maybe something else is going on then. I've done builds of some large open source projects and most of the time they was maxing the cores (I was building j32) but memory usage was fine.
Out of interest what were they?
> Out of interest what were they?
The one that immediately comes to mind is cvc5... not super recently though.
I suspect that "tried to" is doing a bit of work here. The fact that it was failing and swapping out probably meant that the more memory heavy g++ processes were going slower than the memory light ones, resulting in more of them running simultaneously than would likely have happened in a normal successful build. Still, this was on a system with 32GB of ram, so it was using roughly that before swapping would slow down more memory intensive processes.
I can't say the same. Telling people to use `-j$(nproc)` in lieu of `-j` to avoid the wrath of the OOM-killer is a rite of passage