Comment by jmyeet
3 years ago
So this particular problem is severely time-constrained. Performance doesn't relaly matter. In those situations you tend to use the highest level language you can. This isn't surprising.
The author argues it's similar for startups. I agree to a point. For GC in particular I think the costs are underestimated. Processes blowing up because of memory leaks (which can still happen), dealing with stop-the-world ("STW") pauses and inconsistent performance are all real problems.
So much of this is situational. For example, I think Objective-C/Swift was so successful on iOS (vs Java on Android) in part because it opts for reference counting instead of GC. This is predictable performance and less complicated. But RC isn't necessarily appropriate for a server as you may create hot spots and degrade performance that way.
Rust by virtue of borrow checking isn't a panacea. It does however greatly reduce the chances of making a whole class of really important bugs (ie memory safety and all the entails like buffer overruns). Comparing it to Python, Go or Java doesn't necessarily make sense. Rust lives in the same domain as C/C++.
One funny side note:
> Google used 15.5 terawatt hours of electricity in 2020
Bitcoin used 200 TWh in 2022 [1]. Think about that.
[1]: https://www.cnet.com/personal-finance/crypto/bitcoin-mining-...
No comments yet
Contribute on Hacker News ↗