Comment by WorldMaker
2 years ago
The author's final suggested solution at the bottom of the article still relies on logarithms.
> doing it with logarithms will be slower and more complicated for a computer
This is a fascinating point of view and while it isn't wrong in certain "low-level optimization golf" viewpoints is in part based on old wrong assumptions from early chipsets that haven't been true in decades. Most FPUs in modern computers will do basic logarithms in nearly as many cycles as any other floating point math. It is marvelous technology. That many languages wrap these CPU features in what look like library function calls like Math.log() instead of having some sort of "log operator" is as much an historic accident of mathematical notation and that logarithms were extremely slow for a human.
Logarithms used to be the domain of lookup books (you might have one or more volumes, if not a shelf-full) and was one of the keys to the existence of slide rules and why an Engineer would actually have a set of slide rules in different logarithmic bases. Mathematicians would spend lifetimes doing the complex calculations to fill a lookup book of logarithmic data.
Today's computers excel at it. Early CPU designs saved transistors and made logarithms a domain of application/language design. Some of the most famous game designs did interesting hacks of pre-computing logarithm tables for a specific set of needs and embedding them in ROM in useful memory versus CPU time trade-offs. Today's CPU designs have plenty of transistors and logarithm support in hardware is just about guaranteed. (That's just CPU designs even; GPU designs can be logarithmic monsters in how many and how fast they can do.)
Yesterday's mathematicians envy the speed at which a modern computer can calculate logarithms.
In 2023 if you are trying to optimize an algorithm away from logarithms to some other mix of arithmetic you are either writing retro games for a classic chipset like the MOS 6502, stuck by your bosses in a history-challenged backwards language such as COBOL, or massively prematurely optimizing what the CPU can already better optimize for you. I wish that was something any competent programmer would know or at least suspect. It's 2023, it's okay to learn to use logarithms like a mathematician, because you aren't going to need that "optimization" of bit shifts and addition/subtraction/multiplication/division that obscures what your actual high-level algorithmic need and complexity is.
No comments yet
Contribute on Hacker News ↗