Comment by crazygringo

1 day ago

The whole point of NumPy is to make things much, much faster than interpreted Python, whether you're GPU-accelerated or not.

Even code you write now, you may need to GPU accelerate later, as your simulations grow.

Falling back on loops is against the entire reason of using NumPy in the first place.

I really disagree. That's not the only point of NumPy. A lot of people use it like Matlab, to answer questions with minimal coding time, not minimal runtime.

  • I mean sure, the fact that it is performant means tons of functionality is built on it that is hard to find elsewhere.

    But the point is still that the main purpose in building it was to be performant. To be accelerated. Even if that's not why you're personally using it.

    I mean, I use my M4 Mac's Spotlight to do simple arithmetic. That's not the main point in building the M4 chip though.

    • As best I can see, you weren't originally talking about the reason for creating NumPy. Instead you seemed to be talking about the reason for using NumPy.

I used it once purely because I figured out that "turn a sequence of per-tile bitmap data into tiles, then produce the image corresponding to those tiles in order" is equivalent to swapping the two inner dimensions of a four-dimensional array. (And, therefore, so is the opposite operation.) The task was extremely non-performance-critical.

Of course I wasn't happy about bringing in a massive dependency just to simplify a few lines of code. Hopefully one day I'll have a slimmer alternative, perhaps one that isn't particularly concerned with optimization.