Comment by PaulHoule

1 year ago

Knuth's Sorting and Searching book talks a lot about that sort of algorithm.

The old IBM 360 had a 16MB address space at best, but in 1968 there were very few installations anywhere near filling it. This kind of tape

https://en.wikipedia.org/wiki/9-track_tape

could have a capacity of 80 MB or so, which is (1) much larger than RAM, and (2) mainly sequential (it could fast forward for a bit and try to find a particular block, but it was pretty slow) Contrast this to the floppy drives in the 70-140 kb range which the 8-bit micros had. Thus there was a lot of literature on external memory algorithms that would work on tapes in the early days -- though similar methods are still of interest today for "big data" as RAM is faster by a lot when you access it sequentially, you want to minimize round trips in distributed systems, etc.

(It was funny though when I talked to computer scientists around 2004 and was told to forget about external memory algorithms because 'main memory' was the thing; people realized, for instance, that Google Maps could store all the tiles in RAM in half a rack of 1U servers and if utilization was high it was much cheaper than serving the tiles off disk; Hadoop came along and was a blast from the past that itself was obsolete in just a few years)