Comment by geocar

7 years ago

People where I work have workstations with more than 1TB of ram.

I think the value of a larger pointer (immediately) would be tags, not memory density, just as it was with 64-bit systems -- my old Alpha only had 256mb of ram, and maxed out at 512mb IIRC.

> People where I work have workstations with more than 1TB of ram.

Wow. What kinds of workstation tasks need this kind of memory, if I may ask?

  • Analysis; some model building.

    It is popular to buy a bunch of servers (or worse, host them on AWS), a bunch of sysadmins, and so on, so you can support a couple smart analysts with some complex hadoop java streaming somethingrather, but it is much cheaper to just buy them beefy workstations and use awk: A HP Z8 G4 with 1.5TB ram is under £30k, and it's hard to get a sysadmin that has any brains for that, let alone two and servers...

  • Genome assembly and computational fluid dynamics are the two that come immediately to mind.

    • Wow. They need that much data in memory at once? I could understand needing a ton of secondary storage for both, but this much primary storage would seem to imply they need random access to terabytes of data which is a little huge to swallow!!