Comment by myself248

7 years ago

This has always annoyed the piss out of me. It wouldn't have been Bill's or Microsoft's call to make, in the first place. The hardware memory map is not set by software.

The 640K limitation derives from the 1MB address space of the IBM PC, and as the name implies, IBM did the hardware design. They did it around a particular Intel chip, which had a 1MB address space. IBM could've put in hardware support for bank switching (as some EMS/XMS add-in cards later did), they could've used a chip with more than 20 address lines, they could've done a lot of things.

But they didn't. IBM wasn't designing a mainframe-killer, they were designing a personal computer. It was competing with 16k and 64k 8-bit machines, and the first IBM PCs shipped with 64k and later 128k of RAM. Using the top 384K for peripherals and allocating 640K for programs must've seemed insanely generous at the time. But whoever made that decision, it was on the hardware side, not anyone at Microsoft.

But whoever made that decision, it was on the hardware side, not anyone at Microsoft.

Bill Gates' famous comment isn't really a decision though. As it's usually cited it's just an opinion - '640Kb should be enough for anyone' means "I don't think any programs will need more than that!". If someone at IBM decided that there should be 640Kb of available RAM for programs it's believable that Bill might have simply been agreeing with them.

The main thing that's annoying about the quote is that it's trotted out as an example of how Bill was wrong, as if being wrong is something terrible that he should be ashamed of decades later. That's nonsense. Being wrong is fine so long as you change your mind when you understand that you are wrong.

  • It wasn't even wrong. There wasn't a need for that much memory on a desktop then. You could already do wonders with 64KB on an 8bit micro: edit text, run spreadsheets, play games. You would need more for multimedia or web surfing, but that was still in the future.

    • Exactly! It's like saying today that 64gb of ram should be enough for just about everyone in their personal machine.

      In 10 years, that might be a hilariously small number, but for right now, it's way more than the extreme vast majority will need.

      13 replies →

    • Re: It wasn't even wrong. There wasn't a need for that much memory on a desktop then.

      What more RAM does is allow programmers to "be lazy": slap libraries together and use powerful but resource-hungry abstractions rather than hand-tune details.

      We have a similar pattern today with JavaScript and CSS layering in web pages, making them slow and bloated.

      It's increasing hardware usage in order to reduce greyware (human programmer) usage. Whether such is "smart" or economical makes for an interesting debate. It appears consumers prefer cheaper software over cheaper hardware for some reason. Software developers who are fastidious over hardware resource usage (RAM & CPU) are not sufficiently rewarded. The group that slaps existing API's together to get their product out quick and cheap seem to come out ahead.

      I remember how MS-Office for DOS shipped on 7 floppy disks. I kept wondering why people tolerate loading 7 floppies. It's because the alternatives of the time were harder to use or cost more.

      4 replies →

  • At one of the Microsoft company meetings in the middle to late 2000s (I recall it was at Safeco Field) he claimed that when IBM was developing the PC he tried to convince them to use an MC68000 instead of the 8088. He said going with the 8088 set the industry back ten years. Assuming he wasn't making the story up, it's hard to imagine him making that quote or even agreeing with it.

    • > He said going with the 8088 set the industry back ten years.

      That's what it always felt like as an Amiga user. Not before DOOM there was much I liked on the PC.

      Release Amiga 1000: July '85 Release DOOM: December '94

      11 replies →

    • Here's the thing though. Engineering workstations existed. There's a good argument that the "right" approach was to use an MC68K and, while you were at it, a "real" operating system whether a Unix or one of the 16-bit operating systems in use on minicomputers at the time. But there's also a good argument that, had you done a more open and mass market-oriented engineering workstation (whatever that meant exactly) at, what?, 2x the price point of an IBM PC--which, remember, didn't even always have a hard disk at the time--you'd not have been competitive with Z80 or 6502 machines.

      Even using the 8088 vs. the 8086 was a cost-saving move. A premium IBM PC might well have simply flopped rather than accelerating the industry.

      6 replies →

    • Note that IBM introduced a 68000 machine just over a month before the PC came out, which caused some confusion at that time.

      http://www.old-computers.com/museum/computer.asp?st=1&c=623

      But 1981 was just too early for 68000 non workstation machines since the early chips were slow and expensive. The 1983 Lisa was limited to 4MHz, for example, in contrast to the 1984 Macintosh's 8MHz. I remember the price of the chips dropping from over $100 to less than $20 in just one year back then.

    • This I can believe. It's reasonably well known that the first PC design was deliberately crippled in order not to impact on sales of the dedicated word processor, the IBM Displaywriter.

      The 8086 was thought too powerful and would compete against existing IBM products, so the 8088 was chosen. Other changes to expansion and bus architecture were on the same basis.

      I spent over a decade completely failing to understand how it could succeed against Amiga.

  • >If someone at IBM decided that there should be 640Kb of available RAM for programs it's believable that Bill might have simply been agreeing with them.

    Yeah, "640k should be enough for anybody (specifically in the context of a conversation about current hardware of software and not necessarily in perpetuity)"

  • To follow up on that: as if stating anything that’s perfectly acceptable today but not years from now is somehow outrageous.

    “8 core CPUs or 32GB of RAM are more than enough for gaming” (c) 2018

    Facts change and opinions change with them. Wonder if Moore’s law will get the same treatment now.

    • Maybe that's why Google Chrome was created, to prevent anyone ever again suggesting that any particular amount of RAM should be enough.

  • No, the point of the 'quote' is people trying to make a point that 'even Bill Gates' can't predict the future need for technology requirements expanding.

    Of course that's wrong, and most of the time it is used by people who don't understand the trade offs made in incremental advancements in tech standards.

  • I'm not even sure he was as wrong as people claim. They used to do a lot with very limited memory. Now, modern computers "need" hundreds of megabytes for a chat application, and slow down for the latest Gmail revamp. The quote serves as a reminder of how well we used to economize on memory.

  • I don't think the point is that Bill Gates should be ashamed. It's a reminder how fast technology changes and that the assumptions we make today could be invalidated just as quickly.

  • He didnt understand he was wrong. He never learnt a lesson. Years later when Microsoft was launching their Zume audio player, he played a fool on the scene with bunch of people acting that he never heard if iPod.

And the first PC design was deliberately spec-reduced in order not to impact on sales of the dedicated word processor, the IBM Displaywriter.

The 8086 was thought too powerful and would compete against existing IBM products, so the 8088 was chosen. Other changes to expansion and bus architecture were on the same basis.

The PC was not supposed to be the benchmark design on which the entire future of computing was built. If you were looking for one of those better off starting with an Amiga. :)

I was doing OS coding in those days - and writing both drivers and apps on top of Windows. People forget what was happening back then.

Hardware had removed the 640K memory limit - but Windows stuck with it for years afterward. Not for any technical reason, but because they dominated the market, and neither needed to change nor wanted to change - no matter how hard that made things for developers.

I worked with Microsoft guys, and they were very blunt about not wanting to change their working OS code if they didn't need to - so Bill held back the entire industry for years.

  • MS has always been famously conservative when it comes to making sure old software still functioned on newer iterations. They launched a completely new OS (NT) just so they wouldn't have to kill true DOS compatibility. They put special case instructions in their OS to recognize applications and ensure they worked.

    It's easy to say they put back "the industry" for years, if we define the industry as chip design, but I think it's just as easy to make a case that they greatly expanded the industry over this time by giving people a platform and OS combo that could continue to run their software from a few years back even after upgrading every component. In my opinion, consumer confidence likely greatly outweighed the other benefits, as it allowed economies of scale to really be reached, and spurred the whole industry forward from the massive demand.

    • What we are arguing about is whether there were any options other than those chosen by Apple (abandon developers with Motorola 68K) and Microsoft (don't support anything other than segmented architecture). Since I worked with the Microsoft guys, Mach engineers (since I was one), and did OS development, I would say that there were definitely other choices for Microsoft. We added backward compatibility via Mach, which was picked up by Microsoft. But the Intel guys (who I also worked with) had provided newer generation chips to Microsoft years before - and they allowed development with either segmented architecture or with a linear virtual address space. They described conversations with Bill where Bill had refused to make any changes since it wouldn't increase his revenue.

      4 replies →

    • > platform and OS combo that could continue to run their software from a few years back

      So, here's a thought for ya: MS Excel from back then won't run on anymore, 16-bit apps aren't recognized by modern Windows, etc.

      But a .XLS created back then sure will. And lots of business logic is codified in Excel spreadsheets.

      I posit that .XLS is a more stable platform than Windows itself.

      2 replies →

That's right. My first PC had 512K (and a NEC V40 CPU, 80188 compatible). The first time I met someone at school owning a PC with 640K I found it really weird, like it was an usual number coming from a ZX Spectrum 8-bit with 128K of RAM.

To your point:

Entire businesses have been run on systems that have 512k of RAM.

The IBM Series 1 had up to 128k: it was designed by Don Estridge, more famously known as the father of the PC...

It was a software limit, inasmuch as the software wanted to run in real mode even though by 1989 you had a modern 32 bit cpu that had all the features to run Windows 10 today.

> But whoever made that decision, it was on the hardware side, not anyone at Microsoft.

The hardware supported more than 1MB, as later CPUs proved. However, MS-DOS didn't really support it, as EMS/XMS proved:

https://www.filfre.net/2017/04/the-640-k-barrier/

I'd say that you've got it backwards, in my opinion. And in the opinion of that fine article I linked.

  • "The hardware" did not support more than 1 MB. The 8086/8088 had a 20-bit address space (reflected both in the segmentation model and the address bus), so that limits it to 1 MB.

    • It was actually 1MB+64kB, due to how real mode segmenting works. The infamous A20 line could be set up to alias that top 64kB to the first 64kB.

      The 80286 CPUs did support up to 16 MB of RAM, but they needed to be switched into protected mode to be able to use it. The problem was, that there was no way to switch back, except reset. That might be the reason, why MS DOS never supported this mode.

      With 80386, it was possible to run in protected mode AND run real mode binaries in VM86 mode, which is exactly what Windows 3.x used.

      2 replies →

    • Read the article... Yes, obviously those CPUs didn't support it. But the later CPUs and PCs did, and yet the software (MS-DOS), didn't really support it. They had to use some gross hacks to access more memory.

      That excuse is valid for 8086, but it's no longer valid for the 386.

      4 replies →