Comment by onion2k
7 years ago
But whoever made that decision, it was on the hardware side, not anyone at Microsoft.
Bill Gates' famous comment isn't really a decision though. As it's usually cited it's just an opinion - '640Kb should be enough for anyone' means "I don't think any programs will need more than that!". If someone at IBM decided that there should be 640Kb of available RAM for programs it's believable that Bill might have simply been agreeing with them.
The main thing that's annoying about the quote is that it's trotted out as an example of how Bill was wrong, as if being wrong is something terrible that he should be ashamed of decades later. That's nonsense. Being wrong is fine so long as you change your mind when you understand that you are wrong.
It wasn't even wrong. There wasn't a need for that much memory on a desktop then. You could already do wonders with 64KB on an 8bit micro: edit text, run spreadsheets, play games. You would need more for multimedia or web surfing, but that was still in the future.
Exactly! It's like saying today that 64gb of ram should be enough for just about everyone in their personal machine.
In 10 years, that might be a hilariously small number, but for right now, it's way more than the extreme vast majority will need.
There is a limit though. By the time you get to the point where you can store more high-definition movies than you can possibly watch in a lifetime you have definitely passed the point of diminishing returns. And with non-volatile storage, we're pretty much already there.
11 replies →
Re: It wasn't even wrong. There wasn't a need for that much memory on a desktop then.
What more RAM does is allow programmers to "be lazy": slap libraries together and use powerful but resource-hungry abstractions rather than hand-tune details.
We have a similar pattern today with JavaScript and CSS layering in web pages, making them slow and bloated.
It's increasing hardware usage in order to reduce greyware (human programmer) usage. Whether such is "smart" or economical makes for an interesting debate. It appears consumers prefer cheaper software over cheaper hardware for some reason. Software developers who are fastidious over hardware resource usage (RAM & CPU) are not sufficiently rewarded. The group that slaps existing API's together to get their product out quick and cheap seem to come out ahead.
I remember how MS-Office for DOS shipped on 7 floppy disks. I kept wondering why people tolerate loading 7 floppies. It's because the alternatives of the time were harder to use or cost more.
> I remember how MS-Office for DOS shipped on 7 floppy disks.
You must be remembering something else because there were no DOS versions of Office.
3 replies →
At one of the Microsoft company meetings in the middle to late 2000s (I recall it was at Safeco Field) he claimed that when IBM was developing the PC he tried to convince them to use an MC68000 instead of the 8088. He said going with the 8088 set the industry back ten years. Assuming he wasn't making the story up, it's hard to imagine him making that quote or even agreeing with it.
> He said going with the 8088 set the industry back ten years.
That's what it always felt like as an Amiga user. Not before DOOM there was much I liked on the PC.
Release Amiga 1000: July '85 Release DOOM: December '94
I saw an Amiga once in person around 86 and spent that next decade disappointed, but advocating that the future of computing was going to be great. Emotionally I still feel like we haven't caught up to the Amiga but I imagine that isn't really true, heh.
Similar story with me. I got an Amiga 1000 and did a fair bit of assembly coding on it, then ended up writing some 16-bit x86 assembly for school later on. Being used to having sixteen 32-bit registers, then all the sudden having to use AX, BX, CX, and DX (and don't forget they all have slightly different purposes!) was like being brutally shoved back into the 80's.
6 replies →
I must have been 12 or 13 years old when I first met an IBM PC. I was used to 8 bit micros having BASIC right away (and colour). But you couldn't do anything with a PC before using two floppies. First DOS then whatever. It just seemed awful. Why would anyone want one of those?
2 replies →
Here's the thing though. Engineering workstations existed. There's a good argument that the "right" approach was to use an MC68K and, while you were at it, a "real" operating system whether a Unix or one of the 16-bit operating systems in use on minicomputers at the time. But there's also a good argument that, had you done a more open and mass market-oriented engineering workstation (whatever that meant exactly) at, what?, 2x the price point of an IBM PC--which, remember, didn't even always have a hard disk at the time--you'd not have been competitive with Z80 or 6502 machines.
Even using the 8088 vs. the 8086 was a cost-saving move. A premium IBM PC might well have simply flopped rather than accelerating the industry.
It's not clear to me that at the time the m68k was that much more expensive than x86. It certainly was not by 84/85 when the Atari ST was shipping as a sub-$1000 cheap home computer based around it.
I think the bigger compelling piece for x86 was its continuity with the top-selling 8080/Z80 CP/M machines that were the effective standard at the time. IBM offered both PC-DOS (cheap) and CP/M (expensive), and wasn't sure which was going to win out. And PC-DOS was basically a kind of clone of CP/M, down to the API call names.
4 replies →
It was not (or not just) for cost-saving reasons. It was due to a cross-licensing agreement between Intel and IBM for technology called Bubble Memory which turned out to be flop, but IBM didn't know at the time that it would flop.
"Next came the 8088, the processor for the first IBM PC. Even though IBM engineers at the time wanted to use the Motorola 68000 in the PC, the company already had the rights to produce the 8086 line (by trading rights to Intel for its bubble memory)"
https://www.ibm.com/developerworks/library/pa-microhist/inde...
Note that IBM introduced a 68000 machine just over a month before the PC came out, which caused some confusion at that time.
http://www.old-computers.com/museum/computer.asp?st=1&c=623
But 1981 was just too early for 68000 non workstation machines since the early chips were slow and expensive. The 1983 Lisa was limited to 4MHz, for example, in contrast to the 1984 Macintosh's 8MHz. I remember the price of the chips dropping from over $100 to less than $20 in just one year back then.
This I can believe. It's reasonably well known that the first PC design was deliberately crippled in order not to impact on sales of the dedicated word processor, the IBM Displaywriter.
The 8086 was thought too powerful and would compete against existing IBM products, so the 8088 was chosen. Other changes to expansion and bus architecture were on the same basis.
I spent over a decade completely failing to understand how it could succeed against Amiga.
>If someone at IBM decided that there should be 640Kb of available RAM for programs it's believable that Bill might have simply been agreeing with them.
Yeah, "640k should be enough for anybody (specifically in the context of a conversation about current hardware of software and not necessarily in perpetuity)"
To follow up on that: as if stating anything that’s perfectly acceptable today but not years from now is somehow outrageous.
“8 core CPUs or 32GB of RAM are more than enough for gaming” (c) 2018
Facts change and opinions change with them. Wonder if Moore’s law will get the same treatment now.
Maybe that's why Google Chrome was created, to prevent anyone ever again suggesting that any particular amount of RAM should be enough.
No, the point of the 'quote' is people trying to make a point that 'even Bill Gates' can't predict the future need for technology requirements expanding.
Of course that's wrong, and most of the time it is used by people who don't understand the trade offs made in incremental advancements in tech standards.
I'm not even sure he was as wrong as people claim. They used to do a lot with very limited memory. Now, modern computers "need" hundreds of megabytes for a chat application, and slow down for the latest Gmail revamp. The quote serves as a reminder of how well we used to economize on memory.
I don't think the point is that Bill Gates should be ashamed. It's a reminder how fast technology changes and that the assumptions we make today could be invalidated just as quickly.
He didnt understand he was wrong. He never learnt a lesson. Years later when Microsoft was launching their Zume audio player, he played a fool on the scene with bunch of people acting that he never heard if iPod.