Office really chugged on the PCs of the time though. We can debate whether modern Excel actually delivers enough more value than historical Excel to justify being as more resource-hungry, thus slower to load, as it is. But historical Excel appears fast on modern hardware, even in emulation, because the CPU, RAM, and permanent storage have had 30 years to evolve since it was released. Contemporary 386s and 486s would not have been that snappy.
Let's go back to say around 1994/5. I've just got a job as the first dedicated IT bod for a pie factory near Plymouth (Devon not MA)! Win 3.11 was pretty much everywhere and was almost reliable - patching wasn't really a thing then in the MS world. By then Pentium (586) was a thing but the majority of machines were 80486, 80386s were still useful. There were also the 386/486 SX/DX and DX2 and Cyrix and so on.
The planning spreadsheets were a series of Lotus 1-2-3 jobbies with a lot of manual copy and pasting and I gradually ported it to a Excel VBA job. To cut a long story short, I was running Win311 and Excel on a Pentium 75 with 16MB RAM, IDE HDD. Excel was way quicker to start than on a modern PC running Win 11 with an SSD.
Yes, a lot of things took a while but I ended up with a finite capacity plan in VBA for an entire factory that took less than five minutes per run. That was for meat and dough prep, make, bake and wrap and dispatch for 150 odd finished product lines. It generated a labour plan as well and ran totally to forecast (which it also did). Pasties, sossie rolls etc are generally made to forecast - they take a while to get through the plant and have to be delivered into depot with enough code (shelf life) for the customer (store) to be able to sell them and the consumer to not be given a dose of the trots. As reality kicked in, you input the actual orders etc and it refined the plan.
OK not the best tool for the job but I hope I show that a spreadsheet back in the day was more than capable of doing useful things. I've just fired up LO calc on my laptop with a SSD and it took longer than I remember old school Excel starting up or perhaps the same time.
The world runs on Excel. It is the largest development environment by far and no fancy language/framework can come close to touch it. The reason is because it acts as the glue to get real life things done in everything from large governments, militaries, large corporations all the way down to the small bed and breakfast operation across the entire world. Normal people have gotten real processes built by just twiddling around in Excel.
yep and that was Windows which introduced levels of latency and waiting times much worse than equivalent DOS software, but with easier to use and more intuitive menus instead of the usual DOS UI routine of either no menus or menus that showed with key combos, and power users knowing many key combination combos which weren't strictly necessary but both accelerated things and impressed newbs into thinking computers were too hard for them
on a 486, Lotus 1-2-3 was essentially instant - even from floppy disks it would run faster than excel does today on a top of the line machine
I did earn some bread with vba as well, and always advocate for efficiency, but I just opened a 12MB xlsx file in LO, and it took a couple of seconds on a 2024 thinkpad.
As far as I remember, my Win 3.11 machine (a 486 DX with 4MB RAM and 30MB HDD) wouldn't be able to store or open such a file, let alone recognize the extension. Also, it would call the file 2026022~.XL~ or something. And it took more than a couple of seconds to load office programs for sure. It would take well over a minute to load a book from a 1.44MB floppy.
Anyway, software and computers have come a long way and I'm grateful for it.
This is just not true. The only chugging back then was reading from disks, and the entire Office suite was only a handful of 3.5" floppies. If you had already started Excel earlier, then it was likely still cached in RAM and would start nearly instantly. If not, then it was still only a few seconds.
Now what was slow was actual computations. Like try running a big spreadsheet in Excel or counting words in a big Word document on that hardware. It takes a very long time, while on modern hardware it's nearly instant.
This does not match my memory of using Windows 3.1. Excel would likely not have been cached in RAM from a previous run because a typical Windows 3.1 machine only had 4 megabytes of RAM.
This isn't true at all, based on my experiences. Contemporary 386s were kind of slow, I guess, but these programs did not chug on a 486. I spent tons and tons of time in Excel and Access and writing VB/VBScript/macros.
They haven't released the old Excels as open source right?
Wonder if its feasible to reverse the old version using LLMs, vibecode it to run on modern platforms and then shorehorn in support for modern XLS format. At the rate LLMs are improving I hope someone will eventually partake in this challenge!
> Wonder if its feasible to reverse the old version using LLMs, vibecode it to run on modern platforms and then shorehorn in support for modern XLS format.
Oh no it won't. Photoshop PSD and the legacy Office file formats have one thing in common... they are raw dumps of the C in-memory structs representing the contents. That's how they save and load so fast [1], in contrast to the modern formats which are a bunch of XMLs in a ZIP in a trenchcoat. Unfortunately, that makes reverse engineering them not just a challenge in itself, but also reimplementing because you have to reimplement Microsoft's original engines piece by piece, quirk by quirk.
And that's before wading into the mess that is OLE or, yes, the older people will shudder, ActiveX. Or the wonders that VBA macros could achieve, including just running stuff directly from kernel32.dll. I'm reasonably sure you could import the DirectX DLLs into an Office VBA macro and implement a full blown 3D shooter engine with DirectX instead of Excel.
And that's also why conversion in either direction almost always carries loss potential, simply put, not each quirk of the legacy format has been carried over to the "new" XML storage format, and certainly not into OpenOffice XML.
I mean if people are reverse engineering entire n64 games into its original code that can target the original SGI compilers, then it is possible to reverse this other code. I don't think there is a drive to do so though. Thats where I hope some future LLM could help lower that barrier to people already well experienced in reversing.
>And that's also why conversion in either direction almost always carries loss potential, simply put, not each quirk of the legacy format has been carried over to the "new" XML storage format, and certainly not into OpenOffice XML.
Can modern Office reliably open the old formats? If so they must have implemented the parsers correctly no?
I came to write exactly this comment.
The thing runs instantly. And that's in a VM in Javascript.
Office really chugged on the PCs of the time though. We can debate whether modern Excel actually delivers enough more value than historical Excel to justify being as more resource-hungry, thus slower to load, as it is. But historical Excel appears fast on modern hardware, even in emulation, because the CPU, RAM, and permanent storage have had 30 years to evolve since it was released. Contemporary 386s and 486s would not have been that snappy.
I beg to differ. I'm 55.
Let's go back to say around 1994/5. I've just got a job as the first dedicated IT bod for a pie factory near Plymouth (Devon not MA)! Win 3.11 was pretty much everywhere and was almost reliable - patching wasn't really a thing then in the MS world. By then Pentium (586) was a thing but the majority of machines were 80486, 80386s were still useful. There were also the 386/486 SX/DX and DX2 and Cyrix and so on.
The planning spreadsheets were a series of Lotus 1-2-3 jobbies with a lot of manual copy and pasting and I gradually ported it to a Excel VBA job. To cut a long story short, I was running Win311 and Excel on a Pentium 75 with 16MB RAM, IDE HDD. Excel was way quicker to start than on a modern PC running Win 11 with an SSD.
Yes, a lot of things took a while but I ended up with a finite capacity plan in VBA for an entire factory that took less than five minutes per run. That was for meat and dough prep, make, bake and wrap and dispatch for 150 odd finished product lines. It generated a labour plan as well and ran totally to forecast (which it also did). Pasties, sossie rolls etc are generally made to forecast - they take a while to get through the plant and have to be delivered into depot with enough code (shelf life) for the customer (store) to be able to sell them and the consumer to not be given a dose of the trots. As reality kicked in, you input the actual orders etc and it refined the plan.
OK not the best tool for the job but I hope I show that a spreadsheet back in the day was more than capable of doing useful things. I've just fired up LO calc on my laptop with a SSD and it took longer than I remember old school Excel starting up or perhaps the same time.
The world runs on Excel. It is the largest development environment by far and no fancy language/framework can come close to touch it. The reason is because it acts as the glue to get real life things done in everything from large governments, militaries, large corporations all the way down to the small bed and breakfast operation across the entire world. Normal people have gotten real processes built by just twiddling around in Excel.
1 reply →
yep and that was Windows which introduced levels of latency and waiting times much worse than equivalent DOS software, but with easier to use and more intuitive menus instead of the usual DOS UI routine of either no menus or menus that showed with key combos, and power users knowing many key combination combos which weren't strictly necessary but both accelerated things and impressed newbs into thinking computers were too hard for them
on a 486, Lotus 1-2-3 was essentially instant - even from floppy disks it would run faster than excel does today on a top of the line machine
I did earn some bread with vba as well, and always advocate for efficiency, but I just opened a 12MB xlsx file in LO, and it took a couple of seconds on a 2024 thinkpad.
As far as I remember, my Win 3.11 machine (a 486 DX with 4MB RAM and 30MB HDD) wouldn't be able to store or open such a file, let alone recognize the extension. Also, it would call the file 2026022~.XL~ or something. And it took more than a couple of seconds to load office programs for sure. It would take well over a minute to load a book from a 1.44MB floppy.
Anyway, software and computers have come a long way and I'm grateful for it.
3 replies →
This is just not true. The only chugging back then was reading from disks, and the entire Office suite was only a handful of 3.5" floppies. If you had already started Excel earlier, then it was likely still cached in RAM and would start nearly instantly. If not, then it was still only a few seconds.
Now what was slow was actual computations. Like try running a big spreadsheet in Excel or counting words in a big Word document on that hardware. It takes a very long time, while on modern hardware it's nearly instant.
This does not match my memory of using Windows 3.1. Excel would likely not have been cached in RAM from a previous run because a typical Windows 3.1 machine only had 4 megabytes of RAM.
This isn't true at all, based on my experiences. Contemporary 386s were kind of slow, I guess, but these programs did not chug on a 486. I spent tons and tons of time in Excel and Access and writing VB/VBScript/macros.
They haven't released the old Excels as open source right?
Wonder if its feasible to reverse the old version using LLMs, vibecode it to run on modern platforms and then shorehorn in support for modern XLS format. At the rate LLMs are improving I hope someone will eventually partake in this challenge!
> Wonder if its feasible to reverse the old version using LLMs, vibecode it to run on modern platforms and then shorehorn in support for modern XLS format.
Oh no it won't. Photoshop PSD and the legacy Office file formats have one thing in common... they are raw dumps of the C in-memory structs representing the contents. That's how they save and load so fast [1], in contrast to the modern formats which are a bunch of XMLs in a ZIP in a trenchcoat. Unfortunately, that makes reverse engineering them not just a challenge in itself, but also reimplementing because you have to reimplement Microsoft's original engines piece by piece, quirk by quirk.
And that's before wading into the mess that is OLE or, yes, the older people will shudder, ActiveX. Or the wonders that VBA macros could achieve, including just running stuff directly from kernel32.dll. I'm reasonably sure you could import the DirectX DLLs into an Office VBA macro and implement a full blown 3D shooter engine with DirectX instead of Excel.
And that's also why conversion in either direction almost always carries loss potential, simply put, not each quirk of the legacy format has been carried over to the "new" XML storage format, and certainly not into OpenOffice XML.
[1] https://www.joelonsoftware.com/2008/02/19/why-are-the-micros...
I mean if people are reverse engineering entire n64 games into its original code that can target the original SGI compilers, then it is possible to reverse this other code. I don't think there is a drive to do so though. Thats where I hope some future LLM could help lower that barrier to people already well experienced in reversing.
>And that's also why conversion in either direction almost always carries loss potential, simply put, not each quirk of the legacy format has been carried over to the "new" XML storage format, and certainly not into OpenOffice XML.
Can modern Office reliably open the old formats? If so they must have implemented the parsers correctly no?