Comment by bitwize
12 hours ago
Office really chugged on the PCs of the time though. We can debate whether modern Excel actually delivers enough more value than historical Excel to justify being as more resource-hungry, thus slower to load, as it is. But historical Excel appears fast on modern hardware, even in emulation, because the CPU, RAM, and permanent storage have had 30 years to evolve since it was released. Contemporary 386s and 486s would not have been that snappy.
I beg to differ. I'm 55.
Let's go back to say around 1994/5. I've just got a job as the first dedicated IT bod for a pie factory near Plymouth (Devon not MA)! Win 3.11 was pretty much everywhere and was almost reliable - patching wasn't really a thing then in the MS world. By then Pentium (586) was a thing but the majority of machines were 80486, 80386s were still useful. There were also the 386/486 SX/DX and DX2 and Cyrix and so on.
The planning spreadsheets were a series of Lotus 1-2-3 jobbies with a lot of manual copy and pasting and I gradually ported it to a Excel VBA job. To cut a long story short, I was running Win311 and Excel on a Pentium 75 with 16MB RAM, IDE HDD. Excel was way quicker to start than on a modern PC running Win 11 with an SSD.
Yes, a lot of things took a while but I ended up with a finite capacity plan in VBA for an entire factory that took less than five minutes per run. That was for meat and dough prep, make, bake and wrap and dispatch for 150 odd finished product lines. It generated a labour plan as well and ran totally to forecast (which it also did). Pasties, sossie rolls etc are generally made to forecast - they take a while to get through the plant and have to be delivered into depot with enough code (shelf life) for the customer (store) to be able to sell them and the consumer to not be given a dose of the trots. As reality kicked in, you input the actual orders etc and it refined the plan.
OK not the best tool for the job but I hope I show that a spreadsheet back in the day was more than capable of doing useful things. I've just fired up LO calc on my laptop with a SSD and it took longer than I remember old school Excel starting up or perhaps the same time.
The world runs on Excel. It is the largest development environment by far and no fancy language/framework can come close to touch it. The reason is because it acts as the glue to get real life things done in everything from large governments, militaries, large corporations all the way down to the small bed and breakfast operation across the entire world. Normal people have gotten real processes built by just twiddling around in Excel.
Sadly, Excel and the dumbness of the environment generated disasters on Genomics (and tons of other research areas too) causing millions if not billions of losses. Hint: skewed experiments/data and so on, making years of effort worthless.
That woudn't happen under BioPython/BioPer/Rl and a custom dedicated interface with no data mangling at all.
Poeple in the 90's joked about how MS turned the whole IT industry 20 years back. Now, literally, and not just IT.
And that's sad, because you have Turbo Pascal, Windows NT, the VB6 IDE against C/C++ libraries... good products on MS where data correctned was granted with low level libraries called from VB. For sure BLAS/Lapack would exist in the 90's as products for Visual C/C++.
Reusing MS Office for advanced tasks was the key of the shitty computing we were suffering on tons of places. Such as the idiots using Excel tables for Covid patients instead of having a proper SQL database. Even SQlite (IDK about the constraints, maybe it fits) could have been a better choice.
People said with Unix "Worse it's beter". He, nowadays even NDB 'databases' would grant you correctness on scientific data (it's plain text with tuples) that these rotten binary, propietary, office bound pseudo databases and spreadsheets. Or even AWK with CSV's/TSV's.
yep and that was Windows which introduced levels of latency and waiting times much worse than equivalent DOS software, but with easier to use and more intuitive menus instead of the usual DOS UI routine of either no menus or menus that showed with key combos, and power users knowing many key combination combos which weren't strictly necessary but both accelerated things and impressed newbs into thinking computers were too hard for them
on a 486, Lotus 1-2-3 was essentially instant - even from floppy disks it would run faster than excel does today on a top of the line machine
I did earn some bread with vba as well, and always advocate for efficiency, but I just opened a 12MB xlsx file in LO, and it took a couple of seconds on a 2024 thinkpad.
As far as I remember, my Win 3.11 machine (a 486 DX with 4MB RAM and 30MB HDD) wouldn't be able to store or open such a file, let alone recognize the extension. Also, it would call the file 2026022~.XL~ or something. And it took more than a couple of seconds to load office programs for sure. It would take well over a minute to load a book from a 1.44MB floppy.
Anyway, software and computers have come a long way and I'm grateful for it.
That file name thing was from Win95 keeping DOS (FAT) compatibility by introducing FAT32/VFAT: long file names were truncated to standard 8.3 names from DOS (and by extension, Win3.11 too) with fs extensions used to store long file names alongside DOS names. Win95 would show long names, DOS would show shorter ones.
Also, file formats were binary optimized at the time, compared to current XML behemoths compressed with zip. So 12MB file in 1993 is probably something like 100k+ rows, and try that out today.
This is just not true. The only chugging back then was reading from disks, and the entire Office suite was only a handful of 3.5" floppies. If you had already started Excel earlier, then it was likely still cached in RAM and would start nearly instantly. If not, then it was still only a few seconds.
Now what was slow was actual computations. Like try running a big spreadsheet in Excel or counting words in a big Word document on that hardware. It takes a very long time, while on modern hardware it's nearly instant.
This does not match my memory of using Windows 3.1. Excel would likely not have been cached in RAM from a previous run because a typical Windows 3.1 machine only had 4 megabytes of RAM.
This isn't true at all, based on my experiences. Contemporary 386s were kind of slow, I guess, but these programs did not chug on a 486. I spent tons and tons of time in Excel and Access and writing VB/VBScript/macros.