I sometimes wonder what the alternate reality where semiconductor advances ended in the eighties would look like.
We might have had to manage with just a few MB of RAM and efficient ARM cores running at maybe 30 MHz or so. Would we still get web browsers? How about the rest of the digital transformation?
One thing I do know for sure. LLMs would have been impossible.
For me the interesting alternate reality is where CPUs got stuck in the 200-400mhz range for speed, but somehow continued to become more efficient.
It’s kind of the ideal combination in some ways. It’s fast enough to competently run a nice desktop GUI, but not so fast that you can get overly fancy with it. Eventually you’d end up OSes that look like highly refined versions of System 7.6/Mac OS 8 or Windows 2000, which sounds lovely.
My alternate reality "one of these days" projects is to have a RISC-V RV32E core on a small FPGA (or even emulated by a different SOC) that sits on a 40- or 64-pin DIP carrier board, ready to be plugged into a breadboard. You could create a Ben Eater-style small computer around this, with RAM, a UART, maybe something like the VERA board from the Commander X16...
It would probably need a decent memory controller, since it wouldn't be able to dedicate 32 pins for a data bus, loads and stores would need to be done wither 8 or 16 bits at a time, depending on how many pins you want to use for that..
I sometimes drop by cpu down to the 400Mhz-800Mhz range. 400 is rough. 800, not so bad. It runs fine, with something like i3 or sway.
If we really got stuck in the hundreds of MHz range, I guess we’d see many-core designs coming to consumers earlier. Could have been an interesting world.
Although, I think it would mostly be impossible. Or maybe we’re in that universe already. If you are getting efficiency but not speed, you can always add parallelism. One form of parallelism is pipelining. We’re at like 20 pipeline stages nowadays, right? So in the ideal case if we weren’t able to parallelize in that dimension we’d be at something like 6Ghz/20=300Mhz. That’s pretty hand-wavey, but maybe it is a fun framing.
The GameBoy Advance could run 2D games (and some 3D demos) on 2 AA batteries for 16 hours.
I wonder if we could get something more efficient with modern tech? It seems research made things faster but more power hungry. We compensate with better batteries instead. I guess we can and it's a design goal problem, I also do love a screen with backlight.
Given enough power and space efficiency you would start putting multiple cpus together for specialized tasks. Distributed computing could have looked differently
The alternative reality I wish we could move to, across the universe, is the one where SGI were the first to build a titanium laptop and became the worlds #1 Unix laptop vendor ..
There's something to this. The 200-400MHz era was roughly where hardware capability and software ambition were in balance — the OS did what you asked, no more.
What killed that balance wasn't raw speed, it was cheap RAM. Once you could throw gigabytes at a problem, the incentive to write tight code disappeared. Electron exists because memory is effectively free. An alternate timeline where CPUs got efficient but RAM stayed expensive would be fascinating — you'd probably see something like Plan 9's philosophy win out, with tiny focused processes communicating over clean interfaces instead of monolithic apps loading entire browser engines to show a chat window.
The irony is that embedded and mobile development partially lives in that world. The best iOS and Android apps feel exactly like your description — refined, responsive, deliberate. The constraint forces good design.
I'm in the early phases of working on a game that explores that.
The backstory is that in the late 2050s when AI has its hands in everything, humans loose trust of it. There are a few high profile incidents - based on AI decisions -, which cause public opinion to change, and an initiative is brought in to ensure important systems run hardware and software that can be trusted and human reviewed.
A 16bit CPU architecture - with no pipelining, speculative execution etc is chosen, as it's powerful enough to run such systems, but also simple enough that a human can fully understand the hardware and software.
The goal is to make a near-future space exploration MMO. My Macbook Pro can simulate 3000 CPU cores simultaneously, and I have a lot of fun ideas for it. The irony is that I'm using LLMs to build it :D
Heavy webpages are the main barrier for projects like this. We need something that is just reader view for everything without the overhead of also being able to do non reader view. Like w3m or lynx but with sane formatting, word wrap etc.
I think the boring answer is that we waste computing resources simply because if memory and CPU cycles are abundant and cheap, developers don't find it worth their time to optimize nearly as much as they needed to optimize in the 1980s or 1990s.
Had we stopped with 1990s tech, I don't think that things would have been fundamentally different. 1980s would have been more painful, mostly because limited memory just did not allow for particularly sophisticated graphics. So, we'd be stuck with 16-color aesthetics and you probably wouldn't be watching movies or editing photos on your computer. That would mean a blow to social media and e-commerce too.
APIs in a B2B style would likely be much more prevalent, less advertising (yay!) and less money in the internet so more like the original internet I guess.
> I remember using the web on 25mhz computers. It ran about as fast as it does today with a couple ghz.
I know it’s a meme on HN to complain that modern websites are slow, but this is a perfect example of how completely distorted views of the past can get.
No, browsing the web in the early 90s was slooow. Even simple web pages took a long time to load. As you said, internet connections were very slow too. I remember visiting pages with photos that would come with a warning about the size of the page, at which point I’d get up and go get a drink or take a break while it loaded. Then scrolling pages with images would feel like the computer was working hard.
It’s silly to claim that 90s web browsers ran about as fast as they do today.
No, if we had the web it would be more like what gopher was. Or maybe lynx.
Edit: oh I thought you meant if we were stuck in 6502 style stuff. With megabytes of ram we'd be able to do a lot more. When I was studying we ran 20 X terminals with ncsa mosaic on a server with a few CPUs and 128GB RAM or so. Graphic browsing would be fine.
Only when Java and JavaScript came on the scene things got unbearably slow. I guess in that scenario most processing would have stayed server-side.
https://en.wikipedia.org/wiki/PLATO_(computer_system) is from the 1960s, so, technically, it certainly is possible. Whether it would make sense commercially to support a billion users would depend on whether we would stay stuck on prices of the eighties, too.
Also, there’s mobile usage. I would it be possible to build a mobile network with thousands of users per km² with tech from the eighties?
I always think the Core 2 Duo was the inflexion point for me. Before that current software always seemed to struggle on current hardware but after it was generally fine.
As much as I like my Apple Silicon Mac I could do everything I need to on 2008 hardware.
Alongside the power of a single core, that was alongside adoption of multicore and moving from 32 to 64 bit for the general user, which enabled greater than 4GB memory and lots of processes to co-exist more gracefully.
I don't think there's really a credible alternate reality where Moore's law just stops like that when it was in full swing.
The ones that "could have happened" IMO are the transistor never being invented, or even mechanical computers becoming much more popular much earlier (there's a book about this alternate reality, The Difference Engine).
I don't think transistors being invented was that certain to happen, we could've got better vacuum tubes, or maybe something else.
As someone has brought up, Transputers (an early parallel architecture) was a thing in the 1980s because people thought CPU speed was reaching a plateau. They were kind of right (which is why modern CPUs are multicore) but were a decade or so too early so transputers failed in the market.
When MC68030 (1986) was introduced, I remember reading how computers probably won't get much faster, because PCB signal integrity would not allow further improvements.
People that time were not actually sure how long the improvements would go on.
You'd probably get much more multiprocessor stuff much earlier. There's probably 2 or 3 really good interfaces to wire an almost arbitrary number of CPUs together and run some software across all of them (AMP not SMP).
And imagine if telecom had topped out around ISDN somewhere, with perhaps OC-3 (155Mbps) for the bleeding-fastest network core links.
We'd probably get MP3 but not video to any great or compelling degree. Mostly-text web, perhaps more gopher-like. Client-side stuff would have to be very compact, I wonder if NAPLPS would've taken off.
Screen reader software would probably love that timeline.
you are wrong. Windows 3.11 era used CPUs with like 33mhz cpu, and yet we had TONS of graphical applications. Including web browsers, Photoshop, CAD, Excel and instant messangers
Only thing that killed web for old computers is JAVASCRIPT.
Honestly, I think we could’ve pulled off a lot earlier if GPU development had invested in GPGPU earlier.
I can see it now… the a national lab can run ImageNet, but it takes so many nodes with unobtanium 3dfx stuff that you have to wait 24 hours for a run to be scheduled and completed.
Yes, just that they would not run millions of lines of JavaScript for some social media tracking algorithm, newsletter signup, GDPR popup, newsletter popup, ad popup, etc. and you'd probably just be presented with the text only and at best a relevant static image or two. The web would be a place to get long-form information, sort of a massive e-book, not a battleground of corporations clamoring for 5 seconds of attention to make $0.05 off each of 500 million people's doom scrolling while on the toilet.
Web browsers existed back then, the web in the days of NCSA Mosaic was basically exactly the above
The whitewashing of the past in this thread is something else.
Did everyone forget the era of web browsing when pages were filled with distracting animated banner ads?
The period when it was common for malicious ads to just hijack the session and take you to a different page?
The pop-up tornados where a page would spawn pop ups faster than you could close them? Pop unders getting left behind to discover when you closed your window?
Heavy flash ads causing your browser to slow to a crawl?
The modern web browsing experience without an ad blocker feels tame compared to the early days of Internet ads.
Actually real AI isn’t going to be possible unless we return to this arch. Contemporary stacks are wasting 80% of their energy which we now need for AI. Graphics and videos are not a key or necessary part of most computing workflows.
Prodigy launched online ads from the 1980s. AOL as well.
HotWired (Wired's first online venture) sold their first banner ads in 1994.
DoubleClick was founded in 1995.
Neither were limited to 90's hardware:
Web browsers were available for machines like the Amiga, launched in 1985, and today you can find people who have made simple browsers run on 8-bit home computers like the C64.
The display controller they are using (RA8875 or RA8889) has several hundred KB of internal memory. So you can write to the screen and the image will "stay there" as it were, you don't have to store a framebuffer or keep writing out the image like with a CRT.
It probably has a character mapped display, so you can only display 256 different (ascii and graphics) characters in a memory mapped 80*25 = 2000 bytes display buffer.
EDIT: I can now see that is does have bit mapped graphics. It must have a built-in serial like terminal with graphics capabilities.
It always mildly tickles me when retrocomputer designs use anachronistic processors way more powerful than the CPU in their design - in this case, there’s a ATmega644 as a keyboard controller (64K ROM - although only 4K RAM, up to 20MHz) and presumably something pretty powerful in the display board.
3D printer beds have been getting bigger, but slicers don’t seem to account for curling as large prints cool. The problem is long linear runs on bottom infill and perimeters shrinking. I’ve been cutting my large parts into puzzle like shapes, but printing them fully assembled. This adds curved perimeters throughout the bottom layer, reducing the distance stress can travel before finding a seam to deform.
That said, a retro laptop this thick would look really nice in stained wood.
Stunning work! Astounding progress since its under 3 months old from PCB to this result.
Funnily enough I've been musing this past month would I better separate work if I had a limited Amiga A1200 PC for anything other than work! This would nicely fit.
Please do submit to HackaDay I'm sure they'd salivate over this and it's amazing when you have the creator in the comments. Even if just to explain no a 555 wouldn't quite achieve the same result. No not even a 556...
I implemented "multitasking" (well, two-tasking) between a BASIC program and native code on a Z80, using a "supervisor" driven by hardware interrupts. There's just so much you can pack in a 4MHz CPU with a 4-bit ALU (yes, not 8-bit). It worked for soft-realtime tasks, but would be a rather weak desktop.
I love the case material. What is it? It looks like what they make the bulk post boxes out of here (if you ship a lot of material via post, they give you these boxes to put them in to/from the delivery centre), or corflute material (election candidates posters around here).
Atari 130XE used bank switching to handle more memory along with the IO-reserved memory (i.e. you had an address $D301 where you would change bits for the memory bank, and it would redirect $4000 – $7FFF to another bank in the extended memory)
BBC Micro, Acorn Atom, Commadore PET. all kinds of home computer. So prime retro material late 70s early 80s. There's a CP/M port. So you can use PIP which traces it's lineage back to the DEC Tops-10 operating system if not beforehand (its a peripheral IO command model, although I think CP/M PIP only shares name)
Add a DIN plug and record programs in Kansas City Standard on a cassette recorder. Could be a walkman. A floppy (full 8" type) was a luxury. Almost a megabyte! imagine what you can do.. when a program is the amount of text you can fit in the VBI of a ceefax/teletext broadcast, or is typed in by hand in hex. Kansas city standard is 300 bits/second and the tape plays in real-time so a standard C60 is like 160kb on both sides if you were lucky: it misread and miswrote a LOT.
I used to do tabular GOTO jump table text adventures, and use XOR screen line drawing to do moving moire pattern interference fringes. "mod scene" trippy graphics!
Thats a mandelbrot in ASCII, the best I've seen, on the web page. Super stuff.
People wrote tiny languages for the 6502. integer only but C like syntax, or Pascal or ALGOL. People did real science in BASIC, a one weekend course got you what you needed to do some maths for a Masters or PHD in some non CS field.
My friends did a lot more of this than me. Because I had access to a Dec-10 and a PDP-11 at work and later Vax/VMS and BSD UNIX systems, I didn't see the point of home machines. A wave I wish I'd ridden but not seeing the future emerge has been a constant failure of mine.
I wrote (mostly copied from printed code and altered) games in BASIC. Too bad I had not enough understanding what could have been done in the assembly language... Now I keep rediscovering them, but it's only for the sake of nostalgia (and personal development)
The 6502 is the best 8bit CPU for learning stuff. There's a lot you could add to it, but there is very little could take away. It's minimal but you have everything you need.
I love this! I’ve been working on a 6502 kernel. I have an arch trick to give the 6502 tons of memory so it can do a kind of Genera-like babashka lisp machine.
Good timing. My current weekend project is constructing something similar to the the first third of Ben Eater's 6502 design (last weekend was the clock module plus some eccentricities).
It occurred to me that given the 6502's predictable clock cycle timings it should be possible to create a realtime disassembler using e.g. an Arduino Mega 2560+character lcd display attached to the 6502's address/data/etc pins.
Of course, this would only be useful in single-stepping/very slow clock speeds. Still, I think it could be useful in learning how the 6502 works.
Is there relevant prior work? I'm struggling with my google fu.
It does not run Microsoft software at all, as far as I can tell. EhBasic isn't Microsoft Basic, ehbasic was written by Lee Davison. And this particular version was further enhanced (see github).
And wozmon was obviously written by Woz.. not Microsoft.
There has been some discussion around this, and Lee Davison is no longer with us so that makes it more difficult. It appears from the source code that Lee's independent basic is highly based on Microsoft Basic. I'm sure it is no longer an issue, especially as Microsoft has provided a free license for Microsoft 6502 basic, but the licensing situation is not entirely clear.
I sometimes wonder what the alternate reality where semiconductor advances ended in the eighties would look like.
We might have had to manage with just a few MB of RAM and efficient ARM cores running at maybe 30 MHz or so. Would we still get web browsers? How about the rest of the digital transformation?
One thing I do know for sure. LLMs would have been impossible.
For me the interesting alternate reality is where CPUs got stuck in the 200-400mhz range for speed, but somehow continued to become more efficient.
It’s kind of the ideal combination in some ways. It’s fast enough to competently run a nice desktop GUI, but not so fast that you can get overly fancy with it. Eventually you’d end up OSes that look like highly refined versions of System 7.6/Mac OS 8 or Windows 2000, which sounds lovely.
My alternate reality "one of these days" projects is to have a RISC-V RV32E core on a small FPGA (or even emulated by a different SOC) that sits on a 40- or 64-pin DIP carrier board, ready to be plugged into a breadboard. You could create a Ben Eater-style small computer around this, with RAM, a UART, maybe something like the VERA board from the Commander X16...
It would probably need a decent memory controller, since it wouldn't be able to dedicate 32 pins for a data bus, loads and stores would need to be done wither 8 or 16 bits at a time, depending on how many pins you want to use for that..
I loved System 7 for its simplicity yet all of the potential it had for individual developers.
Hypercard was absolutely dope as an entry-level programming environment.
7 replies →
I sometimes drop by cpu down to the 400Mhz-800Mhz range. 400 is rough. 800, not so bad. It runs fine, with something like i3 or sway.
If we really got stuck in the hundreds of MHz range, I guess we’d see many-core designs coming to consumers earlier. Could have been an interesting world.
Although, I think it would mostly be impossible. Or maybe we’re in that universe already. If you are getting efficiency but not speed, you can always add parallelism. One form of parallelism is pipelining. We’re at like 20 pipeline stages nowadays, right? So in the ideal case if we weren’t able to parallelize in that dimension we’d be at something like 6Ghz/20=300Mhz. That’s pretty hand-wavey, but maybe it is a fun framing.
The GameBoy Advance could run 2D games (and some 3D demos) on 2 AA batteries for 16 hours. I wonder if we could get something more efficient with modern tech? It seems research made things faster but more power hungry. We compensate with better batteries instead. I guess we can and it's a design goal problem, I also do love a screen with backlight.
4 replies →
Given enough power and space efficiency you would start putting multiple cpus together for specialized tasks. Distributed computing could have looked differently
4 replies →
The alternative reality I wish we could move to, across the universe, is the one where SGI were the first to build a titanium laptop and became the worlds #1 Unix laptop vendor ..
5 replies →
Or if 640k was not only all you'd ever need, it was all we'd ever get.
There's something to this. The 200-400MHz era was roughly where hardware capability and software ambition were in balance — the OS did what you asked, no more.
What killed that balance wasn't raw speed, it was cheap RAM. Once you could throw gigabytes at a problem, the incentive to write tight code disappeared. Electron exists because memory is effectively free. An alternate timeline where CPUs got efficient but RAM stayed expensive would be fascinating — you'd probably see something like Plan 9's philosophy win out, with tiny focused processes communicating over clean interfaces instead of monolithic apps loading entire browser engines to show a chat window.
The irony is that embedded and mobile development partially lives in that world. The best iOS and Android apps feel exactly like your description — refined, responsive, deliberate. The constraint forces good design.
2 replies →
I'm in the early phases of working on a game that explores that.
The backstory is that in the late 2050s when AI has its hands in everything, humans loose trust of it. There are a few high profile incidents - based on AI decisions -, which cause public opinion to change, and an initiative is brought in to ensure important systems run hardware and software that can be trusted and human reviewed.
A 16bit CPU architecture - with no pipelining, speculative execution etc is chosen, as it's powerful enough to run such systems, but also simple enough that a human can fully understand the hardware and software.
The goal is to make a near-future space exploration MMO. My Macbook Pro can simulate 3000 CPU cores simultaneously, and I have a lot of fun ideas for it. The irony is that I'm using LLMs to build it :D
We had web browsers, kinda, in that we'd call up BBSes, and use ansi for menus and such.
My Vic20 could do this, and a C64 easily, really it was just graphics that were wanting.
I was sending electronic messages around the world via FidoNet and PunterNet, downloaded software, was on forums, and that all on BBSes.
When I think of the web of old, it's the actual information I love.
And a terminal connected to a bbs could be thought of as a text browser, really.
I even connectd to CompuServe in the early 80s via my C64 through "datapac", a dial gateway via telnet.
ANSI was a standard too, it could have evolved further.
Heavy webpages are the main barrier for projects like this. We need something that is just reader view for everything without the overhead of also being able to do non reader view. Like w3m or lynx but with sane formatting, word wrap etc.
1 reply →
> graphics that were wanting
Prodigy established a (limited) graphical online service in 1988.
1 reply →
I think the boring answer is that we waste computing resources simply because if memory and CPU cycles are abundant and cheap, developers don't find it worth their time to optimize nearly as much as they needed to optimize in the 1980s or 1990s.
Had we stopped with 1990s tech, I don't think that things would have been fundamentally different. 1980s would have been more painful, mostly because limited memory just did not allow for particularly sophisticated graphics. So, we'd be stuck with 16-color aesthetics and you probably wouldn't be watching movies or editing photos on your computer. That would mean a blow to social media and e-commerce too.
Apart from transputers mentioned already, there’s https://greenarrays.com/home/documents/g144apps.php
Both the hardware and the forth software.
APIs in a B2B style would likely be much more prevalent, less advertising (yay!) and less money in the internet so more like the original internet I guess.
GUIs like https://en.wikipedia.org/wiki/SymbOS
And https://en.wikipedia.org/wiki/Newton_OS
Show that we could have had quality desktops and mobile devices
I want to chime in on SymbOS, which I think is the perfect reply to the GP's curiosity.
https://www.symbos.org/shots.htm
This is what slow computers with a few hundred kB of RAM can do.
1 reply →
I remember using the web on 25mhz computers. It ran about as fast as it does today with a couple ghz. Our internet was a lot slower than as well.
> I remember using the web on 25mhz computers. It ran about as fast as it does today with a couple ghz.
I know it’s a meme on HN to complain that modern websites are slow, but this is a perfect example of how completely distorted views of the past can get.
No, browsing the web in the early 90s was slooow. Even simple web pages took a long time to load. As you said, internet connections were very slow too. I remember visiting pages with photos that would come with a warning about the size of the page, at which point I’d get up and go get a drink or take a break while it loaded. Then scrolling pages with images would feel like the computer was working hard.
It’s silly to claim that 90s web browsers ran about as fast as they do today.
22 replies →
It crashed a lot more, the fonts (and screens) were uglier, and Javascript was a lot slower. The good thing was that there was very little Javascript.
6 replies →
I remember using the web in the 90s. I often left to make a sandwich while pages loaded.
Try opening Gmail on one of those. Won’t be fun.
No, if we had the web it would be more like what gopher was. Or maybe lynx.
Edit: oh I thought you meant if we were stuck in 6502 style stuff. With megabytes of ram we'd be able to do a lot more. When I was studying we ran 20 X terminals with ncsa mosaic on a server with a few CPUs and 128GB RAM or so. Graphic browsing would be fine.
Only when Java and JavaScript came on the scene things got unbearably slow. I guess in that scenario most processing would have stayed server-side.
> Would we still get web browsers?
https://en.wikipedia.org/wiki/PLATO_(computer_system) is from the 1960s, so, technically, it certainly is possible. Whether it would make sense commercially to support a billion users would depend on whether we would stay stuck on prices of the eighties, too.
Also, there’s mobile usage. I would it be possible to build a mobile network with thousands of users per km² with tech from the eighties?
> One thing I do know for sure. LLMs would have been impossible.
We had ELIZA, and that was enough for people to anthropomorphize their teletype terminals.
I always think the Core 2 Duo was the inflexion point for me. Before that current software always seemed to struggle on current hardware but after it was generally fine.
As much as I like my Apple Silicon Mac I could do everything I need to on 2008 hardware.
It's remarkable how a modern $50 SBC outperforms the old Core 2 Duo line.
Alongside the power of a single core, that was alongside adoption of multicore and moving from 32 to 64 bit for the general user, which enabled greater than 4GB memory and lots of processes to co-exist more gracefully.
Transputers. Lots and lots and lots of transputers. (-:
And https://en.wikipedia.org/wiki/Connection_Machine
1 reply →
I don't think there's really a credible alternate reality where Moore's law just stops like that when it was in full swing.
The ones that "could have happened" IMO are the transistor never being invented, or even mechanical computers becoming much more popular much earlier (there's a book about this alternate reality, The Difference Engine).
I don't think transistors being invented was that certain to happen, we could've got better vacuum tubes, or maybe something else.
As someone has brought up, Transputers (an early parallel architecture) was a thing in the 1980s because people thought CPU speed was reaching a plateau. They were kind of right (which is why modern CPUs are multicore) but were a decade or so too early so transputers failed in the market.
2 replies →
When MC68030 (1986) was introduced, I remember reading how computers probably won't get much faster, because PCB signal integrity would not allow further improvements.
People that time were not actually sure how long the improvements would go on.
3 replies →
Teletext existed in the 80s and was widely in use, so we'd have some kind of information network.
BBSes existed at the same time and if you were into BBSes you were obsessive about it.
We did have web browsers, I had Internet Explorer on Windows 3.1, 33mhz 8mb RAM.
I still remember the Mosaic from NCSA. Internet in a box.
Probably was "Windows 3.11, For Workgroups" as iirc Windows 3.1 didn't ship with a TCP/IP stack
7 replies →
You'd probably get much more multiprocessor stuff much earlier. There's probably 2 or 3 really good interfaces to wire an almost arbitrary number of CPUs together and run some software across all of them (AMP not SMP).
This is basically the premise of the Fallout universe. I think in the story it was the transistor was never invented though.
And imagine if telecom had topped out around ISDN somewhere, with perhaps OC-3 (155Mbps) for the bleeding-fastest network core links.
We'd probably get MP3 but not video to any great or compelling degree. Mostly-text web, perhaps more gopher-like. Client-side stuff would have to be very compact, I wonder if NAPLPS would've taken off.
Screen reader software would probably love that timeline.
you are wrong. Windows 3.11 era used CPUs with like 33mhz cpu, and yet we had TONS of graphical applications. Including web browsers, Photoshop, CAD, Excel and instant messangers
Only thing that killed web for old computers is JAVASCRIPT.
8 replies →
I have a Hayes 9600kbps modem for web surfing.
1 reply →
I remember when I went from 286 to 486dx2, the difference was impressive, able to run a lot of graphical applications smoothly.
Ironically, now I'm using an ESP32-S3, 10x more powerful, just to run Iot devices.
It's probably possible to develop analog adsl chips in 1990 semi tech. But pretty difficult.
Depends how pervasive OC3 would have gotten. A 1080p video stream is only about 7 Mbps today.
2 replies →
You should definitely watch Maniac: https://en.wikipedia.org/wiki/Maniac_(miniseries)
There are web browsers for 8-bits today, and there were web browsers for e.g. Amiga's with 68000 CPU's from 1979 back in the day.
> One thing I do know for sure. LLMs would have been impossible.
Maybe they could, as ASICs in some laboratories :)
Honestly, I think we could’ve pulled off a lot earlier if GPU development had invested in GPGPU earlier.
I can see it now… the a national lab can run ImageNet, but it takes so many nodes with unobtanium 3dfx stuff that you have to wait 24 hours for a run to be scheduled and completed.
I was doing Schematic Capture and Layout on a 486 with <counts voice> one two three four five six seven eight 8 megabytes of RAM ah haha.
>I sometimes wonder what the alternate reality where semiconductor advances ended in the eighties would look like.
We would have seen much less desktop apps being written using Javascript frameworks.
tbh we'd probably just have really good Forth programmers instead of LLMs. same vibe, fewer parameters.
> Would we still get web browsers?
Yes, just that they would not run millions of lines of JavaScript for some social media tracking algorithm, newsletter signup, GDPR popup, newsletter popup, ad popup, etc. and you'd probably just be presented with the text only and at best a relevant static image or two. The web would be a place to get long-form information, sort of a massive e-book, not a battleground of corporations clamoring for 5 seconds of attention to make $0.05 off each of 500 million people's doom scrolling while on the toilet.
Web browsers existed back then, the web in the days of NCSA Mosaic was basically exactly the above
The whitewashing of the past in this thread is something else.
Did everyone forget the era of web browsing when pages were filled with distracting animated banner ads?
The period when it was common for malicious ads to just hijack the session and take you to a different page?
The pop-up tornados where a page would spawn pop ups faster than you could close them? Pop unders getting left behind to discover when you closed your window?
Heavy flash ads causing your browser to slow to a crawl?
The modern web browsing experience without an ad blocker feels tame compared to the early days of Internet ads.
2 replies →
Actually real AI isn’t going to be possible unless we return to this arch. Contemporary stacks are wasting 80% of their energy which we now need for AI. Graphics and videos are not a key or necessary part of most computing workflows.
Well, we wouldn't have ads and tracking.
Prodigy launched online ads from the 1980s. AOL as well.
HotWired (Wired's first online venture) sold their first banner ads in 1994.
DoubleClick was founded in 1995.
Neither were limited to 90's hardware:
Web browsers were available for machines like the Amiga, launched in 1985, and today you can find people who have made simple browsers run on 8-bit home computers like the C64.
1 reply →
If such an alternate reality has internet of any speed above "turtle in a mobility scooter" then there for sure would be ads and tracking.
The young WWW had garish flashing banner ads.
3 replies →
Wait, there is an 800x480 display connected, but the thing only has 46k of RAM. There's no explanation of the display approach being used.
The extended graphics commands seem to allow X/Y positioning with an 8-bit color.
I think the picture shows an 80x25 screen?
What gives here? Anyone know what's going on?
The display controller they are using (RA8875 or RA8889) has several hundred KB of internal memory. So you can write to the screen and the image will "stay there" as it were, you don't have to store a framebuffer or keep writing out the image like with a CRT.
It probably has a character mapped display, so you can only display 256 different (ascii and graphics) characters in a memory mapped 80*25 = 2000 bytes display buffer.
EDIT: I can now see that is does have bit mapped graphics. It must have a built-in serial like terminal with graphics capabilities.
EDIT2: Probably using this chip: https://www.adafruit.com/product/1590
:-)
I love this.
It always mildly tickles me when retrocomputer designs use anachronistic processors way more powerful than the CPU in their design - in this case, there’s a ATmega644 as a keyboard controller (64K ROM - although only 4K RAM, up to 20MHz) and presumably something pretty powerful in the display board.
3D printer beds have been getting bigger, but slicers don’t seem to account for curling as large prints cool. The problem is long linear runs on bottom infill and perimeters shrinking. I’ve been cutting my large parts into puzzle like shapes, but printing them fully assembled. This adds curved perimeters throughout the bottom layer, reducing the distance stress can travel before finding a seam to deform.
That said, a retro laptop this thick would look really nice in stained wood.
Stunning work! Astounding progress since its under 3 months old from PCB to this result.
Funnily enough I've been musing this past month would I better separate work if I had a limited Amiga A1200 PC for anything other than work! This would nicely fit.
Please do submit to HackaDay I'm sure they'd salivate over this and it's amazing when you have the creator in the comments. Even if just to explain no a 555 wouldn't quite achieve the same result. No not even a 556...
> Yes, I know I'm crazy, but
Any time I see this phrase I know these are my people.
Crazy for wanting a computer that's actually yours.
I believe there will come a day where people who can do this will be selling these on the black market for top dollar.
neat. not something i´d hanker for. i saw a 16 core z80 laptop years ago and i often think about it because it can multitask. https://hackaday.com/2019/12/10/laptop-like-its-1979-with-a-...
I implemented "multitasking" (well, two-tasking) between a BASIC program and native code on a Z80, using a "supervisor" driven by hardware interrupts. There's just so much you can pack in a 4MHz CPU with a 4-bit ALU (yes, not 8-bit). It worked for soft-realtime tasks, but would be a rather weak desktop.
The follow on to CPM which ran on the Z80 is MPM which is a multitasking OS.
I love the super clunky retro esthetic!
Takes me back to a time when a laptop would encourage the cat to share a couch because of the amount of heat it emitted.
Amazingly quick as well. Pointless projects are so much better and more fun when they don't take forever!
This was very interesting until I saw it had a Pi Pico for some reason
I love the case material. What is it? It looks like what they make the bulk post boxes out of here (if you ship a lot of material via post, they give you these boxes to put them in to/from the delivery centre), or corflute material (election candidates posters around here).
Looks like 3D printed PLA.
Maybe this can achieve RYF certification.
What I really would love: modern (continously built) modern (less than 10 years old tech) devices ryf-cetified.
Brilliant! I love it. Bonus points for using the eWoz monitor. It’s giving me the itch to build it.
I wonder how long the battery lasts. The LCD backlight probably draws more power than the CPU (<0.1W, even with no special low-power idle modes.)
Awesome! Gives me mnt pocket reform vibes.
https://shop.mntre.com/products/mnt-pocket-reform
lol hi merlin, was peeking in the comments wondering if anyone would say this
Recently purchased a Pocket8086 and I can say – these sorts of things are _very_ fun.
> 46K RAM
Not 64?
(Edit: I see part of the address space is reserved for ROM, but it still seems a bit wonky.)
The 6502 doesn't have separate io addresses so you need to fit all devices in a 64k space, not just ROM.
Atari 130XE used bank switching to handle more memory along with the IO-reserved memory (i.e. you had an address $D301 where you would change bits for the memory bank, and it would redirect $4000 – $7FFF to another bank in the extended memory)
This would have been absolutely mind blowing back in the day!
this post made me smile. why not!!! 6502 my first processor. <3
Nice one - the prototype sure reminds me of the early OpenPandora days ..
Wow. It's fresh as a rose! Congratulations!
How about a cassette tape storage?
Serious question; Why 6502?
BBC Micro, Acorn Atom, Commadore PET. all kinds of home computer. So prime retro material late 70s early 80s. There's a CP/M port. So you can use PIP which traces it's lineage back to the DEC Tops-10 operating system if not beforehand (its a peripheral IO command model, although I think CP/M PIP only shares name)
Add a DIN plug and record programs in Kansas City Standard on a cassette recorder. Could be a walkman. A floppy (full 8" type) was a luxury. Almost a megabyte! imagine what you can do.. when a program is the amount of text you can fit in the VBI of a ceefax/teletext broadcast, or is typed in by hand in hex. Kansas city standard is 300 bits/second and the tape plays in real-time so a standard C60 is like 160kb on both sides if you were lucky: it misread and miswrote a LOT.
I used to do tabular GOTO jump table text adventures, and use XOR screen line drawing to do moving moire pattern interference fringes. "mod scene" trippy graphics!
Thats a mandelbrot in ASCII, the best I've seen, on the web page. Super stuff.
People wrote tiny languages for the 6502. integer only but C like syntax, or Pascal or ALGOL. People did real science in BASIC, a one weekend course got you what you needed to do some maths for a Masters or PHD in some non CS field.
My friends did a lot more of this than me. Because I had access to a Dec-10 and a PDP-11 at work and later Vax/VMS and BSD UNIX systems, I didn't see the point of home machines. A wave I wish I'd ridden but not seeing the future emerge has been a constant failure of mine.
I wrote (mostly copied from printed code and altered) games in BASIC. Too bad I had not enough understanding what could have been done in the assembly language... Now I keep rediscovering them, but it's only for the sake of nostalgia (and personal development)
The 6502 is the best 8bit CPU for learning stuff. There's a lot you could add to it, but there is very little could take away. It's minimal but you have everything you need.
6502 based computers shouldn’t have a “dir” command. It’s “catalog” for detailed info or “cat” for the short one.
No, it should be
Way cool! When can I buy one?
Legend!!!
TIL Atari Lynx was a handheld competitor to GameBoy .... Was launched with a 65C2 processor
https://en.wikipedia.org/wiki/Atari_Lynx
Complete madness! But, I love it.
I love this! I’ve been working on a 6502 kernel. I have an arch trick to give the 6502 tons of memory so it can do a kind of Genera-like babashka lisp machine.
Good timing. My current weekend project is constructing something similar to the the first third of Ben Eater's 6502 design (last weekend was the clock module plus some eccentricities).
It occurred to me that given the 6502's predictable clock cycle timings it should be possible to create a realtime disassembler using e.g. an Arduino Mega 2560+character lcd display attached to the 6502's address/data/etc pins.
Of course, this would only be useful in single-stepping/very slow clock speeds. Still, I think it could be useful in learning how the 6502 works.
Is there relevant prior work? I'm struggling with my google fu.
And it mostly runs Microsoft software, too... Basic from 1977 :-P
It does not run Microsoft software at all, as far as I can tell. EhBasic isn't Microsoft Basic, ehbasic was written by Lee Davison. And this particular version was further enhanced (see github). And wozmon was obviously written by Woz.. not Microsoft.
There has been some discussion around this, and Lee Davison is no longer with us so that makes it more difficult. It appears from the source code that Lee's independent basic is highly based on Microsoft Basic. I'm sure it is no longer an issue, especially as Microsoft has provided a free license for Microsoft 6502 basic, but the licensing situation is not entirely clear.
It's commodore 64 ish. I like it
More like Commodore 46.
Wow! Now this is cool!