Comment by SilverSlash

3 days ago

The title made me think Carmack was criticizing poorly optimized software and advocating for improving performance on old hardware.

When in fact, the tweet is absolutely not about either of the two. He's talking about a thought experiment where hardware stopped advancing and concludes with "Innovative new products would get much rarer without super cheap and scalable compute, of course".

> "Innovative new products would get much rarer without super cheap and scalable compute, of course".

Interesting conclusion—I'd argue we haven't seen much innovation since the smartphone (18 years ago now), and it's entirely because capital is relying on the advances of hardware to sell what is to consumers essentially the same product that they already have.

Of course, I can't read anything past the first tweet.

  • We have self driving cars, amazing advancement in computer graphics, dead reckoning of camera position from visual input...

    In the meantime, hardware has had to go wide on threads as single core performance has not improved. You could argue that's been a software gain and a hardware failure.

    • > single core performance has not improved.

      Single core performance has improved, but at a much slower rate than I experienced as a kid.

      Over the last 10 years, we are something like 120% improvement in single core performance.

      And, not for nothing, efficiency has become much more important. More CPU performance hasn't been a major driving factor vs having a laptop that runs for 12 hours. It's simply easier to add a bunch of cores and turn them all off (or slow them down) to gain power efficiency.

      Not to say the performance story would be vastly different with more focus on performance over efficiency. But I'd say it does have an effect on design choices.

      2 replies →

  • And I'd argue that we've seen tons of innovation in the past 18 years aside from just "the smartphone" but it's all too easy to take for granted and forget from our current perspective.

    First up, the smartphone itself had to evolve a hell of a lot over 18 years or so. Go try to use an iPhone 1 and you'll quickly see all of the roadblocks and what we now consider poor design choices littered everywhere, vs improvements we've all taken for granted since then.

    18 years ago was 2007? Then we didn't have (for better or for worse on all points):

    * Video streaming services

    * Decent video game market places or app stores. Maybe "Battle.net" with like 5 games, lol!

    * VSCode-style IDEs (you really would not have appreciated Visual Studio or Eclipse of the time..)

    * Mapping applications on a phone (there were some stand-alone solutions like Garmin and TomTom just getting off the ground)

    * QR Codes (the standard did already exist, but mass adoption would get nowhere without being carried by the smartphone)

    * Rideshare, food, or grocery delivery services (aside from taxis and whatever pizza or chinese places offered their own delivery)

    * Voice-activated assistants (including Alexa and other standalone devices)

    * EV Cars (that anyone wanted to buy) or partial autopilot features aside from 1970's cruise control

    * Decent teleconferencing (Skype's featureset was damn limited at the time, and any expensive enterprise solutions were dead on the launchpad due to lack of network effects)

    * Decent video displays (flatscreens were still busy trying to mature enough to push CRTs out of the market at this point)

    * Color printers were far worse during this period than today, though that tech will never run out of room for improvement.

    * Average US Internet speeds to the home were still ~1Mbps, with speeds to cellphone of 100kbps being quite luxurious. Average PCs had 2GB RAM and 50GB hard drive space.

    * Naturally: the tech everyone loves to hate such as AI, Cryptocurrencies, social network platforms, "The cloud" and SaaS, JS Frameworks, Python (at least 3.0 and even realistically heavy adoption of 2.x), node.js, etc. Again "Is this a net benefit to humanity" and/or "does this get poorly or maliciously used a lot" doesn't speak to whether or not a given phenomena is innovative, and all of these objectively are.

    • > * Video streaming services

      Netflix video streaming launched in 2007.

      > * VSCode-style IDEs (you really would not have appreciated Visual Studio or Eclipse of the time..)

      I used VS2005 a little bit in the past few years, and I was surprised to see that it contains most of the features that I want from an IDE. Honestly, I wouldn't mind working on a C# project in VS2005 - both C# 2.0 and VS2005 were complete enough that they'd only be a mild annoyance compared to something more modern.

      > partial autopilot features aside from 1970's cruise control

      Radar cruise control was a fairly common option on mid-range to high-end cars by 2007. It's still not standard in all cars today (even though it _is_ standard on multiple economy brands). Lane departure warning was also available in several cars. I will hand it to you that L2 ADAS didn't really exist the way it does today though.

    • The future is unevenly distributed.

      > Video streaming services

      We watched a stream of the 1994 World Cup. There was a machine at MIT which forwarded the incoming video to an X display window

          xhost +machine.mit.edu
      

      and we could watch it from several states away. (The internet was so trusting in those days.)

      To be sure, it was only a couple of frames per second, but it was video, and an audience collected to watch it.

      > EV Cars (that anyone wanted to buy)

      People wanted to buy the General Motors EV1 in the 1990s. Quoting Wikipedia, "Despite favorable customer reception, GM believed that electric cars occupied an unprofitable niche of the automobile market. The company ultimately crushed most of the cars, and in 2001 GM terminated the EV1 program, disregarding protests from customers."

      I know someone who managed to buy one. It was one of the few which had been sold rather than leased.

    • ...TomTom just getting off the ground

      TomTom was founded in 1991 and released their first GPS device in 2004. By 2007 they were pretty well established.

    • I worked for a 3rd party food delivery service in the summer of 2007. Ordering was generally done by phone, then the office would text us (the drivers) order details for pickup & delivery. They provided GPS navigation devices, but they were stand-alone units that were slower & less accurate than modern ones, plus they charged a small fee for using it that came out of our pay.

    • Your post seems entirely anachronistic.

      2007 is the year we did get video streaming services: https://en.wikipedia.org/wiki/BBC_iPlayer

      Steam was selling games, even third party ones, for years by 2007.

      I'm not sure what a "VS-Code style IDE" is, but I absolutely did appreciate Visual Studio ( and VB6! ) prior to 2007.

      2007 was in fact the peak of TomTom's profit, although GPS navigation isn't really the same as general purpose mapping application.

      Grocery delivery was well established, Tesco were doing that in 1996. And the idea of takeaways not doing delivery is laughable, every establishment had their own delivery people.

      Yes, there are some things on that list that didn't exist, but the top half of your list is dominated by things that were well established by 2007.

      1 reply →

    • most of that list is iteration, not innovation. like going from "crappy colour printer" to "not-so-crappy colour printer"

    • >netflix

      >steam

      >Sublime (Of course ed, vim, emacs, sam, acme already existed for decades by 2007)

      >No they weren't TomTom already existed for years, GPS existed for years

      >You're right that they already existed

      >Again, already existed, glad we agree

      >Tech was already there just putting it in a phone doesn't count as innovation

      >NASA was driving electric cars on the moon while Elon Musk was in diapers

      >I was doing that in the early 80s, but Skype is a fine pre 2007 example thanks again >Your right we didn't have 4k displays in 2007, not exactly a software innovation. This is a good example of a hardware innovation used to sell essentially the same product >? Are you sure you didn't have a bad printer there have been good color printers since the 90s let alone 2007. The price to performance arguably hasn't changed since 2007 you are just paying more in running costs than upfront. >This is definitely hardware. Scripting language 3.0 or FOTM framework isn't innovative in that there is no problem being solved and no economic gain, if they didn't exist people would use something else and that would be that. With AI the big story was that there WASN'T a software innovation and that what few innovation do exist will die to the Bitter lesson

  • There has been a lot of innovation - but it is focused to some niche and so if you are not in a niche you don't see it and wouldn't care if you did. Most of the major things you need have already invented - I recall word processors as a kid, so they for sure date back to the 1970s - we still need word processors and there is a lot of polish that can be added, but all innovation is in niche things that the majority of us wouldn't have a use for if we knew about it.

    Of course innovation is always in bits and spurts.

I think its a bad argument though. If we had to stop with the features for a little while and created some breathing room, the features would come roaring back. There'd be a downturn sure but not a continuous one.

A subtext here may be his current AI work. In OP, Carmack is arguing, essentially, that 'software is slow because good smart devs are expensive and we don't want to pay for them to optimize code and systems end-to-end as there are bigger fish to fry'. So, an implication here is that if good smart devs suddenly got very cheap, then you might see a lot of software suddenly get very fast, as everyone might choose to purchase them and spend them on optimization. And why might good smart devs become suddenly available for cheap?

This is exactly the point. People ignore that "bloat" is not (just) "waste", it is developer productivity increase motivated by economics.

The ability to hire and have people be productive in a less complicated language expands the market for workers and lowers cost.