← Back to context

Comment by ghaff

6 months ago

I think it's complicated.

A lot of large US tech corporations do have sizable research arms.

Bell Labs is certainly celebrated as part of a telephone monopoly at the time though AT&T actually pulled out of operating system development related to Multics and Unix was pretty much a semi-off-hours project by Ritchie and Thompson.

It's true that you tend not to have such dominant firms as in the past. But companies like Microsoft still have significant research organizations. Maybe head-turning research advancements are harder than they used to be. Don't know. But some large tech firms are still putting lots of money into longer-term advances.

Yeah F# and Typescript are very impressive. We just got used to tonnes of innovation. It ain't UNIX but I'd say Typescript is as impressive. An exoskeleton for JS that rivals Haskell.

See also VSCode and WSL.

And if we ain't impressed with LLMs then wtf! I mean maybe it is just nostalga for the old times.

Lots of great stuff is coming out. Quantum computing. Open source revolution producing Tor, Bitcoin, Redis, Linux.

I think we are in the Golden age!

And it is not all from one place. Which is better.

  • The way SQL Server was ported to Linux for example, makes use of DrawBridge.

    .NET and Java also started as research projects, as did GraalVM, Maxime, LLVM, many GHC features, OCaml improvements,....

It makes sense that fundamental advances have become more rare as our cumulative scientific progress grows bigger. In order to contribute something fundamentally new, people have to travel further / climb higher in order to see past what has already been done.

But US tech corps are probably not as free to pursue lines of thinking with no link to financial benefit. Academia is pretty broken. People are generally not free, they are economically bound to think about things more grounded in practical needs. The shackles of the almighty dollar.

So maybe more rare as a function of multiple reasons?

  • Probably. My former employer had a pretty active research collaboration with a number of universities because they weren't that huge (especially at the time) and one professor in particular at a local university wanted the grounding of industry collaboration.

    It's a balancing act. Many PhD students also care about working on stuff that real people actually care about.

If innovation can be reliably reproduced, it wouldn't be innovation.

All stories about great inventions happen in the same way: being in the right place and the right time under the right circumstances. In other words, you can curate the "right" environment all you want, but you still need quite a bit of luck. And success does not imply it can be repeated.

Btw, the latest round of significant technological advancements came from a monopoly as well. After all it was mostly in Google that AI researchers did most of the pioneering work in AI in recent years.