← Back to context

Comment by sealeck

1 month ago

I think you're suffering from some survivorship bias here. There are lot of technologies that don't work out.

Computation isn't one of them so far. Do you believe this is the end of computing efficiency improvements?

  • No, but there's really very little reason to think that that makes the ol' magic robots less shit in any sort of well-defined way. Like, it certainly _looks_ like they've plateaued.

    I often suspect that the tech industry's perception of reality is skewed by Moore's Law. Moore's Law is, quibbles aside, basically real, and has of course had a dramatic impact on the tech industry. But there is a tendency to assume that that sort of scaling is _natural_, and the norm, and should just be expected in _everything_. And, er, that is not the case. Moore's Law is _weird_.