← Back to context

Comment by phaedrus

3 days ago

This framework is a good example of something I call, "there's plenty of room in the middle." (In analogy to Feynman's "plenty of room at the bottom" about nanotechnology.)

Much like how Cosmic Inflation in the early universe left the imprint of microwave background radiation written exponentially large across the sky, I believe the exponential expansion of computing during the era of Moore's law left whole sections of scale in software not-fully-explored.

Specifically, as "average app sizes" went from 1K - 10K - 100K - 1M - 10M - 100M+, you could imagine the 1K-10K got fully explored because the space wasn't that big to begin with, and the 1M - 10M range got, if not explored at least inhabited, but it could be that that middle range of "neat applications that weigh in ~100K+ but less than 1M" didn't get completely explored because we didn't spend enough time in that era of computing.

A similar thing may be playing out in AI scaling, where we went from tiny to medium to huge model sizes, but now some new participants are going back and doing neat things with medium sized models.