Comment by FuriouslyAdrift
2 days ago
LLMs are lossy compression of a corpus with a really good parser as a front end. As human made content dries up (due to LLM use), the AI products will plateau.
I see inference as the much bigger technology although much better RAG loops for local customization could be a very lucrative product for a few years.
No comments yet
Contribute on Hacker News ↗