Comment by written-beyond
10 hours ago
Gemini 1.5 Announced the 1 million token context window in 2024. I admire this view of being forward looking towards new technologies, specially when we see the history of how bad people can be at predictions just by looking at history HN posts/comments.
If we look at back 2 years, companies weren't investing into training their LLMs so heavily on code. Any code they got their hands on was what was in the LLMs training corpus, it's well known that the most recent improvements in LLM productivity occurred after they spent millions on different labs to produce more coding datasets for them.
So while LLMs have gotten a lot better at not needing the entire codebase in context at once, because their weights are already so well tuned to development environments they can better infer and index things as needed. However, I fail to see how the context window limitation would no longer be an issue since it's a fundamental part of the real world. Would we get better and more efficient ways of splitting and indexing context windows? Surely. Will that reduce our fear of soiling our contexts with bad prompt response cycles? Probably not...
No comments yet
Contribute on Hacker News ↗