Comment by gessha
7 months ago
I believe this is a case of “20% of the work requiring 80% of the effort”. The current progress on LLMs and products that build on top of them is impressive but I’ll believe the blog’s claims when we have solid building blocks to build off of and not APIs and assumptions that break all the time.
The volume of kool aid surrounding this industry is crazy to me. It’s truly ruining an industry I used to have a lot of enthusiasm for. All we have left is snake oil salesmen, like the Salesforce CEO telling lies about no longer hiring software engineers while they have over 900 software engineering roles on their careers page.
This entire blog article talked about this failed almost completely with just about zero tangible success, hand waved away with “clear paths” to fix it.
I’m just kind of sitting here stunned that the basic hallucination problem isn’t fixed yet. We are using a natural language interface tool that isn’t really designed for doing anything quantitative and trying to shoehorn in that functionality by begging the damn thing to coorperate by tossing in more prompts.
I perused Andon Labs’ page and they have this golden statement:
> Silicon Valley is rushing to build software around today's AI, but by 2027 AI models will be useful without it. The only software you'll need are the safety protocols to align and control them.
That AI 2027 study that everyone cites endlessly is going to be hilarious to witness fall apart in embarrassment. 2027 is a year and a half away and these scam AI companies are claiming that you won’t even need software by then.
Insanely delusional, and honestly, the whole industry should be under investigation for defrauding investors.
it seems like recent trends end up like this... its like we are desperate for any kind of growth and its causing all kinds of pathologies with over-promising and over-investing...
Not just recent. All hype cycles are like this.