← Back to context

Comment by nerdsniper

6 days ago

Oh yeah, LLMs currently spew a lot of garbage. Everything has to be double-checked. I mainly use them for gathering sources and pointing out a few considerations I might have otherwise overlooked. I often run them a few times, because they go off the rails in different directions, but sometimes those directions are helpful for me in expanding my understanding.

I still have to synthesize everything from scratch myself. Every report I get back is like "okay well 90% of this has to be thrown out" and some of them elicit a "but I'm glad I got this 10%" from me.

For me it's less about saving time, and more about potentially unearthing good sources that my google searches wouldn't turn up, and occasionally giving me a few nuggets of inspiration / new rabbit holes to go down.

Also, Google changed their business from Search, to Advertising. Kagi does a much better job for me these days, and is easily worth the $5/mo I pay.

> For me it's less about saving time, and more about potentially unearthing good sources that my google searches wouldn't turn up, and occasionally giving me a few nuggets of inspiration / new rabbit holes to go down.

Yeah, I see the value here. And for personal stuff, that's totally fine. But these tools are being sold to businesses as productivity increasers, and I'm not buying it right now.

I really, really want this to work though, as it would be such a massive boost to human flourishing. Maybe LLMs are the wrong approach though, certainly the current models aren't doing a good job.