Comment by anon7000
20 hours ago
You’re hitting at the core problem. Experts have done the intensive research to create guides on the Internet which ChatGPT is trained on. For example, car repairs. ChatGPT can guide you through a lot of issues. But who is going to take the time to seriously investigate and research a brand new issue in a brand new model of car? An expert. Not an AI model. And as many delegate thinking to AI models, we end up with fewer experts.
ChatGPT is not an expert, it’s just statistically likely to regurgitate something very similar to what existing experts (or maybe amateurs or frauds!) have already said online. It’s not creating any information for itself.
So if we end up with fewer people willing to do the hard work of creating the underlying expert information these AI models are so generously trained on, we see stagnation in progress.
So encouraging people to write books and do real investigative research, digging for the truth, is even more important than ever. A chatbot’s value proposition is repackaging that truth in a way you can understand, surfacing it when you might not have found it. Without people researching the truth, that already fragile foundation crumbles.
> You’re hitting at the core problem.
Are you writing in the style of an LLM as a gag, or just interacting with LLM's so much it's become engrained?