Comment by yesco
3 months ago
I'd argue their brand might be too strong, ChatGPT has already begun to enter the same semantic space as "Velcro". Everyone I know seems to have tried it yet quickly you begin to realize that for most people ChatGPT == LLM, it seems everyone is using "ChatGPT" on completely different platforms.
In the end, regardless of technical understanding, people will always shop around on price if the feature set is similar enough I suppose.
The thing is, laypeople aren't using anything other than Google Search even for LLM answers.
If I want an LLM answer to "is erythritol bad for you", I'm not firing up ChatGPT. I'm just typing it into Google, and the LLM answer it spits out is pretty good.
ChatGPT needs to be significantly more compelling for most people to use it for one-shot LLM answers over Google Search. And the minute Google removes the one-shottedness of its search answers, it's over for ChatGPT.
Imo ChatGPT is just "a feature not a product", in the search engine space, as the adage goes.
I have no idea on the data, but anecdotally I'd dispute this. Regular folks in my life (i.e. non-techies) routinely talk to an LLM and use it to answer questions that 5 years ago they'd have searched for.
The techies are the ones who are steering clear and sticking with search engines in my experience.
It's a Google without ads right now. Worth using until that day comes
ChatGPT though isn’t where the profit id going to come from. Businesses using LLMs are and Amazon (AWS) is not selling access to Bedrock and neither is Google (GCP). Models are becoming a commodity. *Every* implementation I’ve done one of the requirements is to easily be able to switch between multiple models