Comment by tyleo
9 hours ago
TBH "The Oracle" personality doesn't always bother me. There's a bunch of stuff that gets asked in corporate chat that makes me think, "does this person know how to use Google?" I think the same can be said for the chatbots at this point.
I feel like we just need the equivalent of "Let me Google That" for LLMs.
Interesting, that's the one I hate the most.
Caveat that I'm talking about when it's something that's in depth or requires some nuance beyond just a surface level information. They don't know themselves, so rather than saying "oh, sorry I don't know, this document or person might help."
As an example I might post in our teams chat, that I've seen an issue on our physical hardware, there is a step change in the telemetry (increased vibration or some other erratic behavior on a thermocouple). Has anyone had experience with what can cause this on this type of installation.
Then you get a Copilot paste of the prompt "what causes high vibration on rotating machinery".
If somebody clearly asks a question that could just be answered by Google or a LLM prompt quickly then fair enough, but I'm after specific product knowledge from our technical team. In self reflection I might be very specific in my question so other members of the team can't go down that route as easily.
"Let me google that" already mostly gets you LLM output, because google is fairly ruined at this point. Also, if someone asks me a question, I think it would be rude of me to respond with "Maybe I know the answer, maybe I don't, but why don't you send your question to a machine that will give you an answer that just might be right by coincidence?"