Comment by quickthrowman
3 hours ago
> I've often wondered how LLMs cope with basically waking up from a coma to answer maybe one prompt and then get reset, or a series of prompts.
The same way a light fixture copes with being switched off.
3 hours ago
> I've often wondered how LLMs cope with basically waking up from a coma to answer maybe one prompt and then get reset, or a series of prompts.
The same way a light fixture copes with being switched off.
Oh, these binary one layer neural networks are so useful. Glad for your insight on the matter.
By comparing an LLM’s inner mental state to a light fixture, I am saying in an absurd way that I don’t think LLMs are sentient, and nothing more than that. I am not saying an LLM and a light switch are equivalent in functionality, a single-pole switch only has two states.
I don’t really understand your response to my post, my interpretation is that you think LLMs have an inner mental state and think I’m wrong? I may be wrong about this interpretation.