← Back to context

Comment by somenameforme

9 days ago

Setting the temperature to 0 doesn't make an LLM non-probabilistic. Once again, LLMs are inherently probabilistic. All setting the temperature to 0 does is make it always choose the highest probability token instead of using a weighted randomization. You'll still get endless hallucinations and the same inherent limitations, including the inability to reliably simulate a turing machine.

As for the rest of your post, you are again, consciously or not, trying to say 'give me a calculable function that isn't a calculable function.' I obviously agree that the idea of trying to 'calculate' consciousness is essentially a non-starter. That's precisely the point.