← Back to context

Comment by Lexiesst

2 years ago

LLM with temperature 0 will always return the same output for the same input.

Considering this, LLMs are Markov Chains, its just the output sequence as a whole can be considered as an element and whole context can be considered as "one previous element".

So, the whole block of the text on input is previous element, and whole block of the text on the output is the next element.

Isn't it?