← Back to context

Comment by Deklomalo

4 hours ago

You state a lot of things without testing it first?

A LLM has structures in its latent space which allows it to do basic math, it has also seen enough data that it has probably structures in it to detect basic trends.

A LLM doesn't just generate a stream of tokens. It generates an embedding and searches/does something in its latent space, then returns tokens.

And you don't even know at all what LLM Interfaces do in the background. Gemini creates sub-agents. There can easily be already a 'trend detector'.

I even did a test and generated random data with a trend and fet it to chatgpt. The output was very coherent and right.