Comment by stickynotememo

20 days ago

> Grok based transformer

Is Grok not an LLM? Or do they have other models under that brand?

> > Grok based transformer

> Is Grok not an LLM?

Transformer is the underlying technology for (most) LLMs (GPT stands for “Generative Pre-Trained Transformer”)