Comment by dragonwriter
20 days ago
> > Grok based transformer
> Is Grok not an LLM?
Transformer is the underlying technology for (most) LLMs (GPT stands for “Generative Pre-Trained Transformer”)
20 days ago
> > Grok based transformer
> Is Grok not an LLM?
Transformer is the underlying technology for (most) LLMs (GPT stands for “Generative Pre-Trained Transformer”)
Right. How is this one based on Grok?