Comment by dragonwriter
1 month ago
> > Grok based transformer
> Is Grok not an LLM?
Transformer is the underlying technology for (most) LLMs (GPT stands for “Generative Pre-Trained Transformer”)
1 month ago
> > Grok based transformer
> Is Grok not an LLM?
Transformer is the underlying technology for (most) LLMs (GPT stands for “Generative Pre-Trained Transformer”)
Right. How is this one based on Grok?