Comment by dragonwriter
7 hours ago
> > Grok based transformer
> Is Grok not an LLM?
Transformer is the underlying technology for (most) LLMs (GPT stands for “Generative Pre-Trained Transformer”)
7 hours ago
> > Grok based transformer
> Is Grok not an LLM?
Transformer is the underlying technology for (most) LLMs (GPT stands for “Generative Pre-Trained Transformer”)
No comments yet
Contribute on Hacker News ↗