Comment by NetRunnerSu
10 days ago
In fact, what you need is a dynamic sparse live hyperfragmented Transformer MoE, rather than a product like RNN that is destined to be backward...
In terms of microbiology, the architecture of Transformer is more in line with the highly interconnected global receptive field of neurons
No comments yet
Contribute on Hacker News ↗