Comment by joaogui1
1 day ago
Mixture of Experts isn't using multiple models with different specialties, it's more like a sparsity technique, where you massively increase the number of parameters and use only a subset of the weights in each forward pass.
No comments yet
Contribute on Hacker News ↗