Comment by lagrange77
2 days ago
> But with Attention mechanism
I would think LeCun was aware of that. Also prior sequence to sequence models like RNNs have already incorporated information about the further past.
2 days ago
> But with Attention mechanism
I would think LeCun was aware of that. Also prior sequence to sequence models like RNNs have already incorporated information about the further past.
No comments yet
Contribute on Hacker News ↗