Comment by lagrange77
1 day ago
> But with Attention mechanism
I would think LeCun was aware of that. Also prior sequence to sequence models like RNNs have already incorporated information about the further past.
1 day ago
> But with Attention mechanism
I would think LeCun was aware of that. Also prior sequence to sequence models like RNNs have already incorporated information about the further past.
No comments yet
Contribute on Hacker News ↗