Comment by sailingparrot
4 days ago
Just for training and processing the existing context (pre fill phase). But when doing inference a token t has to be sampled before t+1 can so it’s still sequential
4 days ago
Just for training and processing the existing context (pre fill phase). But when doing inference a token t has to be sampled before t+1 can so it’s still sequential
No comments yet
Contribute on Hacker News ↗