Comment by sailingparrot
2 months ago
Just for training and processing the existing context (pre fill phase). But when doing inference a token t has to be sampled before t+1 can so it’s still sequential
2 months ago
Just for training and processing the existing context (pre fill phase). But when doing inference a token t has to be sampled before t+1 can so it’s still sequential
No comments yet
Contribute on Hacker News ↗