Comment by maest
6 years ago
I'm not at all familiar with arithmetic encoding (or adaptive version tehreof), but, after reading some guides, it seems to me that the novel thing here is using GPT2 to somehow generate a character probability distribution?
The theory being that GPT2 should have a distribution closely matching "reality" and thus minimizing the output size?
No comments yet
Contribute on Hacker News ↗