Comment by anarmorarm
2 days ago
Edit: GPT-2, not GPT-2 Medium. The 2nd paragraph should read:
"With 23.8 PPL on WikiText-103, WaveletLM beats both GPT-2, which was trained on 80× more data, and Transformer-XL Standard, which..."
2 days ago
Edit: GPT-2, not GPT-2 Medium. The 2nd paragraph should read:
"With 23.8 PPL on WikiText-103, WaveletLM beats both GPT-2, which was trained on 80× more data, and Transformer-XL Standard, which..."
No comments yet
Contribute on Hacker News ↗