Comment by embedding-shape
7 hours ago
Alternatively, straight from Wikimedia, those are the dumps I'm using, trivial to parse concurrently and easy format to parse too, multistream-xml in bz2. Latest dump (text only) is from 2026-01-01 and weights 24.1 GB. https://dumps.wikimedia.org/enwiki/20260101/ Also have splits together with indexes, so you can grab few sections you want, if 24GB is too large.
No comments yet
Contribute on Hacker News ↗