Comment by sli

8 days ago

I've had a lot of fun training Markov chains using Simple English Wikipedia. I'm guessing the restricted vocabulary leads to more overlapping sentences in the training data. Anything too advanced or technical has too many unique phrases and the output degrades almost immediately.