GPT-2’s training data is scraped from the internet, but with a twist: researchers scraped from outbound Reddit links that had more than three upvotes.
— Read on singularityhub.com/2019/03/07/openais-eerily-realistic-new-text-generator-writes-like-a-human/