Beginner’s Guide to Retrain GPT-2 (117M) to Generate Custom Text Content

Ng Wai Foong
May 13, 2019 · 17 min read
Image taken from https://openai.com/blog/better-language-models/

In this article, we will be exploring the steps required to retrain GPT-2 (117M) using custom text dataset on Windows. For start, GPT-2 is the advanced version of a transformer-based model that was trained to generates synthetic text samples from a variety of user-prompts as input. Check out the official blog post to find out more about GPT-2: