Hands-on with OpenAI’s famous GPT-2 deep fake text AI

Mark Farragher
5 min readMar 19, 2019

By now you’ve probably read about OpenAI’s famous GPT-2 neural network which can produce utterly convincing ‘deep fake’ articles, stories, and reviews on virtually any subject.

I’ve always wanted to get my hands on an AI that can produce realistic text, so I couldn’t wait to take GPT-2 for a spin and run a few tests of my own.

There are 4 versions of GPT-2:

The full version that OpenAI is keeping under wraps is a monster of a neural network with 1.5 billion trained parameters and 48 layers.

What they’ve released is the smallest model with 117 million parameters and 12 neural network layers.

This small model is still very good, getting state-of-the-art results on the LAMBDA, CBT-CN, CBT-NE, and WikiText2 tests:

So let’s download the AI and have some fun!

GPT-2 runs on Python3, so the first thing I need to do is install it on my Mac using Homebrew:

$ brew install python3

Next up is downloading the GPT-2 code from GitHub: https://github.com/openai/gpt-2

And installing the package dependencies:

$ pip3 install fire
$ pip3 install regex
$ pip3 install requests
$ pip3 install tqdm

--

--