Bonus: I dare not meet the emperor
GPT-2 is great at completing lists. I’ve compiled several more here.
They’re a mix of outputs from the tiniest version of GPT-2 (117M) and the second tiniest version of GPT-2 (345M). The neural net was pretrained by OpenAI on a huge selection of webpages, and then I used a colaboratory notebook that Max Woolf set up to do the prompts.
Prompts are in bold.
…