Bonus: A neural Gothic
I’ve been having fun generating stuff using the 117-million parameter GPT-2 model from OpenAI. I’ve written about it before - basically it’s a neural net-powered text generator that learned from internet text. It learned a HUGE variety of text.
And apparently, that included lots of fiction, including public-domain novels from the 1800s and earlier. With …