This work is a collaboration with GPT-2, a neural network model designed to predict the next word in a block of given text based on its study of eight million web pages. In this application, I input a text file of my own prose from the past twenty years into GPT-2. It then generated new writing in a similar style. I selected, arranged, and lightly edited the resulting output.
Web browser | 01.10.2020 |