FUN WITH RECURRENT NEURAL NETWORKS:

Fake pop cultural catchphrases that demonstrate the limits of characterwise text generation

(If you aren't already familiar with recurrent neural networks, why not see Andrej Karpathy's excellent blog?)

These days, thanks to The Wonders of Science[TM], we can train neural networks to imitate different styles of text by showing them some examples. Often the results are gibberish, but occasionally in this gibberish there is a nugget of... less gibberish. There are many fine Python libraries out there to let one run RNN experiments: I am using textgenrnn, and fine-tuning its stock model on data of my own whimsical fancy. Here is a selection of the most interesting, perplexing, or otherwise notable outputs.

We've sure had some good times with neural networks, boy howdy. We've seen them learn some interesting patterns, and we've met a lot of sluggles on the way. But all of the examples we've looked at so far have involved relatively small examples of text to learn from and emulate: titles, colors, words, the true nature of consciousness and its ultimate place in eternity and infinity. These networks have generated text one character at a time, based on the last few. On these short examples, this approach has worked quite well! Many of the results are at least plausible, and for many more you can at least get a sense of what the network was up to. But what are its limits? It can't remember what it was doing too far into the past, characterwise. It has no understanding of semantics. If we use this approach to learn from longer pieces of text with more grammar, like whole sentences, it seems likely that each word or couple of words of the machine imitation on their own will be reasonable but one should expect that the longer it runs, the less coherent it will become.

To test this hypothesis, I scraped together several hundred pop cultural catchphrases from film and television, and let a network chew on them. Some were only a word or two; others were a few sentences. Surely enough, the network got a little out of its depth. The words are mostly ... words ... but it simply cannot maintain enough context to produce a convincing sentence.

Feel free to adopt these catchphrases as your own. Try them out, at parties, with your family, or at a solemn religious observance.

There was one in particular though that really spoke to me: