Text Generation for Char LSTM models
▻https://hackernoon.com/text-generation-for-char-lstm-models-685dc186e319?source=rss----3a8144ea
Train a character-level language model on a corpus of jokes.I decided to experiment with approaches to this problem, which I found on #openai’s Request for Research blog. You can have a look at the code here. This is written in Pytorch, and is heavily inspired by Fast.ai’s fantastic lesson on implementing RNN’s from scratch.Data preparation I started off using the dataset provided by OpenAI. The data was converted to lowercase and for an initial run, I selected the top rated jokes, with a word length of less than 200. Here’s an example of all the tokens encountered:Explicit words ahead! This particular dataset has explicit words/content, so those come up in the output predictions of the model. Another interesting problem to work on would be to filter out inappropriate words from the output (...)
#programming #machine-learning #artificial-intelligence #python