Skip to content

JY-H/character-level-rnn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 

Repository files navigation

character-level-rnn

This script implements a multi-layer LSTM for training character-level language models. The model learns to predit the next character in a sequence.

The language model was initially describe in this blogpost, and written in Torch. I have rewritten the model in Keras.

Training

  • Currently the model trains on 2-layered LSTM with 512 hidden nodes, with a batch size of 100 and 20 character length.
  • After 60 iterations, the model has learned basic English phrases and puncutation, but not Shakespearean prose. I have changed the model to be 3-layered, 512 hidden nodes, batch size of 100 and 60 character length.

About

character-level model (layered LSTM) using Keras

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages