Skip to content

Latest commit

 

History

History
5 lines (3 loc) · 382 Bytes

README.md

File metadata and controls

5 lines (3 loc) · 382 Bytes

Language-Modelling (Generating english language with LSTM-RNN)

This model uses Long Short-term Memory recurrent neural networks to generate complex sequences with long-range structure, simply by predicting one data point at a time.

Here I have trained my model on Sir Arthur Conan Doyle's:Sherlock Holmes and thus the model generates text similar to the writing of the author.