Recurrent language model implementations using numpy
- Vanilla RNN LM
- LSTM LM
- Original Seq2Seq
- Seq2Seq with Dot Product Attention
- Long Short-Term Memory, Sepp Hochreiter and Jürgen Schmidhuber.
- Effective Approaches to Attention-based Neural Machine Translation, Luong et al.
- Neural Machine Translation by Jointly Learning to Align and Translate, Bahdanau et al.
- Sequence to Sequence Learning with Neural Networks, Sutskever et al.
- This repository is developed and maintained by Yonghee Cheon ([email protected]).
- It can be found here: https://github.com/yonghee12/word2vec
- Linkedin Profile: https://www.linkedin.com/in/yonghee-cheon-7b90b116a/