Text generation via RNNs (n-gram) and Transformers.
just head into Text_Generation.ipynb
Summary of this notebook:
- Tokenization
- Tokenize with BPE (from scratch)
- Tokenize with Hazm (Pretrained)
- N-gram RNN Language Model
- Transformer Language Model (TLM)