notebook with multiiple attention techniques (bahdanau and Luong) with different alignment model
for local attention followed these repo https://github.com/uzaymacar/attention-mechanisms
Taken help from these paper
for Luong https://arxiv.org/abs/1508.04025
for bahdanau https://arxiv.org/abs/1409.0473
---> Attention architecture used are (bahdanau and Luong)
---> Alignment model approaches used are (Dot,general,concat)