Releases: lucidrains/x-transformers
Releases · lucidrains/x-transformers
0.8.2
add an assert for relative positional keyword arguments
0.8.1
fix residual gating
0.8.0
add gating at residuals, from deepminds paper for stabilizing txl for…
0.7.4
fix bug with prenorm, introduced when adding residual attention
0.7.3
allow floats for ff_mult
0.7.2
fix bug with default empty memories for txl
0.7.1
fix some more issues with T5 rel pos bias, thanks to @adrian-spataru
0.7.0
untie embedding, after learning T5 switched to untied classifier weig…
0.6.7
bump bug fix release
0.6.6
avoid potential issue with PAR and transformer-xl recurrence