Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Architecture differences: Keras LSTM architecture to LSTM_bidi.py #13

Open
memphisbarbecue opened this issue Dec 17, 2021 · 0 comments
Open

Comments

@memphisbarbecue
Copy link

memphisbarbecue commented Dec 17, 2021

I've been reviewing the bidirectional LSTM used in this example because I'd like to apply the LRP technique to my own model. However, I noticed that the weights and biases in the Keras LSTM are different from here and I would like to verify my observations are correct.

  1. In this example, there are weights on the LSTM outputs from the Left and Right LSTM. Is that the implementation of the Dense layer?

  2. In this example, there are separate biases for the h_Left/h_Right arrays and the x_Left/x_Right arrays? Keras provides only a single bias array for the two and I was wondering whether there are different LSTM architectures being followed. I also noticed that alewarne (at https://github.com/alewarne/Layerwise-Relevance-Propagation-for-LSTMs/blob/master/lstm_network.py) implemented this code with a single bias array.

Thanks for providing this reference code - I appreciate it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant