You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been reviewing the bidirectional LSTM used in this example because I'd like to apply the LRP technique to my own model. However, I noticed that the weights and biases in the Keras LSTM are different from here and I would like to verify my observations are correct.
In this example, there are weights on the LSTM outputs from the Left and Right LSTM. Is that the implementation of the Dense layer?
In this example, there are separate biases for the h_Left/h_Right arrays and the x_Left/x_Right arrays? Keras provides only a single bias array for the two and I was wondering whether there are different LSTM architectures being followed. I also noticed that alewarne (at https://github.com/alewarne/Layerwise-Relevance-Propagation-for-LSTMs/blob/master/lstm_network.py) implemented this code with a single bias array.
Thanks for providing this reference code - I appreciate it.
The text was updated successfully, but these errors were encountered:
I've been reviewing the bidirectional LSTM used in this example because I'd like to apply the LRP technique to my own model. However, I noticed that the weights and biases in the Keras LSTM are different from here and I would like to verify my observations are correct.
In this example, there are weights on the LSTM outputs from the Left and Right LSTM. Is that the implementation of the Dense layer?
In this example, there are separate biases for the h_Left/h_Right arrays and the x_Left/x_Right arrays? Keras provides only a single bias array for the two and I was wondering whether there are different LSTM architectures being followed. I also noticed that alewarne (at https://github.com/alewarne/Layerwise-Relevance-Propagation-for-LSTMs/blob/master/lstm_network.py) implemented this code with a single bias array.
Thanks for providing this reference code - I appreciate it.
The text was updated successfully, but these errors were encountered: