Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

attention weights should be reordered as well as the outputs in case inputs were sorted and packed when doing batch inference #25

Open
levhaikin opened this issue Nov 6, 2019 · 1 comment

Comments

@levhaikin
Copy link

if reorder_output:

during inference in batches, if reorder_output becomes True (if input_seqs are not of PackedSequence), then outputs are correctly reordered to match input order.

however, if return_attention is requested (set to True) then it is (and that's the bug) returned in an order that does not match the inputs.

a possible fix could be in the following section in the code:

if reorder_output:

similarly to how outputs are reordered - we can add the attention reordering code under that same if statement:

reordered_for_weights = Variable(att_weights.data.new(att_weights.size()))
reordered_for_weights[perm_idx] = att_weights
att_weights = reordered_for_weights

does that make sense?
am i missing anything?

@levhaikin levhaikin changed the title attention weights should be reordered as well as the outputs in case inputs were sorted and packed attention weights should be reordered as well as the outputs in case inputs were sorted and packed when doing batch inference Nov 6, 2019
@n0obcoder
Copy link

i added

reordered_for_weights = Variable(att_weights.data.new(att_weights.size())) reordered_for_weights[perm_idx] = att_weights att_weights = reordered_for_weights

after

if reorder_output:

but it doesnt seem to make any change in the ordering of the attention weights.

@levhaikin can you please help me with this, is case you have already solved this bug ?

Thanks in advance !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants