Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PosEmbed bug when the T_out > T_in #49

Open
zuliangfang opened this issue Jun 8, 2023 · 1 comment
Open

PosEmbed bug when the T_out > T_in #49

zuliangfang opened this issue Jun 8, 2023 · 1 comment
Labels
bug Something isn't working

Comments

@zuliangfang
Copy link

zuliangfang commented Jun 8, 2023

image
Here we have to ensure the T <= self.maxT.

But it is not satisfied when the T_out > T_in. so we will get error on PosEmbded of decoder.
image
Here the self.mem_shape[i][0] = T_in = self.maxT, but the actual T in PosEmbed is T_out. When T_out > T_in, we will get error.

@gaozhihan
Copy link
Contributor

gaozhihan commented Jun 11, 2023

Thank you very much for pointing it out! it's fixed in #51.

if self.hierarchical_pos_embed:
self.hierarchical_pos_embed_l = nn.ModuleList([
PosEmbed(embed_dim=self.mem_shapes[i][-1], typ=pos_embed_type,
maxT=target_temporal_length, maxH=self.mem_shapes[i][1], maxW=self.mem_shapes[i][2])
for i in range(self.num_blocks - 1)])

@gaozhihan gaozhihan added the bug Something isn't working label Jun 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants