You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First of all, thank you for sharing the code of your work with us.
The publication states that Fast-TGCN uses approx. 4.13 million learnable parameters. However, if I use the code published here, it uses approx. 24.44 million learnable parameters.
defcount_parameters(model):
returnsum(p.numel() forpinmodel.parameters() ifp.requires_grad)
model=Baseline()
print(f"Nr of parameters in million: {count_parameters(model)/1e6}")
Can you tell me if you used a different number of parameters in the experiments you conducted?
Thanks
The text was updated successfully, but these errors were encountered:
First of all, thank you for sharing the code of your work with us.
The publication states that Fast-TGCN uses approx. 4.13 million learnable parameters. However, if I use the code published here, it uses approx. 24.44 million learnable parameters.
Can you tell me if you used a different number of parameters in the experiments you conducted?
Thanks
The text was updated successfully, but these errors were encountered: