You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please refer to line557 in LoGoHead_kitti.py and line134 in attention_utils.py.
thanks ! and one more question, the input_sp_tensor of spconv.SparseConvTensor in backbone3d, its spatial shape is [41, 1600, 1408] , is it mean how many voxels on Z, Y, X axis respectively ?
hello,I have a question that where the code for Feature Dynamic Aggregation Module is ?
I have seen the code in LoGoHead_kitti.py
pooled_features = pooled_features + localgrid_densityfeat_fuse.permute(0, 2, 1)
and attention_output = pooled_features + attention_output
if this code means the three feature F p B + F l B + F g B, then I didn't see the self attetion moudle before
shared_features = self.shared_fc_layer(pooled_features.view(batch_size_rcnn, -1, 1))
The text was updated successfully, but these errors were encountered: