-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
the computational consumption may be underestimated #5
Comments
Thank you very much! |
It's right. The ESDNet still has small cost and much better performence than other methods although calculate by tensorflow1 toolkits. Again, thanks for your code-sharing. |
I check my tf code again. There is a different kernel size in SAM module. Here is the result after correcting.
|
Yup! this result looks like more normal. |
Hi,
I tried to migrate the model to tensorflow1 and used a 512*512*3 picture to calculate the FLOPs and parameters of the tf1 model with tf1's toolkits. Here is the result:
For comparison, I build a DMCNN model with tensorflow1, Here is its FLOPs and parameters:
which is smaller than the ECDNet( ECDNet has smaller MACs and params than DMCNN as shown in the paper).
tf1 will calculate the computation cost of up-sample operation(e.g.
tf.image.resize_bilinear
), while most toolkits of pytorch(e.g. thop) will ignore the cost of some operations(e.g.F.interpolate
) which haven't be pre-defined.As every
SAM
module has more than oneF.interpolate
, I think the computational consumption is underestimated and this computation cost cannot be ignore for fair comparison.The performance of this work is so impressing. Thanks for your sharing! :)
The text was updated successfully, but these errors were encountered: