-
Notifications
You must be signed in to change notification settings - Fork 161
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,我感觉您的_inverted_residual_block好像有问题? #14
Comments
区别就是for循环那里,t=t改成了t=1,是我理解错了嘛? |
扩展因子t的大小跟上述条件均没有关系,每一层inverted_residual_block的t的大小由作者在论文中的网络结构参数表给出。 |
您好,请问这个模型保存后怎么加载呢? 能看下您的测试代码嘛 |
您好,能把git地址发下嘛?我咋没印象呢?
------------------ 原始邮件 ------------------
发件人: "xiaochus/MobileNetV2" ***@***.***>;
发送时间: 2021年4月21日(星期三) 晚上10:43
***@***.***>;
***@***.******@***.***>;
主题: Re: [xiaochus/MobileNetV2] 您好,我感觉您的_inverted_residual_block好像有问题? (#14)
您好,请问这个模型保存后怎么加载呢? 能看下您的测试代码嘛
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
按照大佬您在简书上的描述:
+ 每个 bottleneck 重复 n 次
+ 相同序列所有层具有相同数量的输出通道
+ 每个序列的第一层使用步长 s ,其他所有层使用步长 1
+ 所有空间卷积使用 3 * 3 内核
+ 扩展因子 t 始终应用于输入大小,若输入某层的tensor的通道数为k,那么用在这一层上的filters数就为 k * t
可是您的代码是:
可是按照描述不应该是这样嘛/
The text was updated successfully, but these errors were encountered: