Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

您好,我感觉您的_inverted_residual_block好像有问题? #14

Open
LoveSophia opened this issue Apr 20, 2019 · 4 comments
Open

Comments

@LoveSophia
Copy link

按照大佬您在简书上的描述:
+ 每个 bottleneck 重复 n 次
+ 相同序列所有层具有相同数量的输出通道
+ 每个序列的第一层使用步长 s ,其他所有层使用步长 1
+ 所有空间卷积使用 3 * 3 内核
+ 扩展因子 t 始终应用于输入大小,若输入某层的tensor的通道数为k,那么用在这一层上的filters数就为 k * t

可是您的代码是:

def _inverted_residual_block(inputs, filters, kernel, t, strides, n):
    x = _bottleneck(inputs, filters, kernel, t, strides)

    for i in range(1, n):
        x = _bottleneck(x, filters, kernel, t, 1, True)

    return x

可是按照描述不应该是这样嘛/

def _inverted_residual_block(inputs, filters, kernel, t, strides, n):
   x = _bottleneck(x, filters=filters, kernel=kernel, t=t, s=strides)

    for i in range(1, n):
        x = _bottleneck(x, filters, kernel, t=1, s=1, shortcut=True)

    return x
@LoveSophia
Copy link
Author

区别就是for循环那里,t=t改成了t=1,是我理解错了嘛?

@xiaochus
Copy link
Owner

扩展因子t的大小跟上述条件均没有关系,每一层inverted_residual_block的t的大小由作者在论文中的网络结构参数表给出。

@q739554534
Copy link

您好,请问这个模型保存后怎么加载呢? 能看下您的测试代码嘛

@LoveSophia
Copy link
Author

LoveSophia commented Apr 27, 2021 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants