Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug of init bert-based-uncased #743

Open
jacazjx opened this issue Oct 17, 2024 · 0 comments
Open

Bug of init bert-based-uncased #743

jacazjx opened this issue Oct 17, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@jacazjx
Copy link

jacazjx commented Oct 17, 2024

model = AutoAdapterModel.from_pretrained('bert-base-uncased')

adapters-1.0.0 tokenizers-0.19.1 transformers-4.43.4

BUG:
RUN this code:

modules = {}
for n, m in model.named_modules():
    for np, p in m.named_parameters(recurse=False):
        if p is None:
            continue
        key = n + '.' + np
        if key in modules:
            assert id(p) == id(modules[key][0]), (n, np, p.shape, modules[key][0].shape)
            continue
        modules[key] = (p, m)

n_params = len(list(self.model.named_parameters()))   # There is a bug that named_parameters cannot print all parameters
assert len(modules) == n_params, (len(modules), n_params)

PRINT:
134 != 133

So there is a layer of parameters is not register
IT IS heads.default.3.weight
And I found its bias can print normally.

@jacazjx jacazjx added the bug Something isn't working label Oct 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant