Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Details about the code #4

Open
Hon-Chen opened this issue Jan 4, 2024 · 2 comments
Open

Details about the code #4

Hon-Chen opened this issue Jan 4, 2024 · 2 comments

Comments

@Hon-Chen
Copy link

Hon-Chen commented Jan 4, 2024

Really solid work!
I noticed that the proportion of non-zero parameters was counted after quantization. What is the motivation for this?

quant_sequential(model, dataloader, device)
for n, p in model.named_parameters():
    print(n, torch.mean((p == 0).float()))
    if "fc2" in n:
        break
@hahnyuan
Copy link
Owner

hahnyuan commented Jan 4, 2024

Thank you for bringing up this observation. The inclusion of non-zero parameters after quantization in the code was initially implemented for debugging purposes. However, upon further evaluation, it has been determined that this code segment is no longer necessary and can be safely removed.

@Hon-Chen
Copy link
Author

Hon-Chen commented Jan 4, 2024

Thank you for bringing up this observation. The inclusion of non-zero parameters after quantization in the code was initially implemented for debugging purposes. However, upon further evaluation, it has been determined that this code segment is no longer necessary and can be safely removed.

Thanks for your reply, I also noticed that this code is unnecessary. But when I made some modifications to the mask search method, I found that the proportion of 0 values ​​has become higher. So I want to understand the meaning behind this piece of code, that is, what are the debugging purposes you mentioned here specifically? According to my understanding, the binarized weights will no longer appear "0", so is 0 brought by the salient weights?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants