Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: some parameters appear in more than one parameter group #169

Open
rohit901 opened this issue Mar 8, 2021 · 2 comments
Open

Comments

@rohit901
Copy link

rohit901 commented Mar 8, 2021

Hi, I was trying to run the network code on some custom data and I'm getting this error ValueError: some parameters appear in more than one parameter group, when I try to initialize my optimizer.

I'm running the code on Jupyter notebook and hence I have only took parts of code which are necessary to build the backbone, ArcFace, bottleneck_ir_se, ir.

After I put this code in my cell:

optimizer = optim.SGD([
    {'params': paras_wo_bn + [head.kernel], 'weight_decay': 5e-4},
    {'params': paras_only_bn}
], lr = lr, momentum = momentum)

I get the following error:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-15-7f153141da86> in <module>
----> 1 optimizer = optim.SGD([
      2     {'params': paras_wo_bn + [head.kernel], 'weight_decay': 5e-4},
      3     {'params': paras_only_bn}
      4 ], lr = lr, momentum = momentum)

~/anaconda3/lib/python3.8/site-packages/torch/optim/sgd.py in __init__(self, params, lr, momentum, dampening, weight_decay, nesterov)
     66         if nesterov and (momentum <= 0 or dampening != 0):
     67             raise ValueError("Nesterov momentum requires a momentum and zero dampening")
---> 68         super(SGD, self).__init__(params, defaults)
     69 
     70     def __setstate__(self, state):

~/anaconda3/lib/python3.8/site-packages/torch/optim/optimizer.py in __init__(self, params, defaults)
     50 
     51         for param_group in param_groups:
---> 52             self.add_param_group(param_group)
     53 
     54     def __getstate__(self):

~/anaconda3/lib/python3.8/site-packages/torch/optim/optimizer.py in add_param_group(self, param_group)
    251 
    252         if not param_set.isdisjoint(set(param_group['params'])):
--> 253             raise ValueError("some parameters appear in more than one parameter group")
    254 
    255         self.param_groups.append(param_group)

ValueError: some parameters appear in more than one parameter group

Can anyone help me know how can I resolve this? Sorry if it's trivial, new to PyTorch.

@maywander
Copy link

我遇到了同样的问题,发现问题的原因是数据并行和optim.SGD顺序颠倒。
当编写为
model=Backbone()
if torch.cuda.device_count() > 1:
model = nn.DataParallel(model,device_ids=[0,1,2,3]
model.to(conf.device)
optimizer = optim.SGD(,,,,,,,)
发生报错
————————
当改成下面结构后报错消失
model=Backbone()
optimizer = optim.SGD(,,,,,,,)
if torch.cuda.device_count() > 1:
model = nn.DataParallel(model,device_ids=[0,1,2,3]
model.to(conf.device)

@arminbiglari
Copy link

arminbiglari commented Feb 3, 2024

Hello, i resolved the problem by changing the code to

model = MobileFaceNet(embedding_size).to(device)

head = Arcface(embedding_size=embedding_size, classnum=6056).to(device)
head_params = [param for name, param in head.named_parameters()]

# Manually create parameter groups
paras_only_bn = [param for name, param in model.named_parameters() if 'batchnorm' in name.lower()]
paras_wo_bn = [param for name, param in model.named_parameters() if 'batchnorm' not in name.lower()]

# Specify the parameters and groups for the optimizer
optimizer = optim.SGD([
    {'params': paras_wo_bn[:-1], 'weight_decay': 4e-5},
    {'params': [paras_wo_bn[-1]] + head_params, 'weight_decay': 4e-4},
    {'params': paras_only_bn, 'weight_decay': 4e-4}
], lr=learning_rate, momentum=momentum)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants