Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error(s) in loading state_dict for DataParallel in generating_samples.py #43

Open
giusepperizzo opened this issue Mar 29, 2021 · 0 comments

Comments

@giusepperizzo
Copy link

giusepperizzo commented Mar 29, 2021

After a whole training with

$ python3 sourcecode/train.py --depth=7 --latent_size=256 --images_dir=DATASET --sample_dir=sampledir/exp2 --model_dir=modeldir/exp2

when launching

$ python3 sourcecode/generate_samples.py --generator_file=modeldir/exp2/GAN_GEN_100.pth --depth=7 --out_dir=outputdir

I got:

''
Creating generator object ...
Loading the generator weights from: modeldir/exp2/GAN_GEN_100.pth
Traceback (most recent call last):
File "sourcecode/generate_samples.py", line 134, in
main(parse_arguments())
File "sourcecode/generate_samples.py", line 105, in main
gen.load_state_dict(
File "~/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1051, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for DataParallel:
Missing key(s) in state_dict: "module.layers.7.conv_1.weight", "module.layers.7.conv_1.bias", "module.layers.7.conv_2.weight", "module.layers.7.conv_2.bias", "module.layers.8.conv_1.weight", "module.layers.8.conv_1.bias", "module.layers.8.conv_2.weight", "module.layers.8.conv_2.bias", "module.rgb_converters.7.weight", "module.rgb_converters.7.bias", "module.rgb_converters.8.weight", "module.rgb_converters.8.bias".
''

Both training and generation are executed on the same machine with multiple GPUs enabled and ready.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant