Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

模型文件大 #15

Open
baisong666 opened this issue Jul 3, 2021 · 14 comments
Open

模型文件大 #15

baisong666 opened this issue Jul 3, 2021 · 14 comments

Comments

@baisong666
Copy link

第一步训练生成的4个模型文件的大小为256M,第二步生成的模型文件大小为500M左右,请问为啥会折磨大呢,这不是要部署在手机上吗,这样大的模型文件怎麼部署呢,请作者看到后能不能回复一下,万分感谢

@baisong666
Copy link
Author

你好,可以看看我的问题吗

@baisong666
Copy link
Author

可以看看不

@baisong666
Copy link
Author

救命阿

@baisong666
Copy link
Author

help

@nightsnack
Copy link
Owner

nightsnack commented Jul 13, 2021

看到了。第二步的模型文件里面存了还有optimizer log之类的东西所以大了。

torch.save(chkpt, last)

,你改一下。这里只存model。部署手机需要compiler,compiler没有开源

@baisong666
Copy link
Author

是改成torch.save(last) 和 torch.save(best)吗

@baisong666
Copy link
Author

baisong666 commented Jul 14, 2021 via email

@baisong666
Copy link
Author

chkpt = {#'epoch': epoch,
#'best_fitness': best_fitness,
#'training_results': f.read(),
'model': ema.ema.module.state_dict() if hasattr(model,
'module') else ema.ema.state_dict()
#'optimizer': None if final_epoch else optimizer.state_dict()
}

@baisong666
Copy link
Author

是这样把其他的注释掉吗

@baisong666
Copy link
Author

预训练模型选择哪一个

@baisong666
Copy link
Author

help

@nightsnack
Copy link
Owner

256M有什么问题吗?这是存的.pt,pruning之后sparse的model里面0也是存下来的呀。在手机上运行的时候才会重新按照index-val的方式重新存储

@baisong666
Copy link
Author

啥意思哈哈,是咱这个程序训练出来就是256M吗

@baisong666
Copy link
Author

可不可以具体说一说呢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants