-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About the pretrain #17
Comments
Many thanks for your attention to our work. Would you please share more details? Pre-training logs, settings, and others? |
Just using the command sh train.sh in this repo. |
Hi, would you please share your pre-training logs and the downstream performances? |
Here is the log and MSD results. https://www.dropbox.com/scl/fo/do7vnjaxxi9vidhus7gbz/AGH6SCISNnszLzMp5ieDACY?rlkey=ofhh43b4qnm5b4ae80c19rgyh&st=m189nq8m&dl=0 |
Many thanks for your sharing! These results are very valuable. It seems that there are no problems with the pre-training and our pre-training checkpoint can indeed improve the performances. |
I use the Monai framework as in https://github.com/Project-MONAI/research-contributions/tree/main/SwinUNETR/BTCV. Looking forward to your powerful version, much appreciation. |
Dear researchers, our work is now available at Large-Scale-Medical, if you are still interested in this topic. Thank you very much for your attention to our work, it does encourage me a lot! |
Hi, thank you for this great work! We used the pre-training code and data provided in the current repository to run pre-training, but the performance on downstream tasks was not as strong as the VoCo_10k.pt model you released. We tested and compared with the VoCo_10k.pt model on several segmentation tasks in the MSD. Since the publicly released code uses a teacher-student training approach, we tested the weights of the teacher and student models separately, and neither performed as well as VoCo_10k.pt. Are there differences in the default training configurations? How can we reproduce a model as strong as the VoCo_10k.pt?
The text was updated successfully, but these errors were encountered: