Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Results are not as good as the pretrained network after a train with the pretrained option. #52

Open
Leonard-GEHIN opened this issue Jun 12, 2020 · 0 comments

Comments

@Leonard-GEHIN
Copy link

I got an FOTS checkpoint, trained from scratch, which has pretty good results on my database.
I want to finetune this checkpoint: when I try to use the pretrained option, everything goes smoothly. After the training is complete, I test the checkpoint with the main_test.py script, and the detection is really bad.

For example, on an image with 10 texts, correctly detected by the correctly trained FOTS, there is 2000 detection before nms on the finetuned FOTS.

I tried to finetune with only one step with a low learning rate to not change weights to much for the purpose of testing, but results were the same: 2000 detection when testing.

Did you ever got in a situation like this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant