-
Notifications
You must be signed in to change notification settings - Fork 139
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[feature request] Reproducible trainings #580
Comments
I just found my previous issue #302 for that. Is this an eScriporium issue which does not use the kraken API correctly? |
On 24/03/17 02:16PM, Stefan Weil wrote:
I just found my previous issue #302 for that. Is this an eScriporium issue which does not use the kraken API correctly?
We never really tried to make eScriptorium reproducible and it is
currently not possible to make training 100% reproducible because of
cuda/cudnn limitations. You can try the deterministic training switch on
ketos but you'll still see differences between machines/library
versions/phase of the moon.
|
I currently struggle with eS trainings which end with a model which claims to have 100 % accuracy although all epochs show accuracies lower than 99 %. When I export the final model and examine its metadata, I can see that it is always the model from epoch 0 (eS starts counting the epochs with 0, so it's the result from the first epoch). |
Hmm, you can set |
Ideally, a training process should be reproducible, as this is required by good scientific practice.
Currently the kraken training is not reproducible. Two recognition trainings with the same ground truth and the same base model give different results (number of epochs, accuracies for the different intermediate models).
eScriptorium shuffles the ground truth randomly, but always uses the same seed, so the resulting training and validation sets are reproducible. But it looks like the training shuffles the training set once more, and that does not seem to be reproducible.
The text was updated successfully, but these errors were encountered: