-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The value of BLEU is always 0 when training the expert model #14
Comments
Exactly not Like this:
|
Can you double check your source and target files? For examples, they should not be ids, but be strings. |
Hi Zachary, Do you settle the problem? I get the same result as you, the value of BLEU is always 0, and the outputs of dev and test are like b'Again' b'Again' b'Again'...... |
Thank you very much if you can reply me! |
There may be a problem in creating the vocabulary,and I use onmt-build-vocab --size max_vocab_size --save_vocab $TEXT/src-vocab.txt $TEXT/train.en to create vocabulary, and it worked. |
Hello, when I use Chinese (word segmentation) and English (token) parallel corpus to train the expert model, the value of BLEU is always 0. And the outputs of dev are all unk.
Like this:
Do you know why there is such a problem?Thanks!
The text was updated successfully, but these errors were encountered: