-
Notifications
You must be signed in to change notification settings - Fork 124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
For cases like O->I, should I manually set the corresponding entries in the transition probability matrix to zero? #38
Comments
Hi @lkqnaruto , I wouldn't recommend putting these constraints on |
Thank you for the quick reply. Do you think putting such constraint on CRF layer could improve the model performance than without it? It looks like without setting the constraint, the currently model is actually "LEARNING" such constraint by itself. |
Yes, I believe so. I see it as a form of model initialization similar to adjusting the bias terms of a classification layer to produce the prior probabilities of the classes on the dataset (see init well), which is an good practice. It could make the training easier, faster to converge, etc. But it does not necessarily improve the model performance. |
I was trying to pub the above constraint to the So the code I was trying to modify is:
or it should be:
which one is the correct way to implement? Thanks in advance. |
Hi
Again, thank you for the amazing work! I wonder in NER task, for cases like O->I, Should I manually set the corresponding entries in the transition probability matrix to zero? I went through the pytorch-crf code, and didn't see such settings.
Thanks in advance!
The text was updated successfully, but these errors were encountered: