-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About negative samples #3
Comments
The adversarial negative sampling is actually implemented in the loss function, Please check the log_rank_loss functions defined in the model.py (line 120-123). For sure, the adversarial negative sampling is proven to be helpful for boosting (T)KGE models' performances in many cases including ours. |
Thanks for your reply, and it really helps me a lot. |
Generally, the adversarial negative sampling can improve 1-3 MRR points in cases of TeRo and RotatE. |
In your paper TeRo, you have said "In this paper, we use the same loss function as the negative sampling loss proposed in (Sun et al.,2019) for optimizing our model."
But in your code (shared), I can only find a simple random negative sampling method # ( Line 80-140 in Train.py)
How do you think about the adversarial negative sampling method used in RotatE (Sun et al.,2019)?
The text was updated successfully, but these errors were encountered: