-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
loss不降反升 #1
Comments
我感觉可能和数据集也有关系,或者网络结构也有关系,这一块代码我当时只是把网络结构搭建出来,只用了小数据跑了一下,所以可能自己尝试一下哈、 |
好像是数据集的问题。然后我发现你只用了两层图网络,而第二层就直接分类了,论文中不是说如果后面接分类层,最后前面那一层不要用wh拼接,而用加和平均吗?然后我看你的代码是用的wh拼接,我自己改成了加和平均,两个都试了下,感觉差不多。请问你对此有什么理解吗? |
如果方便的话,是否呢个加个微信沟通 hbtz2000 |
最近有点忙,这个代码和论文有蛮久没有看过了,我最近有空的话,研究一下看看按照你的结构调整一下我试试看 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
你好,我GAT模型训练到精度大约66%时,loss不降反升。尝试过降低lr没有很好的效果。数据集就是自带的。请问该如何解决?是不是输出pro需要锐化?
The text was updated successfully, but these errors were encountered: