Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

遇到这个问题怎么解决啊???求大佬解答!! #33

Open
cwl1999 opened this issue Nov 5, 2021 · 1 comment
Open

遇到这个问题怎么解决啊???求大佬解答!! #33

cwl1999 opened this issue Nov 5, 2021 · 1 comment

Comments

@cwl1999
Copy link

cwl1999 commented Nov 5, 2021

keep_dims is deprecated, use keepdims instead
Traceback (most recent call last):
File "D:/Gra_stu/Pratice/keras-gat-master/examples/gat.py", line 36, in
H = GraphAttention(8, attn_heads=8, attn_heads_reduction='concat', dropout_rate=0.6, activation='elu', kernel_regularizer=l2(5e-4), attn_kernel_regularizer=l2(5e-4))([H]+G)
File "D:\Downloads\Anaconda\envs\keras-gcn\lib\site-packages\keras\engine\topology.py", line 603, in call
output = self.call(inputs, **kwargs)
File "D:\Gra_stu\Pratice\keras-gat-master\keras_gat\graph_attention_layer.py", line 119, in call
mask = -10e9 * (1.0 - A)
TypeError: unsupported operand type(s) for -: 'float' and 'list'

对应graph_attention_layer.py文件中这一行:mask = -10e9 * (1.0 - A)

@danielegrattarola
Copy link
Owner

Not too sure what's going on here, but you should check the inputs to your layers.
A should be a tensor-like object, not a list.

Cheers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants