-
Notifications
You must be signed in to change notification settings - Fork 179
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to run GNNExplainer on users' own model and data? #3
Comments
Hi, to run on own dataset, one should first train the GNN for the prediction task. You can use the base implementation in the repo, or replace with your own. After saving the model, you can run explainer, passing in the model checkpoint, and specify the node/graph to explain. |
Hi @RexYing I think if you provide some example how you can take existing model, but train it on your own data that would be great. For now, it's not clear how to provide the data to gnn-explainer (e.g. format, reading functions, etc.). |
Hi, the first step is to make sure that the model's aggregation can take a take edge mask value. For example, Sum or mean or attention aggregation can be adapted to be weighted sum or weighted mean. This step should not affect the model performance at all. After that train this model and save it. Lastly, have a file similar to explain.py, where you build a trainable mask on feature and adjacency and optimize it. Note that mask on adjacency can be sparse, just 1 value for each edge. |
Hi @RexYing I talk about different thing. I have 300 adjacency files. How do read/prepare them for your code? How can I work with your code on my data? |
@nd7141 have you figured out how to run it on your own data? EDIT: It seems like Pytorch geometric have this implemented! |
Hi mdanb, I really appreciate your help! |
Hi,
This is a very interesting work!
The repo provides several datasets to test GNNExplainer. However, it is not obvious to me how a user can run it on his/her own model and dataset. Could you please explain how to do that?
Best,
Jingxuan
The text was updated successfully, but these errors were encountered: