Skip to content

Latest commit

 

History

History
59 lines (40 loc) · 2.15 KB

README.md

File metadata and controls

59 lines (40 loc) · 2.15 KB

Directed Message Passing Based on Attention for Prediction of Molecular Properties

Authors: Gong CHEN, Yvon MADAY

D-GATs follow the common framework of MPNNs and explore a bond-level message passing algorithm completely relying on scaled dot-product attention mechanism, which outperforms SOTA baselines on 13/15 molecular property prediction tasks on the MoleculeNet benchmark.

Dependencies

To run the code, it requires jupyter, rdkit and pytorch.

We advice the following environment:

conda create -n D_GATs python=3.8
conda activate D_GATs
conda install -c anaconda jupyter
conda install -c rdkit rdkit
conda install -c conda-forge pytorch-gpu 
(or) conda install -c conda-forge pytorch

Utilization

If you want to pre-train a model, the configuration is specified in config/pretrain_config.json. You only need to use jupyter notebook to open Pre-training.ipynb and to run the code.

If you want to finetune a model, the configuration is specified in config/config.json. You only need to use jupyter notebook to open finetune.ipynb and to run the code.

D-GATs' pre-training strategy

According to our message passing algorithm, the atom states are updated by directed bond states but independent to the update of bond states. Therefore, the successful recovery of atom features relies on the ability to correctly recover the masked bond features. Hence, we only need to recover atom features in the pre-training stage.

D-GATs' performance

Results on classification tasks:

Results on regression tasks:

Citation

Please kindly cite this paper if you use the code.

not published yet

License

This project is licensed under the terms of the MIT license. See LICENSE for additional details.