Skip to content

Commit

Permalink
Add MatterSim document and examples
Browse files Browse the repository at this point in the history
  • Loading branch information
Lingyu-Kong committed Dec 16, 2024
1 parent 62e16b6 commit cf4d565
Show file tree
Hide file tree
Showing 14 changed files with 760 additions and 50 deletions.
15 changes: 13 additions & 2 deletions docs/backbones/m3gnet.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,19 @@
# M3GNet Backbone

The M3GNet backbone implements the M3GNet model architecture in MatterTune. It provides a powerful graph neural network designed specifically for materials science applications.
The M3GNet backbone implements the M3GNet model architecture in MatterTune. It provides a powerful graph neural network designed specifically for materials science applications. In MatterTune, we chose the M3GNet model implemented by MatGL and pretrained on MPTraj dataset.

## Overview
## Installation

```bash
conda create -n matgl-tune python=3.10 -y
pip install matgl
pip install torch==2.2.1+cu121 -f https://download.pytorch.org/whl/torch_stable.html
pip uninstall dgl
pip install dgl -f https://data.dgl.ai/wheels/torch-2.2/cu121/repo.html
pip install dglgo -f https://data.dgl.ai/wheels-test/repo.html
```

## Key Features

M3GNet supports predicting:
- Total energy (with energy conservation)
Expand Down
101 changes: 101 additions & 0 deletions docs/backbones/mattersim.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
# MatterSim Backbone

> Note: As of the latest MatterTune update, MatterSim has only released the M3GNet model.
The MatterSim backbone integrates the MatterSim model architecture into MatterTune. MatterSim is a foundational atomistic model designed to simulate materials property across wide range of elements, temperatures and pressures.

## Installation

We strongly recommand to install MatterSim from source code

```bash
git clone [email protected]:microsoft/mattersim.git
cd mattersim
```

Find the line 41 of the pyproject.toml in MatterSim, which is ```"pydantic==2.9.2",```. Change it to ```"pydantic>=2.9.2",```. After finishing this modification, install MatterSim by running:

```bash
mamba env create -f environment.yaml
mamba activate mattersim
uv pip install -e .
python setup.py build_ext --inplace
```

## Key Features

- Pretrained on materials data across wide range of elements, temperatures and pressures.
- Flexible model architecture selection
- MatterSim-v1.0.0-1M: A mini version of the M3GNet that is faster to run.
- MatterSim-v1.0.0-5M: A larger version of the M3GNet that is more accurate.
- TO BE RELEASED: Graphormer model with even larger parameter scale
- Support for property predictions:
- Energy (extensive/intensive)
- Forces (conservative for M3GNet and non-conservative for Graphormer)
- Stresses (conservative for M3GNet and non-conservative for Graphormer)
- Graph-level properties (available on Graphormer)

## Configuration

Here's a complete example showing how to configure the JMP backbone:

```python
from mattertune import configs as MC
from pathlib import Path

config = MC.MatterTunerConfig(
model=MC.MatterSimBackboneConfig(
# Required: Path to pre-trained checkpoint
pretrained_model="MatterSim-v1.0.0-5M",

# Graph construction settings
graph_convertor=MC.MatterSimGraphConvertorConfig(
twobody_cutoff = 5.0 ## The cutoff distance for the two-body interactions.
has_threebody = True ## Whether to include three-body interactions.
threebody_cutoff = 4.0 ## The cutoff distance for the three-body interactions.
)

# Properties to predict
properties=[
# Energy prediction
MC.EnergyPropertyConfig(
loss=MC.MAELossConfig(),
loss_coefficient=1.0
),

# Force prediction (conservative)
MC.ForcesPropertyConfig(
loss=MC.MAELossConfig(),
loss_coefficient=10.0,
conservative=True
),

# Stress prediction (conservative)
MC.StressesPropertyConfig(
loss=MC.MAELossConfig(),
loss_coefficient=1.0,
conservative=True
),
],

# Optimizer settings
optimizer=MC.AdamWConfig(lr=1e-4),

# Optional: Learning rate scheduler
lr_scheduler=MC.CosineAnnealingLRConfig(
T_max=100,
eta_min=1e-6
)
)
)
```

## Examples & Notebooks

A notebook tutorial about how to fine-tune and use MatterSim model can be found in ```notebooks/mattersim-waterthermo.ipynb```([link](https://github.com/Fung-Lab/MatterTune/blob/main/notebooks/mattersim-waterthermo.ipynb)).

Under ```water-thermodynamics```([link](https://github.com/Fung-Lab/MatterTune/tree/main/examples/water-thermodynamics)), we gave an advanced usage example fine-tuning MatterSim on PES data and applying to MD simulation

## License

The MatterSim backbone is available under MIT License
1 change: 1 addition & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ backbones/jmp
backbones/m3gnet
backbones/orb
backbones/eqv2
backbones/mattersim
```

```{toctree}
Expand Down
18 changes: 18 additions & 0 deletions docs/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,24 @@ pip install "git+https://github.com/FAIR-Chem/fairchem.git@omat24#subdirectory=p
pip install ase "e3nn>=0.5" hydra-core lmdb numba "numpy>=1.26,<2.0" orjson "pymatgen>=2023.10.3" submitit tensorboard "torch>=2.4" wandb torch_geometric h5py netcdf4 opt-einsum spglib
```
### MatterSim
We strongly recommand to install MatterSim from source code
```bash
git clone [email protected]:microsoft/mattersim.git
cd mattersim
```
Find the line 41 of the pyproject.toml in MatterSim, which is ```"pydantic==2.9.2",```. Change it to ```"pydantic>=2.9.2",```. After finishing this modification, install MatterSim by running:
```bash
mamba env create -f environment.yaml
mamba activate mattersim
uv pip install -e .
python setup.py build_ext --inplace
```
## MatterTune Package Installation
```{important}
Expand Down
1 change: 1 addition & 0 deletions docs/introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ Seamlessly work with multiple state-of-the-art pre-trained models including:
- EquiformerV2
- M3GNet
- ORB
- MatterSim

### Flexible Property Predictions
Support for various molecular and materials properties:
Expand Down
5 changes: 5 additions & 0 deletions docs/license.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,11 @@ BSD 3-Clause License
Apache License 2.0
[ORB License](https://github.com/orbital-materials/orb-models/blob/main/LICENSE)

### MatterSim Backbone
MIT License
[MatterSim License](https://github.com/microsoft/mattersim/blob/main/LICENSE.txt)


```{important}
Please ensure compliance with the respective licenses when using specific model backbones in your project. For commercial use cases, carefully review each backbone's license terms or contact the respective authors for licensing options.
```
4 changes: 2 additions & 2 deletions notebooks/eqv2-omat.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1092,8 +1092,8 @@
"\n",
"\n",
"hp = hparams()\n",
"model = MatterTuner(hp).tune()\n",
"model"
"tune_output = MatterTuner(hp).tune()\n",
"model, trainer = tune_output.model, tune_output.trainer"
]
},
{
Expand Down
4 changes: 2 additions & 2 deletions notebooks/jmp-omat-autosplit.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -922,8 +922,8 @@
"\n",
"\n",
"hp = hparams()\n",
"model = MatterTuner(hp).tune()\n",
"model"
"tune_output = MatterTuner(hp).tune()\n",
"model, trainer = tune_output.model, tune_output.trainer"
]
},
{
Expand Down
6 changes: 3 additions & 3 deletions notebooks/m3gnet-waterthermo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -219,7 +219,7 @@
" ## Data hparams\n",
" hparams.data = MC.AutoSplitDataModuleConfig.draft()\n",
" hparams.data.dataset = MC.XYZDatasetConfig.draft()\n",
" hparams.data.dataset.src = \"./data/water_ef.xyz\"\n",
" hparams.data.dataset.src = Path(\"../examples/water-thermodynamics/data/water_ef.xyz\")\n",
" hparams.data.train_split = 0.8\n",
" hparams.data.batch_size = 1\n",
"\n",
Expand All @@ -230,8 +230,8 @@
" return hparams\n",
"\n",
"\n",
"model = MatterTuner(hparams()).tune()\n",
"model\n",
"tune_output = MatterTuner(hp).tune()\n",
"model, trainer = tune_output.model, tune_output.trainer\n",
"\n",
"# pip install torch_sparse -f https://data.pyg.org/whl/torch-2.2.1+cu121.html"
]
Expand Down
Loading

0 comments on commit cf4d565

Please sign in to comment.