Skip to content

Commit

Permalink
Merge pull request #86 from choderalab/fix-ci
Browse files Browse the repository at this point in the history
Fix ci
  • Loading branch information
yuanqing-wang authored Oct 6, 2021
2 parents 373d47e + 83b9c93 commit 68fa608
Show file tree
Hide file tree
Showing 10 changed files with 162 additions and 26 deletions.
11 changes: 0 additions & 11 deletions .github/workflows/CI.yaml
Original file line number Diff line number Diff line change
@@ -1,16 +1,5 @@
name: CI

on: [push]
jobs:
build:
name: Push Sphinx Pages
runs-on: ubuntu-latest
steps:
- uses: seanzhengw/sphinx-pages@master
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
create_readme: true

on:
pull_request:
branches:
Expand Down
38 changes: 38 additions & 0 deletions .github/workflows/sphinx.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
name: "Build Doc"
on:
- push

jobs:
docs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Python 3.7
uses: actions/setup-python@v2
with:
python-version: 3.7

- uses: conda-incubator/setup-miniconda@v2
with:
installer-url: ${{ matrix.conda-installer }}
python-version: ${{ matrix.python-version }}
activate-environment: test
channel-priority: true
environment-file: devtools/conda-envs/espaloma.yaml
auto-activate-base: false
use-mamba: true

- name: Install package
shell: bash -l {0}
run: |
python -m pip install --no-deps .
- name: Compile
shell: bash -l {0}
run: |
python -m pip install sphinx sphinx-rtd-theme numpydoc
cd docs && make html
- name: Deploy
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: docs/_build/html
19 changes: 12 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,25 @@
espaloma
espaloma: **E**xtensible **S**urrogate **P**otenti**al** **O**ptimized by **M**essage-passing **A**lgorithms
==============================
[//]: # (Badges)
[![CI](https://github.com/choderalab/espaloma/actions/workflows/CI.yaml/badge.svg?branch=master)](https://github.com/choderalab/espaloma/actions/workflows/CI.yaml)
[![Total alerts](https://img.shields.io/lgtm/alerts/g/choderalab/espaloma.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/choderalab/espaloma/alerts/)
[![Language grade: Python](https://img.shields.io/lgtm/grade/python/g/choderalab/espaloma.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/choderalab/espaloma/context:python)
[![docs stable](https://img.shields.io/badge/docs-stable-5077AB.svg?logo=read%20the%20docs)](www.espaloma.wangyq.net/)

Extensible Surrogate Potential of Ab initio Learned and Optimized by Message-passing Algorithms

Rather than:
Source code for [Wang Y, Fass J, and Chodera JD "End-to-End Differentiable Construction of Molecular Mechanics Force Fields."](https://arxiv.org/abs/2010.01196)

molecule ---(atom typing schemes)---> atom-types ---(atom typing schemes)---> bond-, angle-, torsion-types ---(table lookup)---> force field parameters
![abstract](docs/_static/espaloma_abstract_v2-2.png)

we want to have

molecule ---(graph nets)---> atom-embedding ---(pooling)---> hypernode-embedding ---(feedforward neural networks)---> force field parameters

# Paper Abstract
Molecular mechanics (MM) potentials have long been a workhorse of computational chemistry.
Leveraging accuracy and speed, these functional forms find use in a wide variety of applications in biomolecular modeling and drug discovery, from rapid virtual screening to detailed free energy calculations.
Traditionally, MM potentials have relied on human-curated, inflexible, and poorly extensible discrete chemical perception rules _atom types_ for applying parameters to small molecules or biopolymers, making it difficult to optimize both types and parameters to fit quantum chemical or physical property data.
Here, we propose an alternative approach that uses _graph neural networks_ to perceive chemical environments, producing continuous atom embeddings from which valence and nonbonded parameters can be predicted using invariance-preserving layers.
Since all stages are built from smooth neural functions, the entire process---spanning chemical perception to parameter assignment---is modular and end-to-end differentiable with respect to model parameters, allowing new force fields to be easily constructed, extended, and applied to arbitrary molecules.
We show that this approach is not only sufficiently expressive to reproduce legacy atom types, but that it can learn and extend existing molecular mechanics force fields, construct entirely new force fields applicable to both biopolymers and small molecules from quantum chemical calculations, and even learn to accurately predict free energies from experimental observables.

# Manifest

Expand Down Expand Up @@ -48,6 +53,6 @@ This software is licensed under [MIT license](https://opensource.org/licenses/MI

Copyright (c) 2020, Chodera Lab at Memorial Sloan Kettering Cancer Center and Authors:
Authors:
- Yuanqing Wang
- [Yuanqing Wang](http://www.wangyq.net)
- Josh Fass
- John D. Chodera
3 changes: 3 additions & 0 deletions devtools/conda-envs/espaloma.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ channels:
- dglteam
- openeye
- defaults
- anaconda
dependencies:
# Base dependencies
- python
Expand All @@ -29,3 +30,5 @@ dependencies:
- nose-timer
- coverage
- qcportal
- sphinx
- sphinx_rtd_theme
Binary file added docs/_static/espaloma_abstract_v2-2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
import os
import sys

sys.path.insert(0, os.path.abspath('../espaloma'))
sys.path.insert(0, os.path.abspath('..'))

import espaloma
from espaloma import mm, nn, data, graphs
Expand Down
74 changes: 73 additions & 1 deletion docs/getting_started.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,76 @@
Getting Started
===============

This page details how to get started with espaloma.
.. image:: _static/espaloma_abstract_v2-2.png

Paper Abstract
--------------
Molecular mechanics (MM) potentials have long been a workhorse of computational chemistry.
Leveraging accuracy and speed, these functional forms find use in a wide variety of applications in biomolecular modeling and drug discovery, from rapid virtual screening to detailed free energy calculations.
Traditionally, MM potentials have relied on human-curated, inflexible, and poorly extensible discrete chemical perception rules _atom types_ for applying parameters to small molecules or biopolymers, making it difficult to optimize both types and parameters to fit quantum chemical or physical property data.
Here, we propose an alternative approach that uses _graph neural networks_ to perceive chemical environments, producing continuous atom embeddings from which valence and nonbonded parameters can be predicted using invariance-preserving layers.
Since all stages are built from smooth neural functions, the entire process---spanning chemical perception to parameter assignment---is modular and end-to-end differentiable with respect to model parameters, allowing new force fields to be easily constructed, extended, and applied to arbitrary molecules.
We show that this approach is not only sufficiently expressive to reproduce legacy atom types, but that it can learn and extend existing molecular mechanics force fields, construct entirely new force fields applicable to both biopolymers and small molecules from quantum chemical calculations, and even learn to accurately predict free energies from experimental observables.

Minimal Example
---------------
::

import torch, dgl, espaloma as esp

# retrieve QM dataset used to train OpenFF 1.0.0 ("parsley") small molecule force field
dataset = esp.data.dataset.GraphDataset.load("parsley").view(batch_size=128)

# define Espaloma stage I: graph -> atom latent representation
representation = esp.nn.Sequential(
layer=esp.nn.layers.dgl_legacy.gn("SAGEConv"), # use SAGEConv implementation in DGL
config=[128, "relu", 128, "relu", 128, "relu"], # 3 layers, 128 units, ReLU activation
)

# define Espaloma stage II and III:
# atom latent representation -> bond, angle, and torsion representation and parameters
readout = esp.nn.readout.janossy.JanossyPooling(
in_features=128,
config=[128, "relu", 128, "relu", 128, "relu"],
out_features={ # define modular MM parameters Espaloma will assign
1: {"e": 1, "s": 1},
2: {"coefficients": 2}, # bond linear combination
3: {"coefficients": 3}, # angle linear combination
4: {"k": 6}, # torsion barrier heights (can be positive or negative)
},
)

# compose all three Espaloma stages into an end-to-end model
espaloma_model = torch.nn.Sequential(
representation,
readout,
esp.mm.geometry.GeometryInGraph(),
esp.mm.energy.EnergyInGraph(),
esp.nn.readout.charge_equilibrium.ChargeEquilibrium(),
)

# define training metric
metrics = [
esp.metrics.GraphMetric(
base_metric=torch.nn.MSELoss(), # use mean-squared error loss
between=['u', "u_ref"], # between predicted and QM energies
level="g",
)
esp.metrics.GraphMetric(
base_metric=torch.nn.MSELoss(), # use mean-squared error loss
between=['q', "q_hat"], # between predicted and reference charges
level="n1",
)
]

# fit Espaloma model to training data
results = esp.Train(
ds_tr=dataset, net=espaloma_model, metrics=metrics,
device=torch.device('cuda:0'), n_epochs=5000,
optimizer=lambda net: torch.optim.Adam(net.parameters(), 1e-3), # use Adam optimizer
).run()




10 changes: 5 additions & 5 deletions espaloma/mm/functional.py
Original file line number Diff line number Diff line change
Expand Up @@ -161,12 +161,12 @@ def periodic(

k = k[:, None, :].repeat(1, x.shape[1], 1)

energy = (k * (1.0 + cos_n_theta_minus_phases)).sum(dim=-1)
# energy = (k * (1.0 + cos_n_theta_minus_phases)).sum(dim=-1)

# energy = (
# torch.nn.functional.relu(k) * (cos_n_theta_minus_phases + 1.0)
# -torch.nn.functional.relu(0.0-k) * (cos_n_theta_minus_phases - 1.0)
# ).sum(dim=-1)
energy = (
torch.nn.functional.relu(k) * (cos_n_theta_minus_phases + 1.0)
-torch.nn.functional.relu(0.0-k) * (cos_n_theta_minus_phases - 1.0)
).sum(dim=-1)


return energy
Expand Down
2 changes: 1 addition & 1 deletion espaloma/mm/tests/test_openmm_consistency.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ def _create_torsion_sim(
# TODO: mark this properly: want to test periodicities 1..6, +ve, -ve k
# @pytest.mark.parametrize(periodicity=[1,2,3,4,5,6], k=[-10 * omm_energy_unit, +10 * omm_energy_unit])
def test_periodic_torsion(
periodicity=4, k=-10 * omm_energy_unit, n_samples=100
periodicity=4, k=10 * omm_energy_unit, n_samples=100
):
""" Using simulated torsion scan, test if espaloma torsion energies and
OpenMM torsion energies agree.
Expand Down
29 changes: 29 additions & 0 deletions espaloma/nn/readout/janossy.py
Original file line number Diff line number Diff line change
Expand Up @@ -423,3 +423,32 @@ def forward(self, g):
)

return g


class ExpCoefficients(torch.nn.Module):
def forward(self, g):
import math
g.nodes['n2'].data['coefficients'] = g.nodes['n2'].data['log_coefficients'].exp()
g.nodes['n3'].data['coefficients'] = g.nodes['n3'].data['log_coefficients'].exp()
return g

class LinearMixtureToOriginal(torch.nn.Module):
def forward(self, g):
import math
g.nodes['n2'].data['k'], g.nodes['n2'].data['eq'] = esp.mm.functional.linear_mixture_to_original(
g.nodes['n2'].data['coefficients'][:, 0][:, None],
g.nodes['n2'].data['coefficients'][:, 1][:, None],
1.5, 6.0,
)

g.nodes['n3'].data['k'], g.nodes['n3'].data['eq'] = esp.mm.functional.linear_mixture_to_original(
g.nodes['n3'].data['coefficients'][:, 0][:, None],
g.nodes['n3'].data['coefficients'][:, 1][:, None],
0.0, math.pi
)

g.nodes['n3'].data.pop('coefficients')
g.nodes['n2'].data.pop('coefficients')
return g


0 comments on commit 68fa608

Please sign in to comment.