metatensor-torch
workflows for training descriptor-based equivariant neural networks to predict at DFT-level accuracy:
- real-space electronic density scalar fields decomposed on a basis (molecular & periodic systems)
- electronic density of states (DOS) (periodic systems)
Authors:
- Joseph W. Abbott, PhD Student @ Lab COSMO, EPFL
- Wei Bin How, PhD Student @ Lab COSMO, EPFL
Note: under active development, breaking changes are likely!
Electronic densities, such as the electron density and local density of states, are central quantities in understanding the electronic properties of molecules and materials on the atomic scale. First principles quantum simulations such as density-functional theory (DFT) are able to accurately predict such fields as a linear combination of single-particle solutions to the Kohn-Sham equations. While reliable and accurate, such methods scale unfavourably with the number of electrons in the system.
Machine learning methods offer a complementary solution to probing the electronic structure of matter on the atomic scale. With a sufficiently expressive model, one can learn the mapping between nuclear geometry and real-space electronic density and predict such quantities with more favourable scaling. Typically, predictions can be used to accelerate DFT by providing initial guesses, or directly probe electronic structure.
There are many approaches to learn the aforementioned mapping. In the density fitting approach, the real-space target electronic density
where
An equivariant model is then trained to predict coefficients
For one of the original workflows for predicting the electron density under the density-fitting framework, readers are referred to SALTED. This uses a symmetry-adapted Gaussian process regression (SA-GPR) method via sparse kernel ridge regression to learn and predict
rholearn
also operates under the density fitting approach. The nuclear coordinates rholearn
is integrated with the electronic structure code FHI-aims
for both data generation and building of real-space fields from predicted coefficients.
rholearn
aims to improve the scalability of the density-fitting approach to learning electronic densities. It is built on top of a modular software ecosystem, with the following packages forming the main components of the workflow:
metatensor
(GitHub) is used as the self-describing block-sparse data storage format, wrapping multidimensional tensors with metadata. Subpackagesmetatensor-operations
andmetatensor-learn
are used to provide convenient sparse operations and ML building blocks respectively that operate on themetatensor.TensorMap
object.rascaline
(GitHub) is used to transform the nuclear coordinates into local equivariant descriptors that encode physical symmetries and geometric information for input into the neural network.PyTorch
is used as the learning framework, allowing definition of arbitrarily complex neural networks that can be trained by minibatch gradient descent.
Leveraging the speed- and memory-efficient operations of torch
, and using building on top of metatensor
and rascaline
, descriptors, models, and learning methodologies can be flexibly prototyped and customized for a specific learning task.
With a working conda
installation, first set up an environment:
conda create -n rho python==3.12
conda activate rho
Then clone and install rholearn
:
git clone https://github.com/lab-cosmo/rholearn.git
cd rholearn
# Specify CPU-only torch
pip install --extra-index-url https://download.pytorch.org/whl/cpu .
Running tox
from the top directory will run linting and formatting.
To run some tests (currently limited to testing rholearn.loss
), run pytest tests/rholearn/loss.py
.
For generating reference data, using the aims_interface
of rholearn
, a working installation of FHIaims >= 240926
is required. FHI-aims is not open source but is free for academic use. Follow the instructions on their website fhi-aims.org/get-the-code to get and build the code. The end result should be an executable, compiled for your specific system.
There are also useful tutorials on the basics of running FHI-aims
here.
In a run directory, user-options are defined in YAML files named "dft-options.yaml", "hpc-options.yaml", and "ml-options.yaml". Any options specified in these files overwrite the defaults.
Default options can be found in the rholearn/options/ directory, and some templates for user options can be found in the examples/options/ directory.
Data can be generated with the following:
rholearn_run_scf # run SCF with FHI-aims
rholearn_process_scf # process SCF outputs
rholearn_setup_ri_fit # setup RI fitting calculation
rholearn_run_ri_fit # run RI fitting with FHI-aims
rholearn_process_ri_fit # process RI outputs
and model training and evaluation run with:
rholearn_train # train model
rholearn_eval # evaluate model
Data can be generated with the following:
doslearn_run_scf # run SCF with FHI-aims
doslearn_process_scf # process SCF outputs
and model training and evaluation run with:
doslearn_train # train model
doslearn_eval # evaluate model
For a more in-depth walkthrough of the functionality, see the following tutorials:
- rholearn tutorial on data generation using
FHI-aims
and model training usingrholearn
to predict the electron density decomposed on a basis. - doslearn tutorial on data generation using
FHI-aims
and model training usingdoslearn
to predict the electron density of states.
@software{abbott_2024_13891847,
author = {Abbott, Joseph W. and
How, Wei Bin and
Fraux, Guillaume and
Ceriotti, Michele},
title = {lab-cosmo/rholearn: rholearn v0.1.0},
month = oct,
year = 2024,
publisher = {Zenodo},
version = {v0.1.0},
doi = {10.5281/zenodo.13891847},
url = {https://doi.org/10.5281/zenodo.13891847}
}