Create a torch.nn.Module
for the composition weights
#269
Labels
Discussion
Issues to be discussed by the contributors
Infrastructure: Miscellaneous
General infrastructure issues
Priority: Medium
Important issues to address after high priority.
Composition weights are used by almost every architecture to subtract the energy from the chemical composition of the dataset to make it easier for the actual architecture to learn the targets. In metatrain, there is a utility function to compute the composition weights function
https://github.com/lab-cosmo/metatrain/blob/40d4d6a30d9add0fa4d1e9d2ce9e2635423fcc48/src/metatrain/utils/composition.py#L9
However, so far each architecture has to apply these composition weights on their own by looping over the
Systems
and creating the outputTensorBlocks
. See for example the GAP architecturehttps://github.com/lab-cosmo/metatrain/blob/40d4d6a30d9add0fa4d1e9d2ce9e2635423fcc48/src/metatrain/experimental/gap/model.py#L242
I think it would be useful to create a
torch.nn.Module
for theCompositionEnergy
to make this essential part of an architecture easier and usable for the devs. Even though we could, I wouldn't make this a public architecture to users right now. The idea for anModule
is also in line for a short rangeModule
as discussed in #265 where we concluded to also create atorch.nn.Module
.My idea for the design below is basically copied from the metatensor atomistic tutorial.
Once we have this Module, an architecture basically has to call the
compute_weights
function to store the weights and can use theforward
function to apply the composition energies.The text was updated successfully, but these errors were encountered: