Skip to content

Code for the paper LaM-SLidE - Latent Space Modeling of Spatial Dynamical Systems via Linked Entities

Notifications You must be signed in to change notification settings

ml-jku/LaM-SLidE

Repository files navigation

LaM-SLidE ({La}tent Space {M}odeling of Spatial Dynamical {S}ystems via {Li}nke{d} {E}ntities)

Template

python pytorch lightning hydra

Project Page | Paper

Implementation of LaM-SLidE (Latent Space Modeling of Spatial Dynamical Systems via Linked Entities).

Note: This repository is provided for research reproducibility only and is not intended for usage in application workflows.

News

🔥February 18, 2025: The training code and paper preprint are released.

Setup

Installation

mamba env create -f environment.yaml
mamba activate pyt25

Env variables

Create an .env file and set the parameters for logging with wandb. An example can be found here.

Setup

Data

The data for all experiment will be located in the data directory.

mkdir data

Workflow

Because our methods reilies on two stage approach:

  1. First stage encoder/decoder
  2. Second stage latent model

we retrieve wandb first stage model information direclty form the api, this simplyfies the workflow a lot, and for the second stage training we only have to provide the RunID of the first stage to the second stage training.

Experiments

MD17

Data Preparation

Download the MD17 dataset in .npz format from here. The dataset should be placed in data/md17.

Training

# First stage (Encoder-Decoder)
python experiment=md17/first-stage

# Second stage (Diffusion)
python experiment=md17/second-stage first_stage_settings.run_id=[WB_RUN_ID] first_stage_settings.project=[WB_PROJECT]

Pedestrian

Data Preparation

Follow the instructions here to download and preprocess the data. Then move the preprocessed files in the folder processed_data_diverse into data/pedestrian_eqmotion.

Training

# First stage (Encoder-Decoder)
python experiment=pedestrian/first-stage

# Second stage (Diffusion)
python experiment=pedestrian/second-stage first_stage_settings.run_id=[WB_RUN_ID] first_stage_settings.project=[WB_PROJECT]

NBA

Data preparation

Download the data from here

Process the data with following commands.

# Train
python scripts/nba/process_4AA.py --data_dir data/social_vae_data/nba/score/train
python scripts/nba/process_4AA.py --data_dir data/social_vae_data/nba/rebound/train

# Val
python scripts/nba/process_data.py --data_dir data/social_vae_data/nba/score/val
python scripts/nba/process_4AA.py --data_dir data/social_vae_data/nba/rebound/val

Training

# First stage (Encoder-Decoder)
python experiment=nba/first-stage

# Second stage (Diffusion)
python experiment=nba/second-stage first_stage_settings.run_id=[WB_RUN_ID] first_stage_settings.project=[WB_PROJECT]

Tetrapeptide - 4AA

Follow the instructions here to download the data.

Data preparation

Process the data with the following commands.

# Train
python scripts/peptide/process_4AA.py --split data/mdgen/splits/4AA_train.csv --outdir data/mdgen/4AA_sims_processed/train --sim_dir data/mdgen/4AA_sims

# Val
python scripts/peptide/process_4AA.py --split data/mdgen/splits/4AA_val.csv --outdir data/mdgen/4AA_sims_processed/val --sim_dir data/mdgen/4AA_sims

# Test
python scripts/peptide/process_4AA.py --split data/mdgen/splits/4AA_test.csv --outdir data/mdgen/4AA_sims_processed/test --sim_dir data/mdgen/4AA_sims

Training

# First stage (Encoder-Decoder)
python experiment=peptide/first-stage

# Second stage (Diffusion)
python experiment=peptide/second-stage first_stage_settings.run_id=[WB_RUN_ID] first_stage_settings.project=[WB_PROJECT]

Acknowledgments

Our source code was inpired by previous work:

  • mdgen - Latent space conditioning/masking.
  • flux - Latent space model architecture.
  • SiT - Stochastic interpolants framework.
  • UPT - Encoder - decoder architecture.

Citation

If you like our work, please consider giving it a star 🌟 and cite us

@misc{sestak2025lamslidelatentspacemodeling,
      title={LaM-SLidE: Latent Space Modeling of Spatial Dynamical Systems via Linked Entities}, 
      author={Florian Sestak and Artur Toshev and Andreas Fürst and Günter Klambauer and Andreas Mayr and Johannes Brandstetter},
      year={2025},
      eprint={2502.12128},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2502.12128}, 
}

Releases

No releases published

Packages

No packages published

Languages