First of all, make sure the project structure matches the one below:
├── fswp_train.yml
├── fswp_run.yml
├── _data
│ └── # pickle files for model training
├── checkpoints
│ └── # .pth models
├── models
│ ├── unet #unet trainer
│ └── convlstm #convlstm trainer
├── utils
└── docker
Download convlstm training data, and data unet training. Put the received datasets in the data folder.
Download the unet model and put it in a folder checkpoints/unet
Download the convlstm model and put it in a folder checkpoints/convlstm
pip install -r requirements.txt
We use crafting to automate our experiments.
You can find an example of running such a pipeline in run.yaml
file.
You need to have installed Docker, Nvidia drivers, and crafting package.
The crafting package is available in PyPI:
pip install crafting
To build the image run the command below in docker
folder:
sh build.sh
To run an experiment specify target command in command
field in run.yaml
file and call crafting:
crafting configs/docker_run.yaml
train.sh -fswp_train.yml
python3 utils/predict_on_tdataset.py ./configs/run_solar_fswp.yaml
python3 utils/run_on_env.py configs/run_star_fswp.yaml False checkpoints/convlstm/CONV_LSTM_run_EricWright_94ed9438-8fae-4fc8-8afa-85409d0c6f46.pth