This repository has been archived by the owner on Jul 29, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 7
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
merge recent changes into pytorch_implementation (#237)
* added primitive xy scaling feature to inference * fixed opset mismatches * changes to model + scripts for exporting to onnx * fixed dropout incompatibility, moved logger to utils. * removed inference session from model export script * use builtins to normalize * caching argument not doing anything yet * profiling script * separate augmentation * Revert "separate augmentation" This reverts commit c30ed61. * remove unused import * let zarr auto-detect multi-processing * remove unused import * copy zarr store to memory * allow cache and preload * update profiling script * format * remove preload this is not practical for larger datasets * cleanup * configurable number of log samples * split train and validation dataset at fov level * fix shuffling * fix dropout layers initialization * fix filter hyperparameter check * disable dropout for the head * stronger default augmentation * formalized onnx exporting and moved script to CLI, updated documentation in inference readme * fix merge conflict * isort * combine training CLI with others * Multi-GPU training (#235) * sync log metrics * example of using more GPUs * sync log metrics * example of using more GPUs * revised data format * updated data org * remove profile output this was accidentally tracked * Inference in lightning (#236) * rename lightning cli main script * add docstring * predict stage in data module * write predict result with callback * remove old inference module * rename tests Signed-off-by: Ziwen Liu <[email protected]> --------- Signed-off-by: Ziwen Liu <[email protected]> --------- Signed-off-by: Ziwen Liu <[email protected]> Co-authored-by: Christian Foley <[email protected]> Co-authored-by: Christian Foley <[email protected]> Co-authored-by: Ziwen Liu <[email protected]>
- Loading branch information
1 parent
00375b0
commit 3d8eb24
Showing
18 changed files
with
303 additions
and
890 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -58,4 +58,5 @@ data: | |
- 256 | ||
augment: true | ||
caching: false | ||
normalize_source: false | ||
ckpt_path: null |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,69 @@ | ||
# lightning.pytorch==2.0.1 | ||
predict: | ||
seed_everything: true | ||
trainer: | ||
accelerator: auto | ||
strategy: auto | ||
devices: auto | ||
num_nodes: 1 | ||
precision: 32-true | ||
callbacks: | ||
- class_path: micro_dl.light.prediction_writer.HCSPredictionWriter | ||
init_args: | ||
output_store: null | ||
write_input: false | ||
write_interval: batch | ||
fast_dev_run: false | ||
max_epochs: null | ||
min_epochs: null | ||
max_steps: -1 | ||
min_steps: null | ||
max_time: null | ||
limit_train_batches: null | ||
limit_val_batches: null | ||
limit_test_batches: null | ||
limit_predict_batches: null | ||
overfit_batches: 0.0 | ||
val_check_interval: null | ||
check_val_every_n_epoch: 1 | ||
num_sanity_val_steps: null | ||
log_every_n_steps: null | ||
enable_checkpointing: null | ||
enable_progress_bar: null | ||
enable_model_summary: null | ||
accumulate_grad_batches: 1 | ||
gradient_clip_val: null | ||
gradient_clip_algorithm: null | ||
deterministic: null | ||
benchmark: null | ||
inference_mode: true | ||
use_distributed_sampler: true | ||
profiler: null | ||
detect_anomaly: false | ||
barebones: false | ||
plugins: null | ||
sync_batchnorm: false | ||
reload_dataloaders_every_n_epochs: 0 | ||
default_root_dir: null | ||
model: | ||
model_config: {} | ||
loss_function: null | ||
lr: 0.001 | ||
schedule: Constant | ||
log_num_samples: 8 | ||
data: | ||
data_path: null | ||
source_channel: null | ||
target_channel: null | ||
z_window_size: null | ||
split_ratio: null | ||
batch_size: 16 | ||
num_workers: 8 | ||
yx_patch_size: | ||
- 256 | ||
- 256 | ||
augment: true | ||
caching: false | ||
normalize_source: false | ||
return_predictions: null | ||
ckpt_path: null |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,15 @@ | ||
# CLI | ||
|
||
## Exporting models to ONNX | ||
|
||
If you wish to run inference via usage of the ONNXruntime, models can be exported to onnx using the `micro_dl/cli/onnx_export_script.py`. See below for an example usage of this script with 5-input-stack model: | ||
|
||
```bash | ||
python micro_dl/cli/onnx_export_script.py --model_path path/to/your/pt_model.pt --stack_depth 5 --export_path intended/path/to/model/export.onnx --test_input path/to/test/input.npy | ||
``` | ||
|
||
**Some Notes:** | ||
|
||
* For cpu sharing reasons, running an onnx model requires a dedicated node on hpc OR a non-distributed system (for example a personal laptop or other device). | ||
* Test inputs are optional, but help verify that the exported model can be run if exporting from intended usage device. | ||
* Models must be located in a lighting training logs directory with a valid `config.yaml` in order to be initialized. This can be "hacked" by locating the config in a directory called `checkpoints` beneath a valid config's directory. |
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
File renamed without changes.
This file was deleted.
Oops, something went wrong.
Oops, something went wrong.