Releases: okunator/cellseg_models.pytorch
Releases · okunator/cellseg_models.pytorch
v0.1.18
v0.1.17
0.1.17 — 2022-12-29
Features
- Add transformer modules
- Add exact, slice, and memory efficient (xformers) self attention computations
- Add transformers modules to
Decoder
modules - Add common transformer mlp activation functions: star-relu, geglu, approximate-gelu.
- Add Linformer self-attention mechanism.
- Add support for model intialization from yaml-file in
MultiTaskUnet
. - Add a new cross-attention long-skip module. Works with
long_skip='cross-attn'
Refactor
- Added more verbose error messages for the abstract wrapper-modules in
modules.base_modules
- Added more verbose error catching for xformers.ops.memory_efficient_attention.
v0.1.16
v0.1.15
v0.1.14
0.1.14 — 2022-12-01
Performance
- Throw away some unnecessary parts of the cellpose post-proc pipeline that just brought overhead and did nothing.
Refactor
-
Refactor the whole cellpose post-processing pipeline for readability.
-
Refactored multiprocessing code to be reusable and moved it under
utils
.
Features
-
Add exact euler integration (on CPU) for cellpose post-processing.
-
added more pathos.Pool options for parallel processing. Added
ThreadPool
,ProcessPool
&SerialPool
-
add all the mapping methods for each Pool obj. I.e.
amap
,imap
,uimap
andmap