-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve parallelisation of OME-Zarr writing #31
Comments
jluethi
added this to the Yokogawa to OME-Zarr extension: sparsely recorded fields milestone
May 4, 2022
See here for the discussion on zarr writing task efficiency & memory usage: #57 (comment) |
Zarr writing performance is very good overall and improved with the dask parallelization here fractal-analytics-platform/fractal-tasks-core#20. No need to have Fractal distribute this further (and would anyway be hard/impossible with the single FOV per well strategy). |
Repository owner
moved this from Ready
to Done
in Fractal Project Management
Aug 4, 2022
jacopo-exact
pushed a commit
that referenced
this issue
Nov 16, 2022
Update pyproject (remove README from bumpver)
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Currently parallelisation is by well. For the test case
/data/active/fractal/3D/PelkmansLab/CardiacMultiplexing/Cycle1_subset/
, that results in ~30 minute run time for a single well. Thus, options to further parallelise this would be very helpful for small to medium datasets as well as to being able to run these kind of test cases better.See here for original parallelisation discussions: #16
Given the zarr architecture, well should be the primary split point. Then time point would be an option, though we often only have 1 timepoint in current datasets (but once there are more, parallelizing over them would make sense). After that, there are the pyramids, so parallelization below the pyramid level may be a bit more complicated. Still, may make sense to parallelize by channel or z plane after that.
The text was updated successfully, but these errors were encountered: