You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have 35 annual CCI Sea Surface Temperature (SST) Level-4 Zarr datasets each comprising 4 data variables with dimensions (time=365, lat=3600, lon=7200) and chunking (time=16, lat=900, lon=1800). I want to append them to generate a single Zarr (with equal or similar chunking) but none of my nc2zarr jobs succeeded so far. The append job will always be terminated by an out-of-memory exception even if I assign it 256 GB of RAM. The issue is complicated by the fact that every append step takes 30 minutes and the OOM may occur only after appending ~20 years. This indicates one or more memory leaks in nc2zarr, xarray, dask, and/or zarr.
I have 35 annual CCI Sea Surface Temperature (SST) Level-4 Zarr datasets each comprising 4 data variables with dimensions (time=365, lat=3600, lon=7200) and chunking (time=16, lat=900, lon=1800). I want to append them to generate a single Zarr (with equal or similar chunking) but none of my nc2zarr jobs succeeded so far. The append job will always be terminated by an out-of-memory exception even if I assign it 256 GB of RAM. The issue is complicated by the fact that every append step takes 30 minutes and the OOM may occur only after appending ~20 years. This indicates one or more memory leaks in nc2zarr, xarray, dask, and/or zarr.
See also
The text was updated successfully, but these errors were encountered: