You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The browser can fail with an out-of-memory error when trying to download a
very large data matrix (in my tests hundreds of gigabytes worth).
This patch uses several strategies to increase the download size at which
that happens:
- Gets access windows one row at a time and only for the range of
requested columns. This minimizes tile cache memory needed.
- Converts rows to tsv format on the fly, so we don't need to convert
the entire matrix to tsv format at once.
- Constructs a blob using a vector of the row tsv data. This is more
memory efficient than manually building a data URL.
These steps help but don't eliminate the problem. (I don't think that's
possible purely in browser.) A future patch will a display warning notice to
the user for very large download sizes.
If the number of array elements to download exceeds a threshold (currently
one million array elements) show a warning to the user that the download
may kill the browser due to memory exhaustion.
If the user chooses to proceed, a progress bar is displayed.
If the warning dialog is displayed, the user can cancel the download.
For large downloads there is a noticeable delay between when we have
finished all processing (and hide the dialog) and when the browser is
ready to save the file. The browser can still crash during this time.
I'm not sure if there's anything that we can do about that.
Downloading data from very large NG-CHMs as TSV data can crash the browser is ways that we can't detect or prevent.
Warn the user about this if they attempt to download a large data matrix.
The text was updated successfully, but these errors were encountered: