You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Inference on larger areas may need a bit more memory than free Azure runner and Wasm (strict 2gb) can provide.
We see failures of this sort when running accessibility
numpy.core._exceptions._ArrayMemoryError: Unable to allocate 509. MiB for an array with shape (10721, 6229) and data type float64
I can possibly cast values to float32 saving a bit of memory and hoping it will be enough but we’ll hit the same issue on any larger area again. Refactoring to avoid allocation of this array would mean complete rewrite of accessibility and most likely a significant drop in performance leading to minutes-long wait time for inference.
The solution for Azure is easy, we just need to bump the instance memory (if we have credits to do that). The solution for wasm I am not sure if exists if we want both fast and super memory efficient inference.
The text was updated successfully, but these errors were encountered:
Inference on larger areas may need a bit more memory than free Azure runner and Wasm (strict 2gb) can provide.
We see failures of this sort when running accessibility
I can possibly cast values to
float32
saving a bit of memory and hoping it will be enough but we’ll hit the same issue on any larger area again. Refactoring to avoid allocation of this array would mean complete rewrite of accessibility and most likely a significant drop in performance leading to minutes-long wait time for inference.The solution for Azure is easy, we just need to bump the instance memory (if we have credits to do that). The solution for wasm I am not sure if exists if we want both fast and super memory efficient inference.
The text was updated successfully, but these errors were encountered: