Multiple GPUs not supported with Tensorflow Backend #424
Labels
area: third-party
Issues related to depencies and third-party package integrations
bug: major
A major bug
Description
Trying to use the multi-gpu functions with Tensorflow models raises an error within RedisAI. This likely seems to be an incompatibility with the call to the Tensorflow backend. Note that we can confirm that multi-gpu works for Torch models.
How to reproduce
set_model_multigpu
to load this into the database.Expected behavior
Users should be able to set Tensorflow models and deploy on multi-gpu machines
The text was updated successfully, but these errors were encountered: