Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple GPUs not supported with Tensorflow Backend #424

Open
ashao opened this issue Nov 3, 2023 · 0 comments
Open

Multiple GPUs not supported with Tensorflow Backend #424

ashao opened this issue Nov 3, 2023 · 0 comments
Labels
area: third-party Issues related to depencies and third-party package integrations bug: major A major bug

Comments

@ashao
Copy link
Collaborator

ashao commented Nov 3, 2023

Description

Trying to use the multi-gpu functions with Tensorflow models raises an error within RedisAI. This likely seems to be an incompatibility with the call to the Tensorflow backend. Note that we can confirm that multi-gpu works for Torch models.

How to reproduce

  1. Create a Tensorflow model
  2. Use the set_model_multigpu to load this into the database.
  3. Error gets raised in the database

Expected behavior

Users should be able to set Tensorflow models and deploy on multi-gpu machines

@ashao ashao added area: third-party Issues related to depencies and third-party package integrations bug: major A major bug labels Nov 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: third-party Issues related to depencies and third-party package integrations bug: major A major bug
Projects
None yet
Development

No branches or pull requests

1 participant