You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Does the project support multi-gpu training?
If yes, how? By default, it only uses one GPU. I am unable to find any parameter that can be used for this purpose.
The text was updated successfully, but these errors were encountered:
Although this is not very well tested, AlphaZero.jl will attempt to use Distributed to parallelise some parts of data generation across all available Julia processes. Each process can then be configured to use its own GPU. Evaluation stages are not parallelised this way though and may become a bottleneck if two many processes are used.
Once again, this is not very well tested and documented so may require some digging in. Don't hesitate to report about your experience and contribute some documentation if you manage to make it work for you.
Does the project support multi-gpu training?
If yes, how? By default, it only uses one GPU. I am unable to find any parameter that can be used for this purpose.
The text was updated successfully, but these errors were encountered: