Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Worker dying issue with controller training #43

Open
rohanb2018 opened this issue Jul 23, 2022 · 0 comments
Open

Worker dying issue with controller training #43

rohanb2018 opened this issue Jul 23, 2022 · 0 comments

Comments

@rohanb2018
Copy link

Hello, has anyone ran into an issue where one or more of the workers created in python traincontroller.py dies without explanation, causing the entire script to hang because it is waiting for the dead workers to finish their evaluations?

I've checked the logs created in tmp for each of the worker processes and unfortunately the .err logs seem to be uninformative or empty.

I think it might be a GPU memory issue or some issue related to some modifications I made to the CarRacing environment, but the lack of any error logging is concerning. It also does not seem to consistently happen, which is also strange.

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant