Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

separate the data parallelization from model parallelization #34

Open
bobye opened this issue Nov 25, 2014 · 2 comments
Open

separate the data parallelization from model parallelization #34

bobye opened this issue Nov 25, 2014 · 2 comments
Milestone

Comments

@bobye
Copy link
Owner

bobye commented Nov 25, 2014

change backpropagate() to two versions (one is sequential in data, one is parallel in data)

@bobye
Copy link
Owner Author

bobye commented Nov 25, 2014

Another workaround is to consider pass derivative as explicit outputs, and use aggregate to obtain overall gradient.

@bobye bobye added this to the The First Release milestone Nov 25, 2014
@bobye
Copy link
Owner Author

bobye commented Nov 25, 2014

I like the second solution to handle data parallelization: It always keeps the status immutable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant