-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can it be used with accelerate? #8
Comments
Thanks! As of the recent version 0.2, it should be able to used with accelerate, provided the types all have a The bigger work issue is probably porting all of the linear algebra operations with their gradients, which is a bit of a boilerplatey undertaking (see hmatrix-backprop). Accelerate types might work, but you also would need to lift linear algebra functions (like dot products, matrix multiplications, etc.) by providing their individual gradients as well for the user. This is something I'm also working on getting out soon! |
@mstksg Awesome! Thanks for the pointers to the hmatrix integration - I'll have a look. I can't promise you much help as of now unfortunately but this might change in the near future. |
Have you seen https://github.com/tmcdonell/accelerate-blas? It could be a good place to start augmenting with backprop variables. |
Hi all, Did anyone try to use backprop with accelerate? |
I am trying to figure it out now. My problem is: if a neural network with fully connected layers is represented as
then how do I represent An additional problem is in the discrepancy between pure equations (backprop library) and impure, GPU-specific operations (Accelerate DSL), which might ultimately result in a poor speed annihilating the advantages of GPU use. |
I am using the matrix-matrix multiplication
|
One approach might be to give
As long as you can assemble the the DSL purely, it should work. One issue that might come up, however, is the |
I would really love to try, but I am really, really afraid of getting a result similar to this one. Are there any limitations that you are aware of? |
This looks fantastic!
One (possibly) trivial question I have is: can this be used with
accelerate
? My Haskell got a little bit rusty - I can't tell for sure just from looking at the types...Thanks!
The text was updated successfully, but these errors were encountered: