Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Forward and Backward Propagation #28

Open
ksarker1205 opened this issue Apr 21, 2016 · 1 comment
Open

Forward and Backward Propagation #28

ksarker1205 opened this issue Apr 21, 2016 · 1 comment

Comments

@ksarker1205
Copy link

I am interested to see where forward and backward propagation is happening in the code. Can you point me to that specific portion of the code ?

@hma02
Copy link
Contributor

hma02 commented Apr 22, 2016

Hi @ksarker1205 , when the computing graph is compiled into a theano function, the forward and backward propagation happens each time the function is called at this line:
https://github.com/uoguelph-mlrg/theano_alexnet/blob/master/train_funcs.py#L165

The function is responsible for taking the input and forward propagating through the graph, and the update argument of the function specifies how the gradient is used in backward propagation.

See how the theano function is constructed here:
https://github.com/uoguelph-mlrg/theano_alexnet/blob/master/alex_net.py#L216

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants