Skip to content

My implementation of auto-differentiation, like PyTorch's autograd.

Notifications You must be signed in to change notification settings

jorchard/AutoDiff

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AutoDiff

Start with optimization.ipynb, a jupyter notebook that creates a function and finds its minimum using gradient descent. The gradients are computed using auto-differentiation (AD). This uses a small AD module that works for scalars only.

The notebook Advanced_NN.ipynb demonstrates how a 2D matrix version of AD is used in a neural network.

About

My implementation of auto-differentiation, like PyTorch's autograd.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published