A tiny Autograd engine that implements backpropagation (reverse-mode autodiff) and a small MLP library on top of it with a PyTorch-like API. With only about 150 lines of code, the engine operates over scalar values,
-
Notifications
You must be signed in to change notification settings - Fork 0
KushGabani/toy-micrograd
About
An educational toy implementation of PyTorch's autograd API from scratch using python and GraphViz to get a good grasp over the underlying pytorch's autograd engine.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published