Repo for step by step work through the implementation of Automatic Differentiation
- Tracer
- How we walk through the graph(forward/backward)
We can easily design variable in our graph as follow:
class Var:
def __init__(self, value):
self.value = value
self.children = []
self.grad_value = None
The children
of variable is a Graph representation.
Each time you add an operation to the graph,you will have a new varable with extened graph.
In this way, we have to overloading every operation in our
- Node: representation for node in graph, its parents are linked list which is the computational graph
- Container: value-tpye related container which used as the atom unit in the forward/backward pass, the advantage of container is that it allow you to convinently costomize your own data container and related function.
- forward pass will traval through the graph and record each invocation of the function
- Func_wrapper mainly used to contruct a function object which will
- unbox container
- calculate value with raw function
- box a new container
- the reason for doing this is to build a graph secertly
- vjp is used to get the gradient of the node wrt its function
- vjp is defined separately by the library you want to use