Skip to content

Commit

Permalink
Move and add images
Browse files Browse the repository at this point in the history
  • Loading branch information
tostenzel committed Jan 6, 2024
1 parent ccf3bc6 commit 6159e1b
Show file tree
Hide file tree
Showing 6 changed files with 17 additions and 14 deletions.
31 changes: 17 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@

<p align="center">
<img src="edugrad-header.png" alt="drawing" width="300"/>
<img src="img/edugrad-header.png" alt="drawing" width="300"/>
</p>

**edugrad** is the most simple and accesible implementation of a deep learning framework. Its purpose is to reveal the
Expand Down Expand Up @@ -34,21 +34,21 @@ relatively minor. The autograd mechanism is inspired by Andrej Karpathy's
[micrograd](https://github.com/karpathy/micrograd).

## The Code

<p align="center">
<img src="edugrad-dependencies.png" alt="drawing" width="400"/>
<img src="img/edugrad-code-i.png" alt="drawing" width="400"/>
</p>

### Low-level (`data.py`), Mid-level (`function.py`) and High-level (`tensor.py) Operations
### 1. Low-level (`data.py`), Mid-level (`function.py`) and High-level (`tensor.py`) Operations

The computation processes are structured across different levels of operations, namely low-level, mid-level, and high-level operations.

#### 1. Low-Level Operations
- **Module**: `data.py` (TensorData class)
- **Purpose**: Direct manipulation and execution of basic tensor operations.
- **Module**: `data.py` (`TensorData` class)
- **Purpose**: Execution of most basic tensor operations.
- **Characteristics**:
- Core tensor data manipulation using NumPy arrays.
- Implement elemental tensor operations like addition, multiplication, reshaping, etc.
- Immediate execution of operations using CPU, leveraging NumPy's capabilities. Using a different backend like PyTorch or Jax would only require reimpleneting 17 operations in the module.
- Immediate execution of operations using CPU, leveraging `numpy.array`'s capabilities. Using a different backend like PyTorch or Jax would only require reimpleneting 17 operations in the module (enumerated in `ops.py`).
- Operations at this level do not involve gradient computations or the autograd mechanism.
- Acts as the foundational building block for higher-level operations.

Expand All @@ -58,21 +58,24 @@ The computation processes are structured across different levels of operations,
- **Characteristics**:
- Compose low-level ops from `data.py` to define more complex operations.
- Each operation (e.g., `Add`, `Mul`, `Sin`) encapsulates a forward pass and a corresponding backward pass for gradient computation.
- Serves as the backbone of edugrad's autograd system, allowing for automatic differentiation.
- Mid-level operations are used as nodes to build computational graphs during the forward pass, storing necessary information for the backward pass.
- Serves as the backbone of edugrad's autograd system, allowing for automatic differentiation of different models defined with `edugrad.Tensor`.
- Mid-level operations are used as nodes to build complex computational graphs during the forward pass, storing necessary information for the backward pass.

#### 3. High-Level Operations (High-Level Ops)
- **Module**: `tensor.py` (Tensor class)
- **Module**: `tensor.py` (`Tensor` class)
- **Purpose**: Provide a user-friendly interface for tensor operations and enable building and training neural network models.
- **Characteristics**:
- High-level abstraction for tensor operations.
- Utilizes mid-level ops from `function.py` to implement tensor methods, enabling automatic differentiation without defining a backward function.
- Utilizes mid-level ops from `function.py` to implement tensor methods and matrix algebra, enabling automatic differentiation without defining a backward function.
- Includes a broad range of operations commonly used in neural networks, like matrix multiplication, activation functions, and loss functions.
- Facilitates the construction and manipulation of larger computational graphs through tensor operations.
- This level is where most users interact with the edugrad library, building and training models using a familiar, PyTorch-like API.

<p align="center">
<img src="img/edugrad-code-ii.png" alt="drawing" width="400"/>
</p>

### Understanding Computational Graphs in edugrad: Forward and Backward Passes
### II. Computational Graphs in edugrad: Forward and Backward Passes

In edugrad, the handling of the computational graph, particularly the relationships between nodes (tensors) during the forward and backward passes, is crucial for understanding how automatic differentiation works. Let's delve into the details of how the parents of each node are stored in `Tensor._ctx` and how they are utilized during the backward pass by functions in `autograd.py`.

Expand All @@ -93,7 +96,7 @@ Consider an operation `z = x + y`, where `x` and `y` are tensors. The `Add` func
```python
class Add(Function):
def forward(self, x: TensorData, y: TensorData) -> TensorData:
return x.elementwise(BinaryOps.ADD, y)
return x.elementwise(ops.BinaryOps.ADD, y)

def backward(self, grad_output: TensorData) -> Tuple[Optional[TensorData], Optional[TensorData]]:
return grad_output if self.needs_input_grad[0] else None, grad_output if self.needs_input_grad[1] else None
Expand All @@ -112,7 +115,7 @@ z = x + y # Internally calls Add.apply(x, y)
During the backward pass, gradients are computed in reverse order, starting from the final output tensor and propagating through its ancestors.

#### Backward Function in autograd.py:
- When `backward()` is called on the final output tensor, `autograd.py` starts traversing the computational graph in reverse.
- When `backward()` is called on the final output tensor, usually the cost/loss, `autograd.ollect_backward_graph()` starts traversing the computational graph in reverse.
- It begins with the tensor on which `backward()` was called and recursively visits the parent tensors stored in `_ctx`.

#### Gradient Computation:
Expand Down
Binary file removed edugrad-dependencies.png
Binary file not shown.
Binary file added img/edugrad-code-i.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/edugrad-code-ii.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/edugrad-dependencies.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
File renamed without changes

0 comments on commit 6159e1b

Please sign in to comment.