Skip to content

A colab-style DL tutorial for how to build and train DNNs (with and without autograd packages) from scratch.

Notifications You must be signed in to change notification settings

TomGeorge1234/DeepLearningTutorial

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep Learning Tutorial

In this tutorial (deeplearning.ipynb) - designed for the TReND CaMinA summer school - we will get hands on building deep neural networks (DNNs) and training them via backproagation. Initially the goal is to avoid autograd packages (such as pytorch or jax) at all costs. Coding a deep neural network by hand this will help us gain an understanding of the mathematics going on behind the scenes. At the end we will use pytorch to build a much deeper network and see how it performs. Here's the plan:

  1. Set-up: Generate some data for a neuroscience-inspired task our networks will try to learn
  2. Linear regression: A simple model with an analytic solution. We'll use this as a comparison later on.
  3. Deep neural networks (by hand): Derive analytically the learning rules for a two hidden layer DNN and code this by hand.
  4. Deep neural networks (by pytorch): Use an autograd package to show how these models can be scaled efficiently.

We recommend cloning and running on your local IDE (it isn't compute heavy and won't require GPUs), but you can also run remotely on Google colab here Open In Colab.

About

A colab-style DL tutorial for how to build and train DNNs (with and without autograd packages) from scratch.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published