Skip to content

Taebaek00/CS231n

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 

Repository files navigation

test

You can check out the summary of lecture and assignments through the following link


Lecture Note

1. Image Classification

Data-driven Approach, K-Nearest Neighbor, train/validation/test splits

2. Linear Classification

Support Vector Machine, Softmax

3. Optimization

Stochastic Gradient Descent

4. Backpropagation, Intutitions

chain rule interpretation, real-valued circuits, patterns in gradient flow

5. Neural Networks Part 1: Setting up the Architecture

model of a biological neuron, activation functions, neural net architecture, representational power

6. Neural Networks Part2 : Setting up the Data

preprocessing, weight initialization, batch normalization, regularization (L2/dropout), loss functions

7. Neural Networks Part 3 : Learning and Evaluation

gradient checks, sanity checks, babysitting the learning process, momentum(+nesterov), second-order methods, Adagrad/RMSprop, hyperparameter optimization, model ensembles

8. Convolutional Neural Networks: Architectures, Pooling Layers

layers, spatial arrangement, computational considerations

9. Convolutional Neural Networks: Layer Patterns, Case studies

layer sizing patterns, AlexNet/ZFnet/VGGNet case studies


Assignments

#1

K-Nearest Neighbor

SVM(Support Vector Machine)

Softmax

Two Layer Net

Higher Level Representations: Image Features

#2

Fully-connected Neural Network

Use modular layer design to implement fully-connected networks of arbitrary depth.

Fully connected Neural Network 2

Implement several popular update rules to optimize these models

Batch Normalization

implement batch normalization, and use it to train deep fully-connected networks.

Dropout

Implement Dropout and explore its effects on model generalization.

Convolutional Networks

implement several new layers that are commonly used in convolutional networks.

PyTorch on CIFAR-10

Learn how the PyTorch works, culminating in training a convolutional network on CIFAR-10

TensorFlow on CIFAR-10

Learn how the TensorFlow works, culminating in training a convolutional network on CIFAR-10

#3

Image Captioning with Vanilla RNNs

Image captioning system on MS-COCO using vanilla recurrent networks

Image Captioning with LSTMs

Long-Short Term Memory (LSTM) RNNs, and apply them to image captioning on MS-COCO

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 74.5%
  • Python 24.4%
  • Other 1.1%