Skip to content

Latest commit

 

History

History
96 lines (85 loc) · 3.68 KB

README.md

File metadata and controls

96 lines (85 loc) · 3.68 KB

Tensorflow2로 배우는 머신러닝 [융합 신기술 교육 프로그램2(AA0020)]

이 git 저장소는 인하대학교 공학교육혁신센터 Tensorflow2로 배우는 머신러닝 [융합 신기술 교육 프로그램2(AA0020)] 강의 및 실습 내용을 요약정리한 내용을 담고 있습니다.

Index

  1. Basic

    1. Python
    2. Numpy
    3. Pandas
  2. KNN

  3. Decision Tree

  4. Regression Analysis

    1. Regression
      • Linear Regression
      • Least Square Method
      • Logistic Regression
      • Cost Function / Sigmoid Function
      • Gradient Descent Method
    2. Using Tensorflow
      • Linear Regression
      • Logistic Regression
      • MNIST Datasets
      • Categorical Encoding / Softmax Function
      • SGD / BGD
      • Epoch / Batch
  5. Neural Network

    1. Neural Network
      • Neuron Cell / Perceptron
      • Neural Network
      • Hidden Layer
      • Backpropagation
    2. Using TensorFlow
      • Gradient Vanishing Problem
      • ReLU / LeakyReLU / ELU Function
      • Optimizers
        • Xavier / He Initialization
        • Dropout / Batch Normalization
        • Momentum / Nesterov Momentum
        • AdaGrad / RMSProp / Adam
  6. Convolutional Neural Network

    1. CNN
      • Convolutional Layer
      • Pooling Layer
      • Toy Image
    2. MNIST CNN
    3. MNIST CNN DNN
    4. Example: Fashion MNIST
    5. Example: CIFAR10 Dataset
      1. CIFAR10 w/ Dropout (approx. 74%)
      2. CIFAR10 w/ Dropout, Batch Normalization (approx. 79%)
      3. CIFAR10 w/ Dropout, Batch Normalization, Kernel Regularizer (approx. 84%)
      4. CIFAR10 w/ Dropout, Batch Normalization, Kernel Regularizer, ImageDataGenerator (approx. 88%)
    6. Example: CatsVsDogs Dataset
      1. Split Data Set
      2. CNN_Train
      3. CNN_Test
      4. ImageGenerator_Train
      5. ImageGenerator_Test
      6. VGG16_Train
      7. VGG16_Test
  7. Auto Encoder

  8. Recurrent Neural Network

    1. RNN
      • Time Sequence Forecasting
      • RNN
        • one to one
        • one to many
        • many to one
        • many to many
        • Multiple Layer RNN
      • Example: Character RNN
      • TimeDistributed Layer
      • Embedding Layer
      • Projection Layer
    2. RNN
      • Gradient Vanishing in RNN
      • Long Short Term Memory
      • Stacked RNN
    3. Example: Stock Data
    4. Example: IMDB Dataset
    5. Word2Vec
    6. Seq2Seq
      • Attention
  9. Appendix

    1. VAE
    2. Image / Video Preprocessing