Skip to content

UMD-CS-STICs/389Aspring18

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CMSC389A: Practical Deep Learning

Image from here.

Course Description

“Deep Learning” systems, typified by deep neural networks, are increasingly taking over all AI tasks, ranging from language understanding, and speech and image recognition, to machine translation, planning, and even game playing and autonomous driving. As a result, expertise in deep learning is quickly transitioning into a prerequisite in many advanced academic/research settings, and a large advantage in the industrial job market.

This course provides a comprehensive, practical introduction to modern deep learning networks and their applications to AI tasks. Specifically, the course will cover basic concepts in optimization, neural networks, convolutional neural networks (CNN), and recurrent neural networks (RNN). By the end of the course, it is expected that students will have a strong familiarity with the subject and be able to design and develop deep learning models for a variety of tasks.

Course Details

  • Course: [Website] [GitHub] [Piazza] [Testudo]
  • Prerequisites: C- or better in CMSC330 and CMSC250, Proficiency in Python, Basic knowledge of Machine Learning
  • Credits: 1
  • Seats: 30
  • Lecture Time: Fridays, 12:00-12:50 PM
  • Location: CSIC 2118
  • Semester: Spring 2018
  • Textbook: None
  • Course Facilitator: Sujith Vishwajith
  • Faculty Advisor: Jordan Boyd-Graber
  • Syllabus Last Updated: January 24, 2018

The course assumes that you know:

  • Linear Algebra, Calculus (vectors, matrices, basic integrals)
  • Probability (Bayes theorem, expectation, variance)
  • Basic machine learning (linear models, regression, decision trees)
  • Coding (python, numpy, sklearn)

If you're "not sure" about some of them, it's most likely okay: you'll be able to grasp the missing concepts as you go through the course.

Textbooks

Required: None

Recommended:

Topics Covered

  • Optimization
    • Linear models review (logistic regression, etc.)
    • Gradient descent
    • Stochastic gradient descent
  • Neural Networks
    • Mulitlayer perceptron
    • Backpropagation
    • Gradients/Weights
    • Learning rates and data normalization
    • Activation functions, Optimizers, Regularization
    • Dropout, Momentum, BatchNorm, etc.
  • Convolutional Neural Networks
    • Motivation
    • Convolution operations
    • Pooling
    • Image classification
    • Modern CNN architectures (VGG, ResNet, etc.)
  • Recurrent Neural Networks
    • Motivation
    • Vanishing/Exploding gradient problem
    • Applications to sequences (text)
    • Modern RNN architectures (LSTM, GRU, etc.)
  • Tuning/Debugging Neural Networks
    • Parameter search
    • Overfitting
    • Visualizations
  • Pretrained Models
    • Word2Vec, Glove, etc.
    • Using pretrained models for different task
  • Libraries
    • Keras
    • PyTorch
    • Numpy

Grading

Grades will be maintained on the CS Department grades server.

You are responsible for all material discussed in lecture and posted on the class repository, including announcements, deadlines, policies, etc.

Your final course grade will be determined according to the following percentages:

Percentage Title Description
40% Projects Weekly individual projects that teach practical skills and real life applications.
20% Midterm Examination
40% Final Project Final project to demonstrate mastery of all topics learned and apply knowledge to create a new application from scratch.

Any request for reconsideration of any grading on coursework must be submitted within one week of when it is returned. No requests will be considered afterwards.

Timeline

Week Topic Assignment
1 (1/26) Intro to Deep Learning + Logistic Regression
2 (2/2) Perceptrons + Environment Setup P1 OUT
3 (2/9) MLPs + Backpropagation
4 (2/16) Neural Networks P1 DUE, P2 OUT
5 (2/23) Neural Networks (cont.)
6 (3/2) Tuning & Debugging Neural Networks
7 (3/9) Deep Learning for Images P2 DUE
8 (3/16) Deep Learning for Images (cont.) E1 DUE, P3 OUT
9 (3/23) SPRING BREAK
10 (3/30) Deep Learning for Sequences
11 (4/6) Pretrained Models + Advanced Architectures P3 DUE, Final Project Proposals DUE
12 (4/13) EXAM
13 (4/20) Advanced Architectures (cont.) P4 OUT
14 (4/27) Siamese Networks
15 (5/4) Reinforcement Learning P4 DUE (5/6), Checkpoint DUE
16 (5/11) Deep Reinforcement Learning Final Project DUE

The timeline is not final and can be subject to change.

Projects

Projects must be submitted electronically following the instructions given in each project assignment. Projects may not be submitted by any other means (e.g., please do not email your projects to us). It is your responsibility to test your program and verify that it works properly before submitting. All projects are due at 11:59 PM on the day indicated on the project assignment.

Projects may be submitted up to 24 hours late for a 10% penalty. If you submit both on-time & late, your project will receive the maximum of the penalty-adjusted scores. You may submit multiple times.

Unlike lower-level programming classes, we will not provide you with test cases (e.g., public tests) before projects are due. You will be responsible for developing your own tests and for using appropriate testing techniques. Also, we expect your projects to use proper style and documentation.

Final Project

The final project will be an oppurtunity to apply the course material into a deep learning application that you will build from the ground up to accomplish a task of your choosing. Some example tasks could be facial recognition, predicting stock prices, disease classification, etc. Students will also be required to submit a writeup and/or 2-5 minute video highlighting how their application works. More information on the final project will be available as the course progresses.

Outside-of-class communication with course staff

We will interact with students outside of class in primarily two ways: in-person during office hours and piazza. Email should only be used for emergencies and not class related questions (e.g., projects). You can enroll in the Piazza discussion through here. Please try to use Piazza as the main method of communication regarding content and class questions. For personal discussions, please email Sujith Vishwajith first and he will escalate it to Dr. Boyd-Graber if necessary.

Course Facilitator:

Sujith Vishwajith - [email protected]

Advisor:

Dr. Jordan Boyd-Graber - [email protected]

Excused Absence and Academic Accommodations

See the section titled "Attendance, Absences, or Missed Assignments" available at Course Related Policies.

Disability Support Accommodations

See the section titled "Accessibility" available at Course Related Policies.

Academic Integrity

Note that academic dishonesty includes not only cheating, fabrication, and plagiarism, but also includes helping other students commit acts of academic dishonesty by allowing them to obtain copies of your work. In short, all submitted work must be your own. Cases of academic dishonesty will be pursued to the fullest extent possible as stipulated by the Office of Student Conduct.

It is very important for you to be aware of the consequences of cheating, fabrication, facilitation, and plagiarism. For more information on the Code of Academic Integrity or the Student Honor Council, please visit http://www.shc.umd.edu.

Course Evaluations

If you have a suggestion for improving this class, don't hesitate to tell the instructor or TAs during the semester. At the end of the semester, please don't forget to provide your feedback using the campus-wide CourseEvalUM system. Your comments will help make this class better.

Thanks to the writers of this syllabus and this syllabus for the wording of much of this document.

About

Practical Deep Learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published