Skip to content

maksymchyk/deep_learning

 
 

Repository files navigation

Deep Learning

Contents

Description

This module offers both fundamental understanding and practical skills necessary to develop, implement, test, and validate various deep learning models. The curriculum delves into the core concepts of Deep Learning, emphasizing its application across diverse domains. Participants will explore the intricacies of neural networks, backpropagation, and the advanced architectures used in image processing, natural language processing, and more.

Learning Outcomes

By the end of this learning module, participants will be able to:

  1. Apply principles of neural networks, including architectures like CNNs and RNNs.
  2. Implement deep learning models for tasks in image processing, NLP, and recommendation systems.
  3. Utilize advanced techniques such as sequence-to-sequence models and attention mechanisms.
  4. Evaluate and address challenges in model training, imbalanced classification, and metric learning.
  5. Use Keras and TensorFlow to emphasize reproducible research.
  6. Explain the ethical implications of deep learning models effectively to diverse audiences.

Assignments

The assessment for this module is based on two components: assignments and class participation, including the completion of Jupyter notebooks.

Assessment Number Individual Weight Cumulative Weight
Assignments 2 35% 70%
Jupyter Notebooks 10 2% 20%
Participation NA NA 10%
  • Assignments consist of two major tasks completed at the end of the first two weeks.
  • Jupyter Notebooks are to be completed throughout the module. Completion of these notebooks is pass/fail.
  • Participation includes engagement in class discussions, activities, and overall contribution to the module environment.

Assignments

Assignments are a vital part of this module, focusing on the application of deep learning concepts. Two main assignments are scheduled, one at the end of each of the two weeks. These assignments will be introduced in live session and can be discussed with the Technical Facilitator or Learning Support during office hours, work periods or via Slack. They should be completed independently.

Assessment Link Due Date
Assignment 1 Notebook TBD
Assignment 2 Notebook TBD
Workbooks TBD

You may submit assignments multiple times before the deadline. The last submission will be graded.

Notebook Completion

Participants are expected to complete the Jupyter notebooks associated with each session. Completion includes actively coding along with the Technical Facilitator and answering any questions in the notebooks. These notebooks are due by the end of the module, but it is highly recommended to complete them as you progress through the material to stay on top of the content. Notebooks are to be submitted for pass/fail assessment.

Submitting Notebooks

Notebooks are to be submitted together at the end of the module. You may submit notebooks multiple times before the deadline. The last submission will be graded.

Participation

We hope all members of the module regularly participate. We define participation broadly and include attendance, asking questions, answering others' questions, participating in discussions, etc.

Contacts

Questions can be submitted to the #cohort-3-help channel on Slack

  • Technical Facilitator: {Name} {Pronouns}. Emails to the Technical Facilitator can be sent to {first_name.last_name}@mail.utoronto.ca.
  • Learning Support Staff: {Name} {Pronouns}. Emails to the Technical Facilitator can be sent to {first_name.last_name}@mail.utoronto.ca.

Delivery of the Learning Module

This module will include live learning sessions and optional, asynchronous work periods. During live learning sessions, the Technical Facilitator will introduce and explain key concepts and demonstrate core skills. Learning is facilitated during this time. Before and after each live learning session, the instructional team will be available for questions related to the core concepts of the module. The Technical Facilitator will introduce concepts through a collaborative live coding session using the Python notebooks found under /01_materials/slides. The Technical Facilitator will also upload live coding files to this repository for participants to revisit under ./04_cohort_three/live_code.

Optional work periods are to be used to seek help from peers, the Learning Support team, and to work through the homework and assignments in the learning module, with access to live help. Content is not facilitated, but rather this time should be driven by participants. We encourage participants to come to these work periods with questions and problems to work through.   Participants are encouraged to engage actively during the learning module. They key to developing the core skills in each learning module is through practice. The more participants engage in coding along with the instructional team, and applying the skills in each module, the more likely it is that these skills will solidify.

This module's materials are adapted from the Deep Learning module taught at Master Year 2 Data Science IP-Paris. The module includes comprehensive lectures and lab notebooks covering fundamental and advanced topics in Deep Learning. While there is no designated textbook for this module, the adapted materials provide a thorough exploration of the subject, incorporating a blend of theoretical knowledge and practical applications.

Schedule

Live Learning Session Date Topic Slides Workbooks Suggested Additional Material
1 TBD Introduction to Deep Learning Slides Lab 1 Workbook
2 TBD Neural Networks and Backpropagation Slides Lab 2 Workbook 3Blue1Brown Neural Networks
3 TBD Embeddings and Recommender Systems Slides Lab 3 Workbook
4 TBD Convolutional Neural Networks for Image Classification Slides Lab 4 Workbook
5 TBD Deep Learning for Object Detection and Image Segmentation Slides Lab 5 Workbook
6 TBD Recurrent Neural Networks and NLP Slides Lab 6 Workbook
 

Requirements

  • Participants are expected to have completed Shell, Git, Python, Linear Regression, Classification, and Resampling, Production, and Algorithms & Data Structures learning modules.
  • Participants are encouraged to ask questions, and collaborate with others to enhance their learning experience.
  • Participants must have a computer and an internet connection to participate in online activities.
  • Participants must not use generative AI such as ChatGPT to generate code in order to complete assignments. It should be used as a supportive tool to seek out answers to questions you may have.
  • We expect participants to have completed the instructions mentioned in the onboarding repo.
  • We encourage participants to default to having their camera on at all times, and turning the camera off only as needed. This will greatly enhance the learning experience for all participants and provides real-time feedback for the instructional team.
  • Participants must have VSCode installed with the following extensions:

Resources

Feel free to use the following as resources:

Documents

Videos

How to get help

image

Folder Structure

.
├── .github
├── 01_materials
├── 02_activities
├── 03_instructional_team
├── 04_cohort_three
├── .gitignore
├── LICENSE
├── README.md
└── steps_to_ask_for_help.png
  • .github: Contains issue templates and pull request templates for the repository.
  • materials: Module slides and interactive notebooks (.ipynb files) used during learning sessions.
  • activities: Contains graded assignments, exercises, and homework to practice concepts covered in the learning module.
  • instructional_team: Resources for the instructional team.
  • cohort_three: Additional materials and resources for cohort three.
  • .gitignore: Files to exclude from this folder, specified by the Technical Facilitator
  • LICENSE: The license for this repository.
  • README.md: This file.
  • steps_to_ask_for_help.png: Guide on how to ask for help.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 99.8%
  • Other 0.2%