CSCE 5218 – Deep Learning

Spring 2024    


Basic information:


Course description

This course aims at covering the basics of modern deep neural networks. In specific, the first part will introduce the fundamental concepts in neural networks including network architecture, activation function, loss, optimization, etc. Then, the second part will describe specific types of different deep neural networks such as convolutional neural networks (CNNs), recurrent neural networks (RNNs) and attention-based Transformer, as well as their applications in computer vision and natural language processing. In the final part we will briefly discuss some recent advanced topics in deep learning including graph neural networks, unsupervised representation learning, deep reinforcement learning, generative adversarial networks (GANs), etc. In this course, the hands-on practice of implementing deep learning algorithms (in Python) will be provided via homeworks and course project.


Textbooks

We will have required readings from the following textbook:

Besides, the following textbooks are useful as additional references: In addition to the textbooks, extra reading materials will be provided as we cover topics. Check out the course website regularly for updated reading materials.


Announcements

Links


Paper review list

Important: Read the requirements (click here) for paper review. Here is a review example for your reference. (Paper review lists will be gradually added.)

Paper review list 1 (due on 2/8):

  1. A Krizhevsky, I Sutskever, and G Hinton, ImageNet Classification with Deep Convolutional Neural Networks, NeurIPS, 2012.
  2. A Paszke, et al., PyTorch: An Imperative Style, High-Performance Deep Learning Library, NeurIPS, 2019.

Paper review list 2 (due on 2/20):

  1. K Simonyan and A Zisserman, Very Deep Convolutional Networks for Large-Scale Image Recognition, ICLR, 2015.
  2. K. He, X. Zhang, S. Ren, and J. Sun, Deep Residual Learning for Image Recognition, CVPR, 2016.

Paper review list 3 (due on 2/29):

  1. R. Girshick, J. Donahue, T. Darrell, and J. Malik, Rich feature hierarchies for accurate object detection and semantic segmentation, CVPR, 2014.
  2. R. Girshick, Fast R-CNN, ICCV, 2015.

Paper review list 4 (due on 3/10):

  1. S. Ren, K. He, R. Girshick, and J. Sun, Faster R-CNN: Towards real-time object detection with region proposal networks, NIPS, 2015.
  2. J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, You only look once: Unified, real-time object detection, CVPR, 2016.
  3. J. Long, E. Shelhamer, and T. Darrell, Fully Convolutional Networks for Semantic Segmentation, CVPR, 2015.

Paper review list 5 (due on 3/28):

  1. J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modelling, NIPS Workshop, 2014.
  2. J. Donahue, L. Hendricks, S. Guadarrama, M. Rohrbach, S. Venugopalan, K. Saenko, and T. Darrell, Long-term recurrent convolutional networks for visual recognition and description, CVPR, 2015.

Paper review list 6 (due on 4/11):

  1. D. Bahdanau, K. Cho, and Y. Bengio, Neural machine translation by jointly learning to align and translate, ICLR, 2015.
  2. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. Gomez, Ł. Kaiser, and I. Polosukhin, Attention is all you need, NIPS, 2017.
  3. A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold, S. Gelly, J. Uszkoreit, and N. Houlsby, An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale, ICLR, 2021.


Schedule and class notes (being updated)

Date Lecture Reading Note
Week 1
1/16
Introduction - -
1/18 Machine learning overview Deep Learning, Ch 1-5 -
Week 2
1/23
Neural network basics-1 Deep Learning Ch 4.2, 4.3, 6 -
1/25 Neural network basics-2 Deep Learning Ch 4.2, 4.3, 6 -
Week 3
1/30
Canceled - -
2/1 Deep neural network training-1 Deep Learning Ch 7, 8, 11 -
Week 4
2/6
Neural network basics-2 Deep Learning Ch 4.2, 4.3, 6 -
2/8 PyTorch Tutorial - Guest Lecture by Xiaoqiong Liu
Week 5
2/13
Deep neural network training-3 Deep Learning Ch 7, 8, 11 -
2/15 Convolutional Neural Networks (CNNs)
  • Convolution and Pooling
Deep Learning Ch 9 -
Week 6
2/20
  • Convolution and Pooling (cont.)
Deep Learning Ch 9 -
2/22
  • CNN Architectures and Applications
Deep Learning Ch 9 -
Week 7
2/27
  • CNN Architectures and Applications (cont.)
Deep Learning Ch 10 -
2/29 Recurrent Neural Networks (RNNs)
  • Basics and Architecture
Deep Learning Ch 10 -
Week 8
3/5
3/7
Project Proposal Preparation - -
Week 9
Spring Break (no classes) - -
Week 10
3/19
  • Basics and Architecture (cont.)
Deep Learning Ch 10 -
3/21
  • Applications
Deep Learning Ch 10 -
Week 11
3/26
Transformers
  • Background
  • Self- and Cross-attention
Ref 1, Ref 2, Ref 3, Ref 4 (a blog), Ref 5 -
3/28
  • Self- and Cross-attention (cont.)
  • Applications beyong language
Ref 1, Ref 2, Ref 3, Ref 4 (a blog), Ref 5 -
Week 12
4/2
Advanced Topics
  • Graph Convolutional Networks
  • Self-supervised Learning
Ref 1, Ref 2 -

4/4
  • Reinforcement Learning
Ref 1, Ref 2, Ref 3 -
Week 13
4/9
  • Generation
  • Generative Adversarial Networks
Ref 1 -
4/11
  • Bias and Ethics (optional)
- -
Week 14
4/16
4/18
  • Project Presentation
- -
Week 15
4/23
4/25
  • Project Presentation
- -
Week 16
4/30
  • Project Presentation
- -
Final Exam 10:30 am - 12:30 pm, May 9, at K110 - -


Grading policy

Grading will be based on the following components:


Resource and Acknowledgment

This course is inspired by the following courses: