Syllabus

Please find all the Video Lectures within this YouTube Playlist.

Lecture Slides are password protected. 

  • Week 01: Introduction, course description, administrative details, features spaces, k-nearest neighbors, classification
  • Week 02: Machine Learning Pipeline, Linear classifiers, Classifier Evaluation,
  • Week 03: Loss Functions, Optimization, Gradient Descent
  • Week 04: Neural Networks, Perceptron Layers, Backpropagation
  • Week 05: Convolutional, Pooling and Soft-Max Layers, Convolutional Neural Networks (CNN)
  • Week 06: Analysis of a complete Network: AlexNet and a practical session on deep learning coding
    • Practicum: Using the (password proceted) code, we will train our first classification network. This piece of code will be your base for your projects. 
  • Week 07: Training CNNs, activation functions, initialization, batch normalization...
    • Lecture Slides (Stanford CS231n Lecture 6 Slides)
    • Here is a good article on activations functions.
    • And here is another nice article on data preprocessing (whitenning).
    • also, this is another nice article on weight initialization.
  • Week 08: More on Training, Solvers, Regularization Techniques, Data Augmentation, Transfer Learning.
    • Lecture Slides (Stanford CS231n Lecture 7 Slides)
    • Here is a useful article on optimization algorithms (solvers) in CNNs.
  • Week 09: Classification Architectures: AlexNet, VGG, GoogLeNet, ResNet...
  • Week 10: Analysis on a complete Deep Learning Project
    • (dataset construction, training, results analysis etc.)
    • here is some codes.
  • Week 11: Introduction to Recurrent Neural Networks
    • Lecture Slides (Stanford CS231n Lecture 10 Slides)
    • Those who are interested can find a very nice (both mathematical and philosophical) introduction to the subject here.
  • Week 12:This is the midterm week.
  • Week 13: Memory, GRU and LSTM, Advanced CNN architectures.
  • Week 14: Project Presentations