This class is an introductory undergraduate course in machine learning. The class will briefly cover topics in regression, classification, mixture models, neural networks, deep learning, ensemble methods and reinforcement learning.

Prerequisites: You should understand basic probability and statistics, (STA 107, 250), and college-level algebra and calculus. For example it is expected that you know about standard probability distributions (Gaussians, Poisson), and also how to calculate derivatives. Knowledge of linear algebra is also expected, and knowledge of mathematics underlying probability models (STA 255, 261) will be useful. For the programming assignments, you should have some background in programming (CSC 270), and it would be helpful if you know Matlab or Python. Some introductory material for Matlab will be available on the course website as well as in the first tutorial.

back to top

  • October 6nd: New pdf and code for Assignment 1 with problems fix. Due date Oct. 19 at noon.
  • October 2nd: Assignment 1 posted at the bottom of the page
  • September 27th: Dont forget to complete your Form at
  • September 27th: Slide on Precision and Recall in lecture 3 has been corrected
  • September 18th: Piazza for this course can be found at
  • August 25th: Creation of Webpage
  • back to top

    Lectures: Monday, Wednesday 12-1 (section 1), 3-4 (section 2), Thursday 6-8 (section 3)

    Lecture Room: MP134 (section 1), SS2106 (section 2), BA1200 (section 3)

    Instructor: Raquel Urtasun (section 1 and 2), Ruslan Salakhutdinov (section 3)

    Tutorials: Friday 12-1 (section 1), 3-4 (section 2), Thursday 8-9 (section 3)

    Tutorial Room: MP134 (section 1), SS2106 (section 2), BA1200 (section 3)

    Office Hours: Raquel Urtasun: Monday 4:10-5:40, Pratt Building, Room 290E. Ruslan Salakhutdinov: Thursdays 1-2pm in Pratt Building, Room 290F. Additionally, you can also ask questions about the course to the CSC2515 instructor. Rich Zemel: Thursday 4-5 Pratt Building, Room 290D.

    TAs: TBA

    back to top

    The format of the class will be lecture, with some discussion. We strongly encourage interaction and questions. There are assigned readings for each lecture that are intended to prepare you to participate in the class discussion for that day.

    The final grade will consist of the following
    Assignments 40% (3 assignments, first two worth 12.5%, last 15%)
    Mid-Term Exam 25%
    Final Exam 35%

    Homework assignments

    The best way to learn about a machine learning method is to program it yourself and experiment with it. So the assignments will generally involve implementing machine learning algorithms, and experimentation to test your algorithms on some data. You will be asked to summarize your work, and analyze the results, in brief (3-4 page) write ups. The implementations may be done in any language, but Matlab and Python are reocmmended. A brief tutorial on Matlab is included here. You may also use Octave.

    Collaboration on the assignments is not allowed. Each student is responsible for his or her own work. Discussion of assignments and programs should be limited to clarification of the handout itself, and should not involve any sharing of pseudocode or code or simulation results. Violation of this policy is grounds for a semester grade of F, in accordance with university regulations.

    The schedule of assignments is included in the syllabus. Assignments are due at the beginning of class/tutorial on the due date. Because they may be discussed in class that day, it is important that you have completed them by that day. Assignments handed in late but before 5 pm of that day will be penalized by 5% (i.e., total points multiplied by 0.95); a late penalty of 10% per day will be assessed thereafter. Extensions will be granted only in special situations, and you will need a Student Medical Certificate or a written request approved by the instructor at least one week before the due date.


    There will be a mid-term in class on TBA, which will be a closed book exam on all material covered up to that point in the lectures, tutorials, required readings, and assignments.

    The final will not be cumulative, except insofar as concepts from the first half of the semester are essential for understanding the later material.


    We expect students to attend all classes, and all tutorials. This is especially important because we will cover material in class that is not included in the textbook. Also, the tutorials will not only be for review and answering questions, but new material will also be covered.

    back to top

    There is no required textbook for this course. There are several recommended books. We will recommend specific chapters from two books: Introduction to Machine Learning by Ethem Alpaydin, and Pattern Recognition and Machine Learning by Chris Bishop. We will also recommend other readings.

    back to top

    Click on the syllabus

    back to top

    Sept 14Course Introduction lecture1
    Sept 16Linear Regression Bishop, Chapter 1.0-1.1; 3.1 lecture2
    Sept 18Tutorial: Review on Probability tutorial1
    Sept 21Linear Classification Bishop: Pages 179-195 lecture3
    Sept 23Logistic Regression Bishop: Pages 203-207 lecture4
    Sept 25Tutorial: Gradient Descent tutorial2
    Sept 28Non-parametric Methods Bishop: Pages 120-127 lecture5
    Sept 30Decision Trees lecture6
    Oct 2Tutorial: K-NN and Decision Trees tutorial3
    Oct 5Multi-class Classification Bishop 4.1.2, 4.3.4 lecture7
    Oct 7Probabilistic Classifiers I Bishop 4.2.2 lecture8
    Oct 9Probabilistic Classifiers II Bishop, 380-381 lecture9
    Oct 14Neural Networks I Bishop 5.1 - 5.3 lecture10
    Oct 16Neural Networks II lecture11
    Oct 19Tutorial: Naive Bayes tutorial4
    Oct 21Tutorial: Neural Networks tutorial5
    Oct 23Tutorial: Midterm Review tutorial6
    Oct 26MIDTERM
    Oct 28Clustering Bishop 9.1 lecture12
    Oct 30Tutorial: Clustering tutorial7
    Nov 2Mixture of Gaussians Bishop 9.2, 9.3 lecture13
    Nov 4PCA & Autoencoders Bishop 12.1 lecture14
    Nov 9NO CLASS
    Nov 11SVM Bishop: Chapter 7, pages 325-337 lecture15
    Nov 13Tutorial: PCA tutorial8
    Nov 16Kernels lecture16
    Nov 18Ensemble Methods I Bishop 14.2 - 14.3 lecture17
    Nov 20Tutorial: SVM tutorial9
    Nov 23Ensemble Methods II lecture18
    Nov 30Reinforcement Learning lecture19

    back to top

  • Assignment 1: document and code. Due date Oct. 19 at noon.
  • Assignment 2: document and code. Due date Nov. 16 at noon.
  • Assignment 3: document . Due date Dec. 3 at midnight.
  • back to top