Philippos Mordohai
Assistant Professor
Department of Computer Science
Stevens Institute of Technology

Office: Lieb 215
Phone Number: +1 201 216 5611
E-mail: mordohai_at_cs.stevens.edu

CS 559: Machine Learning: Fundamentals and Applications

Fall 2016



Homepage

Location
McLean 105

Time
Wednesdays 6:15-8:45 PM.

Office Hours
Tuesday 5-6 and by appointment.

Pre-requisites
Basic knowledge of probability and statistics. Past experience has shown that students without this background struggle in CS 559.
Basic programming in Matlab, python, C/C++ or Java. This is crucial for the final project which requires the implementation of machine learning techniques.

Syllabus

Textbooks
The primary textbook is the following. I refer to it as Barber in the class outline.
Bayesian Reasoning and Machine Learning
by David Barber
Cambridge University Press, 2012.
It is available online free of charge here.

I will also use the following textbook, which I refer to as HTF.
The Elements of Statistical Learning (2nd edition)
by Trevor Hastie, Robert Tibshirani and Jerome Friedman
Springer, 2009.
This book is available also free of charge here.



Evaluation
Final project (25%)
Each student will select a project related to machine learning, which has to be approved by me regarding relevance and feasibility. I will provide pointers and suggestions for potential projects. Students actively involved in machine learning research can select a project related to their research, but new work has to be done during the semester. Large projects can be performed by groups of two students. Each student will briefly present his or her project in 3-5 minutes during Week 9. Final project reports and presentations are due in Week 14.

Homework assignments (20%)
Homework sets will be tentatively assigned in Weeks 3, 5, 9 and 12 and will be due a week later.

Pop-up quizzes and participation (10%)
A simple quiz will be given at the beginning of each class.

Midterm (20%)
The midterm is scheduled for Week 7 (October 12).

Final (25%)
The final will take place during the final exam period and will be cumulative.

Resources

The following links should be useful in case you need to refresh your math or Matlab knowledge. Outline

Week 1: Introduction; Probability theory overview (Barber Ch. 1, 8 and 13)
Notes pt. 1 (pdf)

Week 2: Linear algebra review; Basic graph concepts; Belief networks (Notes and Barber Ch. 2 and 3)
Notes pt. 2 (pdf)

Week 3: Bayesian decision theory; Maximum Likelihood Estimation; Bayesian methods (Barber Ch. 8 and 13 and notes)
Notes pt. 3 (pdf)
Homework 1 (pdf) is due on 9/21.

Week 4: Naive Bayes; Non-parametric techniques (Barber Ch. 8, 10 and 14)
Notes pt. 4 (pdf)

Week 5: Project proposals; Principal Component Analysis; Eigenfaces (Barber Ch. 15)
Notes pt. 5 (pdf)
Homework 2 (pdf) is due on 10/5.

Week 6: Fisher's Linear Discriminant; Generative and Discriminative Approaches (Barber Ch. 16 and 13)
Notes pt. 6 (pdf)

Week 7: Midterm


Week 8: Linear regression (Barber Ch. 17 and notes)
Notes pt. 8 (pdf)

Week 9: Logistic regression; Perceptron; Support Vector Machines (Barber Ch. 17, HTF Ch. 4 and 12 and notes)
Notes pt. 9 (pdf)
Homework 3 (pdf) is due on 11/2.

Week 10: Bagging; Random Forests; Boosting (HTF Ch. 10 and 15 and notes)
Notes pt. 10 (pdf)

Week 11: Hidden Markov Models (Barber Ch. 23 and notes).
Notes pt. 11 (pdf)
Homework 4 (pdf) is due on 11/28.
Final Project Progress Report (pdf) is due on 11/22.

Week 12: Deep Learning (Notes)
Notes pt. 12 (pdf)

Week 13: Unsupervised Learning; Expectation Maximization (HTF Ch. 14 and Barber Ch. 11)
Notes pt. 13 (pdf)

Week 14: Project presentations