Term: 2018 Spring
Instructor: Dr. Chengjiang Long
Time: Tuesday and Friday, 2:00pm – 3:20pm
Building/Room: JEC 4107, Rensselaer Polytechnic Institute (RPI).
Office Hour: Tuesday and Friday 3:20pm—4:00pm by appointment
Office Hour Location: JEC 6045
Course Assistant: Il-Young Son
Course Website: www.chengjianglong.com/teachings.html
Course Overview: This course will give an introduction to the large and diverse field of pattern recognition. Topics include Introduction to Pattern Recognition, Probability theory and Linear algebra review, Basic graph concepts and Belief Network, Bayesian decision theory, Maximum-likelihood estimation, Bayesian methods, Nonparametric techniques, Principal Component Analysis, Fisher Discrimination,
Linear models for regression, Linear models for classification and support vector machines, Bagging, Random Forests and Boosting, Introduction to Neural Networks and Multilayer Neural Network,
Introduction to deep learning: Deep Feedforward Network and Convolutional Neural Network, Unsupervised Learning and clustering. On completion of the course, students should be sufficiently familiar with the formal theoretical structure, notation, and vocabulary of pattern recognition to be able to read and understand current technical literature. They will also have experience in the design and implementation of pattern recognition systems and be able to use those methods to program and solve practical problems.
Prerequisites: Basic probability and statistics, some linear algebra, basic programming skills. Working familiarity with Matlab, Python, Java or C/C++ will be expected.
Text Books: Rechard O. Duda, Peter E. Hart, and David G. Stork, “ Pattern Classification”, 2nd Edition, John Wiley, 2001.
Auxiliary Text Books: Trevor Hastie, Robert Tibshirani and Jerome Friedman, “ The Elements of Statistical Learning”, 2nd Edition, Springer, 2009.
David Barber, “ Bayesian Reasoning and Machine Learning”, Cambridge University Press, 2012.
Ian Goodfellow and Yoshua Beio and Aaron Courville, “ Deep Learning”, MIT Press, 2016.
Grading: The students will be graded based on course participation (5%), four homework assignments (20%, some of them need programming), a midterm exam (20%), a final exam (20%), and a
final project/presentation (35%).
Final grade: A(>=92), A-(>=90), B+(>=87), B(>=82), B-(>=80), C+(>=77), C(>=72), C-(>=70), D+(>=67), D(>=62), D-(>=60) and F (<60).
Late submission policy: Exponential penalty -- late for one day loses half, two day loses another half of the remaining, and so on and so forth.
Topics:
- Introduction to Pattern Recognition.
- Probability theory and Linear algebra review.
- Basic graph concepts and Belief Network.
- Bayesian decision theory.
- Maximum-likelihood estimation.
- Bayesian methods.
- Nonparametric techniques.
- Dimension reduction: Principal Component Analysis;
- Fisher Discrimination.
- Linear models for regression.
- Linear models for classification and support vector machines.
- Bagging, Random Forests and Boosting.
- Introduction to Neural Networks and Multilayer Neural Network
- Introduction to deep learning: Deep Feedforward Network and Convolutional Neural Network.
- Unsupervised learning and clustering.
Course schedule:
Class | Date | Topic | Reading | Homework/Project | Slides |
0 | 1/16/2018 | Short class | | | |
1 | 1/19/2018 | Introduction to Pattern Recognition | Duda Ch. 1 | | Lecture_1 |
2 | 1/23/2018 | Probability theory and Linear algebra review | Duda Ch. 2 | | Lecture_2 |
3 | 1/26/2018 | Bayesian decision theory | Duda Ch. 2.3-4 | | Lecture_3 |
4 | 1/30/2018 | Bayesian decision theory and Max Likelihood Estimation | Duda Ch. 2.5-6, 3.1-2, Barber Ch 8 | | Lecture_4 |
5 | 2/2/2018 | Bayesian Esitmation Methods and Naive Bayes Classifier | Duda Ch. 3.3-5, Barber Ch 13 | HW 1 assigned | Lecture_5 |
6 | 2/6/2018 | Non-Parametric Methods – Parzen Estimation | Duda Ch. 4.1-3 | | Lecture_6 |
7 | 2/9/2018 | Non-Parametric Methods – KNN | Duda Ch. 4.4-5 | HW 1 due | Lecture_7 |
8 | 2/13/2018 | Dimensionality – PCA and Fisher Discrimination | Duda Ch. 3.7-8, Barber Ch. 15-16 | HW 2 assigned | Lecture_8 |
9 | 2/16/2018 | Linear Discriminant Functions (1) | Duda Ch. 5.1-2, Barber Ch. 17 | | Lecture_9 |
| 2/20/2018 | No class with the schedule of the President's Day | | | |
10 | 2/23/2018 | Linear Discriminant Functions (2) | Duda Ch. 5.3-7, Barber Ch. 17, HTF Ch. 12 | HW 2 due | Lecture_10 |
11 | 2/27/2018 | Support Vector Machine | Duda Ch. 5.8-9 | HW 3 assigned | Lecture_11 |
| 3/2/2018 | No class due to the snow storm | | | |
12 | 3/6/2018 | Midterm Exam Review | | | Lecture_12 |
13 | 3/9/2018 | Midterm exam | | Midterm exam and HW 3 due | |
| 3/13/2018 | Spring break | | | |
| 3/16/2018 | Spring break | | | |
14 | 3/20/2018 | Bagging; Random Forests; Boosting (1) | HTF Ch. 10 and 15 | | Lecture_13 |
15 | 3/23/2018 | Bagging; Random Forests; Boosting (2) | HTF Ch. 10 and 15 | Project Proposal Submission | Lecture_14 |
16 | 3/27/2018 | Basic graph concepts, Belief Network and Hidden Markov Models | Duda Ch. 2.5-6, 3.10 | | Lecture_15 |
17 | 3/30/2018 | Final Project Proposal Presentation | Duda Ch. 6.1 | Final Project Proposal Presentation | |
18 | 4/3/2018 | Introduction to Nueral Networks, Multilayer Neural Networks and Back-propagation | Duda Ch. 6.2-8 | | Lecture_16 |
19 | 4/6/2018 | Convolutional Neural Networks | Ian Ch. 9 | | Lecture_17 |
20 | 4/10/2018 | Case Study on ConvNets and Train Neural Networks | Ian Ch. 6-9 | | Lecture_18 |
21 | 4/13/2018 | Concept of Neural Networks | Ian Ch. 6-9 | | Lecture_19 |
22 | 4/17/2018 | Regression | Duda Ch. 6, Barber Ch. 17, HTF Ch. 3 | HW 4 assigned | Lecture_20 |
23 | 4/20/2018 | Unsupervised learning and Clustering Algorithms | Duda Ch. 10.1-14, HTF Ch. 14 | | Lecture_21 |
24 | 4/24/2018 | Gaussian Mixture Model and EM algorithm | Duda Ch. 10.9-14, HTF Ch. 14.3 | | Lecture_22 |
25 | 4/27/2018 | Final Exam Review | | HW4 due | Lecture_23 |
26 | 5/1/2018 | Presentation of Project Reports to Class | | Final Project Presentation | |
27 | 5/7/2018 | Final exam | | Final exam | |
|