Created:
Wed 04 Sep 2013
Last modified:
Note: This is an approximate syllabus; it may change at any time.
- Lecture 01, Wed, Sep 04 2013
- Admistrivia
- Introduction to Machine Learning
- Reading: Murphy Chap. 1; DHS Chap. 1
- Lecture 02, Mon, Sep 09 2013
- Probability primer
- Homework 01 assigned
- Reading: Murphy Chap. 2; DHS Chap. A.4
- Lecture 03, Wed, Sep 11 2013
- Finish probability primer
- Bayesian methods
- Parameter estimation
- Reading: Murphy Chap. 3; DHS Chap. 2
- Lecture 04, Mon, Sep 16 2013
- Finish Bayesian methods
- Finish parameter estimation
- Reading: Murphy Chap. 3; DHS Chap. 2
- Lecture 05, Wed, Sep 18 2013
- Linear and logistic regression
- Homework 01 due
- Homework 02 assigned
- Reading: Murphy Chap. 7, 8; DHS Chap. 9.6, 5, 6
- Lecture 06, Mon, Sep 23 2013
- Lecture 07, Wed, Sep 25 2013
- Perceptrons and neural networks
- Homework 03 due
- Reading: Murphy 8.5.4, 16.5; DHS Chap. 6
- Lecture 08, Mon, Sep 30 2013
- Finish neural networks: back propogation
- Reading: Murphy 8.5.4, 16.5; DHS Chap. 6
- Lecture 09, Wed, Oct 02 2013
- Information Theory primer
- Intro to decision trees
- Homework 02 due
- Homework 03 assigned
- Reading: Murphy 2.8, 16.2; DHS Chap. A.7, 8
- Lecture 10, Mon, Oct 07 2013
- Lecture 11, Wed, Oct 09 2013
- Bias and variance
- Start ensemble methods: bagging, boosting
- Homework 03 due
- Reading: AdaBoost; Murphy 16.2.5, 16.4; DHS Chap. 9.3, 9.5
- Lecture 12, Wed, Oct 16 2013
- Finish boosting
- Homework 04 assigned
- Reading: Murphy 16.4; DHS Chap. 9.5
- Lecture 13, Mon, Oct 21 2013
- On-line learning:
- halving algorithm
- randomized halving algorithm
- weighted majority algorithm
- Lecture 14, Wed, Oct 23 2013
- To be announced...
Switch to:
jaa@ccs.neu.edu