Schedule and Readings

The dates on which readings are assigned, and dates on which we will discuss COURSERA and reading material is listed below. Many of the readings are quite old, but because they often introduce new algorithms and paradigms, they are pedagogically at about the right level for new machine learning students, they are source papers, which students should get into the habit of reading, and they are about the same conceptual level as we might expect to be presented in a textbook on machine learning.

Monday, August 20: COURSERA Machine Learning course begins (https://www.coursera.org/course/ml ). See “Preview” for the list of lectures and topics.

Wednesday, August 22 (5:00 – 5:30) Short orientation meeting; No quiz; Decision and Regression Tree papers assigned; (2 papers for 2 weeks):

Induction of Decision Trees” (1986) by Ross Quinlan:  http://www.springerlink.com/content/ku63wm5513224245/ ;

Machine learning approaches to estimating software development effort” (1995) by Srinivasan and Fisher: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=345828&tag=1

Come to next meeting with an understanding of the assigned papers, and also look for descendants of the Quinlan paper and skim at least one of them — be prepared to concisely state in what way this descendant purports to extend, evaluate, or otherwise improve Quinlan’s basic ID3 algorithm. In reading the Srinivasan and Fisher paper, the discussion of regression trees is most germane to the next class meeting. This paper is also an example of a previous student’s machine learning class project (looking at neural networks only, using simpler experimental design), which later evolved into a journal paper with more work (and there are many other such instances).

Wednesday, August 29 (5:00 – 6:30) No Meeting

Wednesday, September 5 (5:00 – 6:30) Quiz. Discussion of decision/regression trees and COURSERA material through week 2 (multivariate linear regression); ensemble learning paper assigned

An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization” by Dietterich:  http://www.csd.uwo.ca/faculty/ling/cs860/papers/mlj-randomized-c4.pdf

Wednesday, September 12 (5:00 – 6:30) Quiz. Discussion of decision tree over-fitting and ensembles and COURSERA material through week 3 (logistic regression/regularization); learning as search paper assigned

Generalization as Search” (1982) Mitchell. Google   generalization as search mitchell

This is an “ancient” paper, but it is an important one. Most all of the ML methods covered by Professor Ng, and in the readings assume that an “object” (datum, etc) is a feature vector. As you read the Mitchell paper, understand that in one important way it is a more sophisticated algorithm than (most of) the others we will study — each “object” is a (unordered) set of components, and each component is a feature vector — each object is a set of feature vectors. There are more complex object descriptions still! In any case, throughout the semester, we will speculate on how the various other methods that assume a single feature vector representation for each datum could be expanded to work with more sophisticated representations of data.

Wednesday, September 19 (5:00 – 6:30) Quiz. Discussion of learning as search paper COURSERA material through week 4 (neural network representation); multi-task learning paper assigned

Multitask learning” (1997) by Caruana:  http://www.springerlink.com/content/x4q010h7342j4p15/

Some neural network systems can be viewed as multitask learning, as can some unsupervised learning systems to be studied later

Wednesday, September 26 (5:00 – 6:30) Quiz. Discussion of multi-task learning and neural networks and COURSERA material through week 5 (neural network learning);

No paper assigned, BUT for Wednesday, October 10, prepare a short presentation outlining an application-oriented project using supervised machine learning, describing the application, the description of data (inputs), the class or continuous prediction variables (outputs), relate it to the material studied in papers or COURSERA as appropriate, motivate it by appeal to a larger project or other context.

Wednesday, October 3 (5:00 – 6:30) No meeting; Fall break starts (but COURSERA material marches on through weeks 6 and 7)

Wednesday, October 10 (5:00 – 6:30) No quiz. Discussion COURSERA material through weeks 6 and 7; present project ideas (these are NOT intended to limit you, but rather to encourage you to take a step back and synthesize and use your imagination)

Iterative optimization and simplification of hierarchical clusterings” (1996) by Fisher:  http://www.cs.washington.edu/research/jair/abstracts/fisher96a.html

In contrast to relocating single objects between clusters, as K-means does, Fisher’s method relocates many objects simultaneously by exploiting groups at multiple levels of granularity that are found in a hierarchical clustering. In addition, the paper illustrates clustering as a means of promoting what amounts to, and which preceded multitask prediction. Of all the readings and material covered by the COURSERA course, this paper is closest (other than perhaps Mitchell’s paper and neural network learning) to an earlier tradition that regarded machine learning for organizing an autonomous agent’s experiences, a tradition that has returned with machine learning’s infusion in computer vision, robotics, and language processing.

Besides clustering and PCA (and like dimensionality reduction methods), there are at least two other major unsupervised learning paradigms — induction of belief networks and association-rule learning (http://www.springerlink.com/content/h216477830q7415v/ ); the former is typically covered in an undergraduate AI class.

Wednesday, October 17 (5:00 – 6:30) Quiz. Discussion of iterative optimization in clustering and COURSERA material through week 8 (clustering and dimensionality reduction); relational learning (inductive logic programming) paper assigned

Learning logical definitions from relations” by Quinlan (1990):  http://www.cs.washington.edu/education/courses/cse546/10wi/hw2/quinlan90.pdf

This paper is one of the original of the inductive logic programming paradigm. Like Mitchell’s paper and a host of others that describe learning in the context of planning and problem solving (which this CS 390 isn’t focusing on), Quinlan’s paper and ILP generally assumes relational representations.

Wednesday, October 24 (5:00 – 6:30) Quiz. Discussion of relational learning and COURSERA material through week 9 (anomaly detection and recommender systems); knowledge-biased learning paper assigned

Inductive Process Modeling” (2008) by Bridewell, Langley, Todorovski, & Dzeroski :   http://www.springerlink.com/content/v562101k85324170/

Many of the CS 390 participants are biomedical informatics folks. Can you imagine personalized medical applications of the knowledge-constrained learning strategies described by Bridewell, et al?

Wednesday, October 31 (5:00 – 6:30) No Quiz. Discussion of knowledge-biased learning and COURSERA material through week 10 (large scale ML), and final projects

Wednesday, November 7 (5:00 – 6:30) No quiz. Discussion of projects

Wednesday, November 14 (5:00 – 6:30) No quiz. Discussion of projects

Wednesday, November 21 (5:00 – 6:30) No meeting (Thanksgiving break)

Wednesday, November 28 (5:00 – 6:30) No quiz. Discussion of Projects

Wednesday, December 5 (5:00 – 6:30) No quiz. Discussion of Projects

Friday, December 7 (1:00 PM until all students done) Project Presentations

 

Leave a Reply

Your email address will not be published. Required fields are marked *