Professor:
Dr. Thomas R. Ioerger
Office: 322C HRBB
Phone: 845-0161
Email: ioerger@cs.tamu.edu
Office hours: Wed 1:30-2:30 (or by appt., set up via email)
Class Time: Tues/Thurs, 9:35-10:50
Room: 105B Zachry
Course WWW page: http://www.cs.tamu.edu/faculty/ioerger/cs633-spr10/index.html
Textbook: Machine Learning. Tom Mitchell (1997). McGraw-Hill.
Teaching Assistant:
Joshua Johnston
email: joshua.b.johnston@gmail.com
office hours: MWF, 10-11am, 911 Richardson
Symbolic learning version spaces, decision trees, rule induction explanation-based learning, inductive-logic programming Nearest-neighbor (non-parametric) algorithms Feature selection and feature weighting filters and wrappers, entropy principle-component analysis constructive induction Linear classifiers (covered lightly) neural networks, multi-layer perceptrons, and gradient descent support vector machines, maximum-margin optimization Bayesian classifiers Computational learning theory inductive bias, hypothesis space search PAC model (probably-approximately correct) algorithmic complexity, sample complexity Unsupervised learning (data mining) clustering, association rules Reinforcement Learning
We will be relying on standard concepts in AI, especially heuristic search algorithms, propositional logic, and first-order predicate calculus. Either the graduate or undergraduate AI class (or a similar course at another university) will count as satisfying this prerequisite.
In addition, the course will require some background in analysis of algorithms (big-O notation), and some familiarity with probability and statistics (e.g. standard deviation, confidence intervals, linear regression, Binomial distribution).
Your grade at the end of the course will be based on a weighted average of points accumulated during the semester. The weights will be distributed approximately as 45% exams, 45% projects, 10% other (homeworks, quizzes, participation in class discussions), but this might be adjusted slightly to reflect relative effort of each. The maximum cutoff for an A will be 90%, 80% for B, and 70% for C.
The late-assignment policy for homeworks and projects will be incremental: -5%/per day, down to a maximum of -50%. If the project is turned in anytime by the end of the semester, you can still get up to 50% (minus points marked off).
Tues, Jan 19: | first day of class; Perspectives on Machine Learning | |
Thurs, Jan 21: | Ch. 1 | choices in designing a learning system |
Tues, Jan 26: | Ch. 2 | Searching Hypothesis Space |
Thrus, Jan 28: | Candidate elimination, bias | |
Tues, Feb 2: | Ch. 3 | Decision Trees, ID3 |
Thurs, Feb 4: | (Mingers, 1989) | bias, pruning |
Tues, Feb 9: | Ch. 5 | Empirical Methods |
Thurs, Feb 11: | empirical methods, continued | |
Tues, Feb 16: | empirical methods, continued | |
Thurs, Feb 18: | Ch. 4 | Neural networks; perceptrons |
Tues, Feb 23: | Caruna et al. (NIPS-2000) | back-propagation |
Thurs, Feb 25: | Project 1 due | |
Tues, Mar 2: | Ch. 8 | Instance-based Learning, notes |
Thurs, Mar 4: | Mid-term exam | |
Tues, Mar 9: | (class cancelled) | |
Thurs, Mar 11: | Ch. 6 | Bayesian Learning |
Mar 15-19: | Spring Break | Spring Break |
Tues, Mar 23: | Naive Bayes algorithm | |
Thurs, Mar 25: | PCA, sec 8.4 | PCA; RBFs; Project 2 due | Tues, Mar 30: | sec. 10.1-10.5 | rule learning (CN2 and FOIL) |
Thurs, Apr 1: | (Craven and Slattery, 2001) | relational learning and predicate invention |
Tues, Apr 6: | (Burges, 1998) sec 1-3 | support vector machines; Project 3 due (k-NN) |
Thurs, Apr 8: | (sec 4) | non-linear kernels |
Tues, Apr 13: | (Breiman, 1996) | bagging |
Thurs, Apr 15: | (Freund and Schapire, 1996) | boosting |
Tues, Apr 20: | Ch. 13 | Reinforcement Learning; Project #4 due (NB) |
Thurs, Apr 22: | ||
Tues, Apr 27: | (Lagoudakis and Parr) | value-function approximation, LSPI |
Thurs, Apr 29: | (last class) Final Exam | |
Mon, May 10: | Final Project due | |