|
|
|
General Information | Resources | Weekly Schedule | Lecture Notes |
I. General Information |
Dr. Yoonsuck Choe
Email: choe(a)tamu.edu
Office: HRBB 322B
Phone: 845-5466
Office hours: Tue/Thu1pm-2pm2pm-3pm
Hanzi Mao
Email: hzmao(a)tamu.edu
Office: WEB 332G
Phone: None
Office hours: Mon/Wed 10:00am to 11:30am
CPSC 420, 625, or consent of instructor.
TR 11:10am–12:25pm, THOM 107A.
Machine learning is the study of self-modifying computer systems that can acquire new knowledge and improve their own performance; survey machine learning techniques, which include induction from examples, Bayesian learning, artificial neural networks, instance-based learning, genetic algorithms, reinforcement learning, unsupervised learning, and biologically motivated learning algorithms. Prerequisite: CPSC 420 or 625.
The goal of this course is to
- learn various problems and solution strategies in machine learning.
- learn practical methodology for applying ML algorithms to problem domain of your choice.
See the Weekly Schedule section for more details. The content will closely reflect a combination of Alpaydin + Mitchell.
The cutoff for an `A' will be at most 90% of total score, 80% for a `B', 70% for a `C', and 60% for a `D'. However, these cutoffs might be lowered at the end of the semester to accomodate the actual distribution of grades.
AGGIE HONOR CODE: An Aggie does not lie, cheat, or steal or tolerate those who do.Upon accepting admission to Texas A&M University, a student immediately assumes a commitment to uphold the Honor Code, to accept responsibility for learning, and to follow the philosophy and rules of the Honor System. Students will be required to state their commitment on examinations, research papers, and other academic work. Ignorance of the rules does not exclude any member of the TAMU community from the requirements or the processes of the Honor System.
For additional information please visit: http://aggiehonor.tamu.edu/
Local Course Policy:
- All work should be done individually and on your own unless otherwise allowed by the instructor.
- Discussion is only allowed immediately before, during, or immediately after the class, or during the instructor's office hours.
- If you find solutions to homeworks or programming assignments on the web (or in a book, etc.), you may (or may not) use it but please check with the instructor first.
- Assignments turned in that are significantly similar will be reported to the Aggie Honor System Office.
- There will be no make up exams unless it is due to a genuine emergency. Events that do not count as an emergency: Merely visiting the doctor's office without an explicit note from the office requesting absence. Interview trips that got scheduled at the last moment, etc. All make up exams, if given, will be different from the original exam. Instructor may choose to give an oral exam instead of a written exam.
The Americans with Disabilities Act (ADA) is a federal anti-discrimination statute that provides comprehensive civil rights protection for persons with disabilities. Among other things, this legislation requires that all students with disabilities be guaranteed a learning environment that provides for reasonable accommodation of their disabilities. If you believe you have a disability requiring an accommodation, please contact the Department of Student Life, Services for Students with Disabilities, in Cain Hall or call 845-1637.
III. Weekly Schedule and Class Notes |
|
|
|||||
1 | 1/17 | Introduction | Alpaydin chap 1; Mitchell 1.1–1.2, 1.3–1.5 | slide01.pdf slide01b.pdf |
||
1 | 1/19 | " | ||||
2 | 1/24 | Supervised Learning (general) | Alpaydin chap 2; Mitchell 7.1–7.2, 7.4 | slide02.pdf |
||
2 | 1/26 | " | ||||
3 | 1/31 | Multilayer perceptrons | Alpaydin chap 11; Mitchell chap 4 | Homework 1 announced | slide03.pdf |
|
3 | 2/2 | " | ||||
4 | 2/7 | Guest lecture | Randall Reams, Tool Use with Evolving Neural Networks | |||
4 | 2/9 | " | ||||
5 | 2/14 | Reinforcement learning | Alpaydin chap 18; Mitchell chap 13 | Homework 2 announced | Homework 1 due | slide04.pdf |
5 | 2/16 | " | ||||
6 | 2/21 | " | ||||
6 | 2/23 | " | Advanced topics | slide04-sida.pdf |
||
7 | 2/28 | Decision tree learning | Alpaydin chap 9; Mitchell chap 3 | Homework 2 due | slide05.pdf |
|
7 | 3/2 | " | ||||
8 | 3/7 | Genetic Algorithms | Mitchell chap 9 | slide06.pdf |
||
8 | 3/9 | Midterm Exam (in class) | ||||
9 | 3/14 | Spring break | No class | Homework 3 announced | ||
9 | 3/16 | Spring break | No class | |||
10 | 3/21 | Genetic Algorithms: Advanced topics | slide-cons.pdf |
|||
10 | 3/23 | Dimensionality reduction | Alpaydin chap 6: 6.1–3, 6.7, 6.8 | slide08.pdf |
||
11 | 3/28 | " | ||||
11 | 3/30 | Local models | Alpaydin chap 12 | slide09.pdf |
||
12 | 4/4 | " | ||||
12 | 4/6 | Bayesian learning | Mitchell chap 6 | Homework 4 announced | slide10.pdf |
|
13 | 4/11 | " | Homework 3 due | slide10.pdf |
||
13 | 4/13 | Bayesian learning | slide10.pdf |
|||
14 | 4/18 | Deep learning | Optional reading: Jürgen Schmidhuber's Deep Learning Page: [LINK]. Optional reading: Hinton's tutorial on deep belief networks | slide-dl.pdf web_link; |
||
14 | 4/20 | Deep learning | slide-dl.pdf |
|||
15 | 4/25 | Support vector machines | slide11.pdf |
|||
15 | 4/27 | Support vector machines | Homework 4 due 5/2 Tuesday | slide11.pdf |
||
5/4 (Thu) | Final Exam (3-5pm) |