CPSC 633-600 Machine Learning:
Spring 2017

Syllabus

NEWS: 4/22/17, 04:59PM (Sat)
  • [04/22/2017] Final exam: 5/4 Thursday, 3-5pm. Same rules as the midterm. Materials will be from slide06, slide-cons, slide08, slide09, slide10, slide-dl, slide11.
  • [02/26/2017] Old midterm exam posted to eCampus.
  • [01/24/2017] Single-page slides: http://faculty.cs.tamu.edu/choe/633-slideXX.pdf
  • ---------------------------------------------------------
  • For older announcements, see the archive
Read-Only Bulletin Board.: 1/14/17, 03:05PM (Sat)

Page last modified: 4/5/17, 07:55PM Wednesday.

General Information Resources Weekly Schedule Lecture Notes

I. General Information

Instructor:

Dr. Yoonsuck Choe
Email: choe(a)tamu.edu
Office: HRBB 322B
Phone: 845-5466
Office hours: Tue/Thu 1pm-2pm 2pm-3pm

TA:

Hanzi Mao
Email: hzmao(a)tamu.edu
Office: WEB 332G
Phone: None
Office hours: Mon/Wed 10:00am to 11:30am

Prerequisite/Restrictions:

CPSC 420, 625, or consent of instructor.

Lectures:

TR 11:10am–12:25pm, THOM 107A.

Introduction:

Machine learning is the study of self-modifying computer systems that can acquire new knowledge and improve their own performance; survey machine learning techniques, which include induction from examples, Bayesian learning, artificial neural networks, instance-based learning, genetic algorithms, reinforcement learning, unsupervised learning, and biologically motivated learning algorithms. Prerequisite: CPSC 420 or 625.

Goal:

The goal of this course is to
  1. learn various problems and solution strategies in machine learning.
  2. learn practical methodology for applying ML algorithms to problem domain of your choice.

Textbook:

Administrative details:

  1. Computer accounts: if you do not have a unix account, ask for one on the CS web page.
  2. Programming languages permitted: C/C++, Java, or Matlab (or octave), Python, etc. and must be executable on CS unix hosts or other public systems in the departmental lab.

Topics to be covered:

See the Weekly Schedule section for more details. The content will closely reflect a combination of Alpaydin + Mitchell.

Grading:

  1. 4 assignments (including written and programming components), 10% each = 40%
  2. 2 exams, 30% each = 60%
The cutoff for an `A' will be at most 90% of total score, 80% for a `B', 70% for a `C', and 60% for a `D'. However, these cutoffs might be lowered at the end of the semester to accomodate the actual distribution of grades.

Academic Integrity Statement:

AGGIE HONOR CODE: An Aggie does not lie, cheat, or steal or tolerate those who do.

Upon accepting admission to Texas A&M University, a student immediately assumes a commitment to uphold the Honor Code, to accept responsibility for learning, and to follow the philosophy and rules of the Honor System. Students will be required to state their commitment on examinations, research papers, and other academic work. Ignorance of the rules does not exclude any member of the TAMU community from the requirements or the processes of the Honor System.

For additional information please visit: http://aggiehonor.tamu.edu/

Local Course Policy:

Students with Disabilities:

The Americans with Disabilities Act (ADA) is a federal anti-discrimination statute that provides comprehensive civil rights protection for persons with disabilities. Among other things, this legislation requires that all students with disabilities be guaranteed a learning environment that provides for reasonable accommodation of their disabilities. If you believe you have a disability requiring an accommodation, please contact the Department of Student Life, Services for Students with Disabilities, in Cain Hall or call 845-1637.

Resources:

  1. UCI Machine Learning Repository: datasets to test machine learning algorithms.

III. Weekly Schedule and Class Notes

Week
Date
Topic
Reading
Assignments
Notices and Dues
Notes
1 1/17 Introduction Alpaydin chap 1; Mitchell 1.1–1.2, 1.3–1.5     slide01.pdf
slide01b.pdf
1 1/19      
2 1/24 Supervised Learning (general) Alpaydin chap 2; Mitchell 7.1–7.2, 7.4     slide02.pdf
2 1/26      
3 1/31 Multilayer perceptrons Alpaydin chap 11; Mitchell chap 4 Homework 1 announced   slide03.pdf
3 2/2      
4 2/7 Guest lecture Randall Reams, Tool Use with Evolving Neural Networks    
4 2/9      
5 2/14 Reinforcement learning Alpaydin chap 18; Mitchell chap 13 Homework 2 announced Homework 1 due slide04.pdf
5 2/16      
6 2/21      
6 2/23 Advanced topics     slide04-sida.pdf
7 2/28 Decision tree learning Alpaydin chap 9; Mitchell chap 3 Homework 3 announced Homework 2 due slide05.pdf
7 3/2      
8 3/7 Genetic Algorithms Mitchell chap 9     slide06.pdf
8 3/9 Midterm Exam (in class)      
9 3/14 Spring break No class Homework 3 announced    
9 3/16 Spring break No class      
10 3/21 Genetic Algorithms: Advanced topics       slide-cons.pdf
10 3/23 Dimensionality reduction Alpaydin chap 6: 6.1–3, 6.7, 6.8   Homework 3 due slide08.pdf
11 3/28      
11 3/30 Local models Alpaydin chap 12     slide09.pdf
12 4/4      
12 4/6 Bayesian learning Mitchell chap 6 Homework 4 announced Homework 3 due slide10.pdf
13 4/11     Homework 3 due slide10.pdf
13 4/13 Bayesian learning       slide10.pdf
14 4/18 Deep learning Optional reading: Jürgen Schmidhuber's Deep Learning Page: [LINK]. Optional reading: Hinton's tutorial on deep belief networks     slide-dl.pdf
web_link;

14 4/20 Deep learning     Homework 4 due slide-dl.pdf
15 4/25 Support vector machines     Homework 4 due slide11.pdf
15 4/27 Support vector machines     Homework 4 due 5/2 Tuesday slide11.pdf
5/4 (Thu)Final Exam (3-5pm)   


$Id: index.php,v 1.4.1.7 2003/11/13 00:02:12 choe Exp choe $