CPSC 633-600 Machine Learning:
Spring 2015

Syllabus

NEWS: 5/5/15, 01:48PM (Tue)
Read-Only Bulletin Board.: 1/16/15, 01:56PM (Fri)

Page last modified: 2/23/15, 04:34PM Monday.

General Information Resources Weekly Schedule Lecture Notes

I. General Information

Instructor:

Dr. Yoonsuck Choe
Email: choe(a)tamu.edu
Office: HRBB 322B
Phone: 845-5466
Office hours: TR 1-2pm

TA:

Manisha Srivastava
Email: manisha.m.srivastava(a)gmail.com
Office: HRBB 339
Phone: None
Office hours: MWF 11am-12pm

Grader:

Yujia Liu
Email: liuyujia(a)email.tamu.edu

Prerequisite/Restrictions:

CPSC 420, 625, or consent of instructor.

Lectures:

TR 3:55pm–5:10pm, ETB 1020.

Introduction:

Machine learning is the study of self-modifying computer systems that can acquire new knowledge and improve their own performance; survey machine learning techniques, which include induction from examples, Bayesian learning, artificial neural networks, instance-based learning, genetic algorithms, reinforcement learning, unsupervised learning, and biologically motivated learning algorithms. Prerequisite: CPSC 420 or 625.

Goal:

The goal of this course is to

  1. learn various problems and solution strategies in machine learning.
  2. learn practical methodology for applying ML algorithms to problem domain of your choice.

Textbook:

Administrative details:

  1. Computer accounts: if you do not have a unix account, ask for one on the CS web page.
  2. Programming languages permitted: C/C++, Java, or Matlab (or octave), and must be executable on CS unix hosts or other public systems in the departmental lab.

Topics to be covered:

See the Weekly Schedule section for more details. The content will closely reflect a combination of Alpaydin + Mitchell.

Grading:

  1. 4 assignments (including written and programming components), 15% each = 60%
  2. 2 exams (in class), 15% each = 30%
  3. Class participation 10% (repeated absences in the class will weigh most heavily [in the negative direction] in the determination of this score).
The cutoff for an `A' will be at most 90% of total score, 80% for a `B', 70% for a `C', and 60% for a `D'. However, these cutoffs might be lowered at the end of the semester to accomodate the actual distribution of grades.

Academic Integrity Statement:

AGGIE HONOR CODE: An Aggie does not lie, cheat, or steal or tolerate those who do.

Upon accepting admission to Texas A&M University, a student immediately assumes a commitment to uphold the Honor Code, to accept responsibility for learning, and to follow the philosophy and rules of the Honor System. Students will be required to state their commitment on examinations, research papers, and other academic work. Ignorance of the rules does not exclude any member of the TAMU community from the requirements or the processes of the Honor System.

For additional information please visit: http://aggiehonor.tamu.edu/

Local Course Policy:

Students with Disabilities:

The Americans with Disabilities Act (ADA) is a federal anti-discrimination statute that provides comprehensive civil rights protection for persons with disabilities. Among other things, this legislation requires that all students with disabilities be guaranteed a learning environment that provides for reasonable accommodation of their disabilities. If you believe you have a disability requiring an accommodation, please contact the Department of Student Life, Services for Students with Disabilities, in Cain Hall or call 845-1637.

Resources:

  1. UCI Machine Learning Repository: datasets to test machine learning algorithms.
  2. Research resources page

III. Weekly Schedule and Class Notes

Week
Date
Topic
Reading
Assignments
Notices and Dues
Notes
1 1/20 Introduction Alpaydin chap 1; Mitchell 1.1–1.2, 1.3–1.5     slide01.pdf
slide01b.pdf
1 1/22      
2 1/27 Supervised Learning (general) Alpaydin chap 2; Mitchell 7.1–7.2, 7.4     slide02.pdf
2 1/29      
3 2/3 Multilayer perceptrons Alpaydin chap 11; Mitchell chap 4     slide03.pdf
3 2/5      
4 2/10   Homework 1 announced  
4 2/12      
5 2/17 Reinforcement learning Alpaydin chap 18; Mitchell chap 13     slide04.pdf
5 2/19      
6 2/24 Advanced topics Homework 2 announced Homework 1 due slide04-sida.pdf
6 2/26 Decision tree learning Alpaydin chap 9; Mitchell chap 3     slide05.pdf
7 3/3      
7 3/5 Genetic Algorithms Mitchell chap 9     slide06.pdf
8 3/10   Homework 3 announced Homework 2 due
8 3/12 Exam #1      
9 3/17 Spring break No class      
9 3/19 Spring break No class      
10 3/24 Genetic Algorithms: Advanced topics       slide07.pdf
10 3/26 Dimensionality reduction Alpaydin chap 6: 6.1–3, 6.7, 6.8     slide08.pdf
11 3/31   Homework 4 announced Homework 3 due
11 4/2 Local models Alpaydin chap 12     slide09.pdf
12 4/7      
12 4/9 Bayesian learning Mitchell chap 6     slide10.pdf
13 4/14       slide10.pdf
13 4/16 Online Guest lecture Deep Learning RNNaissance with Dr. Jürgen Schmidhuber    
14 4/21 Bayesian learning       slide10.pdf
14 4/23 Exam #2        
15 4/28 Special topic Texture Processing using Biologically Inspired Neural Networks   Homework 4 due slide-texture.pdf
15 4/30 Deep learning Optional reading: Jürgen Schmidhuber's Deep Learning Page: [LINK]. Optional reading: Hinton's tutorial on deep belief networks     slidedl.pdf
slidebm.pdf
web_link;


$Id: index.php,v 1.4.1.7 2003/11/13 00:02:12 choe Exp choe $