CPSC 636-600 Neural Networks
Spring 2017

NEWS: 4/27/17, 12:55PM (Thu)
Read-Only Bulletin Board.: 1/14/17, 03:29PM (Sat)

Page last modified: 4/5/17, 07:55PM Wednesday.

General Information Resources Weekly Schedule Lecture Notes Example Code Read-Only Board

I. General Information

Instructor:

Dr. Yoonsuck Choe
Email: choe(a)tamu.edu
Office: HRBB 322B
Phone: 979-845-5466
Office hours: Tue/Thu 1pm to 2pm 2pm-3pm

TA:

Randall Reams
Email: rcr344 at tamu.edu
Office: HRBB 339
Office hours: Mon/Wed/Fri 1pm to 2pm

Prerequisite/Restrictions:

Math 304 (linear algebra) and 308 (differential equations) or approval of instructor. (Actually, if you are mildly familiar with linear algebra and have taken calculus, you should be fine.)

Prior programming experience is not a prerequisite, but there will be programming assignments. It if preferred that you already took 633 machine learning.

Lectures:

TR 8am-9:15am, HRBB113

Synposis:

Basic concepts in neural computing; functional equivalence and convergence properties of neural network models; associative memory models; associative, competitive and adaptive resonance models of adaptation and learning; selective applications of neural networks to vision, speech, motor control and planning; neural network modeling environments.

Textbook:

The official textbook for this class will be: However, a lot of overlapping material appear in the older edition: so this could be a good, cheaper alternative.

Other books: see slide01.pdf.

Computer Accounts and Usage:

  1. Computer accounts: if you do not have a unix account, ask for one on the CS web page.

Topics to be covered:

See the Weekly Schedule section for more details.

Grading:

  1. Exams: 60% (midterm: 30%, final: 30%).
  2. Assignments: 40% (4 written+programming assignments, 10% each). Late submissions: 1 point (out of 100) deduction per hour late.
Grading will be on the absolute scale. The cutoff for an `A' will be 90% of total score, 80% for a `B', 70% for a `C', 60% for a `D', and below 60% for an 'F'.

Attendance will be checked on a random basis. More than 3 absences will result in a deduction of 5 points (out of 100) from the final weighted total.

Academic Integrity Statement:

AGGIE HONOR CODE: An Aggie does not lie, cheat, or steal or tolerate those who do.

Upon accepting admission to Texas A&M University, a student immediately assumes a commitment to uphold the Honor Code, to accept responsibility for learning, and to follow the philosophy and rules of the Honor System. Students will be required to state their commitment on examinations, research papers, and other academic work. Ignorance of the rules does not exclude any member of the TAMU community from the requirements or the processes of the Honor System.

For additional information please visit: http://www.tamu.edu/aggiehonor/

Local Course Policy:

Students with Disabilities:

The Americans with Disabilities Act (ADA) is a federal anti-discrimination statute that provides comprehensive civil rights protection for persons with disabilities. Among other things, this legislation requires that all students with disabilities be guaranteed a learning environment that provides for reasonable accommodation of their disabilities. If you believe you have a disability requiring an accommodation, please contact Disability Services, currently located in the Disability Services building at the Student Services at White Creek complex on west campus or call 979-845-1637. For additional information, visit http://disability.tamu.edu.

II. Resources

  1. Matlab code for the examples in the text book (2nd edition): Download the ZIP archive.
  2. Linear algebra review (by Eero Simoncelli)
  3. Matlab Primer by Kermit Sigmon, University of Florida
  4. GNU Octave http://www.octave.org (compatible with Matlab)
  5. My general resources page
  6. Neural networks resources:

III. Weekly Schedule and Class Notes

Week
Date
Topic
Reading
Assignments
Notices and Dues
Notes
1 1/17 Introduction Chap 1 (Intro chapter, 3rd ed)     slide01.pdf
1 1/19 Introduction Chap 1 (Intro chapter, 3rd ed)     slide01.pdf
2 1/24 Learning process Chap 2 (Intro chapter sections 8, 9)     slide02.pdf
2 1/26 Learning process Chap 2 (Intro chapter sections 8, 9)     slide02.pdf
3 1/31 Single-layer perceptrons Chap 3 (Chap 1, Chap 3) Homework 1 assigned   slide03.pdf
3 2/2 Single-layer perceptrons Chap 3 (Chap 1, Chap 3)     slide03.pdf
4 2/7 Guest lecture Jaewook Yoo, Development of target reaching gesture map in the cortex and its relation to the motor map: A simulation study     slide.pdf (TBA)
4 2/9 Single-layer perceptrons Chap 3 (Chap 1, Chap 3)     slide03.pdf
5 2/14 Multi-layer perceptrions Chap 4     slide04.pdf
5 2/16 Multi-layer perceptrions Chap 4 Homework 2 assigned Homework 1 due slide04.pdf
6 2/21 Multi-layer perceptrions Chap 4     slide04-suppl.pdf
6 2/23 Radial-basis functions Chap 5     slide05.pdf
7 2/28 Radial-basis functions Chap 5     slide05.pdf
7 3/2 Special topic Biologically inspired models (will revisit as needed)   Homework 2 due slide06.pdf
8 3/7 Midterm Exam (in class)       slide.pdf (TBA)
8 3/9 Deep learning   Homework 3 assigned   slide-dl.pdf
9 3/14 Spring Break No class Homework 3 assigned   slide.pdf (TBA)
9 3/16 Spring Break No class     slide.pdf (TBA)
10 3/21 Deep learning       slide-dl.pdf
10 3/23 Deep learning       slide-dl.pdf
11 3/28 Self-organizing maps Chap 9     slide07.pdf
11 3/30 Self-organizing maps Chap 9   Homework 3 due slide07-suppl.pdf
12 4/4 Neurodynamics Chap 14 (3rd ed. Chap 13)     slide08.pdf
12 4/6 Neurodynamics Chap 14 (3rd ed. Chap 13) Homework 4 assigned Homework 3 due slide08.pdf
13 4/11 Boltzmann machine       slide-bm.pdf
13 4/13 Principal components analysis Chap 8     slide10.pdf
14 4/18 PCA, Information theoretic models Chap 8, Chap 10, ICA     slide10.pdf
slide11.pdf
14 4/20 Information theoretic models, ICA Chap 10, ICA     slide11.pdf
15 4/25 ICA Chap 10, ICA;     slide11.pdf
15 4/27 Special topic Visual saliency detection   Homework 4 due slide12.pdf
 5/5 (Fri)Final Exam (1-3pm)  
 


$Id: index.php,v 1.4.1.8 2006/08/22 22:01:11 choe Exp $