![]() |
CSCE
666: Pattern Analysis Instructor:
Ricardo Gutierrez-Osuna |
Course description: Introduction to methods for the analysis, classification and clustering of high-dimensional data in Computer Science applications. Course contents include density and parameter estimation, linear feature extraction, feature subset selection, clustering, Bayesian and geometric classifiers, non-linear dimensionality reduction methods from statistical learning theory and spectral graph theory, Hidden Markov models, and ensemble learning.
Prerequisites: CPSC 206, MATH 222, MATH 411 (or equivalent) and graduate standing in CPSC, CECN, ELEN, CEEN (or permission of the instructor). Basic knowledge of Linear Algebra, Probability and Statistics: algebra of matrices, geometry of Euclidean space, vector spaces and subspaces, basis, linear independence, linear transformations, eigenvalues and eigenvectors, mean, variance, probability and distributions. Programming experience in a high-level language is required.
This material is NOT intended to be comprehensive, but rather a SUMMARY of the key concepts covered in the lectures. Consult the syllabus for additional reading material from the textbook, which may be included in the tests.
Topic |
Lecture slides |
Introduction to Pattern Recognition |
|
Review of Statistics and Probability |
|
Linear Algebra and MATLAB |
|
Fourier Analysis |
|
Bayesian Decision Theory |
|
Quadratic Classifiers |
|
Parameter Estimation |
|
Kernel Density Estimation |
|
Nearest Neighbors |
|
Linear Discriminant Functions |
|
Cross-validation
|
|
Principal Components (GIF) |
|
Fisher's Linear Discriminants |
|
Feature
Subset Selection |
|
Advanced Dimensionality Reduction |
|
Mixture Models and EM (MPEG) |
|
Statistical Clustering (MPEG) |
|
Independent
Components Analysis |
|
Support Vector Machines |
|
SVMs and Kernel Methods |
|
Kernel PCA/LDA |
|
Discrete HMMs, Viterbi |
|
Baum-Welch and Entropic Training |
|
Ensemble Learning |
Homework assignments
Homework # |
Material |
1 |
|
2 |
|
3 |