
Mod01 Lec01 Introduction to Statistical Pattern Recognition

Mod01 Lec02 Overview of Pattern Classifiers

Mod02 Lec03 The Bayes Classifier for minimizing Risk

Mod02 Lec04 Estimating Bayes Error; Minimax and NeymannPearson classifiers

Mod03 Lec05 Implementing Bayes Classifier; Estimation of Class Conditional Densities

Mod03 Lec06 Maximum Likelihood estimation of different densities

Mod03 Lec07 Bayesian estimation of parameters of density functions, MAP estimates

Mod03 Lec08 Bayesian Estimation examples; the exponential family of densities and ML estimates

Mod03 Lec09 Sufficient Statistics; Recursive formulation of ML and Bayesian estimates

Mod04 Lec10 Mixture Densities, ML estimation and EM algorithm

Mod04 & 05 Lec11 Convergence of EM algorithm; overview of Nonparametric density estimation

Mod05 Lec12 Nonparametric estimation, Parzen Windows, nearest neighbour methods

Mod06 Lec13 Linear Discriminant Functions; Perceptron  Learning Algorithm and convergence proof

Mod06 Lec14 Linear Least Squares Regression; LMS algorithm

Mod06 Lec15 AdaLinE and LMS algorithm; General nonliner leastsquares regression

Mod06 Lec16 Logistic Regression; Statistics of least squares method; Regularized Least Squares

Mod06 Lec17 Fisher Linear Discriminant

Mod06 Lec18 Linear Discriminant functions for multiclass case; multiclass logistic regression

Mod07 Lec19 Learning and Generalization; PAC learning framework

Mod07 Lec20 Overview of Statistical Learning Theory; Empirical Risk Minimization

Mod07 Lec21 Consistency of Empirical Risk Minimization

Mod07 Lec22 Consistency of Empirical Risk Minimization; VCDimension

Mod07 Lec23 Complexity of Learning problems and VCDimension

Mod07 Lec24 VCDimension Examples; VCDimension of hyperplanes

Mod08 Lec25 Overview of Artificial Neural Networks

Mod08 Lec26 Multilayer Feedforward Neural networks with Sigmoidal activation functions;

Mod08 Lec27 Backpropagation Algorithm; Representational abilities of feedforward networks

Mod08 Lec28 Feedforward networks for Classification and Regression; Backpropagation in Practice

Mod08 Lec29 Radial Basis Function Networks; Gaussian RBF networks

Mod08 Lec30 Learning Weights in RBF networks; Kmeans clustering algorithm

Mod09 Lec31 Support Vector Machines  Introduction, obtaining the optimal hyperplane

Mod09 Lec32 SVM formulation with slack variables; nonlinear SVM classifiers

Mod09 Lec33 Kernel Functions for nonlinear SVMs; Mercer and positive definite Kernels

Mod09 Lec34 Support Vector Regression and ?insensitive Loss function, examples of SVM learning

Mod09 Lec35 Overview of SMO and other algorithms for SVM; ?SVM and ?SVR; SVM as a risk minimizer

Mod09 Lec36 Positive Definite Kernels; RKHS; Representer Theorem

Mod10 Lec37 Feature Selection and Dimensionality Reduction; Principal Component Analysis

Mod10 Lec38 No Free Lunch Theorem; Model selection and model estimation; Biasvariance tradeoff

Mod10 Lec39 Assessing Learnt classifiers; Cross Validation;

Mod11 Lec40 Bootstrap, Bagging and Boosting; Classifier Ensembles; AdaBoost

Mod11 Lec41 Risk minimization view of AdaBoost