This lab is about linear Regularized Least Squares.
Extract the zip file in a directory and set
the MATLAB path to that, follow the
instructions below and think/try hard before you call the instructors!
Start by generating data using the provided function MixGauss:
1.B Generate a corresponding test
set of 200 points per class (Xte,
Yte).
1.C Add noise to the generated data by randomly flipping a
percentage of the point labels (e.g. 10%), using the provided function
flipLabels. You
will obtain a new set of training Ytrn
and test Yten output
vectors. Plot the various datasets using scatter
, e.g.:
figure; hold on
scatter(Xtr(Ytr==1,1), Xtr(Ytr==1,2), '.r');
scatter(Xtr(Ytr==-1,1), Xtr(Ytr==-1,2), '.b');
2.A Complete the code in functions regularizedLSTrain and regularizedLSTest for training and testing a regularized Least Squares classifier.
2.D Repeat the procedure data generation -> parameter selection -> test multiple times and compare the test error of RLS with that of ordinary least squares (OLS), i.e. with lambda=0. Does regularization improve classification performance?
3.A Modify the regularizedLSTrain and regularizedLSTest functions to incorporate an off-set in the linear model (i.e., y = <w,x> + b). Compare the solution with and without offset, in a 2-class data set where classes are centered on (0,0) and (1,1) with variance 0.35 each.
3.C Modify
the regularizedLSTrain
and regularizedLSTest
functions to handle multiclass
problems.