Quote:
Originally Posted by yaser
Correct. The solution was also given in slide 11 of Lecture 12 (regularization).

yes my point was how do you solve this numerically  given that people will already have a good least squares code ( doing SVD on Z to avoid numerical ill conditioning), there is no need to implement (poorly) a new regularised least squares solver
you can just add a few data points at the end of your training data and feed it into your least squares solver. ie
\lambda w^2 = \sum_i (y_i\sqrt(lambda)w_i)^2
ie if w is d dimensional you append to your Z matrix the additional matrix=sqrt(lambda)*eye(d) and append a d vector of zeros to your y
(eye(d) is d by d identity matrix) [ but this is much better explained in the notes i linked to]