Quote:
Originally Posted by a.sanyal902
In Lecture 14, Professor mentions that because only the support vectors count towards W (the rest have alpha=0) which leads to a decrease in the number of features and thus, better generalization.
I'm not sure I got this point because I thought the VC dimension for W would be equal to d, the no. of dimensions of the space, regardless of the number of points being summed. Aren't we just summing the various "x" vectors, multiplied by alpha*y ? How does this decrease the number of features of W?
Thank You!
|
It decreases the effective number of parameters. If you have 5 support vectors for example, then the vector

lives in a 5-dimensional subspace of the

-dimensional space it is formally defined over. There is a bit of liberty taken in this explanation since which vectors would be the support vectors is not known a priori.