
#1




Clarification  Support Vectors decrease no. of features of W
In Lecture 14, Professor mentions that because only the support vectors count towards W (the rest have alpha=0) which leads to a decrease in the number of features and thus, better generalization.
I'm not sure I got this point because I thought the VC dimension for W would be equal to d, the no. of dimensions of the space, regardless of the number of points being summed. Aren't we just summing the various "x" vectors, multiplied by alpha*y ? How does this decrease the number of features of W? Thank You! 
#2




Re: Clarification  Support Vectors decrease no. of features of W
Quote:
__________________
Where everyone thinks alike, no one thinks very much 
#3




Re: Clarification  Support Vectors decrease no. of features of W
I spotted a less sophisticated way of thinking about it which seems helpful to me.
If you merely assume that general points which are associated with a particular target (say +1) are more likely to be near points in a sample with that target than further away from them, then the bigger the margin, the lower the probability that a general point with target +1 will be wrongly classified (because the bigger the minimum distance from any point in to a point that would be classified differently). This ties in quite intuitively with the idea of distances from support vectors (or some sort of transformed distance if kernels are used) being the basis of the hypothesis. 
#4




Re: Clarification  Support Vectors decrease no. of features of W
Thank You Professor and Elroch for the answers! Clears things up.

Tags 
doubt, lecture 14, support vector machines 
Thread Tools  
Display Modes  

