View Single Post
  #4  
Old 07-29-2013, 12:01 AM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,477
Default Re: computing w, b for soft margin SVM

Quote:
Originally Posted by hsolo View Post
Is the 'heuristic' number of parameters (the VC dimension proxy) to be used while reasoning about generalization then the number of margin support vectors << the number of all support vectors?

When we use kernel functions with soft SVMs (problem 2 etc), where there is no explicit w, does the above translate to :
* 1==> Use all support vectors to compute the sigma term in the hypothesis function g()
* 2==> Use only margin support vectors for b (which is also used in g()

I was wondering if this aspect was covered in the lecture or any of the additional material -- I seem to have missed.
The computation of {\bf w} involves all support vectors, margin and otherwise, since it involves all \alpha's that are bigger than zero. Assuming {\bf w} has been computed, the computation of b, for both hard and soft margins, involves any one support vector (margin support vector in the case of soft margin) since it is based on solving the equation {\bf w}^{\rm T} {\bf x}_n + b = 1 for b.

In the case of kernels, the explicit evaluation of {\bf w} followed by taking an inner product with a point {\bf x} is replaced by evaluating the kernel with two arguments; one is a support vector (margin or otherwise) and the other is the point {\bf x}, and repeating that for all support vectors (margin or otherwise).
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote