#1




computing w, b for soft margin SVM
When we compute w from the alphas for the softmargin SVM which values of alpha do we include? only margin support vectors ( those with 0 < alpha_n < C)? or do we include all of them (0 < alpha_n <= C)?
Similarly, when computing b, can we use any support vector? or only margin support vectors? 
#2




Re: computing w, b for soft margin SVM
Quote:
2. Only margin SV's for computing , since we need an equation, not an inequality, to solve for after knowing .
__________________
Where everyone thinks alike, no one thinks very much 
#3




Re: computing w, b for soft margin SVM
Quote:
When we use kernel functions with soft SVMs (problem 2 etc), where there is no explicit w, does the above translate to : * 1==> Use all support vectors to compute the sigma term in the hypothesis function g() * 2==> Use only margin support vectors for b (which is also used in g() I was wondering if this aspect was covered in the lecture or any of the additional material  I seem to have missed. 
#4




Re: computing w, b for soft margin SVM
Quote:
In the case of kernels, the explicit evaluation of followed by taking an inner product with a point is replaced by evaluating the kernel with two arguments; one is a support vector (margin or otherwise) and the other is the point , and repeating that for all support vectors (margin or otherwise).
__________________
Where everyone thinks alike, no one thinks very much 
#5




Re: computing w, b for soft margin SVM

Thread Tools  
Display Modes  

