LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Homework 8 (http://book.caltech.edu/bookforum/forumdisplay.php?f=137)
-   -   computing w, b for soft margin SVM (http://book.caltech.edu/bookforum/showthread.php?t=4312)

 Katie C. 05-24-2013 01:14 PM

computing w, b for soft margin SVM

When we compute w from the alphas for the soft-margin SVM which values of alpha do we include? only margin support vectors ( those with 0 < alpha_n < C)? or do we include all of them (0 < alpha_n <= C)?

Similarly, when computing b, can we use any support vector? or only margin support vectors?

 yaser 05-24-2013 05:31 PM

Re: computing w, b for soft margin SVM

Quote:
 Originally Posted by Katie C. (Post 10947) When we compute w from the alphas for the soft-margin SVM which values of alpha do we include? only margin support vectors ( those with 0 < alpha_n < C)? or do we include all of them (0 < alpha_n <= C)? Similarly, when computing b, can we use any support vector? or only margin support vectors?
1. All of them for computing , since any vector with will contribute to the derived solution for .

2. Only margin SV's for computing , since we need an equation, not an inequality, to solve for after knowing .

 hsolo 07-28-2013 05:42 PM

Re: computing w, b for soft margin SVM

Quote:
 Originally Posted by yaser (Post 10949) 1. All of them for computing , since any vector with will contribute to the derived solution for . 2. Only margin SV's for computing , since we need an equation, not an inequality, to solve for after knowing .
Is the 'heuristic' number of parameters (the VC dimension proxy) to be used while reasoning about generalization then the number of margin support vectors << the number of all support vectors?

When we use kernel functions with soft SVMs (problem 2 etc), where there is no explicit w, does the above translate to :
* 1==> Use all support vectors to compute the sigma term in the hypothesis function g()
* 2==> Use only margin support vectors for b (which is also used in g()

I was wondering if this aspect was covered in the lecture or any of the additional material -- I seem to have missed.

 yaser 07-29-2013 01:01 AM

Re: computing w, b for soft margin SVM

Quote:
 Originally Posted by hsolo (Post 11301) Is the 'heuristic' number of parameters (the VC dimension proxy) to be used while reasoning about generalization then the number of margin support vectors << the number of all support vectors? When we use kernel functions with soft SVMs (problem 2 etc), where there is no explicit w, does the above translate to : * 1==> Use all support vectors to compute the sigma term in the hypothesis function g() * 2==> Use only margin support vectors for b (which is also used in g() I was wondering if this aspect was covered in the lecture or any of the additional material -- I seem to have missed.
The computation of involves all support vectors, margin and otherwise, since it involves all 's that are bigger than zero. Assuming has been computed, the computation of , for both hard and soft margins, involves any one support vector (margin support vector in the case of soft margin) since it is based on solving the equation for .

In the case of kernels, the explicit evaluation of followed by taking an inner product with a point is replaced by evaluating the kernel with two arguments; one is a support vector (margin or otherwise) and the other is the point , and repeating that for all support vectors (margin or otherwise).

 khohi 03-04-2016 07:04 AM

Re: computing w, b for soft margin SVM

thanks :)

الولادة الطبيعية

 All times are GMT -7. The time now is 04:23 PM.