LFD Book Forum computing w, b for soft margin SVM

#1
05-24-2013, 12:14 PM
 Katie C. Member Join Date: Apr 2013 Posts: 17
computing w, b for soft margin SVM

When we compute w from the alphas for the soft-margin SVM which values of alpha do we include? only margin support vectors ( those with 0 < alpha_n < C)? or do we include all of them (0 < alpha_n <= C)?

Similarly, when computing b, can we use any support vector? or only margin support vectors?
#2
05-24-2013, 04:31 PM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,477
Re: computing w, b for soft margin SVM

Quote:
 Originally Posted by Katie C. When we compute w from the alphas for the soft-margin SVM which values of alpha do we include? only margin support vectors ( those with 0 < alpha_n < C)? or do we include all of them (0 < alpha_n <= C)? Similarly, when computing b, can we use any support vector? or only margin support vectors?
1. All of them for computing , since any vector with will contribute to the derived solution for .

2. Only margin SV's for computing , since we need an equation, not an inequality, to solve for after knowing .
__________________
Where everyone thinks alike, no one thinks very much
#3
07-28-2013, 04:42 PM
 hsolo Member Join Date: Jul 2013 Posts: 12
Re: computing w, b for soft margin SVM

Quote:
 Originally Posted by yaser 1. All of them for computing , since any vector with will contribute to the derived solution for . 2. Only margin SV's for computing , since we need an equation, not an inequality, to solve for after knowing .
Is the 'heuristic' number of parameters (the VC dimension proxy) to be used while reasoning about generalization then the number of margin support vectors << the number of all support vectors?

When we use kernel functions with soft SVMs (problem 2 etc), where there is no explicit w, does the above translate to :
* 1==> Use all support vectors to compute the sigma term in the hypothesis function g()
* 2==> Use only margin support vectors for b (which is also used in g()

I was wondering if this aspect was covered in the lecture or any of the additional material -- I seem to have missed.
#4
07-29-2013, 12:01 AM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,477
Re: computing w, b for soft margin SVM

Quote:
 Originally Posted by hsolo Is the 'heuristic' number of parameters (the VC dimension proxy) to be used while reasoning about generalization then the number of margin support vectors << the number of all support vectors? When we use kernel functions with soft SVMs (problem 2 etc), where there is no explicit w, does the above translate to : * 1==> Use all support vectors to compute the sigma term in the hypothesis function g() * 2==> Use only margin support vectors for b (which is also used in g() I was wondering if this aspect was covered in the lecture or any of the additional material -- I seem to have missed.
The computation of involves all support vectors, margin and otherwise, since it involves all 's that are bigger than zero. Assuming has been computed, the computation of , for both hard and soft margins, involves any one support vector (margin support vector in the case of soft margin) since it is based on solving the equation for .

In the case of kernels, the explicit evaluation of followed by taking an inner product with a point is replaced by evaluating the kernel with two arguments; one is a support vector (margin or otherwise) and the other is the point , and repeating that for all support vectors (margin or otherwise).
__________________
Where everyone thinks alike, no one thinks very much
#5
03-04-2016, 06:04 AM
 khohi Member Join Date: Dec 2015 Posts: 10
Re: computing w, b for soft margin SVM

thanks

الولادة الطبيعية

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 02:51 AM.