LFD Book Forum Question 12
 User Name Remember Me? Password
 FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#1
09-12-2012, 10:29 PM
 Anton Khorev Junior Member Join Date: Sep 2012 Posts: 3
Question 12

Looks like there's two answers for Q13. It's possible to get different number of support vectors with octave qp and libsvm.
#2
09-12-2012, 11:04 PM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,477
Re: Question 13

Quote:
 Originally Posted by Anton Khorev Looks like there's two answers for Q13. It's possible to get different number of support vectors with octave qp and libsvm.
Interesting. Is the hypothesis identical?
__________________
Where everyone thinks alike, no one thinks very much
#3
09-13-2012, 09:40 AM
 MLearning Senior Member Join Date: Jul 2012 Posts: 56
Re: Question 13

I think it has to do with the fact that qp ( and quadprog in MATLAB) provide alpha values that are negligbly small. By setting an appropriate threshold, it is possible to filter out these very small values.

In Homework 7, one of the students introduced a trick as means to go around the initialization problem in qp (or quadprod). When I applied this trick, qp and libsvm provide different number of SVs. However, when I initialize all alphas to a vector of zeros, libsvm and Octave's qp yield the same number of SVs.

Last edited by MLearning; 09-13-2012 at 10:02 AM. Reason: I just checked that qp and libsvm (command line) give the same number of support vectors.
#4
09-13-2012, 10:39 AM
 Anton Khorev Junior Member Join Date: Sep 2012 Posts: 3
Re: Question 13

In this problem vectors are placed symmetrically. In qp solution one of them touches the margin with alpha==0.
#5
09-13-2012, 11:41 AM
 MLearning Senior Member Join Date: Jul 2012 Posts: 56
Re: Question 13

Quote:
 Originally Posted by Anton Khorev In this problem vectors are placed symmetrically. In qp solution one of them touches the margin with alpha==0.
Symmetric in X space, yes. How about in Z space, are they symmetric?
#6
09-16-2012, 02:35 PM
 patrickjtierney Member Join Date: Jul 2012 Location: Toronto, Canada Posts: 33
Re: Question 13

This is the only question I got wrong on the final, and I would have got it right if I used my libsvm version of the answer rather than my hand-built version with qp (all in Octave). My qp (wrong!) answer was one less support vector than I got with libsvm and that might only be because I used 10e-012 as a threshhold. (If I had omitted the threshhold I would have gotten the same number of sv's as in libsvm ).

I got w = [-0.88889, 5.0e-016] and b = -1.6667 using qp, but strangely I get
w = [0.88869, 0] and b = 1.6663 using libsvm. They both have Ein=0 and on a thousand test runs of a million random points in [-3,3]^2 they agree on labels on average 99.999% of the cases. (For libsvm, I use svmpredict with all labels = +1 which is ~71% accurate to get the actual prediction labels.)

The difference in sign may not be significant. I got w and b for qp directly by following the class slides, but I got w = model.SVs'*model.sv_coef and b = - model.rho in the libsvm case (which may not be exactly correct).

The values of alpha (for qp) are different from model.sv_coef, and the qp version uses all but the last of the libsvm support vectors.

So I do agree that there may be 2 correct answers for this question, based on numerical issues and different ways qp and libsvm handle the calculations, but beyond the control of the student.

If required I can PM the alphas and the code I used to support the claim, or wait and post an **answer** after the deadline.
#7
09-17-2012, 08:48 PM
 Anton Khorev Junior Member Join Date: Sep 2012 Posts: 3
Re: Question 13

Quote:
 Originally Posted by yaser Interesting. Is the hypothesis identical?
Predictions of qp and libsvm are identical (for 10000 uniformly distributed samples on x1,x2 = [-5,5]).

 Thread Tools Display Modes Hybrid Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 05:24 PM.

 Contact Us - LFD Book - Top

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.