LFD Book Forum

LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   The Final (http://book.caltech.edu/bookforum/forumdisplay.php?f=138)
-   -   Question 12 (http://book.caltech.edu/bookforum/showthread.php?t=1505)

patrickjtierney 09-16-2012 09:09 PM

Re: Question 13
 
Using x = [1 0;0 1;0 -1.00002; -1 0; 0 2.0001; 0 -2; -2 0]; I still get one less s.v. for qp than libsvm (ie same values I get without perturbing). This remains the case when only perturbing one s.v. The most notable change is that the second weight entry grows, although the first and b also change.

Also, thanks to fgpancorbo for the code for getting w & b from libsvm. Useful for the future.

MLearning 09-17-2012 07:47 AM

Re: Question 13
 
Quote:

Originally Posted by patrickjtierney (Post 5374)
Using x = [1 0;0 1;0 -1.00002; -1 0; 0 2.0001; 0 -2; -2 0]; I still get one less s.v. for qp than libsvm (ie same values I get without perturbing). This remains the case when only perturbing one s.v. The most notable change is that the second weight entry grows, although the first and b also change.

Also, thanks to fgpancorbo for the code for getting w & b from libsvm. Useful for the future.

@patrickjtierney,

In z space X6 and X7 map to the same point in z space, i.e, X6 (0, -2) and X7 (-2,0) map to (3,5). I wonder if this has any effect on the computation.

patrickjtierney 09-17-2012 10:26 AM

Re: Question 13
 
Quote:

Originally Posted by MLearning (Post 5415)
@patrickjtierney,

In z space X6 and X7 map to the same point in z space, i.e, X6 (0, -2) and X7 (-2,0) map to (3,5). I wonder if this has any effect on the computation.

I noticed that in Q12, but I believe that the z-space is defined by the polynomial kernel in Q13, and not the mapping from Q12. This can be seen in slide 10 of week 15 where g(x) is specified.

MCN12 09-17-2012 03:01 PM

Re: Question 13
 
Using Matlab libsvm I perturbed [0,-1] to [0,-0.94] and it reduced the number of support vectors by one. W and b agree with what others have seen for libsvm.

fgpancorbo 09-17-2012 06:32 PM

Re: Question 13
 
I used libsvm and got this question right.

Anton Khorev 09-17-2012 09:48 PM

Re: Question 13
 
Quote:

Originally Posted by yaser (Post 5217)
Interesting. Is the hypothesis g identical?

Predictions of qp and libsvm are identical (for 10000 uniformly distributed samples on x1,x2 = [-5,5]).

JohnH 09-17-2012 09:58 PM

Re: Question 13
 
As a quick (read minimal effort) check of model equivalence, I compared the predicted results of 1,000,000 randomly selected points within [-3,3][-3,3] using the support vectors from both Octave/QP and Python/libsvm. Only three points were classified differently despite the difference in the number of support vectors returned by the two approaches. I'm certain that an analytical comparison of the support vectors would prove their equivalence; however, it hardly seems necessary given the empirical results.

marco.lehmann 06-01-2013 07:32 AM

Re: Question 13
 
I had quite some problems with solving this problem and it forced me to play around with different approaches (qp, libsvm). One more approach to consider is this: Lecture 15, slide 5:
In the case of the kernel used in exercice 13, there is a corresponding transformation, given explicitly on that slide. So why not giving it a try?
I got some confidence in the result after reading the slides title:).


All times are GMT -7. The time now is 11:20 PM.

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.