LFD Book Forum on the right track?

#11
03-03-2013, 10:38 AM
 Suhas Patil Senior Member Join Date: Dec 2012 Posts: 57
Re: on the right track?

I verified for 0 versus 7 case and I am getting exactly same number of support vectors. (and also Ein and Eout)
Haven't explored using sv coefficients for calculating the error. I am using this API:
double svm_predict(const struct svm_model *model, const struct svm_node *x); from 'svm.h'
that returns the predicted class value. I am calling this method in loop for all the test points and comparing against the ground truth (y array from svm_problem) to compute Ein or Eout.
#12
03-04-2013, 10:25 AM
 Anne Paulson Senior Member Join Date: Jan 2013 Location: Silicon Valley Posts: 52
Re: on the right track?

Thanks, thanks, thanks!

Like Ivan Keller, I at first wasn't setting the gamma and coef0 parameters.

I know what gamma is for the radial kernel, but what does it mean for the polynomial kernel? And what is coef0? The bias? If so, why would the default be 0? Wouldn't you usually want an intercept?
#13
03-04-2013, 10:33 AM
 ilya239 Senior Member Join Date: Jul 2012 Posts: 58
Re: on the right track?

Quote:
 Originally Posted by Anne Paulson I know what gamma is for the radial kernel, but what does it mean for the polynomial kernel? And what is coef0? The bias? If so, why would the default be 0? Wouldn't you usually want an intercept?
see http://scikit-learn.org/stable/modul...rnel-functions

also, from the python docs:
|
| kernel : string, optional (default='rbf')
| Specifies the kernel type to be used in the algorithm.
| It must be one of 'linear', 'poly', 'rbf', 'sigmoid', 'precomputed' or
| a callable.
| If none is given, 'rbf' will be used. If a callable is given it is
| used to precompute the kernel matrix.
|
| degree : int, optional (default=3)
| Degree of kernel function.
| It is significant only in 'poly' and 'sigmoid'.
|
| gamma : float, optional (default=0.0)
| Kernel coefficient for 'rbf' and 'poly'.
| If gamma is 0.0 then 1/n_features will be used instead.
|
| coef0 : float, optional (default=0.0)
| Independent term in kernel function.
| It is only significant in 'poly' and 'sigmoid'.
#14
03-04-2013, 11:08 AM
 kartikeya_t@yahoo.com Member Join Date: Jul 2012 Posts: 17
Re: on the right track?

Quote:
 Originally Posted by butterscotch Seems good to me. Are you getting the same number of support vectors with Sendai's post? You might want to verify how you calculate the error. The sv_coefficients are not just "alpha", but "y*alpha"
Thanks ButterScotch for pointing this out about the sv coefficients. I have been looking at getting the errors in the test data without relying on the svm-predict function. Once I run the training using svm-train, I take the resulting model file and extract the support vectors and their coefficients, taking care that the coefficients are "y*alpha".
If my understanding is correct, the support vectors are some points from the input data set (in particular, the points that are "supporting" the decision boundary.)
So I expect that the support vectors that are being reported in the model file should be found in the raw training data. But for some reason, I do not see that. None of the support vectors that the package calculates are in the raw data.
Am I missing something? How would one go about constructing the final hypothesis from the support vectors and coefficients that are reported in the model file?
#15
03-04-2013, 11:20 AM
 Anne Paulson Senior Member Join Date: Jan 2013 Location: Silicon Valley Posts: 52
Re: on the right track?

Now I have a different problem (sorry to bug you all, thanks for your help). I'm getting the right (or at least, the same) results as the rest of you. But now I can't get answers to Q5 and Q6. I'm getting more than one statement being true, and numbers are not increasing/decreasing monotonically.

Suggestions? Hints?
#16
03-04-2013, 11:24 AM
 Anne Paulson Senior Member Join Date: Jan 2013 Location: Silicon Valley Posts: 52
Re: on the right track?

Never mind: "goes down" is to be interpreted as "goes down monotonically".
#17
03-04-2013, 11:46 AM
 alternate Member Join Date: Jan 2013 Posts: 14
Re: on the right track?

As per another thread, when it says it goes up or goes down, it means it goes strictly, not monotonically.
#18
03-04-2013, 11:59 AM
 Anne Paulson Senior Member Join Date: Jan 2013 Location: Silicon Valley Posts: 52
Re: on the right track?

Right. Strictly, that's what I meant to say.
#19
03-05-2013, 07:07 AM
 boulis Member Join Date: Feb 2013 Location: Sydney, Australia Posts: 29
Re: on the right track?

Quote:
 Originally Posted by Sendai I thought it would be nice to have a way to check if we're on the right track with problems 2-5 without giving away the answers. I ran SVM (with the polynomial kernel) for a couple of cases and pasted the results below. Are others getting the same numbers? 0 vs 7 classifier, C=0.01, Q=2 number of support vectors = 861 = 0.071778 = 0.063241 2 vs 8 classifier, C=0.1, Q=3 number of support vectors = 721 = 0.234878 = 0.291209
Very good idea. I am using LIBSVM with Python. I got the exact same results, with only slight difference that the numSV for second case was 722.

Code:
0 vs 7, Q=2, C=0.01 => Ein: 0.0717781402936 SV#: 861 Eout: 0.0632411067194
2 vs 8, Q=3, C=0.1 => Ein: 0.234878240377 SV#: 722 Eout: 0.291208791209
#20
05-24-2013, 05:25 AM
 alasdairj Member Join Date: Mar 2013 Posts: 12
Re: on the right track?

Quote:
 Originally Posted by Suhas Patil I found the issue...thanks for reply from buttterscotch. The problem was with the way I was initializing 'svm_node' structure after reading the training data.
I too am getting Ein of 0.35 for 0-to-7 classification. What is this "sum_node" structure Suhas mentions?

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 02:53 PM.