#11
08-24-2012, 05:19 AM
 jakvas Member Join Date: Jul 2012 Posts: 17

as for the matlab this worked for me

UB(1:N,1)=10^5;
options = optimset('LargeScale','off','MaxIter',2000);

oh and for clarification Class is the +1/-1 classification of the points and the empty matricies are unused restrictions I hope this helps a little and isn't frowned upon from the collaboration point of view after all you know what you are doing;P
#12
08-24-2012, 06:08 AM
 apinde Member Join Date: Jul 2012 Posts: 12

Can one use Solver in Excel for this problem? I've not used the other languages and platforms for quadratic programming packages and am desperately searching for a package that can be learned and used in a few days.
#13
08-24-2012, 07:10 AM
 patrickjtierney Member Join Date: Jul 2012 Location: Toronto, Canada Posts: 33

This was discussed extensively in the previous class. A user elkka found a solution to the non-terminating problem for Octave. The trick is to run qp on an altered version of H (using all the same other parameters) and then to use the output as the initial value alpha0 for qp on the original H. The second run seems to always terminate after one iteration.

The altered H is just to add a small amount to the diagonal of H (say 10^-15). This makes its determinant non-zero, which is helpful for qp. So HH=H+eye(n)*10^-15 should work. [ n is just the length of Y ] Also, the alpha0 I refer to goes first in the arg list for the second qp call.

The results are much better, both in terms of termination and in producing a noticeably higher prediction accuracy (ie 10+% rise in better than PLA results).

#14
08-24-2012, 11:24 AM
 invis Senior Member Join Date: Jul 2012 Posts: 50

Wow !
I try it for n=10 on 400 iterations and have ZERO situations where QP cant solve !
Next I try to do the same iterations without altered version of H and have 164 problems without solving...

Thanks very much !
#15
08-24-2012, 12:01 PM
 patrickjtierney Member Join Date: Jul 2012 Location: Toronto, Canada Posts: 33

I know. It's like magic, isn't it? And it appears to be a legitimate workaround for a problem with gp() on our problem.

One warning. I don't know if the answers we get are acceptable as far as the assignment goes (ie too good) or if we are simply fixing a problem in Octave. YMMV.

I noticed that on both N=10 and N=100 that the average Eout for SVM drops by about 20% after using this technique.
#16
08-24-2012, 12:05 PM
 invis Senior Member Join Date: Jul 2012 Posts: 50

Quote:
 Originally Posted by patrickjtierney It's like magic, isn't it?
ML entirely is like magic too but its just a mathematics
#17
08-24-2012, 01:18 PM
 jakvas Member Join Date: Jul 2012 Posts: 17

I'm a matlab user and adding the diagonal terms results in no significant change for the results so it's probably something with the implementation of qp in Octave.

@patrickjtierney
A change in Eout this big should be noticible when plotting the results (as in plotting the target function the PLA result and the SVM result with the training data set you should see that something is wrong?).

@All
Watch lecture 15 the part from 1:07:35 1:08:00 just to make us feel better
#18
08-24-2012, 01:52 PM
 invis Senior Member Join Date: Jul 2012 Posts: 50

Hmm, my SV looks pretty strange
Code:
```    options = optimset("MaxIter", 400);
H = (Y*Y') .* (X*X');
A = Y';
q=-1*ones(n,1);
b=0;
lb=zeros(n,1);
ub=10^10*ones(n,1);
alpha0 = qp ([], H+(eye(n)*10^-15), q, A, b, lb, ub, options);
alpha = qp (alpha0, H, q, A, b, lb, ub, options);```
After this I have Alpha vector size of n. Next check what of alphas bigger then threshold (mine is 10^-5). If Alpha[2] bigger then threshold this mean that X[2,:] (X is my n*2 matrix with points) is one of support vectors
Ok, look at this vectors (small black line is my ):

Some of them pretty good, I place here only strange. Any ideas about behavior of this support vectors ?

Or it is a part of homework ? Because I check with n=100 and vectors looks better, but some time some of them still strange
#19
08-24-2012, 05:51 PM
 zifmia Junior Member Join Date: Jul 2012 Posts: 4

I don't see anything odd with your support vectors. The solution may not be a great fit to your target f, but SVM doesn't see your f, it only sees 10 samples of f, and in all the cases you show, it looks like the support vectors could do a pretty good job of separating the given training points.

Try drawing the actual separating line found by SVM (from the w vector and b), then draw the parallels to this line through your support vectors to see the margin zone. I think you will find that SVM does a pretty good job of separating your points.

Last edited by zifmia; 08-24-2012 at 05:54 PM. Reason: correction
#20
08-24-2012, 10:59 PM
 jakvas Member Join Date: Jul 2012 Posts: 17

@invis

The support vectors look fine to me. Could you draw the final hypothesis with the target function and the SVs? Then you should see more or less if the SVs support the final hypothesis like they should

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 10:33 AM.