Thread: Support Vector Machines View Single Post
#1
05-17-2013, 12:25 PM
 Michael Reach Senior Member Join Date: Apr 2013 Location: Baltimore, Maryland, USA Posts: 71
Support Vector Machines

I wonder if I'm missing something - I found this lecture to be by far the hardest to understand till now, for the following reasons:
1) I didn't see how it fit. Does it have something to do with the cross-validation lecture that preceded it? There really wasn't much discussion of how good this method is, or what its Eout is (there's a tiny bit at the very end), or any of the discussion that I'm used to from the whole rest of the course. As a result,
2) I didn't really know what the point of SVM is. There was sort of a hand-waving argument that it would be better to have fat margins, and then a mention that it allows very high-dimensional models without paying the penalty - because you still hope to end up with just a few support vectors. I got the impression from the argument that the answer will be more robust, so I guess that it generalizes better.
3) I hadn't heard of quadratic programming before, and again that was taken for granted: after you do all this, just pass it to a quadratic programming package... I can look it up on wikipedia, but again disconcerting.
4) The starting point of the lecture was PLA with linearly separable points. But I'm assuming that that isn't required at all, because how useful could that be in general? Again, though, I wasn't clear on it; the simple clarity of the support vectors meaning the points that the margin "bumps into" doesn't work (near as I can see) where we have a non-separable set of points. What determines how many support vectors you get?
5) This whole lecture isn't in the book at all, which also contributes to my difficulty, since I usually use the book as a second input.
6) Hypothesis: We have moved into the part of the course where we are looking at particular methods, and I missed the transition. Everyone but me knows all about SVM and how great it is, so that was taken for granted in the lecture.
Not complaining, but I would like to get my bearings back.