LFD Book Forum  

Go Back   LFD Book Forum > Course Discussions > Online LFD course > Homework 7

 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
  #1  
Old 05-17-2013, 01:25 PM
Michael Reach Michael Reach is offline
Senior Member
 
Join Date: Apr 2013
Location: Baltimore, Maryland, USA
Posts: 71
Default Support Vector Machines

I wonder if I'm missing something - I found this lecture to be by far the hardest to understand till now, for the following reasons:
1) I didn't see how it fit. Does it have something to do with the cross-validation lecture that preceded it? There really wasn't much discussion of how good this method is, or what its Eout is (there's a tiny bit at the very end), or any of the discussion that I'm used to from the whole rest of the course. As a result,
2) I didn't really know what the point of SVM is. There was sort of a hand-waving argument that it would be better to have fat margins, and then a mention that it allows very high-dimensional models without paying the penalty - because you still hope to end up with just a few support vectors. I got the impression from the argument that the answer will be more robust, so I guess that it generalizes better.
3) I hadn't heard of quadratic programming before, and again that was taken for granted: after you do all this, just pass it to a quadratic programming package... I can look it up on wikipedia, but again disconcerting.
4) The starting point of the lecture was PLA with linearly separable points. But I'm assuming that that isn't required at all, because how useful could that be in general? Again, though, I wasn't clear on it; the simple clarity of the support vectors meaning the points that the margin "bumps into" doesn't work (near as I can see) where we have a non-separable set of points. What determines how many support vectors you get?
5) This whole lecture isn't in the book at all, which also contributes to my difficulty, since I usually use the book as a second input.
6) Hypothesis: We have moved into the part of the course where we are looking at particular methods, and I missed the transition. Everyone but me knows all about SVM and how great it is, so that was taken for granted in the lecture.
Not complaining, but I would like to get my bearings back.
Comments?
Reply With Quote
 

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 02:27 PM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.