LFD Book Forum Probability estimate from soft margin SVMs

#1
03-28-2013, 07:57 AM
 sbgaucho Junior Member Join Date: Mar 2013 Posts: 1
Probability estimate from soft margin SVMs

Apologies if this has already been covered either in the forums or in a lecture, but I don't recall it in any lecture and couldn't find anything in the forum.

It seems to me that it would be nice after using SVM to get a probability estimate that a given x (particularly for out of sample x's) corresponds to y=1. For noisy but non linearly separable data it seems like it would be ideal to combine the probabilistic output of a logistic regression with the power of SVM. I googled this and found a couple presentations/references, but it doesn't seem like there is a clear-cut answer. Am I way off base? If not what is the simplest/easiest direction to go in terms of learning about and implementing such a thing? Is it easiest just to use something like libsvm or weka?

Thanks
#2
03-29-2013, 11:41 PM
 htlin NTU Join Date: Aug 2009 Location: Taipei, Taiwan Posts: 601
Re: Probability estimate from soft margin SVMs

Quote:
 Originally Posted by sbgaucho Apologies if this has already been covered either in the forums or in a lecture, but I don't recall it in any lecture and couldn't find anything in the forum. It seems to me that it would be nice after using SVM to get a probability estimate that a given x (particularly for out of sample x's) corresponds to y=1. For noisy but non linearly separable data it seems like it would be ideal to combine the probabilistic output of a logistic regression with the power of SVM. I googled this and found a couple presentations/references, but it doesn't seem like there is a clear-cut answer. Am I way off base? If not what is the simplest/easiest direction to go in terms of learning about and implementing such a thing? Is it easiest just to use something like libsvm or weka? Thanks
SVM with probabilistic outputs is useful for some applications. The most popular technique was proposed from Platt. The technique basically runs a variant of logistic regression to post-process the outputs of SVM. An earlier work of myself improves Platt's proposed algorithm from an optimization perspective:

http://www.csie.ntu.edu.tw/~htlin/pa.../plattprob.pdf

Hope this helps.
__________________
When one teaches, two learn.
#3
03-30-2013, 12:47 AM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,477
Re: Probability estimate from soft margin SVMs

A related question asked by email

Quote:
 Can you please tell me if the following would be a good idea for post-processing after performing SVM: use the same z-space but instead of maximizing the margin, use logistic regression (in z space) and also allow the width of the logistic function to be a free parameter (let the cross-entropy be the objective function and use gradient descent). The solution from SVM could be used as the initial guess. Would this be a good idea (ie. improve the SVM result)?
and the answer from htlin (my colleague Professor Hsuan-Tien Lin):

Quote:
 Post-processing the outputs of SVM by logistic regression formulation has been explored for getting probabilistic (soft) outputs from SVMs. The formulation comes with two parameters: the width (scaling) of the SVM output as you suggest, and an additional "bias" term. You can check http://www.csie.ntu.edu.tw/~htlin/pa.../plattprob.pdf and the earlier work of John Platt for some additional information. Hope this helps.
__________________
Where everyone thinks alike, no one thinks very much

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 10:42 AM.