View Single Post
Old 12-13-2017, 02:38 PM
htlin's Avatar
htlin htlin is offline
Join Date: Aug 2009
Location: Taipei, Taiwan
Posts: 610
Default Re: How does regularized logistic regression regularize perceptron model?

Originally Posted by ntvy95 View Post
In the case I use regularized logistic regression to train perceptron model with threshold \frac{1}{2}, for regularized logistic regression it means that I have had soft-order constraint on w, however for perceptron model, as suggested in Exercise 4.6, it looks like soft-order constraint on w doesn't regularize perceptron model at all. Meanwhile in practice, I see that regularized logistic regression does regularize perceptron model and Andrew Ng's online course also suggests that regularized logistic regression fights against overfitting on classification problems.

What did I miss here?

Thank you very much in advance.
That is because the hard-classification output (sign) is not affected by the magnitude of \mathbf{w} while the soft output (\theta) is . Hope this helps.
When one teaches, two learn.
Reply With Quote