Thread: Exercise 4.6
View Single Post
  #3  
Old 11-09-2016, 06:32 AM
magdon's Avatar
magdon magdon is offline
RPI
 
Join Date: Aug 2009
Location: Troy, NY, USA.
Posts: 595
Default Re: Exercise 4.6

Yes, the soft order constraint does not impact classification. Better regularize with the hard order constraint, or use the soft order constraint with the "regression for classification" algorithm.

Quote:
Originally Posted by ntvy95 View Post
Hello, I have this answer for the Exercise 4.6 but I'm not sure if it's right?

Because sign(w^{T}x) = sign(\alpha w^{T}x) for any \alpha > 0, very small weights are still as powerful as large weights (all that matters is the accuracy of the calculations that computer being able to perform): That also means a hyperplane can be represented by many hypotheses, constraining the weights can reduce the number of hypotheses represents the same hyperplane. Hence soft-order constraint will be able to reduce the var component while likely not compromising the bias component.

----------------------------------------

Edit: I have just remembered that the growth function has already taken care of the issue many hypotheses representing the same hyperplane (and this issue does not affect the var component anyway (?)). So in this case the answer should be the hard-order constraint...? I'm really confused right now.
__________________
Have faith in probability
Reply With Quote