View Single Post
  #3  
Old 02-16-2013, 12:00 PM
melipone melipone is offline
Senior Member
 
Join Date: Jan 2013
Posts: 72
Default Re: Question on regularization for logistic regression

Thanks. Okay, so if I take the derivative of \frac{\lambda}{2N}w^Tw for the regularization, I just add \frac{\lambda}{N}w to the gradient in the update of each weight in stochastic gradient descent. Is that correct?

I was also looking into L1 and L2 regularization. That would be L2 regularization above. My understanding is that L1 regulation would just add a penalty term to the gradient regardless of the weight itself. Is my understanding correct?

TIA
Reply With Quote