View Single Post
  #8  
Old 07-17-2012, 12:19 PM
JohnH JohnH is offline
Member
 
Join Date: Jul 2012
Posts: 43
Default Re: Perceptron: should w_0 (bias) be updated?

It is a mistake to talk about w_{0} converging. It is the vector w that converges. w should not be normalized after each update because doing so alters the relative scale of the error adjustments performed with each iteration. I suspect that this could result in cases where convergence would fail to occur even for a linearly separable training vector.
Reply With Quote