View Single Post
Old 07-17-2012, 09:41 AM
yaser's Avatar
yaser yaser is offline
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,478
Default Re: Perceptron: should w_0 (bias) be updated?

Originally Posted by Randy View Post
the problem is that w0 does a random walk over the integers, without ever converging to a meaningful value, at least if you use a starting value of 0.

Since w1 and w2 determine the orientation of the dividing line between the positive and negative points, and w0 determines it's location relative to the origin, it seems to me that this update rule can never find a good solution if the true dividing line does not pass through (x1=0,x2=0).
Hint: The perceptron with weight vector {\bf w} is equivalent to that with weight vector \alpha {\bf w} for any \alpha>0.
Where everyone thinks alike, no one thinks very much
Reply With Quote