View Single Post
#2
07-12-2012, 03:34 PM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,477
Re: Perceptron: should w_0 (bias) be updated?

Quote:
 Originally Posted by fredrmueller@gmail.com The zeroth term is just a clever way to simplify the notation by adding the threshold/bias term as another term in the sum. The value of the threshold/bias, however, is not an observed quantity, though - it was chosen. So I am assuming that when updating the weights, we should NOT update the zero-th weight (the threshold/bias). Is this correct?
In fact is just like all other weights, and should be updated in the same way (which will happen automatically when you use the PLA update rule and take to include the zero-coordinate ). The intuitive reason is that some thresholds work better than others (similar to some weights working better than others) in separating the data, hence being part of the learning update will result in a better value.
__________________
Where everyone thinks alike, no one thinks very much