View Single Post
  #2  
Old 07-12-2012, 03:34 PM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,477
Default Re: Perceptron: should w_0 (bias) be updated?

Quote:
Originally Posted by fredrmueller@gmail.com View Post
The zeroth term is just a clever way to simplify the notation by adding the threshold/bias term as another term in the sum. The value of the threshold/bias, however, is not an observed quantity, though - it was chosen. So I am assuming that when updating the weights, we should NOT update the zero-th weight (the threshold/bias). Is this correct?
In fact w_0 is just like all other weights, and should be updated in the same way (which will happen automatically when you use the PLA update rule and take {\bf x} to include the zero-coordinate x_0=1). The intuitive reason is that some thresholds work better than others (similar to some weights working better than others) in separating the data, hence being part of the learning update will result in a better value.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote