Quote:
Originally Posted by carpdiem
Since the perceptron algorithm updates each of the weights with y_n * x_n, then it updates w[0] with
w[0] < w[0] + y_n * x_n[0]
and y_n is +/ 1, and we have defined x_n[0] = 1.
Then, it seems that w[0] may only ever attain values that differ from its initial value by an integer. For example, if we initialized w[0] = 0, then w[0] could never attain a value of 1/2.
This seems like an odd limitation to the perceptron algorithm. Am I interpreting it correctly that this limitation exists, and does this limitation have any other side effects?

You are right, but this is not really a limitation since scaling the weight vector up or down leads to an equivalent perceptron, so an integer value of
is equivalent to a noninteger value in a properly scaled version.