View Single Post
  #37  
Old 04-15-2012, 04:43 PM
zsero zsero is offline
Junior Member
 
Join Date: Apr 2012
Posts: 6
Default Re: Perceptron Learning Algorithm

Quote:
Originally Posted by mephistoyourfriend View Post
The scalar product between x and w+x is always greater than the scalar product between x and w (unless x is a zero vector), which is obvious once you express it in terms of angles between the vectors and their lengths. Similarly, the product <x,w-x> is always less than <x,w>. Since the sign of the scalar product is the only thing you're interested in, then changing the sign from -1 to +1 is achieved by making the scalar product more positive (rather, less negative), which you do by adding x to w. It's not guaranteed that you can accomplish that in one step (if the length of x is smaller than the length of w and the angle between them is 180 degrees, you certainly can't), but given a sufficient number of steps, you will be able to nudge it into submission. That's how I see it at least.
There is a point what I think is not clearly explained. It took me a long time to realize that it's not true that exactly one point gets corrected in each step. The true statement is that at most one point gets corrected in each step. The illustration in the lecture as well as various comments in this forum would suggest that the vector addition will always "correct" a vector. It is not true. The vector addition will always "tune" the weights such that one vector gets "better", but it doesn't mean that it will be correct after that step.
Reply With Quote