View Single Post
Old 04-07-2012, 11:08 AM
yaser's Avatar
yaser yaser is offline
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,478
Default Re: Perceptron Learning Algorithm

Originally Posted by lacamotif View Post

I have a question about how weighting is assigned and the meaning of the PLA . For point a1 which has assignment 1, does w(a1.y) + w(a1.x) = 1 ? ( '.' denotes subscript)

And then, for point a2 which has assignment -1, would
w(a1.y) + w(a1.x) + w(a1.x) + w(a2.x) = -1 , and so on?

To adjust weighting of w for misclassified points, is w.x2 = w.x1 + x.2 * y.2

Thank you for the help!
Let me use the book notation to avoid confusion. You have two points {\bf x}_1 and {\bf x}_2 (which you called a1 and a2) and their target outputs (which you called assignment) are f({\bf x}_1)=+1 and f({\bf x}_2)=-1.

Either point, call it just {\bf x} for simplicity, is a vector that has d components {\bf x}=x_1,x_2,\cdots,x_d. Notice that bold {\bf x} denotes a full data point, while italic x denotes a component in that data point. We add a constant 1 component to each data point {\bf x} and call the component x_0 to simplify the expression for the perceptron. If the weight vector of the perceptron is {\bf w}=w_0,w_1,w_2,\cdots,w_d (where w_0 takes care of the threshold value of that perceptron), then the perceptron implements

{\rm sign}(w_0x_0+w_1x_1+w_2x_2+\cdots+w_dx_d)

where `{\rm sign()}' returns +1 if its argument is positive and returns -1 if its argument is negative.

Example: Say the first data point {\bf x}_1=(3,4) (two dimensional, so d=2). Add the constant component x_0=1 and you have {\bf x}_1=(1,3,4). Therefore, the percepton's output on this point is {\rm sign}(w_0+3w_1+4w_2). If this formula returns -1 which is different from the target output +1, the PLA adjusts the values of the weights trying to make the perceptron output agree with the target output for this point {\bf x}. It uses the specific PLA update rule to achieve that.
Where everyone thinks alike, no one thinks very much
Reply With Quote