LFD Book Forum Problem with simple perceptron implmenetation
 User Name Remember Me? Password
 Register FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#1
07-15-2012, 03:58 AM
 lorddoskias Junior Member Join Date: Jul 2012 Posts: 9
Problem with simple perceptron implmenetation

So I wanted to do a little bit of experimenting with the perceptron algorithm to understand it better and I came up with a simple match making scenario. Essentially i have a 100x2 training data matrix with the first feature being height between 165 and 185 and the second feature is weight (as in physical weight of a person ) between 60 and 80. My target function is a very simple one :

Code:
```function [res] = target(X)

m = size(X, 1);
res = zeros(100, 1);

for j = 1:length(X)
if(X(j,1) > 170 && X(j,2) > 65)
res(j) = 1;
else
res(j) =  -1;
end
end

end```
It returns 1 if a person is taller than 170cm and heavier than 65 kilos and -1 otherwise. I use this function to generate the result of a X matrix which consists of random values withing the aforementioned boundaries.

The next logical step is to learn a hypothesis function g(X) which would act much in the same way as the target function - which I assume is unknown. So here is my implementation of the perceptron:

Code:
```function [w] = perceptron(X, y)

X = [ones(size(X, 1), 1) X]; % add the bias term
w = zeros(size(X, 2), 1); %init weights to zero
m = length(y);
iterations = 0;
wrong = 0;

while(true)
iterations = iterations + 1;
wrong = 0;
for j = 1:m
if (sign(X(j, :) * w) != y(j))
w = w + (y(j)*X(j, :))';
wrong = 1;
break;
endif
end %inner for

if(wrong == 0)
break;
endif

end  %outer loop

end %function```
Unfortunately it doesn't converge? Are my assumptions towards the problem wrong or do I have a fault in my implementation?
#2
07-15-2012, 01:29 PM
 JohnH Member Join Date: Jul 2012 Posts: 43
Re: Problem with simple perceptron implmenetation

I haven't looked at your implementation of the perceptron; however, I think there will be a problem with convergence regardless of the implementation since the target function is not linearly separable.
#3
07-16-2012, 07:15 AM
 lorddoskias Junior Member Join Date: Jul 2012 Posts: 9
Re: Problem with simple perceptron implmenetation

I think you are mistaken. It is linearly separable since the 170/65 act as the threshold i.e. that's where the linear separation occurs.
#4
07-16-2012, 10:40 AM
 JohnH Member Join Date: Jul 2012 Posts: 43
Re: Problem with simple perceptron implmenetation

The function, height > 170 and weight > 65, is not linearly separable because it does not bisect the plane but instead defines an infinite rectangular region on the plane. There is no single line that separates this region from the rest of the plane.
#5
07-16-2012, 10:01 PM
 Ubermensch Junior Member Join Date: Jun 2012 Posts: 4
Re: Problem with simple perceptron implmenetation

John is right. It can't be fit with a linear line. This is what I get with your data (the points are randomly chosen and sorry for not including the axis, legends and labels)

So it isn't a straight line. So it can't converge with a simple 1D perceptron
#6
07-17-2012, 12:17 PM
 lorddoskias Junior Member Join Date: Jul 2012 Posts: 9
Re: Problem with simple perceptron implmenetation

Thanks for the answers. So if I have understood you correctly in this case I will need to train 2 perceptrons - one for each feature and then when a new sample is given run each of the perceptrons for height/weight and my final hypothesis would return 1 only if the 2 perceptrons return 1 on each respective feature? Is this correct?
#7
07-17-2012, 01:53 PM
 JohnH Member Join Date: Jul 2012 Posts: 43
Re: Problem with simple perceptron implmenetation

At this point the course hasn't provided us with sufficient tools to address the problem you pose, but I think you are on the right track for a method of resolving the categories. Two perceptrons, one for height and one for weight, could feed a third perceptron that would perform the logical operation (AND is a linearly separable logical operation). This approach probably requires back-propogation in order to train the network.

 Tags perceptron, pla

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 11:43 PM.

 Contact Us - LFD Book - Top