View Single Post
  #4  
Old 01-17-2013, 09:27 AM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,478
Default Re: Regression then PLA

Quote:
Originally Posted by Anne Paulson View Post
I'm coming up against this as well. Maybe I have a bug, but I'm finding that even though the regression finds an almost perfect line with, usually, very few points misclassified, I give the weights from the regression to PLA as initial weights and the PLA line bounces all over the place before settling down.

Scaling the regression weights up by a factor of 10 or 100 would speed up the PLA a lot, I think, by preventing the PLA update from moving the weights so much. That would have a similar effect to using a small alpha. But we're not supposed to do either thing, right?
You are right, there is no scaling in Problem 7. Here, and in all homework problems, you are encouraged to explore outside the statement of the problem, like you have done here, but the choice of answer should follow the problem statement.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote