LFD Book Forum LR and PLA with scaled input space
 User Name Remember Me? Password
 Register FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#1
10-13-2013, 09:59 PM
 yaser Caltech Join Date: Aug 2009 Location: Pasadena, California, USA Posts: 1,478
LR and PLA with scaled input space

A post at another forum:
Quote:
 While answering question 7, I mistakenly took my test points from , which resulted in the number of iterations being approximately double of what it was when I corrected it to . I thought that this might simply be because doubling the interval provided more area, making the points more loosely scattered and allowing for a larger number of lines to satisfy the test cases. However, when I tested my hypothesis by taking intervals of and , etc., I found that the number of iterations was least for and went up in either direction from there. The points which defined my line were also taken from the larger intervals. Can anyone help explain why this might be the case?
If you scale , then the linear regression solution scales in the opposite direction (other things being equal) since it is trying to make match the same value ( or ). Now if you take the LR solution and use it as initial condition for PLA, the impact of each PLA iteration scales up with since you are adding to the weight vector at each iteration.

Put these together and you conclude that, as scales up and down, the impact of the LR solution vector on PLA goes down and up, respectively, and significantly so. On the large extreme, the LR solution behaves like the vector so you get the original PLA iterations. As gets smaller, kicks in as a good initial condition (with non-trivial size) and you gain some PLA iterations. As diminishes, PLA will take longer to correct the misclassified points that the LR didn't get simply because the PLA iteration becomes relatively smaller in the movement that it creates.
__________________
Where everyone thinks alike, no one thinks very much

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 02:51 AM.

 Contact Us - LFD Book - Top

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.