LFD Book Forum  

Go Back   LFD Book Forum > Course Discussions > Online LFD course > Homework 2

Reply
 
Thread Tools Display Modes
  #1  
Old 01-20-2013, 02:57 PM
dobrokot dobrokot is offline
Junior Member
 
Join Date: Jan 2013
Posts: 3
Default PLA (question 7): inital weight is too weak hint

If I take output of linear regression, I usually have a little (1-5 of 100) points which are misclassified. But it does not help PLA, because first addition Xi to W destabilize "almost good" value of W. I can even invert sign of L.R. output, it doesn't affect number of iterations (despite number of misclassified items on first iteration changes from 1-5 to 95-99, I verified)

But if I use (L.R. output)*N, it does help for PLA, greatly reduces number of iterations.

Is it supposed, that usage of linear regression output as W should greatly change number of iterations? If yes, does it means I should to search for other bugs in my code?

Output of L.R. should be used as is, or should be multiplied to make it "stronger" ?
Reply With Quote
  #2  
Old 01-20-2013, 03:12 PM
sanbt sanbt is offline
Member
 
Join Date: Jan 2013
Posts: 35
Default Re: PLA (question 7): inital weight is too weak hint

Quote:
Originally Posted by dobrokot View Post
If I take output of linear regression, I usually have a little (1-5 of 100) points which are misclassified. But it does not help PLA, because first addition Xi to W destabilize "almost good" value of W. I can even invert sign of L.R. output, it doesn't affect number of iterations (despite number of misclassified items on first iteration changes from 1-5 to 95-99, I verified)

But if I use (L.R. output)*N, it does help for PLA, greatly reduces number of iterations.

Is it supposed, that usage of linear regression output as W should greatly change number of iterations? If yes, does it means I should to search for other bugs in my code?

Output of L.R. should be used as is, or should be multiplied to make it "stronger" ?
I think for this question you should set N=10 and just use output of linear regression. It should reduce the number if iterations of PLA. I tried it for N = 100 and agree with you that linear regression doesn't help improve iterations.
Reply With Quote
  #3  
Old 01-20-2013, 07:50 PM
Anne Paulson Anne Paulson is offline
Senior Member
 
Join Date: Jan 2013
Location: Silicon Valley
Posts: 52
Default Re: PLA (question 7): inital weight is too weak hint

This issue is discussed in another thread. Short answer: scaling the weights by N might make a difference, but for the homework question, we should follow the directions as given and not scale.
Reply With Quote
  #4  
Old 04-03-2013, 08:25 PM
Michael Reach Michael Reach is offline
Senior Member
 
Join Date: Apr 2013
Location: Baltimore, Maryland, USA
Posts: 71
Default Re: PLA (question 7): inital weight is too weak hint

Anne! Welcome back! (I was in your Data Analysis course.) Are you like a TA here or something? I know you took this course already.
Reply With Quote
  #5  
Old 04-10-2013, 11:46 AM
Anne Paulson Anne Paulson is offline
Senior Member
 
Join Date: Jan 2013
Location: Silicon Valley
Posts: 52
Default Re: PLA (question 7): inital weight is too weak hint

Hi Michael!

My post was from the last iteration of the course (look at the date). I'm not taking it this time. I only happened to see your post because I was randomly surfing around. Good luck to all the DA students taking this course now.
Reply With Quote
  #6  
Old 04-10-2013, 12:11 PM
Michael Reach Michael Reach is offline
Senior Member
 
Join Date: Apr 2013
Location: Baltimore, Maryland, USA
Posts: 71
Default Re: PLA (question 7): inital weight is too weak hint

Oh, my bad. I did see your exalted Senior Member status and I thought that meant T.A. or such.
Reply With Quote
  #7  
Old 04-15-2013, 03:02 AM
wangkexue wangkexue is offline
Junior Member
 
Join Date: Apr 2013
Posts: 5
Default Re: PLA (question 7): inital weight is too weak hint

My experiment results show that when N = 10, the iteration number will be reduced from about ** to about ** with LR Initiation. And when N = 100, the iteration number will be reduced from about ** to **. Futher, when N = 50, it be ** to about **. Does that means LR Initiation can only reduce the iteration number about **?

Admin Edit: Numbers taken out. Please start an *ANSWER* thread if you want to discuss specific answers.
Reply With Quote
  #8  
Old 04-15-2013, 07:41 AM
Elroch Elroch is offline
Invited Guest
 
Join Date: Mar 2013
Posts: 143
Default *ANSWER* PLA (question 7): inital weight is too weak hint

wangxue, please could you delete (or edit) your last post, for obvious reasons?
Reply With Quote
  #9  
Old 03-23-2016, 06:05 AM
khohi khohi is offline
Member
 
Join Date: Dec 2015
Posts: 10
Default Re: PLA (question 7): inital weight is too weak hint

Great job
طريقة عمل البسبوسة
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 04:36 AM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.