LFD Book Forum  

Go Back   LFD Book Forum > Course Discussions > Online LFD course > Homework 6

Reply
 
Thread Tools Display Modes
  #1  
Old 05-14-2013, 12:08 AM
Michael Reach Michael Reach is offline
Senior Member
 
Join Date: Apr 2013
Location: Baltimore, Maryland, USA
Posts: 71
Default *ANSWER* questions w linear regression & weight decay

I have been running the weight decay examples Q2-6, but haven't seen any real improvement in the out-of-sample error compared to no regularization at all. Is that just a feature of this particular problem, or should I recheck my calculations?

Unfortunately (or not), the answers I've been getting do appear as options on the multiple choices.
Reply With Quote
  #2  
Old 05-14-2013, 01:05 AM
Ziad Hatahet Ziad Hatahet is offline
Member
 
Join Date: Apr 2013
Location: San Francisco, CA
Posts: 23
Default Re: *ANSWER* questions w linear regression & weight decay

You should be seeing a change in the out-of-sample error when you vary k (for certain values of k at least). Are you using classification error as your error measure?
Reply With Quote
  #3  
Old 05-14-2013, 07:29 AM
jlaurentum jlaurentum is offline
Member
 
Join Date: Apr 2013
Location: Venezuela
Posts: 41
Default Re: *ANSWER* questions w linear regression & weight decay

Are you using the correct formula for the one step solution?

I was using (Z^T Z - \lambda I)\ldots instead of (Z^T Z + \lambda I)\ldots and so regularization didnt make sense at all. I caught the error because I saw in another post that Professor Yaser corrected a student on the plus sign.
Reply With Quote
  #4  
Old 05-14-2013, 07:54 AM
Michael Reach Michael Reach is offline
Senior Member
 
Join Date: Apr 2013
Location: Baltimore, Maryland, USA
Posts: 71
Default Re: *ANSWER* questions w linear regression & weight decay

Quote:
Originally Posted by Ziad Hatahet View Post
You should be seeing a change in the out-of-sample error when you vary k (for certain values of k at least). Are you using classification error as your error measure?
Well, I am seeing a change, just not really a reduction. Some are the same, some are bigger. I was expecting a dramatic drop in the out-of-sample error.

And yes, I have been using classification error, but that is a good point - I started using the regression residuals and such, but that mistake at least I caught.
Reply With Quote
  #5  
Old 05-14-2013, 01:56 PM
Michael Reach Michael Reach is offline
Senior Member
 
Join Date: Apr 2013
Location: Baltimore, Maryland, USA
Posts: 71
Default Re: *ANSWER* questions w linear regression & weight decay

As I suspected, all my answers on these were wrong. Does anyone have code (R if possible) to show, that I could use for comparison? I'm suspecting my problem was something dumb; even the original linear regression was wrong, and I compared that one with the same answer from the R lm() function.

I'm especially concerned since HW 7 uses all the same data again - so I really need to track this down.
Reply With Quote
  #6  
Old 05-14-2013, 04:10 PM
Elroch Elroch is offline
Invited Guest
 
Join Date: Mar 2013
Posts: 143
Default Re: *ANSWER* questions w linear regression & weight decay

Quote:
Originally Posted by Michael Reach View Post
As I suspected, all my answers on these were wrong. Does anyone have code (R if possible) to show, that I could use for comparison? I'm suspecting my problem was something dumb; even the original linear regression was wrong, and I compared that one with the same answer from the R lm() function.

I'm especially concerned since HW 7 uses all the same data again - so I really need to track this down.
Are you using lambda from 0.001 to 1000? I suppose it might be possible to forget to calculate the power. If you do this, the added term in the matrix described in this thread can hardly fail to have a significant effect.
Reply With Quote
  #7  
Old 05-14-2013, 05:06 PM
jlaurentum jlaurentum is offline
Member
 
Join Date: Apr 2013
Location: Venezuela
Posts: 41
Default Re: *ANSWER* questions w linear regression & weight decay

Michael:

Here you go:

Code:
#READ IN THE FILES.
datos1 <- read.table("in.dta")
names(datos1) <- c("X1","X2","Y")
datos2 <- read.table("out.dta")
names(datos2) <- c("X1","X2","Y")
#FOR THE FOLLOWING QUESTIONS, SET UP THE MATRIXES
Z <- with(datos1,
			cbind(rep(1,nrow(datos1)),X1,X2,
						X1^2,X2^2,X1*X2,abs(X1-X2),abs(X1+X2)) )
Z <- as.matrix(Z)
Zout <- with(datos2,
					cbind(rep(1,nrow(datos2)),X1,X2,
						X1^2,X2^2,X1*X2,abs(X1-X2),abs(X1+X2)) )
Zout <- as.matrix(Zout)
#NOW FIT WITH WEIGHT DECAY USING LAMBDA=10^-3
lambda <- 10^(-3)

M <- t(Z)%*%Z + diag(rep(8,1))*lambda
w <- solve(M)%*%t(Z)%*%datos1$Y
Ym <- as.numeric(sign(Z%*%w))
Ein <- mean(datos1$Y!=Ym)
Ym <- as.numeric(sign(Zout%*%w))
Eout <- mean(datos2$Y!=Ym)
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 01:10 AM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.