LFD Book Forum Exercise 3.4
 User Name Remember Me? Password
 FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#1
06-14-2013, 10:08 AM
 xuewei4d Junior Member Join Date: May 2013 Posts: 4
Exercise 3.4

I didn't get correct answer to Exercise 3.4 (c).

Exercise 3.4(b), I think the answer would be

Exercise 3.4(c), by independence between different , I have .

Where am I wrong?
#2
06-17-2013, 08:27 PM
 htlin NTU Join Date: Aug 2009 Location: Taipei, Taiwan Posts: 601
Re: Exercise 3.4

You can consider double-checking your answer of 3.4(b). Hope this helps.
__________________
When one teaches, two learn.
#3
10-06-2013, 02:16 PM
 i_need_some_help Junior Member Join Date: Sep 2013 Posts: 4
Re: Exercise 3.4

I am not sure how to approach part (a). Are we supposed to explain why that in-sample estimate intuitively makes sense, or (algebraically) manipulate expressions given earlier into it?
#4
10-06-2013, 09:18 PM
 magdon RPI Join Date: Aug 2009 Location: Troy, NY, USA. Posts: 595
Re: Exercise 3.4

Algebraically manipulate earlier expressions and you should get 3.4(a). It is essentially a restatement of .
__________________
Have faith in probability
#5
10-07-2013, 12:00 AM
 Sweater Monkey Junior Member Join Date: Sep 2013 Posts: 6
Re: Exercise 3.4

I'm not sure if I'm going about part (e) correctly.

I'm under the impression that

where as derived earlier
and

This lead me to

I carried out the expansion of this expression and then simplified into the relevant terms but my final answer is because the N term cancels out.

Am I starting out correctly up until this expansion or is my thought process off from the start? And if I am heading in the right direction is there any obvious reason that I may be expanding the expression incorrectly? Any help would be greatly appreciated.
#6
10-07-2013, 01:46 AM
 ddas2 Junior Member Join Date: Oct 2013 Posts: 3
Re: Exercise 3.4

1. I got $y^{\prime}=y-\epsilon+\epsilon^{\prime}$.
and $\hat{y}-y^{\prime}=H\epsilon +\epsilon^{\prime}$.
#7
10-07-2013, 05:53 AM
 magdon RPI Join Date: Aug 2009 Location: Troy, NY, USA. Posts: 595
Re: Exercise 3.4

You got it mostly right. Your error is assuming both term, the H term and the one without the H give an N to cancel the N in the denominator. One term gives an N and the other gives a (d+1).

Quote:
 Originally Posted by Sweater Monkey I'm not sure if I'm going about part (e) correctly. I'm under the impression that where as derived earlier and This lead me to I carried out the expansion of this expression and then simplified into the relevant terms but my final answer is because the N term cancels out. Am I starting out correctly up until this expansion or is my thought process off from the start? And if I am heading in the right direction is there any obvious reason that I may be expanding the expression incorrectly? Any help would be greatly appreciated.
__________________
Have faith in probability
#8
10-07-2013, 09:09 AM
 Sweater Monkey Junior Member Join Date: Sep 2013 Posts: 6
Re: Exercise 3.4

Quote:
 Originally Posted by magdon You got it mostly right. Your error is assuming both term, the H term and the one without the H give an N to cancel the N in the denominator. One term gives an N and the other gives a (d+1).
Yes I realize that only one term should have the N so the issue must be in how I'm expanding the expression.

I think my problem is how I'm looking at the trace of the matrix.

I'm under the impression that produces an NxN matrix with a diagonal of all values and 0 elsewhere. I come to this conclusion because the are all independent so when multiplied together the covariance of any two should be zero while the covariance of any should be the variance of . So then the trace of this matrix should have a sum along the diagonal of , shouldn't it?
#9
10-07-2013, 09:18 AM
 aaoam Junior Member Join Date: Oct 2013 Posts: 1
Re: Exercise 3.4

I'm having a bit of difficulty with 3.4b. I take \hat(y) - y and multiply by (XX^T)^{-1}XX^T, which ends up reducing the expression to just H\epsilon. However, then I can't use 3.3c in simplifying 3.3c, which makes me think I did something wrong. Can somebody give me a pointer?

Also, it'd be great if there was instructions somewhere about how to post in math mode. Perhaps I just missed them?
#10
10-07-2013, 09:19 AM
 magdon RPI Join Date: Aug 2009 Location: Troy, NY, USA. Posts: 595
Re: Exercise 3.4

Yes, that is right. You have to be more careful but use similar reasoning with

Quote:
 Originally Posted by Sweater Monkey Yes I realize that only one term should have the N so the issue must be in how I'm expanding the expression. I think my problem is how I'm looking at the trace of the matrix. I'm under the impression that produces an NxN matrix with a diagonal of all values and 0 elsewhere. I come to this conclusion because the are all independent so when multiplied together the covariance of any two should be zero while the covariance of any should be the variance of . So then the trace of this matrix should have a sum along the diagonal of , shouldn't it?
__________________
Have faith in probability

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 09:30 PM.

 Contact Us - LFD Book - Top