LFD Book Forum  

Go Back   LFD Book Forum > Book Feedback - Learning From Data > Chapter 4 - Overfitting

Reply
 
Thread Tools Display Modes
  #1  
Old 10-10-2017, 09:49 AM
ghbcode ghbcode is offline
Junior Member
 
Join Date: Oct 2017
Posts: 4
Default Exercise 4.3

The only posting for Chapter 4 that touched on this topic is listed below though it did not explicitly cover exercise 4.3, it is somewhat touchy feely and not that exact.
http://book.caltech.edu/bookforum/showthread.php?t=503

Exercise 4.3 asks:
Deterministic noise depends on H, as some models approximate f better
than others.
(a) Assume H is fixed and we increase the complexity of f. Will deter*ministic noise in general go up or down? Is there a higher or lower tendency to overfit?
(b) Assume f is fixed and we decrease the complexity of H. Will deter*ministic noise in general go up or down? Is there a higher or lower tendency to overfit? [Hint: There is a race between two factors that affect overfitting in opposite ways, but one wins.]

The hint to me implies the the response to a and b would move in different directions. This is what I have for an answer:
a) By increasing the target function complexity deterministic noise will increase since H remains fixed and f becomes more complex. There will be lower overfitting in the out-of-sample data. As a matter of fact this goes counter to the summary table in page 124, however, it does not make sense to me that, keeping all else constant, by increasing the target complexity we are increasing the overfit. If anything by increasing the target complexity, your fixed H would underfit.

b) By lowering the target function complexity deterministic noise would increase and there would be a tendency to lower overfit.

I'm not sure that my answer is correct so if you could enlighten me that would be most helpful. In the thread that I posted above there was discussion and use of the formula:

$E_{out}=\sigma^2+bias+var$

Question 1:
In this exercise do we assume that E_out remains fixed so that the expression is tweaked by changing H or f?

Question 2:
In part a) if you keep H fixed and increase f, what happens to deterministic noise and what happens to overfit?

Question 3:
In part b) if you keep f fixed and decrease H, what happens to deterministic noise and what happens to overfit?
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 12:56 PM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2018, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.