LFD Book Forum Exercise 4.3
 Register FAQ Calendar Mark Forums Read

#1
10-10-2017, 10:49 AM
 ghbcode Junior Member Join Date: Oct 2017 Posts: 4
Exercise 4.3

The only posting for Chapter 4 that touched on this topic is listed below though it did not explicitly cover exercise 4.3, it is somewhat touchy feely and not that exact.

Deterministic noise depends on H, as some models approximate f better
than others.
(a) Assume H is fixed and we increase the complexity of f. Will deter*ministic noise in general go up or down? Is there a higher or lower tendency to overfit?
(b) Assume f is fixed and we decrease the complexity of H. Will deter*ministic noise in general go up or down? Is there a higher or lower tendency to overfit? [Hint: There is a race between two factors that affect overfitting in opposite ways, but one wins.]

The hint to me implies the the response to a and b would move in different directions. This is what I have for an answer:
a) By increasing the target function complexity deterministic noise will increase since H remains fixed and f becomes more complex. There will be lower overfitting in the out-of-sample data. As a matter of fact this goes counter to the summary table in page 124, however, it does not make sense to me that, keeping all else constant, by increasing the target complexity we are increasing the overfit. If anything by increasing the target complexity, your fixed H would underfit.

b) By lowering the target function complexity deterministic noise would increase and there would be a tendency to lower overfit.

I'm not sure that my answer is correct so if you could enlighten me that would be most helpful. In the thread that I posted above there was discussion and use of the formula:

$E_{out}=\sigma^2+bias+var$

Question 1:
In this exercise do we assume that E_out remains fixed so that the expression is tweaked by changing H or f?

Question 2:
In part a) if you keep H fixed and increase f, what happens to deterministic noise and what happens to overfit?

Question 3:
In part b) if you keep f fixed and decrease H, what happens to deterministic noise and what happens to overfit?

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 03:40 AM.