View Single Post
  #3  
Old 11-03-2016, 01:45 PM
CountVonCount CountVonCount is offline
Member
 
Join Date: Oct 2016
Posts: 17
Default Re: Overfitting with Polynomials : deterministic noise

I think the confusion comes from Figure 4.4 compared to the figures of the stochastic noise.

Here you write the shading is the deterministic noise, since this is the difference between the best fit of the current model and the target function.
Exactly this shading is \bar{g}(x) - f(x) from the bias-variance analyses. Thus the value of the deterministic noise is directly related to the bias.

When you talk about stochastic noise you say that the out-of-sample error will increase with the model-complexity and this is related to the area between the final hypothesis g(x) and the target f(x). Thus the reader might think the bias is increasing with the complexity of the model. However the bias depends on \bar{g}(x) and not on g(x). And the reason why this area increases is due to the stochastic noise. If there isn't any noise the final hypothesis will have a better chance to fit (depending on the position of the samples).

In fact (and this is not really clear form the text, but from Exercise 4.3) on a noiseless target the shaded area in Figure 4.4 will decrease when the model complexity increases and thus the bias decreases.
My suggestion is to make is more clear, that in case of stochastic noise you talk about the actual final hypothesis and in case of deterministic noise you talk about the best fitting hypothesis, that is related to \bar{g}(x).

From my understanding I would say:
Overfitting does not apply to the best fit of the model (\bar{g}(x)) but to the real hypothesis (g^{(D)}(x)). In the bias-variance-analyses we saw the variance will increase together with the model complexity (at the same number of samples). So I think Overfitting is a major part of the variance, either due to the stochastic noise or due to the deterministic noise.
Reply With Quote