View Single Post
  #3  
Old 05-25-2013, 10:39 PM
yaser's Avatar
yaser yaser is offline
Caltech
 
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,477
Default Re: Doubt in Lecture 11

Quote:
Originally Posted by a.sanyal902 View Post
Say the complexity of our hypothesis set matches that of the target function (or the set includes the target function). So, there is no deterministic noise. Moreover, let us assume there is no stochastic noise either.

However, due to a finite data set, we may still not be able to generalize very well. Is this still called overfitting? We referred to overfitting when the algorthm tries to select a hypothesis which fits the "noise", stochastic or deterministic. But there is no noise in the example above. We may call it variance, because we have many possible choices and few data points, but are we "overfitting" ?
Since the model is fixed (and assuming no "early stopping" within this model), it will be just bad generalization, rather than overfitting. Recall that overfitting means getting worse E_{\rm out} as you get better E_{\rm in}, not just getting bad E_{\rm out} in the absolute.
__________________
Where everyone thinks alike, no one thinks very much
Reply With Quote