View Single Post
  #1  
Old 05-25-2013, 11:08 AM
a.sanyal902 a.sanyal902 is offline
Member
 
Join Date: Apr 2013
Posts: 11
Default Doubt in Lecture 11

There was a previous thread (here) which discussed this, but I still had a nagging doubt.
Say the complexity of our hypothesis set matches that of the target function (or the set includes the target function). So, there is no deterministic noise. Moreover, let us assume there is no stochastic noise either.

However, due to a finite data set, we may still not be able to generalize very well. Is this still called overfitting? We referred to overfitting when the algorthm tries to select a hypothesis which fits the "noise", stochastic or deterministic. But there is no noise in the example above. We may call it variance, because we have many possible choices and few data points, but are we "overfitting" ?
Reply With Quote