View Single Post
  #2  
Old 05-11-2012, 03:58 PM
AqibEjaz AqibEjaz is offline
Junior Member
 
Join Date: May 2012
Posts: 7
Default Re: doubt in lecture 11, deterministic noise

This is indeed confusing and after spending some time thinking about this point I think I have finally understood it (I hope). Deterministic noise is nothing but the bias in the modelling hypothesis. So the more complex model will indeed have less deterministic noise (small bias). But this does not imply that this model will also have smaller Eout. Because Eout also depends on the variance of the hypothesis and since the variance of the more complex model will be large for small N, this means that Eout will be large for more complex model. But if we have sufficiently large sample size (i.e. large N) then both the variance and the bias (i.e. deterministic noise) will be small for complex model. Hence in this case the more complex model will outperform the simpler model. So the lesson learnt is: Complex model is better than simple model provided we have sufficient data. For small data sets, complex models overfit and it is better to choose simple models.
Reply With Quote