LFD Book Forum

LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Homework 6 (http://book.caltech.edu/bookforum/forumdisplay.php?f=135)
-   -   Doubt in Lecture 11 (http://book.caltech.edu/bookforum/showthread.php?t=4314)

a.sanyal902 05-25-2013 11:08 AM

Doubt in Lecture 11
 
There was a previous thread (here) which discussed this, but I still had a nagging doubt.
Say the complexity of our hypothesis set matches that of the target function (or the set includes the target function). So, there is no deterministic noise. Moreover, let us assume there is no stochastic noise either.

However, due to a finite data set, we may still not be able to generalize very well. Is this still called overfitting? We referred to overfitting when the algorthm tries to select a hypothesis which fits the "noise", stochastic or deterministic. But there is no noise in the example above. We may call it variance, because we have many possible choices and few data points, but are we "overfitting" ?

Elroch 05-25-2013 03:57 PM

Re: Doubt in Lecture 11
 
Quote:

Originally Posted by a.sanyal902 (Post 10951)
There was a previous thread (here) which discussed this, but I still had a nagging doubt.
Say the complexity of our hypothesis set matches that of the target function (or the set includes the target function). So, there is no deterministic noise. Moreover, let us assume there is no stochastic noise either.

However, due to a finite data set, we may still not be able to generalize very well. Is this still called overfitting? We referred to overfitting when the algorthm tries to select a hypothesis which fits the "noise", stochastic or deterministic. But there is no noise in the example above. We may call it variance, because we have many possible choices and few data points, but are we "overfitting" ?

Firstly, you can have deterministic noise even if the exact target function is in a hypothesis set. The definition is based on the difference between an average hypothesis and the target function. This average hypothesis is not definable in terms of the hypothesis set: it requires, a set D of samples, a probability distribution \Psi on D and an algorithm M which associates a hypothesis with each element of D. Then it is defined as the function each of whose values is the average over D with respect to \Psi of the hypotheses M generates.

Even if there is no deterministic noise, this certainly doesn't preclude the possibility of overfitting: this merely means by comparison with some other machine M', M gives lower in sample error, but greater out of sample error.

yaser 05-25-2013 10:39 PM

Re: Doubt in Lecture 11
 
Quote:

Originally Posted by a.sanyal902 (Post 10951)
Say the complexity of our hypothesis set matches that of the target function (or the set includes the target function). So, there is no deterministic noise. Moreover, let us assume there is no stochastic noise either.

However, due to a finite data set, we may still not be able to generalize very well. Is this still called overfitting? We referred to overfitting when the algorthm tries to select a hypothesis which fits the "noise", stochastic or deterministic. But there is no noise in the example above. We may call it variance, because we have many possible choices and few data points, but are we "overfitting" ?

Since the model is fixed (and assuming no "early stopping" within this model), it will be just bad generalization, rather than overfitting. Recall that overfitting means getting worse E_{\rm out} as you get better E_{\rm in}, not just getting bad E_{\rm out} in the absolute.


All times are GMT -7. The time now is 07:52 PM.

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.