Problem 1.10 : Expected Off Training Error
Hi,
If I got it right, in a noiseless setting, for a fixed D, if all f are equally likely, the expected off-training-error of any hypothesis h is 0.5 (part d of problem 1.10, page 37) and hence any two algorithms are the same in terms of expected off training error (part e of the same problem).
My question is, does this not contradict the generalization by Hoeffding. Specifically, the following point is bothering me
By Hoeffding : Ein approaches Eout for larger number of hypothesis (i.e for small epsilon) as N grows sufficiently large. Which would imply that expected(Eout) should be approximately the same as expected(Ein) and not a constant (0.5).
Can you please provide some insight on this, perhaps my comparison is erroneous.
Thanks.
|