Quote:
Originally Posted by matthijs
I'm having trouble understanding the role of noise. The generalization bound depends on N, the VC dimension of H, and delta.
I notice that in later lecture slides, noise forms an explicit term in the biasvariance decomposition, i.e. more noise increases the expected E_out (apologies for referring to slides that haven't been discussed yet).
Why doesn't it feature in the generalization bound? Is it because it is captured in the E_in term, i.e. more noise will increase our training error?

Your understanding is correct. Noise increases both
and
. Generalization error is the difference between the two. The more critical impact of noise, that of overfitting, will be discussed in Lecture 11.