View Single Post
Old 04-17-2013, 11:21 AM
yaser's Avatar
yaser yaser is offline
Join Date: Aug 2009
Location: Pasadena, California, USA
Posts: 1,478
Default Re: The role of noise

Originally Posted by matthijs View Post
I'm having trouble understanding the role of noise. The generalization bound depends on N, the VC dimension of H, and delta.

I notice that in later lecture slides, noise forms an explicit term in the bias-variance decomposition, i.e. more noise increases the expected E_out (apologies for referring to slides that haven't been discussed yet).

Why doesn't it feature in the generalization bound? Is it because it is captured in the E_in term, i.e. more noise will increase our training error?
Your understanding is correct. Noise increases both E_{\rm in} and E_{\rm out}. Generalization error is the difference between the two. The more critical impact of noise, that of overfitting, will be discussed in Lecture 11.
Where everyone thinks alike, no one thinks very much
Reply With Quote