View Single Post
  #1  
Old 04-17-2013, 08:31 AM
matthijs matthijs is offline
Junior Member
 
Join Date: Jul 2012
Posts: 1
Default The role of noise

I'm having trouble understanding the role of noise. The generalization bound depends on N, the VC dimension of H, and delta.

I notice that in later lecture slides, noise forms an explicit term in the bias-variance decomposition, i.e. more noise increases the expected E_out (apologies for referring to slides that haven't been discussed yet).

Why doesn't it feature in the generalization bound? Is it because it is captured in the E_in term, i.e. more noise will increase our training error? In earlier lectures, N was written in terms of the growth function, to see how much data we need; and a rule of thumb was given that says N >= 10*VCdim. I'd like understand quantitatively how our need for data grows with noise, but I don't see how to do this using the generalization bound or bias-variance.
Reply With Quote