View Single Post
  #3  
Old 07-18-2012, 05:24 AM
itooam itooam is offline
Senior Member
 
Join Date: Jul 2012
Posts: 100
Default Re: Hoeffding’s inequality

Going back to my questions, I now feel I grasp the principles and realise that because the right side of the inequality equation doesn't rely on Eout (or Ein) to calculate, that this tells us the probability of generalisation based on values of n, epsilon and M. But M will be infinite when we deal with learning techniques such as perceptron learning etc. i.e., providing a large value of n (big sample size) will tend to mean better generalisation, a large value of M will have the adverse affect causing poor generalisation.

Because we were told M will be infinite for "learning techniques" I believe we will be told a way to get round this problem in future lectures?

I believe the epsilon gives us a confidence interval as to whether the sample (Ein) generalises to the overall population. Please could somebody confirm the following:

if I set epsilon to say 0.01 in Hoeffding’s inequality, is this equivalent to saying "what is the probability that Ein DOES NOT generalise to Eout by at least 99%, given the parameters n and M"?

Last edited by itooam; 07-18-2012 at 05:34 AM. Reason: incorrect wording in my last paragraph
Reply With Quote