View Single Post
  #1  
Old 02-22-2013, 02:51 PM
cls2k cls2k is offline
Junior Member
 
Join Date: May 2012
Posts: 5
Default Hoeffding inequality for multiple classifiers

I'm having some trouble understanding the case of applying Hoeffding to the case of multiple classifier (Bins). Shouldn't the final picked hypothesis g* still be bound by Hoeffding's inequality since its just like any other hypothesis in the set? How does the process of picking the hypothesis based on the data affect the Hoeffding's bound? What if I pick the worst hypothesis instead of the best one? shouldn't hoeffding's bound apply to that too?

While I understand the mathematics behind the union bound, it seems unintuitive that the bond on g* should be a union of all the bonds of h() in the set since the final g* does not have anything to do with the other unpicked hypothesis.

I do understand the coin example. since the chance of getting 10 heads in a row for one coin is very low but its actually high if you repeat the experiment 1000 times. However I'm unsure as to how this relates to the learning scenario. Getting 10 heads on a sample would be equivelant to getting an Ein of 0. But its mentioned again and again that this is a "bad" event. How does this have anything to do with the Hoeffding bound?

Any insight into this will be greatly appreciated.
Reply With Quote