Quote:
Originally Posted by grozhd
I have the same concern as scottedwards2000 and I still don't understand how it is resolved.
As I understand the bin symbolizes the probability space of all possible inputs  . Sample of balls drawn randomly from the bin symbolizes our training set  .
Now we pick a hypothesis  (suppose we are running PLA). We look at our sample  , compute  and use Hoeffding's Inequality. We do one step of PLA and come up with new hypothesis  which automatically gives us  and professor is saying that we can write down Hoeffding inequality for  and  ?
I guess, we can. But that inequality tells us something about random variable ![\nu_2: X^n \rightarrow [0;1] \nu_2: X^n \rightarrow [0;1]](/vblatex/img/dc8df4236a4a446d80e9628ee900a701-1.gif) , i.e. about :  where  is a random sample. But it seems like we are using  where  is hardly random with regard to  since we built  using that sample.
Here is an example that illustrates my point: say we tried some random  , compared it with target function  on our training sample  , wrote down Hoeffding's inequality. Now let's construct  as follows:  and  . Let's write down Hoeffding's ineqaulity for this hypothesis. If we are indeed using  then here it would be equal to 1 since  on  and we would have:  is small. So somehow we are saying with high probability that  does an excellent job out of sample though we didn't change it much from  . This example shouldn't be correct, right? If it isn't how is the one with PLA correct?
|
It is a subtle point, so let me try to explain it in the terms you outlined. Let us take the sample

(what you call

, just to follow the book notation). Now evaluate

for all hypotheses

in your model

. We didn't start at one

and moved to another. We just evaluated

for all

. The question is, does Hoeffding inequality apply to each of these

's by itself? The answer is clearly yes since each of them could be in principle the hypothesis you started with (which you called

).
Hoeffding states what the probabilities are before the sample is drawn. When you choose one of these hypotheses because of its small

, as in the scenario you point out, the probability that applies now is
conditioned on the sample having small

. We can try to get conditional version of Hoeffding to deal with the situation, or we can try to get a version of Hoeffding that applies regardless of which

we choose and how we choose it. The latter is what we did using the union bound.
Finally, taking the example you illustrated, any hypothesis you use has to be in

(which is decided before the sample is drawn). The one you constructed is not guaranteed to be in

. Of course you can guarantee that it is in

by taking

to be the set of all possible hypotheses, but in this case,

is thoroughly infinite

and the multiple-bin Hoeffding does not guarantee anything at all.