LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Chapter 5 - Three Learning Principles (http://book.caltech.edu/bookforum/forumdisplay.php?f=112)
-   -   Problem 5.1b (http://book.caltech.edu/bookforum/showthread.php?t=4459)

 Sandes 11-10-2013 03:25 PM

Problem 5.1b

I get as far as this
P ≥ 1 - 4mH(N)^2/e^(N/32)

but I'm not certain how to go about showing that P ≥ 1 - mH(N)/2^N

Maybe I have totally the wrong approach:
I started with
Eout(h) ≤ Ein(h) + sqr (8/N ln(4mH(N)^2/δ)) with P ≥ 1-δ
So perhaps there's another formula that would work better. I feel like I'm making a silly mistake somewhere but I'm not sure what it could be.

 magdon 11-17-2013 10:51 AM

Re: Problem 5.1b

If the data is generated from a random (arbitrary) target function, then every dichotomy is equally likely. Since you can implement at most m(N) of these dichotomies, the number of dichotomies you cannot implement is at least 2^N-m(N). Each of these dichotomies you cannot implement has probability 1/2^N

Quote:
 Originally Posted by Sandes (Post 11605) I get as far as this P ≥ 1 - 4mH(N)^2/e^(N/32) but I'm not certain how to go about showing that P ≥ 1 - mH(N)/2^N Maybe I have totally the wrong approach: I started with Eout(h) ≤ Ein(h) + sqr (8/N ln(4mH(N)^2/δ)) with P ≥ 1-δ So perhaps there's another formula that would work better. I feel like I'm making a silly mistake somewhere but I'm not sure what it could be.

 samarhabibi 01-07-2018 01:50 AM

Re: Problem 5.1b

I just want to say thank you, my problem is solved

 All times are GMT -7. The time now is 05:38 AM.