LFD Book Forum

LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Chapter 5 - Three Learning Principles (http://book.caltech.edu/bookforum/forumdisplay.php?f=112)
-   -   Problem 5.1b (http://book.caltech.edu/bookforum/showthread.php?t=4459)

Sandes 11-10-2013 04:25 PM

Problem 5.1b
 
I get as far as this
P ≥ 1 - 4mH(N)^2/e^(N/32)

but I'm not certain how to go about showing that P ≥ 1 - mH(N)/2^N

Maybe I have totally the wrong approach:
I started with
Eout(h) ≤ Ein(h) + sqr (8/N ln(4mH(N)^2/δ)) with P ≥ 1-δ
So perhaps there's another formula that would work better. I feel like I'm making a silly mistake somewhere but I'm not sure what it could be.

magdon 11-17-2013 11:51 AM

Re: Problem 5.1b
 
If the data is generated from a random (arbitrary) target function, then every dichotomy is equally likely. Since you can implement at most m(N) of these dichotomies, the number of dichotomies you cannot implement is at least 2^N-m(N). Each of these dichotomies you cannot implement has probability 1/2^N

Quote:

Originally Posted by Sandes (Post 11605)
I get as far as this
P ≥ 1 - 4mH(N)^2/e^(N/32)

but I'm not certain how to go about showing that P ≥ 1 - mH(N)/2^N

Maybe I have totally the wrong approach:
I started with
Eout(h) ≤ Ein(h) + sqr (8/N ln(4mH(N)^2/δ)) with P ≥ 1-δ
So perhaps there's another formula that would work better. I feel like I'm making a silly mistake somewhere but I'm not sure what it could be.


samarhabibi 01-07-2018 02:50 AM

Re: Problem 5.1b
 
I just want to say thank you, my problem is solved


All times are GMT -7. The time now is 07:38 PM.

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2018, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.