LFD Book Forum

LFD Book Forum (http://book.caltech.edu/bookforum/index.php)
-   Chapter 2 - Training versus Testing (http://book.caltech.edu/bookforum/forumdisplay.php?f=109)
-   -   The other side of |Eout - Ein|<= epsilon (http://book.caltech.edu/bookforum/showthread.php?t=4223)

manish 04-19-2013 06:56 PM

The other side of |Eout - Ein|<= epsilon
 
I don't understand how Eout(g) >= Ein(g) - epsilon
implies that there is no other hypothesis better than g?

yaser 04-19-2013 08:11 PM

Re: The other side of |Eout - Ein|<= epsilon
 
Quote:

Originally Posted by manish (Post 10497)
I don't understand how Eout(g) >= Ein(g) - epsilon implies that there is no other hypothesis better than g?

You are right. The statement that is relevant to the conclusion that "there is no other hypothesis better than g" is E_{\rm out} (h) \ge E_{\rm in} (h) - \epsilon for all h\in{\cal H}. Since one of the h's is g, one can also conclude the same relationship for g, but that says something else. In the other direction, it is also true that E_{\rm out} (h) \le E_{\rm in} (h) + \epsilon for all h\in{\cal H}, but we write it in terms of g only since this is sufficient for the conslusion we need in that direction.


All times are GMT -7. The time now is 09:50 AM.

Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.