 LFD Book Forum Hoeffding inequality for multiple hypothesis
 User Name Remember Me? Password
 FAQ Calendar Mark Forums Read

 Thread Tools Display Modes
#1
 kostya3312 Junior Member Join Date: Jan 2015 Posts: 2 Hoeffding inequality for multiple hypothesis

It's clear for me how inequality works for each hypothesis separately. But I don't understand why we need Hoeffding inequality for multiple hypothesis. If i have training data set of size 'N' then (for fixed tolerance 'e') Hoeffding upper bound is determined for each hypoyhesis. The only thing that remains is to find hypothesis with minimal in-sample rate. Why do we need to consider all hypothesis simultaneously? What information gives us Hoeffding inequality with factor 'M' in it? I undetstand example with coins but I can not relate it to learning problem.

Sorry for my english and thanks.
#2 magdon RPI Join Date: Aug 2009 Location: Troy, NY, USA. Posts: 595 Re: Hoeffding inequality for multiple hypothesis

Hoeffding for a single hypothesis tells you that, with high probability, As you point out, "The only thing that remains is to find hypothesis with minimal in-sample rate." Why would one do this? Because one is confident that Ein is close to Eout for every hypothesis, and so if we find the the hypothesis with minimum Ein, it will likely have minimum Eout. So, to be justified in picking the hypothesis with minimum Ein, we require that Equivalently,

for no The factor of M comes from using the union bound  Quote:
 Originally Posted by kostya3312 It's clear for me how inequality works for each hypothesis separately. But I don't understand why we need Hoeffding inequality for multiple hypothesis. If i have training data set of size 'N' then (for fixed tolerance 'e') Hoeffding upper bound is determined for each hypoyhesis. The only thing that remains is to find hypothesis with minimal in-sample rate. Why do we need to consider all hypothesis simultaneously? What information gives us Hoeffding inequality with factor 'M' in it? I undetstand example with coins but I can not relate it to learning problem. Sorry for my english and thanks.
__________________
Have faith in probability
#3
 kostya3312 Junior Member Join Date: Jan 2015 Posts: 2 Re: Hoeffding inequality for multiple hypothesis

Thank you, Professor!

I do not quite understand the following:
Quote:
 Originally Posted by magdon The factor of M comes from using the union bound  I thought that the goal is to get the upper bound for probability of event . That is, for feasibility of learning the probability of this event should be small. In my opinion two events (mine) and (yours) are different events. Am I right?

My last question is as follows. The LHS of Hoeffding inequality for M hypothesis is . It implies that event and event (event if you are right) are equal. Though I understand the meaning of event the meaning of event isn't so clear for me. What it literally means? I think it means . Am I right?
#4 magdon RPI Join Date: Aug 2009 Location: Troy, NY, USA. Posts: 595 Re: Hoeffding inequality for multiple hypothesis

Sorry, there was a typo in my previous message. Yes they are different events. But they are very related events.  P[B]=1-P[A]>=1-M*...

Quote:
 Originally Posted by kostya3312 Thank you, Professor! I do not quite understand the following: I thought that the goal is to get the upper bound for probability of event . That is, for feasibility of learning the probability of this event should be small. In my opinion two events (mine) and (yours) are different events. Am I right? My last question is as follows. The LHS of Hoeffding inequality for M hypothesis is . It implies that event and event (event if you are right) are equal. Though I understand the meaning of event the meaning of event isn't so clear for me. What it literally means? I think it means . Am I right?
__________________
Have faith in probability

 Thread Tools Show Printable Version Email this Page Display Modes Linear Mode Switch to Hybrid Mode Switch to Threaded Mode Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home General     General Discussion of Machine Learning     Free Additional Material         Dynamic e-Chapters         Dynamic e-Appendices Course Discussions     Online LFD course         General comments on the course         Homework 1         Homework 2         Homework 3         Homework 4         Homework 5         Homework 6         Homework 7         Homework 8         The Final         Create New Homework Problems Book Feedback - Learning From Data     General comments on the book     Chapter 1 - The Learning Problem     Chapter 2 - Training versus Testing     Chapter 3 - The Linear Model     Chapter 4 - Overfitting     Chapter 5 - Three Learning Principles     e-Chapter 6 - Similarity Based Methods     e-Chapter 7 - Neural Networks     e-Chapter 8 - Support Vector Machines     e-Chapter 9 - Learning Aides     Appendix and Notation     e-Appendices

All times are GMT -7. The time now is 11:43 PM.

 Contact Us - LFD Book - Top