LFD Book Forum  

Go Back   LFD Book Forum > Book Feedback - Learning From Data > Chapter 2 - Training versus Testing

Reply
 
Thread Tools Display Modes
  #1  
Old 05-09-2012, 03:27 PM
goodnight goodnight is offline
Junior Member
 
Join Date: Apr 2012
Posts: 2
Default Multiple bins

Hi All,

I'm a little bit confused about the multiple bin analogy. I try to approach it from the practical point of view. Could you please check that my understanding is correct?

My example is the following:
We have a large dataset provided by our customer. We choose to use the 80% of the data for training an ML algorithm (let's say a neural network with fixed topology constant number of layers and neurons). We want to use the rest 20% for testing (and we don't use this part of the data in the training).

The training process means that we choose a neural network which produces the best result for the training data set. Hypothesis set 'H' means all the possible neural networks. Hypothesis 'h(i)' is a network with one particular weight set. N means the number of data in the training set. For this training we can apply the Hoeffding inequality with multipe bins. M can be calculated somehow, I'll understand it later

Hypothesis 'g' is the winner neural network.

1) Is my understanding correct that the original Hoeffding inequality can be applied for 'g' to the testing data set (which was not used for the training)? If the E(in) on the test data set is close to the E(in) in the training data set, we are happy and can provide 'g' to our customer.

2) My confusion is that the original one-bin Hoeffding cannot be applied to 'g' on the training data!
Why not? Let's say that I don't run any training on the data set. After the customer provided me the data set, I meet a magican. The magican gives me a hypothesis 'h(i)'. He forgets to mention that if I run a learning, my 'g' would be equal to his 'h(i)'. He tells me 'try this hypothesis on your data set, the E(in) will be very small.' He also tells me 'you know Hoeffding. N is constant, epsilon also fixed, you can calculate that it's a very nice hypothesis with a good chance!
After it a friend of mine comes to me, he tells me that he did run a training, he found the g is equal to the 'h(i)' provided by the magican, that's why he suggest not to apply Hoeffding on it, because he cannot do it!

What should I do?
Reply With Quote
  #2  
Old 05-09-2012, 10:20 PM
markland markland is offline
Member
 
Join Date: Apr 2012
Posts: 30
Default Re: Multiple bins

Interesting question. I think what's wrong with #2 is the order of events. Hoeffding applies when you pick your h(i) or H first and then get the data. If you or the magician has used the data to pick h(i) then it doesn't apply. But if you took the magician's h(i) and then sampled new data points, that even the magician hadn't seen yet, then one-bin Hoeffding would apply again, though of course in might turn out that Ein is no longer small on the new data sample.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 11:19 AM.


Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
The contents of this forum are to be used ONLY by readers of the Learning From Data book by Yaser S. Abu-Mostafa, Malik Magdon-Ismail, and Hsuan-Tien Lin, and participants in the Learning From Data MOOC by Yaser S. Abu-Mostafa. No part of these contents is to be communicated or made accessible to ANY other person or entity.