#1




How To Interpret Cross Entropy Error Training Versus Test?
A likelihood analysis for logistic regression yields an expression with (1/probability) and so the terminology for entropy can be applied. This I understand. What I'm not clear on is just what the calculated number means as far as insample versus out of sample performance. For the training data the weights calculated are supposed to minimize Ein(w) in accord with the likelihood viewpoint (the training data is the most likely given the final hypothesis if these weights are used). OK. But does it make any sense to compare the calculated Ein for the final weights with the Eout calculated by applying those weights to the test data set?
I appreciate any clarification that may be provided. Thanks. 
#2




Re: How To Interpret Cross Entropy Error Training Versus Test?
Quote:
__________________
Where everyone thinks alike, no one thinks very much 
Thread Tools  
Display Modes  

